Project description
Our Client a leading bank in Asia with a global network of more than 500 branches and offices in 19 countries and territories in Asia Pacific, Europe, and North America, are looking for Consultants to be part of the project. The Technology and Operations function is comprised of five teams of specialists with distinct capabilities : business partnership, technology, operations, risk governance, and planning support and services. They work closely together to harness the power of technology to support our physical and digital banking services and operations. This includes developing, centralising, and standardising technology systems as well as banking operations in Malaysia and overseas branches. The client has more than 80 years of history in the banking industry and is expanding its footprint in Malaysia. You will be working in a newly set-up technology centre located in Kuala Lumpur as part of Technology and Operations to deliver innovative financial technology solutions that enable business growth and technology transformation.
Responsibilities
- Design, develop, and maintain data pipelines and ETL workflows using Informatica Data Integration Suite, Python, and R.
- Build and optimize large-scale data processing systems on Cloudera Hadoop (6.x) and Teradata Inteliflex platforms.
- Implement data ingestion, transformation, and storage solutions integrating diverse data sources, including Oracle, SQL Server, PostgreSQL, and AS400.
- Develop and deploy dashboards and analytics solutions using QlikSense, Microsoft Power BI, and other visualization tools.
- Collaborate with business teams to deliver analytics and decision-support solutions across domains like Credit Risk Analytics, Credit Scoring,
- Treasury & Wealth Management, and Trade Finance.
- Leverage data science tools (Python, R Studio, Kafka, Spark) to support predictive modeling, scoring, and advanced analytics use cases.
- Participate in code reviews, performance tuning, and data quality validation using tools like QuerySurge, SonarCube, and JIRA.
- Automate workflows, deployments, and job scheduling using Jenkins, Control-M, and Bitbucket.
- Ensure scalability, security, and governance of data solutions in production environments across Linux, AIX, Windows, and AS400 platforms.
Must have
3 to 5 years experience in Big Data & Data Engineering : Cloudera Hadoop (6.x), Spark, Hive, HUE, Impala, KafkaETL & Data Integration : Informatica (BDM, IDQ, IDL), QuerySurgeDatabases : Teradata Inteliflex, Oracle, SQL Server, PostgreSQLData Visualization : QlikSense Discovery, Microsoft Power BIProgramming & Analytics : Python, R, R StudioVersion Control & Automation : Jenkins, Bitbucket, Control-MOS : AS400, AIX, Linux, WindowsDomain Knowledge : Minimum 1 of the following :Credit Risk AnalyticsCredit Scoring & Decision SupportTreasury & Wealth Management (Murex)Trade Finance & Accounts Receivable (FITAS, ARF)Retail Banking & Cards (Silver Lake)Data Modeling (FSLDM / Data Marts)Nice to have
AS400, Experian PowerCurve, SAS
#J-18808-Ljbffr