Talent.com
Tawaran kerja ini tidak tersedia di negara anda.
AVP, Data Integration (pySpark, Nifi, Hadoop)

AVP, Data Integration (pySpark, Nifi, Hadoop)

MaybankKuala LumpurMalaysia, Kuala Lumpur, Malaysia
30+ hari lalu
Penerangan pekerjaan

AVP, Data Integration (pySpark, Nifi, Hadoop)

Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Get AI-powered advice on this job and more exclusive features.

  • Implement ETL systems that are operationally stable, efficient, and automated. This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.
  • Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse / Data Marts implementations.

Responsibilities of the Role

  • Review business and technical requirements to ensure the data integration platform meets specifications.
  • Apply industry best practices for ETL design and development.
  • Produce technical design documents, system testing plans, and implementation documentation.
  • Conduct system testing : execute job flows, investigate and resolve system defects, and document results.
  • Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
  • Assist in developing, documenting, and applying best practices and procedures.
  • Strong SQL writing skills are required.
  • Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.
  • Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.
  • Experience with data warehouse architecture, source system data analysis, and data profiling.
  • Ability to work effectively in a fast-paced, adaptive environment.
  • Financial domain experience is a plus.
  • Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.
  • Experience working in an Agile environment is advantageous.
  • Qualifications

  • Bachelor’s Degree in Computer Science, Information Technology, or equivalent.
  • Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.
  • At least 4 years of experience in data analysis, profiling, and designing ETL systems / programs.
  • Seniority level

  • Mid-Senior level
  • Employment type

  • Full-time
  • Job function

  • Information Technology
  • Industries

  • Banking
  • #J-18808-Ljbffr

    Buat amaran kerja untuk carian ini

    Integration • Kuala LumpurMalaysia, Kuala Lumpur, Malaysia