Join to apply for the DevOps Engineer - Data Lake role at PayNet (Payments Network Malaysia) .
As a Principal DevOps Engineer supporting DataOps, you will play a crucial role in enabling efficient and reliable data pipeline building, data analysis, and data governance. Your primary responsibility will be to ensure smooth data ingestion into the data lake or data warehouse, enabling downstream data consumption for Data Scientists, Analysts, and other stakeholders. You will utilize various AWS services such as MSK / Kafka, S3, EMR or Glue, Lakeformation, Athena, QuickSight, and manage ingestion from different databases like MS SQL, MySQL, Oracle, and PostgreSQL.
Responsibilities
Collaborate with Data Engineers to design, implement, and optimize data pipelines.
Ensure efficient and reliable data ingestion processes from various sources into the data lake or data warehouse.Implement data validation and quality checks to maintain data integrity throughout the pipeline.Data Analysis SupportAssist Data Scientists and Analysts in their routine analytical work by providing data access and query support.
Optimize data storage and retrieval mechanisms to enhance performance and efficiency.Collaborate with stakeholders to identify and implement solutions for data analysis and reporting requirements.Data Governance And SecurityUtilize AWS Lakeformation and other relevant tools to establish and enforce data governance policies.
Implement security measures and access controls to protect sensitive data.Collaborate with stakeholders to ensure compliance with data privacy regulations.Functional Competencies
Technical ExpertiseStrong expertise in AWS services such as MSK / Kafka, S3, EMR or Glue, Lakeformation, Athena, and QuickSight.
Proficient in database management systems like MS SQL, MySQL, Oracle, and PostgreSQL.Familiarity with Change Data Capture (CDC) mechanisms and practices.Solid understanding of data pipeline architecture, ETL processes, and data integration techniques.Collaboration And CommunicationExcellent collaboration skills to work closely with Data Engineers, Data Scientists, Analysts, and stakeholders.
Strong communication skills to effectively convey technical concepts to both technical and non-technical audiences.Ability to collaborate with cross-functional teams and promote a culture of knowledge sharing and continuous improvement.Problem-solving And TroubleshootingProven ability to identify and resolve complex technical issues related to data ingestion, processing, and analysis.
Experience in performance tuning, optimizing data storage and retrieval, and addressing scalability challenges.Strong analytical and problem-solving skills to diagnose and troubleshoot system-level issues.Automation And Infrastructure As Code (IaC)Proficiency in infrastructure automation using tools like Terraform, CloudFormation
Experience in managing and administering Kubernetes on-premises or EKS in AWSExperience with version control systems (e.g., Git) and CI / CD pipelines for automated deployments.Understanding of Infrastructure as Code (IaC) principles to maintain reproducible and scalable environments.Proficient in programming languages such as Python, Golang, JavaMinimum Qualifications
Bachelors Degree in Computer Science, Information Systems Technology, or Software EngineeringMore than 7 years of relevant experience in Information Technology, especially in the field of DevOpsRelevant work experience in the Financial Services and / or Technology sectors would be an added advantageExcellent in both English and Bahasa MalaysiaSeniority level
Mid-Senior levelEmployment type
Full-timeJob function
Engineering and Information Technology#J-18808-Ljbffr