work schedule : Standard hours
base location : CG office @Jln Tun Razak
Mode : Hybrid
Responsibilities :
Build and maintain scalable, efficient, and resilient data pipelines and ETL / ELT workflows using GCP services such as BigQuery, Dataflow, Pub / Sub, and Cloud Composer.
Partner closely with the Technical Lead (client-facing) to translate client requirements into well-architected data solutions.
Collaborate in solution design sessions, offering technical input to ensure alignment with client expectations and business goals.
Implement and maintain data security, access control, and authentication standards in compliance with organizational and client policies.
Create and maintain detailed technical documentation covering data architecture, workflows, and operational procedures.
Define and uphold coding standards, data modeling best practices, and operational excellence in the data engineering lifecycle.
Support version upgrades, testing, and validation of data tools and platforms, ensuring minimal disruption to business operations.
Monitor and troubleshoot data pipelines and infrastructure for performance, reliability, and cost-efficiency.
Investigate and resolve issues related to data quality, pipeline failures, latency, and system performance, especially within GCP environments.
Identify opportunities to improve data workflow automation and implement enhancements to reduce manual overhead.
Support the ongoing development and evolution of the enterprise data warehouse (BigQuery), ensuring scalability, integrity, and business value.
Communicate technical challenges and proposed solutions clearly to both technical and non-technical stakeholders.
Work with business units and technology partners to troubleshoot and reconcile data issues, ensuring accuracy and traceability.
Actively contribute to a culture of continuous improvement in data operations and engineering processes.
Preferred Qualifications
1-3 years of experience in data engineering or a related field, with hands-on experience designing and deploying scalable data pipelines and ETL / ELT processes.
Proven expertise in Google Cloud Platform (GCP) services, especially BigQuery, Dataflow, Pub / Sub, Cloud Storage, and Cloud Composer (Airflow).
Strong programming skills in Python, SQL, and familiarity with data pipeline frameworks (e.g., Apache Beam, Airflow, dbt).
Experience working with data warehouse architecture and dimensional data modeling in cloud-based environments.
Solid understanding of data security, IAM, encryption, and compliance in a cloud environment.
Familiarity with CI / CD practices, version control (e.g., Git), and infrastructure as code
Ability to troubleshoot complex data issues, perform root cause analysis, and implement long-term solutions.
Strong communication skills with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders.
Demonstrated ability to work effectively in collaborative, cross-functional teams
Job Type : Contract
Pay : RM2, RM9,400.98 per month
Data Analyst • Kuala Lumpur, Kuala Lumpur, Malaysia