CloudMile. Located in Federal Territory of Kuala Lumpur, Malaysia.
We are seeking an experienced Data Engineer to join our technical team. In this role, you will own our data architecture, designing and optimizing robust ETL / ELT pipelines on modern cloud platforms ( AWS , GCP , or Snowflake ). Your work will bridge raw data and actionable insights, ensuring scalable, cost‑effective, and fault‑tolerant infrastructure.
Key Responsibilities
- Advanced Pipeline Development : Design, build, and maintain scalable ETL / ELT pipelines using SQL and Python , moving data from APIs, databases, and logs into the Data Warehouse.
- Cloud Architecture & Management : Manage and optimize cloud infrastructure on AWS (Redshift, Glue, Lambda, S3), GCP (BigQuery, Dataflow, Cloud Composer), or Snowflake ; configure access and storage.
- Performance Tuning & Cost Optimization : Monitor query performance, refactor inefficient code, and implement cost‑saving strategies.
- Data Modeling : Build efficient star / snowflake schemas and semantic layers that serve as the company’s single source of truth.
- BI Infrastructure & Semantic Layer : Serve visualization tools ( Tableau , Power BI , Looker , AWS QuickSight ), defining reusable models (LookML or Power BI datasets).
- Orchestration & Automation : Build workflow orchestrations with Apache Airflow , AWS Glue, or similar; implement CI / CD for data pipelines.
- Data Quality & Governance : Enforce quality standards and create automated tests (Great Expectations, dbt tests) to detect anomalies.
- End‑to‑End Dashboard Development : Own the “last mile” of data delivery : gather stakeholder requirements, design layouts, and build dashboards.
Requirements
Experience : 3–6 years of professional Data Engineering.Advanced SQL & Python : Expert SQL (window functions, CTEs, stored procedures) and scripting with Pandas and API integration.Cloud Expertise : Hands‑on experience with at least one major cloud ecosystem :AWS : Redshift, EMR, Glue, Lambda, S3.
GCP : BigQuery, Cloud Composer, Dataflow, Cloud Functions.Snowflake : Snowpipe, Streams & Tasks, Data Sharing.Orchestration : Production workflows in Apache Airflow or equivalent.Data Modeling : Deep knowledge of dimensional modeling, warehousing, and lakes.Dashboard Creation : Build polished, professional dashboards from scratch.Design Best Practices : Apply layout, color theory, and UI / UX principles for clarity.Preferred Qualifications (Nice to have)
Familiarity with dbt for transformation.Experience with Docker, Kubernetes.Knowledge of streaming data (Kafka, Kinesis, Pub / Sub).Additional Information
Location : Kuala Lumpur, Malaysia.Employment Type : Contract.Seniority Level : Associate.Industry : IT Services & Consulting.#J-18808-Ljbffr