13th Month Bonus
Staff Insurance Provided
Key Responsibilities
Pipeline Development : Design, build, and maintain batch and streaming data pipelines using GCP services such as BigQuery, Dataflow, Dataproc, Composer, Dataform, and Cloud Functions.
Data Modeling & Optimization : Implement and optimize data models in BigQuery to support analytics, BI reporting, and machine learning workloads.
Data Integration : Connect and transform data from multiple sources, including APIs, databases, event streams, and flat files.
Platform Reliability : Monitor and troubleshoot data pipelines, ensuring high availability, scalability, and cost efficiency.
Governance & Quality : Implement data validation, quality checks, and security best practices to ensure trusted data.
Collaboration : Work closely with analysts, BI developers (Tableau, MicroStrategy), and business teams to enable reporting and self-service analytics.
Legacy Support (Light) : Provide occasional support for legacy systems (Oracle, MicroStrategy) where needed, focusing on data extraction and gradual modernization.
Key Requirements
Experience : 4 – 8 years of hands‑on experience in data engineering, ETL / ELT development, or related roles.
Technical Skills
Data Modeling : Understanding of relational, dimensional, and modern data modeling concepts, with an eye for performance optimization.
Cloud Knowledge : Exposure to Azure Data Stack (Synapse, Data Factory, Databricks) is a plus.
Application Questions
#J-18808-Ljbffr
Senior Data Engineer • Petaling Jaya, Selangor, Malaysia