APAC Market || Talent Acquisition || Recruitment
We are seeking a skilled and detail-oriented data professional to design, build, and maintain scalable data solutions that enable data-driven decision-making across the organization. The ideal candidate will have hands-on experience with Azure Data Factory (ADF) , Azure Databricks , and CI / CD pipelines , along with strong analytical and data modeling capabilities.
Overview
Overview is included in the description above.
Key Responsibilities
- Design, develop, and maintain robust data pipelines using Azure Data Factory (ADF) and Databricks.
- Collaborate with business and technical stakeholders to gather, analyze, and translate data requirements into scalable solutions.
- Develop and optimize ETL / ELT workflows to integrate data from various sources into centralized data models or data warehouses.
- Build and maintain data models (conceptual, logical, and physical) that ensure consistency, quality, and accessibility of enterprise data.
- Implement and maintain CI / CD pipelines for automated deployment and version control of data solutions.
- Conduct data analysis and validation to ensure accuracy, integrity, and completeness of data across environments.
- Create dashboards, reports, or analytics views to support business insights and data visualization needs.
- Work closely with cross-functional teams (data scientists, analysts, business users) to improve data quality, performance, and governance .
- Ensure security, compliance, and documentation standards are followed throughout the data lifecycle.
Required Skills & Experience
Bachelor’s or Master’s degree in Computer Science, Information Systems, Statistics, or a related field.3–8 years of experience in data engineering, analytics, or modeling (depending on role level).Strong proficiency with Azure Data Factory (ADF) and Azure Databricks for pipeline orchestration and data processing.Hands-on experience with SQL , Python , and data transformation frameworks.Solid understanding of data warehousing concepts , data modeling (Dimensional, 3NF) , and ETL best practices .Experience with CI / CD tools (Azure DevOps, Git, or similar) for automated deployment.Knowledge of cloud platforms (Azure, AWS, or GCP) and related data ecosystem tools.Strong analytical thinking, problem-solving, and communication skills.Preferred (Nice-to-Have)
Experience with Power BI, Synapse Analytics , or Snowflake .Exposure to Data Vault , Medallion Architecture , or Lakehouse design principles .Understanding of data governance , metadata management , and data security practices.Familiarity with Agile / Scrum project methodologies.Seniority level
Mid-Senior levelEmployment type
ContractJob function
Business DevelopmentIndustries
Data Infrastructure and Analytics#J-18808-Ljbffr