Tranglo WP, Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Key responsibilities include :
- Drive the implementation and optimization of centralized data repositories and data lake architectures , supporting enterprise-wide analytics and reporting.
- Identify, diagnose, and resolve data-related issues , ensuring data quality, integrity, and availability for downstream analytics and business applications.
- Apply advanced data modeling techniques to solve real-world business challenges, ensuring data consistency, integrity, and performance.
- Provide advanced support and troubleshooting for data pipelines, ETL / ELT processes, and data integration workflows across cloud and on-prem environments.
- Enhance and mentor others in organizational, communication, and analytical skills , fostering a collaborative and data-driven engineering culture.
- Gain exposure to the end-to-end machine learning lifecycle , from experimentation to production, by enabling robust and scalable data access for model training and inference.
- Develop, optimize, and maintain robust data pipelines (ETL / ELT) across both on-premise and cloud environments.
- Work closely with business and technical teams to ensure that data mapping aligns with business rules and reporting needs.
- Maintain data lake and data warehouse solutions .
- Develop detailed reports and documentation of incidents and resolutions to contribute towards continuous improvement processes within the team.
- Provide technical guidance to junior staff and implement best practices and standards of code.
- Create and maintain data mapping documents to define the relationships between source data and target schemas.
- Develop and support ingestion of data from multiple sources into a centralized data repository or data lake .
- Perform basic data cleaning, transformation, and validation to ensure data quality and integrity.
Requirements :
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems , or a related field.5+ years of experience in data engineering, data warehousing, or big data environments.Strong understanding of data structures, performance tuning, and optimization .Strong proficiency in SQL and one or more programming languages (Python, Java, or Scala).Ability to troubleshoot and improve complex data workflows.Experience designing systems for scalability, reliability, and maintainability .Strong communication and documentation skills.Solid understanding of data governance, security, and compliance principles.Exposure to BI tools such as Power BI, Tableau, or Looker for data consumption and visualization.Exposure to Machine Learning project implementation.Nice to have : Experience with streaming platforms .Up to RM11,000 only
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Analyst, Information Technology, and Product Management
Industries
Financial Services, Technology, Information and Media, and Banking
#J-18808-Ljbffr