Job posted 2 days ago. Explore AI-powered advice on this job and more exclusive features.
Responsibilities
- Administer and maintain Cloudera CDH / CDP clusters, including installation, configuration, upgrades, and patching.
- Monitor cluster health and performance using Cloudera Manager and other tools.
- Manage and optimize Hadoop ecosystem components (HDFS, YARN, Hive, Impala, Spark, HBase, etc.).
- Implement and manage Kubernetes clusters for containerized workloads and services.
- Ensure high availability, disaster recovery, and data security across Big Data platforms.
- Troubleshoot and resolve system and application issues in a timely manner.
- Collaborate with data engineers, developers, and DevOps teams to support data pipelines and analytics workloads.
- Maintain documentation for configurations, procedures, and troubleshooting guides.
- Stay current with industry trends and best practices in Big Data and container orchestration.
Required Qualifications
Bachelor’s degree in Computer Science, Information Technology, or related field.Minimum 5 years of hands‑on experience in administering Cloudera CDH / CDP environments.Strong experience with Hadoop ecosystem and Big Data technologies.Proficiency in Kubernetes administration and container orchestration.Solid understanding of Linux system administration.Experience with monitoring tools and performance tuning.Excellent problem‑solving and communication skills.Preferred Qualifications
Cloudera certifications (e.g., CCA Administrator, CCA Spark and Hadoop Developer).Experience with cloud platforms (AWS, Azure, GCP).Familiarity with CI / CD pipelines and DevOps practices.Experience with Apache NiFi, Kafka, and other data ingestion tools.Employment Details
Seniority level : Mid‑Senior levelEmployment type : ContractJob function : Information TechnologyIndustries : IT Services and IT ConsultingLocation
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
#J-18808-Ljbffr