Your mission & challenges
- Design, implement, and maintain ETL data pipelines at scale
- Build and optimize data models for robotics applications
- Ensure data quality, governance, and security across all platforms
- Develop data workflows using scalable processing, streaming, and dataset curation technologies
- Collaborate with cross-functional teams to deliver high-quality datasets
- Evaluate and integrate emerging data engineering technologies and best practices
What we can look forward to
Master’s degree in Computer Science, Information Systems, or related field7+ years of experience in data engineering or related rolesStrong programming skills in Python and SQL; experience with Java or Scala for big data frameworksExperience with modern data technologies (Spark, Kafka, Airflow) and NoSQL databases (e.g., MongoDB)Cloud expertise (AWS, Azure, or GCP) and familiarity with data lake / data warehouse solutionsProficiency in containerization and orchestration (Docker, Kubernetes)Excellent problem-solving and debugging skillsAbility to work independently and as part of a teamYou have a perfect command of the English language and, best of all, speak German well#J-18808-Ljbffr