1 day ago Be among the first 25 applicants
Direct message the job poster from Virtej Technologies
Job Title : Data & AI Ops Engineer
Employment Type : Full time - Contractual (renewable)
Experience Level : Mid to Senior Level (6+ years experience)
About the Role : We are looking for a Data & AI Ops Engineer to design, develop, and optimize scalable data pipelines and integration workflows powering analytics, reporting, and AI-driven initiatives. This role bridges data engineering, AI operations, and systems integration, ensuring that data flows seamlessly across platforms to support real-time business intelligence and decision‑making.
You’ll work closely with application developers, business analysts, and data scientists to build reliable, high‑performance data solutions aligned with enterprise goals.
Key Responsibilities :
- Design, develop, and maintain data pipelines supporting analytics, reporting, and operational systems.
- Ensure data quality, consistency, and reliability through validation and transformation processes.
- Collaborate with cross‑functional teams to define data requirements and deliver integrated solutions.
- Optimize data storage and retrieval performance across databases, data lakes, and cloud platforms.
- Deploy and enhance data models, APIs, and ETL / ELT frameworks, aligned with architectural standards.
- Monitor and resolve data‑related incidents and service requests as per SLAs.
- Document data flows, schemas, and integration logic for compliance and knowledge sharing.
- Stay current with emerging data and AI technologies to improve efficiency and solution effectiveness.
Qualifications :
Bachelor’s in Computer Science, Data Analytics, or related field, with 6+ years of relevant experience OR Diploma in a relevant field, with 8+ years of practical experience in data engineering and integration
Minimum 3–4 years in enterprise‑grade data pipeline design and maintenance.
Required Skills & Experience :
Proficiency in data architecture principles, data modeling, and pipeline design.Hands‑on experience with RDBMS, ETL / ELT frameworks, and distributed data processing.Experience with Azure Data Factory, Databricks, Power BI (preferred).Familiarity with API‑based integrations (REST, SOAP, MQ).Knowledge of Azure / AWS / GCP cloud environments.Understanding of data governance, privacy, and compliance.Awareness of AI / ML fundamentals, especially for chatbots and predictive analytics.Experience with business intelligence tools and reporting frameworks.Excellent communication skills for both technical and non‑technical audiences.Strong documentation and collaboration skills.#J-18808-Ljbffr