
R C W M A S Global•2h ago
Naukri
AI Automation Engineer
Hyderabad, Kolkata, Mumbai, New Delhi, Pune, Chennai, Bengaluru
Entry Level
N/A
N/A
N/A
Full Job Description
AI Automation Engineer - Hyderabad, Kolkata, Mumbai, New Delhi, Pune, Chennai, Bengaluru
R C W M A S Global is seeking an AI Automation Engineer to join our team in multiple locations across India, including Hyderabad, Kolkata, Mumbai, New Delhi, Pune, Chennai, and Bengaluru.
About the Role:
As the Lead AI Automation Engineer, you will architect and oversee the deployment of AI-driven automation across data ingestion, transformation, model training, serving, and monitoring pipelines. You'll ensure all processes meet high standards for data security, privacy, and regulatory compliance.
Core Responsibilities:
- Vertex AI Pipeline Development: Build, manage, and scale Vertex AI Pipelines (Kubeflow / Vertex Workbench) to enable reproducible, robust ML/AI workflows.
- Data Ingestion & Orchestration: Engineer data ingestion flows from various sources into GCS, BigQuery, or Cloud Storage, using Dataflow, Pub/Sub, Composer (Airflow), and Cloud Functions.
- Secure Data Handling: Implement data classification, encryption (at rest and in transit), IAM governance, and audit logging using Cloud KMS, VPC Service Controls, Cloud DLP.
- CI/CD for ML: Automate model builds, testing, deployment using Vertex AI Model Registry, Container Registry, Cloud Build, GitOps tools, and open-source CI/CD.
- Infrastructure as Code (IaC): Use Terraform, Deployment Manager, or CDK to define data and AI infrastructure, incorporating least-privilege policies and reproducibility.
- Monitoring & Observability: Deploy logging and monitoring using Cloud Monitoring, Logging, APM, Vertex AI Model Monitoring, and alerting for data drift, resource issues, and SLIs/SLOs.
- Security Reviews & Compliance: Conduct threat modeling, risk assessments, and align with SOC 2, ISO 27001, HIPAA, or GDPR requirements as relevant.
- Team Leadership & Collaboration: Mentor junior engineers, define best practices, and collaborate cross-functionally with Data Engineering, MLOps, Security, and Product teams.
Qualifications & Skills:
Must-Have:
- 0 to 3 years in engineering or MLOps roles, with hands-on experience building production workflows in GCP.
- Deep experience with Vertex AI, Kubeflow Pipelines, or Kubeflow on GKE.
- Proficiency in Python, Terraform (or comparable IaC tools), SQL.
- Strong knowledge of GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Functions, Cloud Storage, Secret Manager, IAM, KMS, VPC, etc.
- Expertise in secure data workflows: encryption, compliance frameworks, identity and access management.
- Experience implementing CI/CD automation for AI/ML systems.
Nice-to-Have:
- Certifications such as Google Cloud Professional Data Engineer, Professional Cloud Architect, or MLOps Engineering Specialist.
- Familiarity with Docker, Kubernetes, Kubernetes-native orchestration.
- Knowledge of GitOps tooling: ArgoCD, Flux, or Jenkins X.
- Experience with data cataloging tools like Data Catalog, DataGov, Great Expectations, or similar.
- Statistical understanding of model evaluation, drift detection, bias mitigation.
Company
R C W M A S Global
Hyderabad, Kolkata, Mumbai, New Delhi, Pune, Chennai, Bengaluru
Posted on Naukri