
MLOps Engineer
Qualifications
Experience Level: Mid Level
- </strong></p><li>. Proficiency in Python
- TensorFlow
- PyTorch
- CI/CD pipelines. Hands-on experience with cloud ML platforms (AWS SageMaker
- GCP Vertex AI
- Azure ML). Expertise in monitoring tools (MLflow
- Prometheus
- Grafana). </li><li>Knowledge of distributed data processing (Spark
- Kafka). (BonusExperience in A/B testing
- canary deployments
Full Job Description
Are you a seasoned MLOps Engineer looking for your next exciting challenge? Soul AI, by Deccan AI, is assembling an elite cohort of AI professionals and is seeking talented individuals to join our network and contribute to groundbreaking projects. In this role, you will be instrumental in architecting and optimizing robust ML infrastructure using tools like Kubeflow, MLflow, and SageMaker Pipelines. You will build and maintain efficient CI/CD pipelines leveraging GitHub Actions, Jenkins, or GitLab CI/CD, and automate complex ML workflows, including feature engineering, retraining, and deployment. Scalability will be key as you deploy and manage ML models using Docker, Kubernetes, and Airflow. Furthermore, you'll be responsible for ensuring model observability, security, and cost optimization within cloud environments such as AWS, GCP, or Azure. This role requires strong proficiency in Python, TensorFlow, and PyTorch, coupled with hands-on experience in cloud ML platforms like AWS SageMaker, GCP Vertex AI, or Azure ML. Expertise in monitoring tools such as MLflow, Prometheus, and Grafana is essential. A solid understanding of distributed data processing with Spark and Kafka is highly valued. Bonus points for experience with A/B testing, canary deployments, and serverless ML architectures. To apply, please register on the Soul AI website.
Company
Soul Ai
Soul AI, powered by Deccan AI, is dedicated to building a premier network of artificial intelligence professionals. We connect top-tier AI talent with innovative and cutting-edge projects in the rapid...