About the Role
Boston Consulting Group (BCG) is seeking a Global IT GenAI Senior Specialist in Gurgaon, Haryana, India. This role, within the Data Product Portfolio and Data Ingestion and Analyst Chapter, focuses on designing, developing, deploying, and optimizing Generative AI solutions at scale, while contributing to Data Ingestion Engineering as needed.
Responsibilities
- Design, develop, and deploy scalable, impactful GenAI solutions.
- Act as a hands-on coding contributor, driving AI/GenAI use cases from prototype to production.
- Build and maintain scalable data ingestion pipelines for structured, semi-structured, and unstructured data.
- Implement ETL/ELT workflows using dbt.
- Ensure data quality and compliance.
- Curate and prepare datasets for LLM fine-tuning, embeddings, and retrieval-augmented generation (RAG) pipelines.
- Work with vector databases.
- Contribute to MLOps/LLMOps practices.
- Collaborate with data and AI engineers and product owners.
- Share insights and stay updated on GenAI frameworks (LangChain, LlamaIndex).
Qualifications
Candidates should possess:
- 5+ years in technology, with 4+ years in Data Engineering and 1+ years in GenAI/AI product development at scale.
- Experience with Agentic AI architectures or RAG pipelines.
- Hands-on coding in Python and experience with Gen AI frameworks (LangChain, LlamaIndex).
- Strong understanding of LLMOps/MLOps.
- Experience deploying solutions on AWS (preferred), Azure/GCP with containerization (Docker/Kubernetes).
- Strong knowledge of Python, SQL, dbt, and Snowflake for building and maintaining data ingestion pipelines.
- Experience with AWS Glue and S3.
- Experience working with structured, semi-structured, and unstructured datasets.
- Ability to analyze performance bottlenecks and optimize data pipelines.
- Exposure to version control (GitHub/Bitbucket) and Agile methodologies.
- Good communication and stakeholder engagement skills.
- Experience working in a fast-paced, global company with diverse teams.
BCG is an Equal Opportunity Employer.