EY
EY5h ago
Naukri

EY - GDS Consulting - AI And DATA -...

Pune
Senior Level

Auto Apply to 50+ AI Matched EY - GDS Consulting - AI And DATA -... Jobs

Use Auto Apply Agents to Bulk Apply jobs with ATS Optimised Resumes, find verified Insider Connections for jobs at EY

Full Job Description

Opportunity: EY's Global Delivery Services (GDS) Consulting practice is expanding its AI and Data capabilities and is seeking an experienced AWS Data Engineering Manager. This is a key leadership role within a dynamic and growing team, offering the chance to lead impactful, enterprise-grade data initiatives. We are looking for individuals with deep expertise in PySpark, SQL, ETL engineering, AWS data services, and modern data lakehouse architectures.

Key Responsibilities:

  • Technical Leadership: Architect, design, and oversee scalable ETL/ELT pipelines using PySpark, SQL, Python, and a suite of AWS data services. Lead the implementation of robust data lakehouse solutions leveraging AWS S3, Glue, Iceberg, and other cloud-native components. Drive the migration of on-premises data workloads to AWS, focusing on performance, reliability, scalability, and cost optimization. Define and standardize metadata-driven ingestion frameworks and implement medallion (Bronze/Silver/Gold) architecture patterns. Provide expert direction on Spark job optimization, distributed processing, and performance tuning.
  • Delivery & Governance: Lead teams in the building and operationalizing of data pipelines using orchestration tools such as Astronomer (Airflow), AWS Step Functions, and managed workflows. Ensure strict adherence to data quality frameworks, industry best practices, and coding standards. Conduct thorough reviews of architecture, design, and code artifacts, and effectively troubleshoot complex technical challenges. Collaborate seamlessly with cross-functional teams, including BI, data science, and product teams, to ensure efficient data delivery.
  • People & Stakeholder Management: Mentor and guide data engineers and senior engineers, fostering their technical growth and delivery excellence. Engage directly with business and technical stakeholders, translating their requirements into scalable and effective data solutions. Facilitate Agile/Scrum delivery methodologies across multi-functional teams.

Optional (Good to Have):

  • Experience with Databricks (Delta Lake, PySpark notebooks, Unity Catalog).
  • Familiarity with modern governance frameworks and MLOps/DevOps integrations.

Skills and Attributes for Success:

  • Minimum 9 years of overall IT experience, with at least 5 years dedicated to AWS-based data engineering.
  • A minimum of 2 years in a leadership or managerial capacity.
  • Advanced hands-on expertise in: PySpark, SQL, Python; AWS S3, Glue, Lambda, Step Functions, CloudWatch; ETL/ELT design and data lake/lakehouse architectures; Apache Iceberg or similar table formats; Airflow/Astronomer or equivalent orchestration tools.
  • Strong understanding of structured and semi-structured data formats (e.g., Parquet, JSON, CSV, XML).
  • In-depth knowledge of Data Warehousing concepts, dimensional modeling, and performance optimization techniques.
  • Practical experience with CI/CD frameworks (e.g., GitHub, Azure DevOps, Jenkins).
  • Proven analytical, problem-solving, and troubleshooting capabilities.
  • Excellent communication, leadership, and stakeholder management skills.

To Qualify, You Must Have:

  • Bachelor's or Master's degree in Computer Science, IT, or a related field.
  • 9+ years of industry experience with significant hands-on exposure to cloud data engineering.
  • Demonstrated experience designing and managing production-grade AWS data platforms.
  • Proven success leading Agile/Scrum delivery teams.
  • Ability to own deliverables end-to-end with a proactive, self-driven approach.

Ideally, You'll Also Have:

  • Prior client-facing experience and the ability to influence senior stakeholders.
  • Experience delivering complex projects within multi-environment, large-scale enterprise data landscapes.
  • Exposure to Databricks, Delta Lake, or governance frameworks like Unity Catalog.

What We Look For:

We are seeking technically strong, innovative, and adaptable leaders who are passionate about mentoring teams, solving complex data challenges, and driving continuous improvement within a fast-paced environment.

What Working at EY Offers:

  • Opportunities to work on diverse, industry-leading, and high-impact data programs.
  • Access to continuous learning, coaching, and tailored career development.
  • A collaborative, inclusive, and global work environment.
  • Flexibility to manage your work in a way that suits you best.
  • A culture that champions innovation, knowledge-sharing, and personal growth.

Company

EY

EY

Ernst & Young (EY) is a global leader in assurance, tax, transaction and advisory services. We are committed to building a better working world. Our team in Pune contributes to this global mission by ...

Pune
Posted on Naukri