
TechOps
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
EY GDS TechOps is seeking a Senior Data Engineer specializing in CloudOps and Azure Data Lake to join our team in Noida, India. This role focuses on application management and ensuring the optimal performance of modern data architectures.
You will be instrumental in supporting, optimizing, and maintaining our end-to-end data ecosystem, which includes Informatica CDI, Azure Data Factory, Azure Data Lake, and Databricks. Your responsibilities will involve providing technical leadership and application management expertise to global clients, ensuring the seamless operation of data platforms, resolving incidents promptly, and implementing enhancements that align with business objectives. Collaboration with cross-functional teams will be key to driving data reliability and delivering value through best practices and innovation.
Key Responsibilities:
- Provide daily Application Management Support for the full data stack including Informatica CDI, Azure Data Factory, Azure Data Lake, and Databricks, addressing service requests, incidents, enhancements, and changes.
- Lead and coordinate the resolution of complex data integration and analytics issues through thorough root cause analysis.
- Collaborate with technical and business stakeholders to support and optimize data pipelines, models, and dashboards.
- Maintain detailed documentation such as architecture diagrams, troubleshooting guides, and test cases.
- Demonstrate flexibility for shift-based work or on-call duties, adapting to client needs and critical business periods.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, Data Analytics, or a related field, or equivalent work experience.
- 3–7 years of experience in modern data ecosystems with hands-on proficiency in:
- Informatica CDI for data ingestion and transformation.
- Azure Data Factory (ADF) for pipeline orchestration.
- Azure Data Lake (ADLS) for cloud storage and data lake architecture.
- Databricks for large-scale data processing using SQL, PySpark, and Delta Lake.
- Proven experience in application management support, including incident resolution, enhancements, monitoring, and optimization.
- Strong root cause analysis skills for data pipelines, storage layers, and reporting.
- Excellent stakeholder collaboration skills to translate business needs into scalable technical solutions.
- Solid understanding of data governance, performance tuning, and cloud-based data architecture best practices.
- Experience working in global delivery models and distributed teams.
- Flexibility to manage work hours, including shifts and on-call availability, due to the dynamic nature of Application Management.
Preferred Qualifications:
- Experience integrating data from diverse sources like ERP, CRM, POS, and third-party APIs.
- Familiarity with DevOps/CI-CD pipelines within a data engineering context.
- Industry experience in retail, finance, or consumer goods.
- Relevant certifications such as Informatica Certified Developer, Microsoft Certified: Azure Data Engineer Associate, or Databricks Certified Data Engineer.
Join EY in Noida, Uttar Pradesh, India, and contribute to a world-class, multidisciplinary team delivering data excellence to global businesses. We offer opportunities for professional growth, skill development, and a flexible work environment.
Company
EY
EY is a global leader in professional services, dedicated to building a better working world. We empower our clients, from emerging startups to Fortune 500 companies, to achieve their goals. Our diver...