Optum
Optum1h ago
Indeed

Senior Data Engineering Consultant

Gurugram, Haryana
Full Time
Senior Level

Auto Apply to 50+ AI Matched Senior Data Engineering Consultant Jobs

Use Auto Apply Agents to Bulk Apply jobs with ATS Optimised Resumes, find verified Insider Connections for jobs at Optum

Full Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.

Primary Responsibilities:

  • Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on business intelligence dataset development
  • Support the full data engineering lifecycle including research, designing, developing, testing and maintaining end-to-end data processing systems and data management solutions
  • Design reusable components, frameworks, and libraries
  • Contribute and make recommendations to the design and architecture to enable secure, scalable, and maintainable solutions
  • Support the implementation of a modern data framework that facilitates business intelligence reporting and advanced analytics
  • Work collaboratively with People Analytics on the development and production of standard datasets to drive actionable decision making and reporting stability
  • Work towards eliminating unwarranted complexity and unneeded interdependencies
  • Data Pipeline Development: Design, build, and maintain scalable data pipelines to process large volumes of data from various sources
  • Data Integration: Integrate data from multiple sources, ensuring data quality and consistency
  • Database Management: Develop and manage databases, data warehouses, and data lakes
  • Data Quality: Utilize data quality tools to ensure the accuracy, reliability, and integrity of data throughout its lifecycle
  • Data Governance: Implement data governance practices to ensure data security, privacy, and compliance
  • Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines
  • Work very closely with architecture groups and drive optimized solutions
  • Design and manage complex workflows using Apache Airflow
  • Develop and maintain DAGs (Directed Acyclic Graphs) for orchestrating data pipelines
  • Documentation: Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement
  • Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions
  • Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals
  • Analyze project requirements and develop detailed specifications for new data warehouse reporting requirements
  • Assesses and interprets customer requests for feasibility, priority, and complexity
  • Support projects and change initiatives aligned to key priorities of People Analytics and Insights and People Technology customers
  • Proactively keeps data secure and decommissions legacy content in our environment
  • Serve as a resource to others within the People Analytics and Insights and People Tech community; mentors other data engineers; provides explanations and information to others on difficult issues, problems, and solutions
  • Works with minimal guidance; seeks guidance on only the most complex tasks
  • Coaches, provides feedback, and guides others within the People Analytics community
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

  • 9+ years of data engineering experience
  • 7+ years of full lifecycle application, software development experience
  • 6+ years of modern programming language such as Python, Java, Spark and Scala
  • 6+ years of SDLC experience in an Agile environment
  • 5+ years of solid hands-on experience with Snowflake, Azure Databricks
  • Experience with Cloud technologies and platforms such as Docker, Podman, OSE, Kubernetes, AWS, Snowflake, and Azure
  • Experience with Jenkins, GitHub, Big Data technologies like Spark, PySpark
  • Experience using IDEs such as Eclipse, JBoss, IntelliJ
  • Relational database experience
  • Experience ingesting and working with large and complex datasets
  • Experience gathering requirements from end users
  • Experience building data pipelines on Azure Databricks/ Snowflake, following best practices in Cloud deployments
  • Experience in Data Integration and Data Warehousing, working on Public Cloud (Azure)
  • Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data ecosystem, cloud data warehousing, data architecture and data governance
  • Grow and maintain knowledge of and leverage emerging technologies
  • Understands priorities and organizes prescribed and non-prescribed work to meet or exceed deadlines and expectations
  • Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.

Preferred Qualifications:

  • Master's degree in Computer Science, Engineering, or Technology
  • Relevant certifications in data engineering or cloud platforms (Azure and Databricks)
  • Experience with People Data (Human Capital domain)
  • Experience with disaster and recovery models
  • Experience creating user stories in an agile tool using gherkin format methodology

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone—of every race, gender, sexuality, age, location and income—deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

#NIC#NJP

Company

Optum

Optum

Optum is a global healthcare leader dedicated to improving lives through technology and innovative care solutions. We connect individuals with the vital resources, pharmacy benefits, and data they nee...

Gurugram, Haryana
Posted on Indeed