
Senior Data Engineer
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
Zinnia is building a robust data team comprising data engineers, analysts, and scientists to uncover opportunities and drive informed decision-making through data. We collaborate across all company departments to develop advanced behavioral predictors, generate insights that shape business strategy, and construct solutions that optimize both internal and external experiences.
As a Senior Data Engineer, you will play a pivotal role in shaping Zinnia's data landscape. Your responsibilities will include overseeing technological choices and implementing data pipelines and warehousing strategies. You will lead cross-organizational projects focused on automating our data value chain processes and promote best practices within the data organization.
A key aspect of this role involves designing simple, maintainable data architectures that enable Data Analysts, Data Scientists, and stakeholders to work efficiently. You will mentor junior data team members in architecture and coding techniques, serving as a knowledge hub for process improvement, automation, and the adoption of new technologies to enhance data timeliness and coverage.
You will design and implement data pipelines using ETL tools, event-driven software, and streaming technologies. Collaboration with data scientists and engineers is essential to bring innovative concepts to fruition, requiring effective communication with both statistical and software engineering perspectives. Ensuring the reliability of data pipelines, enforcing data governance, security, and customer data protection while managing technical debt will be crucial.
We encourage a mindset of innovation, customer focus, and experimentation. You will partner with product and engineering teams to design data models for optimal downstream data utilization and evaluate/champion new engineering tools that enhance team velocity and scalability.
What you'll need:
- A Technical Bachelor's/Master's Degree with 5+ years of experience in Data Engineering (Data Pipelining, Warehousing, ETL Tools).
- Extensive experience with data engineering techniques, Python, and SQL.
- Familiarity and working knowledge of Airflow and dbt.
- Expertise in data engineering tooling such as Jira, Git, Buildkite, Terraform, Airflow, dbt, and containers, as well as GCP suite, Terraform, Kubernetes, and Cloud Functions.
- Understanding of standard ETL patterns, modern data warehousing concepts (e.g., data mesh, data vaulting), and data quality practices including test-driven design and data observability.
- A blend of high-level architectural thinking and hands-on coding ability.
- Passion for all aspects of data: big data, small data, transformation, quality, accessibility, and delivering value.
- A desire for ownership to solve problems and lead a team in delivering modern, efficient data pipeline components.
- A commitment to a culture of learning and teaching.
- A drive for continuous improvement and knowledge sharing.
- A willingness to take risks and explore novel solutions to complex problems.
Technologies you will use:
- Python for data pipelining and automation.
- Airbyte for ETL purposes.
- Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore.
- Airflow and dbt for data pipelining.
- Tableau and PowerBI for data visualization.
Join Zinnia in Gurugram or Noida and contribute to transforming the insurance industry with cutting-edge technology and data insights.
Company
Zinnia
Zinnia is a leading technology platform dedicated to accelerating growth in the life and annuities insurance sector. Through innovative enterprise solutions and data insights, Zinnia streamlines the p...