Mid-Level Data Engineer
Responsibilities
Qualifications & Requirements
Experience Level: Mid Level
Full Job Description
We are seeking a Mid-Level Data Engineer to join our team in Chennai. In this role, you will be responsible for analyzing requirements, developing, and maintaining robust data pipelines and ETL processes to ensure efficient and reliable data extraction, transformation, and loading from diverse sources into appropriate data repositories. You will create and manage logical and physical data models that support our data architecture and business objectives, defining schemas, tables, and relationships for optimal data retrieval and analysis. Collaboration with cross-functional teams and stakeholders will be key to ensuring data security, privacy, and regulatory compliance. You will work closely with downstream applications to understand their needs and optimize data storage solutions accordingly, and with business stakeholders to translate data requirements into technical solutions. Identifying and mitigating risks, providing solutions to complex problems, and ensuring on-time delivery are crucial aspects of this position. Familiarity with Agile methodologies (Scrum/Kanban) and experience with software development best practices, including secure coding, unit testing, code coverage, and quality gates, are essential. The ability to lead and deliver change effectively, conduct technical discussions with customers to identify optimal solutions, and work closely with Project Managers and Solution Architects, including client communication, is expected. Proactively identifying opportunities for task automation and developing reusable frameworks, as well as collaborating with team members to ensure service reliability, maintainability, and integration, will be vital. You will also be responsible for creating technical write-ups and drawings to document proposed solutions.
What you should bring along
Must Have
- Knowledge of Data Analytics.
- Experience building applications using AWS Services.
- 10+ years of hands-on expertise in AWS services including S3, Lambda, Glue, Athena, RDS, Step Functions, Amazon Q, SNS, SQS, API Gateway, Security, Access and Role Permissions, Logging, and Monitoring Services.
- Proficiency in Python, Spark, Hive, Unix, and AWS CLI.
- Hands-on experience with Terraform, GIT, GIT Actions, and CI/CD pipelines.
- Excellent knowledge of Data Modeling and ETL pipeline design.
- Strong knowledge of databases such as MySQL, Oracle, and writing complex SQL queries.
- Strong experience working in a Continuous Integration and Deployment process.
- Ability to learn quickly, be organized, and detail-oriented.
Nice to Have
- AI Project implementation and AI methods.
Must have technical skills
- Pyspark, AWS, SQL
Good to have technical skills
- Terraform, GIT, GIT Actions, CI/CD pipeline, AI
Company
Bmw Techworks India
BMW TechWorks India, a joint venture between the BMW Group and Tata Technologies, is at the forefront of transforming the automotive software industry. Specializing in Digital Car, Digital Company, an...