Principal Engineer
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
Level AI is seeking an experienced Principal Software Engineer for its Noida location. This role will lead the design and development of our data warehouse and analytics platform, while also elevating the engineering bar across our entire technology stack, including applications, platform, and infrastructure. You will collaborate with team members and the broader Level AI engineering community to build highly scalable and performant systems.
As a technical thought leader, you will drive the solutioning of complex, current, and future problems through the design and implementation of simple and elegant technical solutions. You will be responsible for coaching and mentoring junior engineers and promoting engineering best practices. Close collaboration with product managers and other internal and external stakeholders is essential.
Key Competencies:
- Data Modelling: Expertise in designing data warehouse schemas (e.g., star and snowflake schemas), fact and dimension tables, and normalization/denormalization techniques.
- Data Warehousing & Storage Solutions: Proficiency with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
- ETL/ELT Processes: Strong experience with ETL/ELT tools like Apache NiFi, Apache Airflow, Informatica, Talend, and dbt for data movement.
- SQL Proficiency: Advanced SQL skills for complex querying, indexing, and performance tuning.
- Programming Skills: Strong command of Python or Java for custom data pipeline development and advanced data transformations.
- Data Integration: Experience with real-time data integration tools including Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
- Data Pipeline Management: Familiarity with workflow automation tools like Apache Airflow and Luigi for orchestrating and monitoring data pipelines.
- APIs and Data Feeds: Knowledge of API-based integrations, particularly for aggregating data from distributed sources.
Responsibilities:
- Design and implement analytical platforms to deliver insightful customer dashboards.
- Develop and maintain data warehouse schemas (e.g., star schemas, fact tables, dimensions) for efficient querying and data access.
- Oversee data propagation from source databases to warehouse tools, ensuring data accuracy, reliability, and timeliness.
- Ensure architectural designs are extensible and scalable for future needs.
Requirements:
- Education: B.E./B.Tech/M.E./M.Tech/PhD from a Tier 1/2 Engineering institute with relevant experience at a top technology company.
- Experience: 9+ years of backend and infrastructure experience with a proven track record in development, architecture, and design.
- Hands-on experience with large-scale databases, high-scale messaging systems, and real-time job queues.
- Ability to navigate and understand large-scale systems, complex codebases, and architectural patterns.
- Proven experience in building high-scale data platforms.
- Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
- Experience with data movement, transformation, and integration tools for cross-system data propagation.
- Ability to evaluate and implement best practices in data architecture for scalable solutions.
- Experience mentoring and providing technical leadership to engineering teams.
Nice to Have:
- Experience with Google Cloud, Django, Postgres, Celery, Redis.
- Some experience with AI Infrastructure and Operations.
Learn more about us at https://thelevel.ai/.
Company
Level AI
Level AI, founded in 2019 and headquartered in Mountain View, California, is a Series C startup focused on transforming customer engagement. The company's AI-native platform leverages advanced technol...