Senior Backend Engineer
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
Level AI is seeking a Senior Backend Engineer specializing in Analytics to join our team in Noida. This role is crucial for designing and implementing analytical platforms that provide insightful dashboards to our customers. You will be responsible for developing and maintaining robust data warehouse schemas, including star schemas, fact tables, and dimensions, to ensure efficient querying and data access. A key part of your role will involve overseeing data propagation processes from source databases to warehouse-specific tools, guaranteeing data accuracy, reliability, and timeliness. We are looking for someone who can ensure our architectural designs are extensible and scalable to meet future demands.
Key Responsibilities:
- Design and implement analytical platforms delivering insightful customer dashboards.
- Develop and maintain data warehouse schemas (e.g., star schemas, fact tables, dimensions) for efficient data access.
- Manage data propagation from source databases to warehouse tools, ensuring accuracy and timeliness.
- Ensure architectural designs are scalable and extensible for future growth.
Required Skills and Experience:
- B.E/B.Tech/M.E/M.Tech/PhD from a tier 1 Engineering institute with relevant experience at a top technology company.
- A minimum of 3 years of Backend and Infrastructure experience, with a strong background in development, architecture, and design.
- Hands-on experience with large-scale databases, high-scale messaging systems, and real-time job queues.
- Proven ability to navigate and understand large-scale systems, complex codebases, and architectural patterns.
- Demonstrated experience in building high-scale data platforms.
- Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
- Experience with data movement, transformation, and integration tools for cross-system data propagation.
- Ability to evaluate and implement best practices in data architecture for scalable solutions.
Competencies:
- Data Modelling: Designing data warehouse schemas (star and snowflake), fact/dimension tables, normalization/denormalization.
- Data Warehousing & Storage: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse Analytics.
- ETL/ELT Processes: Apache NiFi, Apache Airflow, Informatica, Talend, dbt.
- SQL Proficiency: Advanced SQL for complex queries and performance tuning.
- Programming Skills: Strong Python or Java for data pipelines and transformations.
- Data Integration: Apache Kafka, Apache Spark, AWS Glue, Fivetran, Stitch.
- Data Pipeline Management: Apache Airflow, Luigi for workflow automation.
- APIs and Data Feeds: API-based integrations for aggregating distributed data.
Nice to Have:
- Experience with Google Cloud, Django, Postgres, Celery, Redis.
- Familiarity with AI Infrastructure and Operations.
Learn more about Level AI: Website | Funding | LinkedIn | AI Platform Demo
Company
Level AI
Level AI, founded in 2019 and headquartered in Mountain View, California, is a Series C startup revolutionizing customer engagement. Our AI-native platform transforms contact centers into strategic as...