
Senior Data Engineer
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
Sumo Logic is seeking a Senior Data Engineer to join our team in Bengaluru, Karnataka, India. The role involves actively contributing to, executing, and leading IT in the design and development of data engineering solutions. You will play a crucial role in solving complex data integrity, security, process, and sanitization challenges across Sumo Logic's extensive cloud services, data tooling, and business applications. This position reports to the Senior Manager of Data Engineering and is part of the global IT organization.
We are looking for a passionate data engineer skilled in creating scalable, resilient, secure, and cost-efficient data solutions. You should be adept at writing clean, elegant, maintainable, robust, and well-tested code, and thrive in a collaborative team environment, contributing experienced guidance and insights. Experience with performance, scalability, and reliability issues of 24x7 uptime systems is highly desirable.
Key Responsibilities:
- Design and implement complex, well-architected enterprise data solutions for mid-to-large organizations.
- Enhance existing data warehouse systems, solutions, and processes through architectural reviews and design improvements.
- Apply DevOps methodologies to infrastructure, cloud, and business application data.
- Implement systems for strict data compliance, robust security guardrails, and cost optimization.
- Leverage strong understanding and hands-on experience with AWS data infrastructure and compute services supporting our data platform.
- Design and manage data schemas and data flow across corporate systems and applications to ensure compliance, integrity, and security.
- Deliver well-architected, end-to-end data solutions for a growing enterprise across multiple infrastructure environments, data sources, and business applications.
- Build strong partnerships with other Sumo Logic teams to create mutually beneficial solutions and drive increased value.
- Ensure global delivery, alignment of data initiatives, and maintain 24/7 system uptime.
Required Qualifications:
- Extensive experience with Databricks, Spark, and core AWS services (EC2, RDS, Aurora, DynamoDB, S3, Kinesis).
- Proven experience developing scalable, secure, and resilient data architectures and implementations.
- A minimum of 5 years of industry experience with a demonstrated track record of ownership and delivery.
- Proficiency in Python scripting, PySpark, and other data frameworks/tools.
- Strong SQL and data schema experience.
- Experience with API calls and API-based data ingestion.
- Hands-on and deep experience with Git and GitHub.
- Proven experience in AWS infrastructure management and deployment for data platforms, including EC2, S3, RDS, VPC, network adjustments, KMS, and PrivateLink.
- Experience across data ingestion, data storage, and data consumption layers.
- Familiarity with both structured and unstructured data.
- Experience with Agile development methodologies, including Jira, sprints, and story pointing.
- Experience building robust, well-architected designs for enterprise-scale data architectures and workflows.
- A passion for continuous learning and deep technological curiosity.
- Excellent verbal and written communication skills.
- Experience and comfort with an on-call schedule for enterprise systems.
Desirable Skills:
- Experience in AI/ML, Data Science, LLMs, Contextualization, Amazon Bedrock, Amazon SageMaker, and Iceberg.
- Terraform experience is highly valued.
- Experience with the following technologies is a plus: Tableau, Looker, AWS Quicksight, HDFS, Hive, HBase, Yarn, and Oozie.
Company
Sumo Logic
Sumo Logic, Inc. empowers organizations to secure, accelerate, and enhance the reliability of their digital operations. Through its Intelligent Operations Platform, Sumo Logic unifies critical securit...