
Technical Architect
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
DataWeave is seeking a highly skilled Technical Architect to spearhead the design and ongoing development of our large-scale analytics and data-driven SaaS platforms. This pivotal role demands profound expertise in distributed systems, data engineering, and scalable architectures. The ideal candidate will focus on constructing systems capable of high-volume data processing, complex computational tasks, and real-time analytics. You will collaborate closely with our engineering, data science, product, and platform teams to translate intricate business requirements into robust, scalable, and cost-effective technical solutions.
Key Responsibilities:
- Oversee the complete technical architecture for our analytics and data products.
- Architect and implement scalable, fault-tolerant, and high-performance systems for extensive computation and data processing.
- Establish and uphold reference architectures, design standards, and best practices for the engineering teams.
- Evaluate and select optimal technologies for data storage, processing, analytics, and machine learning workloads.
- Champion the adoption of cloud-native and distributed architectural patterns.
- Ensure all systems rigorously adhere to non-functional requirements, including scalability, reliability, performance, security, and cost efficiency.
- Provide strong technical leadership and mentorship to engineering teams.
- Serve as a primary technical escalation point for critical and complex system issues.
- Lead performance benchmarking initiatives and capacity planning efforts.
- Design and deliver solutions for: large-scale batch and streaming data pipelines, high-throughput APIs, and real-time/near-real-time analytics systems.
Required Qualifications:
- Substantial experience in designing distributed systems and SaaS platforms.
- In-depth knowledge of modern data platforms and analytics architectures.
- Proven expertise in at least one of the following technologies: Kafka, PubSub, Kinesis, Data Lakes (S3/ADLS/GCS with Delta/Iceberg/Hudi), Columnar Stores (BigQuery, Redshift, Snowflake, ClickHouse), MySQL, NoSQL databases (DynamoDB), Postgres, HBase, or Spark.
- Extensive hands-on experience with Amazon Web Services (AWS).
- Proficiency in creating comprehensive technical documentation, including architecture views, technology architecture blueprints, and detailed design specifications.
- Expertise in containerization technologies such as Docker and Kubernetes, coupled with a strong understanding of CI/CD, DevOps practices, and monitoring/alerting tools like CloudWatch, DataDog, or Sentry.
- Strong coding proficiency in Python and/or Go.
- Solid grasp of API design principles, microservices architecture, and event-driven systems.
- Deep understanding of data modeling techniques (OLAP, dimensional, event-based), feature engineering, and ML/LLM pipelines.
- Previous experience building or supporting AI-powered analytics systems is considered a significant advantage.
Company
DataWeave
DataWeave specializes in building sophisticated data products derived from publicly available web data. We empower businesses with actionable intelligence by aggregating, curating, and analyzing data ...