
Teradata Developer
Responsibilities
Qualifications & Requirements
Experience Level: Mid Level
Full Job Description
Role Title: Data Analyst
Function: Data & Analytics
Role Type: Permanent
About this Role
Join the Data & Analytics (D&A) team as a Data Analyst and play a crucial role in developing and maintaining the Application Layer, which serves as the direct interface for D&A customers. This position requires a deep understanding of business context and the ability to translate evolving requirements into robust data solutions. Responsibilities include backend development, encompassing the creation of data structures, data marts, and transformation pipelines essential for regular service operations. You will also be tasked with developing ETL processes from source systems into the Data Warehouse, ensuring data consistency and quality within an agile framework.
Who you are
You are a detail-oriented data professional passionate about solving complex data challenges. You excel in collaborative environments, are comfortable working with large datasets, and possess a strong foundation in SQL and cloud-native technologies. Proactive, adaptable, and committed to delivering high-quality solutions, you aim to meet and exceed stakeholder needs.
What you will do
- Develop and maintain the Application Layer, focusing on backend data structures and transformation logic.
- Design and implement ETL processes from source systems into the Data Warehouse.
- Collaborate with cross-functional teams to clarify business requirements and ensure system stability.
- Support operations teams in resolving data quality and consistency issues.
- Drive initiatives for system optimization and simplification.
- Provide a seamless and efficient experience for business stakeholders.
Required Skills
- Primary Skills: Advanced SQL scripting (Teradata SQL preferred), ETL development and maintenance, Linux/Unix Shell Scripting.
- Alternate Skills (Preferred): Version control tools (e.g., Git), Scheduling tools (e.g., Control-M), Google Cloud Platform (BigQuery, DataForm, DataProc).
- Core Competencies: Strong data interpretation and analytical skills, Experience with large datasets (tens of millions of rows), Excellent communication skills for explaining technical concepts to non-technical stakeholders, Proven ability to manage multiple stakeholders and deliver on commitments, Strong problem-solving and decision-making capabilities.
- Experience: 2-7 years in data engineering, data warehousing, or ETL development. Experience in designing and modifying data models. Hands-on experience with cloud-native platforms (GCP preferred). Exposure to the telecommunications industry is advantageous.
- Technical Qualifications: Proficiency in SQL and Python scripting. Familiarity with structured and unstructured data handling. Understanding of end-to-end system landscapes.
What you will learn
- Advanced cloud-native data engineering practices.
- Agile methodologies for data product development.
- Cross-functional collaboration in a global enterprise environment.
- Continuous improvement and automation in data operations.
- Exposure to cutting-edge tools and technologies in the data ecosystem.
Key Performance Indicators
- Technical expertise and delivery quality.
- Accountability and ownership of assigned tasks.
- Effective stakeholder management and communication.
- Responsiveness to risks and issues.
- Proactive identification of improvement opportunities.
- Collaborative team contribution and problem-solving mindset.
Company
VOIS
VOIS, a strategic arm of Vodafone Group Plc, is a leading global provider of intelligent solutions. As the largest shared services organization in the telecommunications industry with 30,000 FTE, VOIS...