Databricks
Responsibilities
Qualifications & Requirements
Experience Level: Senior Level
Full Job Description
AIA-Pune is seeking a highly skilled Senior Developer with 7 to 10 years of experience to join their team. The ideal candidate will possess expertise in Databricks Unity Catalog, Azure Data Lake Store, Azure DevOps, Python, Databricks SQL, Databricks Workflows, and PySpark. Experience in Data Management, Hedge Fund Accounting, and Account Management is considered a plus. This hybrid role operates during the day shift with no travel requirements.
Responsibilities include developing and maintaining scalable data solutions using Databricks Unity Catalog and Azure Data Lake Store to enhance data accessibility and security. Collaboration with cross-functional teams to integrate Azure DevOps into the development lifecycle, ensuring seamless deployment and continuous integration, is essential. The role involves utilizing Python and PySpark to design and implement efficient data processing pipelines, optimizing performance and reliability. Creating and managing Databricks SQL queries for data extraction, transformation, and loading to support business intelligence and analytics initiatives is also a key responsibility. Overseeing the execution of Databricks Workflows to ensure timely and accurate data processing, and providing technical expertise and support to team members to foster a collaborative environment, are expected. Analyzing complex data sets to identify trends and insights, contributing to data-driven decision-making, and ensuring data quality and integrity through robust validation and error-handling mechanisms are critical. Staying updated with the latest industry trends and technologies, applying new knowledge to improve systems, and collaborating with stakeholders to understand business requirements and translate them into technical specifications are also part of the role. Documenting technical designs, processes, and procedures to facilitate knowledge sharing, supporting the development of data management strategies, and contributing to the continuous improvement of development practices are also required.
Qualifications include demonstrated proficiency in Databricks Unity Catalog, Azure Data Lake Store, and Azure DevOps. Strong programming skills in Python and PySpark for data processing and analysis are necessary. Experience in Databricks SQL and Workflows for data management and analytics is required. A background in Data Management, Hedge Fund Accounting, or Account Management is a plus. The ability to work in a hybrid model with excellent communication and collaboration skills is essential.
Certifications required include Databricks Certified Data Engineer Associate and Microsoft Certified: Azure Data Engineer Associate.