
Azure Databricks
- Pune, Maharashtra
- Permanent
- Full-time
Role Description: A minimum of 5 years experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partnersExperience in troubleshooting and Supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tunning, ETL importing large volume of data extracted from multiple systems, capacity planning
Competencies: Digital : Python, Digital : Databricks, Digital : PySpark, Azure Data Factory, MySQL
Experience (Years): 6-8
Essential Skills: Strong knowledge of Extraction Transformation and Loading (ETL) processes using frameworks like Azure Data Factory or Synapse or Databricks; establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc. Databricks Architect, Cloud Architect, Python, SQL
Desirable Skills: Design and develop ETL processes based on functional and non-functional requirements in python / pyspark within Azure platform. Databricks Architect, Cloud Architect, Python, SQL