
Data Engineer, Azure Databricks
- Pune, Maharashtra
- Permanent
- Full-time
- design, develop, and improve the digital products and technology services we provide to our clients and employees.
- apply a broad range of software engineering practices, from analyzing user needs and developing new features to automated testing and deployment.
- ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements.
- build observability into our solutions to monitor production health, help to resolve incidents, and remediate the root cause of risks and issues.
- continuously up-skill, learn new technologies and practices, reuse strategic platforms and standards, evaluate options, and make decisions with long-term sustainability in mind.
- 10-14 years of hands-on experience in data engineering space, with ability to translate working solution into implementable working package and/or built new solution using Azure platform
- high proficiency and experience in designing/developing data analytics and data warehouse solutions with Azure Data Factory (ADF) and Azure Data Bricks, services related to Azure Analytics, Azure data lake storage Gen2, Azure SQL/Postgres, Azure analytics services and good understanding of Azure identity (SPN, SAMI, UAMI etc.)
- experience in designing large data distribution, integration with service-oriented architecture, data mesh, data Lake and data analytics solution using Azure data services with large and multi-format data
- proficient coding experience using Spark (Python/Scala) and SQL
- hands on experience with one of scripting language: Linux / PowerShell.
- prior ETL development experience using industry tool e.g. informatica/SSIS/Talend etc., good to have knowledge in Kafka streaming Azure Infrastructure
- a proven track of work experience in cloud (MS Azure platform preferred) in an agile SDLC environment, leveraging modern programming languages, DevOps, and test-driven development"
- Exposure to Docker, Git
- hands on experience with pipeline orchestration tools like Apache Airflow, Autosys, Control-M - Preferred Apache Ariflow
- proficient at working with large and complex code bases using GitLab/Github, Fork/Pull Model and CI/CD
- knowledge of DevOps tools and technologies (like GitLab CI/CD pipeline creation, YAML, Terraform, ARM, Puppet), Azure function app, logic app is a plus
- working experience in Agile methodologies (SCRUM, XP, Kanban). Work within an agile SDLC environment to deliver high-quality data solutions
- you should hold a relevant bachelor's degree or equivalent. Excellent communication skills, with fluency in English for both presentations and technical writing