
Senior Data Engineering Lead - Python, Spark, Azure
- Gurgaon, Haryana
- Permanent
- Full-time
- Manage and mentor a team of data engineers, fostering a culture of innovation and continuous improvement
- Design and maintain robust data architectures, including databases and data warehouses
- Oversee the development and optimization of data pipelines for efficient data processing
- Implement measures to ensure data integrity, including validation, cleansing, and governance practices
- Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver solutions
- Analyze, synthesize, and interpret data from a variety of data sources, investigating, reconciling and explaining data differences to understand the complete data lifecycle
- Architecting with modern technology stack and Designing Public Cloud Application leveraging in Azure
- Basic, structured, standard approach to work
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
- Undergraduate degree or equivalent experience.
- 12+ Years of Implementation experience on time-critical production projects following key software development practices
- 8+ years of programming experience in Python or any programming language
- 6+ years of hands-on programming experience in Spark using scala/python
- 4+ years of hands-on working experience with Azure services like:
- Azure Databricks
- Azure Data Factory
- Azure Functions
- Azure App Service
- Good knowledge in writing SQL queries
- Good knowledge in building REST API's
- Good knowledge on tools like Azure Dev Ops & Github
- Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization
- Ability to learn modern technologies and be part of fast paced teams
- Proven excellent Analytical and Communication skills (Both Verbal and Written)
- Proficiency with AI-powered development tools such as GitHub Copilot or AWS Code Whisperer or Google’s Codey (Duet AI) or any relevant tools is expected. Candidates should be adept at integrating these tools into their workflows to accelerate development, improve code quality, and enhance delivery velocity.
- Good knowledge on Docker & Kubernetes services