
EY - GDS Consulting - AI and DATA - Azure DBX - Senior
- Kolkata, West Bengal
- Permanent
- Full-time
- Design, build, and maintain scalable and efficient data pipelines using Databricks and Python.
- Develop ETL processes that ingest and transform data from various sources into a structured and usable format.
- Collaborate with cross-functional teams to gather requirements and deliver data engineering solutions that support business objectives.
- Write and optimize Python scripts for data extraction, transformation, and loading tasks.
- Ensure data quality and integrity by implementing best practices and standards for data engineering.
- Monitor and troubleshoot ETL processes, performing root cause analysis and implementing fixes to improve performance and reliability.
- Document data engineering processes, creating clear and concise technical documentation for data pipelines and architectures.
- Stay current with industry trends and advancements in data engineering technologies and methodologies.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Minimum of 3-5 years of experience in data engineering, with a focus on Databricks and Python scripting for ETL implementation.
- Strong understanding of data warehousing concepts and experience with SQL and NoSQL databases.
- Proficiency in Python and familiarity with data engineering libraries and frameworks.
- Experience with cloud platforms (e.g., AWS, Azure) and big data technologies is a plus.
- Excellent problem-solving skills and attention to detail.
- Ability to work independently and as part of a collaborative team.
- Innovative and dynamic work environment with a strong emphasis on delivering high-quality data solutions.
- Opportunity to work with a diverse team of data professionals and contribute to impactful projects.