Consulting - BO - Cloud Engineering - Consultant - Azure Data Engineer
Deloitte
- Bangalore, Karnataka
- Permanent
- Full-time
Deloitte is working with global customers on cloud technologies to help unlock growth, stability, and sustainability by enabling them to spot unseen business trends through curation, transformation, and blending of data. In our endeavors for continued expansion, we're searching for like-minded individuals to help us 'take it to the next level'.In this exciting opportunity for an experienced developer, you will join a team delivering a transformative cloud hosted data platform for some of the world's biggest organizations. The candidate we seek, needs to have a proven track record in implementing data ingestion and transformation pipelines on Microsoft Azure. Deep technical skills and experience with working on Azure Databricks. Familiarity with data modelling concepts and exposure to Synapse.
You will also be required to participate in stakeholder management, highlight risks, propose deliver plans and estimate for time and team size based on requirements. Hence, adequate levels of communication skills and relevant experience in handling such situations is desired.Scope of workYour main responsibilities will be:
- Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks
- Delivering and presenting proofs of concept of key technology components to project stakeholders.
- Developing scalable and re-usable frameworks for ingesting and enriching datasets
- Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
- Working with event based / streaming technologies to ingest and process data
- Working with other members of the project team to support delivery of additional project components (API interfaces, Search)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
- Strong knowledge of Data Management principles
- 2.5 to 4.5 years of total years of experience
- Experience in building ETL / data warehouse transformation processes
- Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks).
- Should have experience with MS SQL RDBMS and SQL Query Language
- Experience using Apache Spark and associated design and development patterns
- Microsoft Azure Big Data Architecture certification is an advantage.
- Hands-on experience designing and delivering solutions using Azure Storage, Azure SQL Data Warehouse, Azure Data Lake
- Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Terraform etc.
- Develop self by actively seeking opportunities for growth, share knowledge and experiences with others, and act as a strong brand ambassadors
- Understand objectives for clients and Deloitte, align own work to objectives and set personal priorities
- Seek opportunities to challenge self
- Collaborate with others across businesses and borders to deliver and take accountability for own and team results
- Identify and embrace our purpose and values and put these into practice in their professional life
- Build relationships and communicate effectively in order to positively influence peers and other stakeholders