Consulting-BO-Cloud Engineering-Senior Consultant-GCP Big data Data Engineer-Delhi
Deloitte
- Delhi
- Permanent
- Full-time
- Work with cloud engineers and customers to solve for big data problems by developing utilities for migration, storage, and processing on Google Cloud.
- Work with various NoSQL databases such as MongoDB, Cassandra, or HBase to store and retrieve data as per application requirements.
- Design and build a cloud migration strategy for cloud and on-premises applications.
- Diagnose and troubleshoot complex distributed systems problems and develop solutions with a significant impact at massive scale.
- Build tools to ingest and jobs to process several terabytes or petabytes per day.
- Design and develop high-performance, scalable data processing applications using Java and Apache Spark.
- Design and develop next-gen storage and compute solutions for several large customers.
- Communicate with a wide set of teams, including Infrastructure, Network, Engineering, DevOps, SiteOps teams, and cloud customers.
- Utilize Google Cloud Platform services such as BigQuery, Dataflow, and Pub/Sub for building data pipelines and analytics solutions.
- Stay updated with the latest technologies and best practices in Big Data processing and cloud computing.
- Build advanced tooling for automation, testing, monitoring, administration, and data operations across multiple cloud clusters.
- 4 + years' experience of Hands-on in data structures, distributed systems, Apache spark, SQL and
- NoSQL Databases
- ETL job writing using Java, Scala.
- Experience building and deploying cloud-based solutions at scale.
- Experience in developing Big Data solutions (migration, storage, processing)
- BS, MS or PhD degree in Computer Science or Engineering, and 3+ years of relevant work experience in Big Data and cloud systems.
- Experience building and supporting large-scale systems in a production environment
- Cloud Platforms -GCP
- Any of Apache Spart/CDH/HDP/EMR/Google DataProc/HD-Insights Distributed processing
- Frameworks.
- One or more of MapReduce, Apache Spark, Apache Storm, Apache Flink. Database/warehouse
- Hive, HBase,Bigtable and at least one cloud native services Orchestration Frameworks
- Any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions
- Any of Kafka, Kinesis, Cloud pub-sub Container Orchestration (Good to have)
- Develop self by actively seeking opportunities for growth, share knowledge and experiences with others, and act as a strong brand ambassador
- Understand objectives for clients and Deloitte, align own work to objectives and set personal priorities
- Seek opportunities to challenge self
- Collaborate with others across businesses and borders to deliver and take accountability for own and team results
- Identify and embrace our purpose and values and put these into practice in their professional life
- Build relationships and communicate effectively in order to positively influence peers and other stakeholders