Hiring for Bigdata Hadoop Developer -- MumbaiJob Summary:We are seeking an experienced Big Data Hadoop Developer, Have strong expertise in the Hadoop ecosystem, with proven experience in managing and developing on Hadoop clusters. You will be responsible for designing, developing, optimizing, and maintaining big data solutions, as well as ensuring cluster health and performance.Key Responsibilities:Design and develop scalable big data solutions using Hadoop ecosystem tools such as HDFS, Hive, Pig, Sqoop, and MapReduce.Administer, configure, and optimize Hadoop clusters (Cloudera, Hortonworks, or Apache).Develop and maintain ETL pipelines to ingest, process, and analyze large datasets.Implement and monitor data security, backup, and recovery strategies on Hadoop clusters.Collaborate with data engineers, data scientists, and business analysts to deliver data solutions.Perform cluster performance tuning and troubleshoot issues across Hadoop services (YARN, HDFS, Hive, etc.).Write and optimize complex HiveQL and Spark jobs.Support production deployment and post-deployment monitoring.Required Skills and Qualifications:Bachelor's degree in Computer Science, Engineering, or a related field.3+ years of hands-on experience with the Hadoop ecosystem.Experience in Hadoop cluster setup, administration, and troubleshooting.Strong knowledge of Hive, HDFS, Pig, Sqoop, Oozie, and YARN.Experience with Spark, Kafka, and HBase is a plus.Strong programming skills in Java, Scala, or Python.Experience with Linux shell scripting and DevOps tools (e.g., Jenkins, Git).Familiarity with cloud platforms (AWS EMR, Azure HDInsight, or GCP Dataproc) is a plus.Excellent problem-solving and communication skills.