Hiring for Kafka Engineer - MumbaiJob Summary:We are seeking a skilled and motivated Kafka Engineer to design, implement, and maintain robust event-driven data pipelines and real-time streaming applications. The ideal candidate will have a deep understanding of Apache Kafka, data streaming architectures, and distributed systems, and will play a critical role in ensuring the high availability, performance, and scalability of our data infrastructure.Key Responsibilities:Design, develop, and manage Kafka-based data streaming solutions.Set up, configure, and maintain Kafka clusters, brokers, topics, and schemas.Monitor, optimize, and ensure the availability and performance of Kafka services.Implement Kafka Connect, Kafka Streams, and Schema Registry as needed.Develop producers and consumers for real-time data ingestion and processing.Collaborate with DevOps, Data Engineering, and Backend teams to integrate Kafka with other systems (e.g., Spark, Flink, MongoDB, PostgreSQL).Automate deployment, scaling, and recovery of Kafka components.Implement monitoring and alerting tools (e.g., Prometheus, Grafana, Splunk).Ensure data security, compliance, and governance across Kafka pipelines.Participate in on-call rotations and provide support for production systems.Required Skills & Qualifications:2+ years of hands-on experience with Apache Kafka in production environments.Strong programming skills in Java, Scala, or Python.Experience with Kafka Connect, Kafka Streams, and Kafka REST Proxy.Solid understanding of distributed systems, messaging systems, and real-time data processing.Familiarity with Docker, Kubernetes, and CI/CD pipelines.Experience with monitoring tools like Confluent Control Center, Prometheus, Grafana, or Datadog.Proficiency with Linux/Unix systems and scripting.