
Confluent Kafka Specialist
- Pune, Maharashtra
- Permanent
- Full-time
- Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems.
- Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance.
- Integrate CDC pipelines with core banking applications, databases, and enterprise systems.
- Ensure data consistency, integrity, and security, adhering to banking compliance standards (e.g., GDPR, PCI-DSS).
- Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing.
- Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate.
- Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives.
- Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures.
- Stay updated with emerging technologies and drive innovation in real-time banking data solutions.
- Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions.
- Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry.
- Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate.
- Hands on experience on IBM Analytics
- Solid understanding of core banking systems, transactional databases, and financial data flows.
- Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud).
- Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations.
- Strong experience in event-driven architectures, microservices, and API integrations.
- Familiarity with security protocols, compliance, and data governance in banking environments.
- Excellent problem-solving, leadership, and stakeholder communication skills.