
Specialist, Data Infrastructure Engineer
- Bangalore, Karnataka
- Permanent
- Full-time
- Architect and Lead: Spearhead the design, evolution, and implementation of our core data infrastructure, centered around Snowflake, Kafka, and Google Cloud Platform. You will be the go-to expert for scalable and resilient data solutions.
- Drive Strategic Initiatives: Own and deliver on critical platform initiatives that reduce friction in data access, improve data quality, and enable self-service analytics, directly impacting product development and business strategy.
- Mentor and Elevate: Act as a force multiplier for the team. Mentor senior and junior engineers, establish best practices through code reviews and design discussions, and foster a culture of engineering excellence.
- Automate at Scale: Champion an "automation-first" mindset. Lead the development of sophisticated automation and Infrastructure as Code (IaC) to manage our complex data ecosystem, enhancing reliability, security, and performance.
- Solve a New Level of Challenges: Troubleshoot and resolve our most complex and critical data systems issues, ensuring the integrity and availability of data that powers our business.
- Innovate and Influence: Stay at the forefront of data engineering technologies and trends. Evaluate and prototype new tools and frameworks to continuously innovate and steer the technical direction of Clover's data platform.
- A proven track record with 8+ years of experience in building and managing large-scale data infrastructure.
- Deep, hands-on expertise with cloud-native data warehouses, especially Snowflake or BigQuery or Amazon Redshift .
- Extensive experience with at least one major cloud provider (GCP, AWS, or Azure) and its data services.
- Strong command of event-driven architecture and hands-on experience with data streaming technologies like Kafka.
- Demonstrable experience in designing and implementing robust, petabyte-scale ELT/ETL solutions.
- Proficiency with Infrastructure as Code (Terraform is highly preferred) and configuration management tools (e.g., Puppet, Ansible).
- A bachelor’s or master’s degree in Computer Science, a related technical field, or equivalent practical experience.
- You have experience with a broader set of data tools and technologies, such as Airflow, BigQuery, CloudSql, GKE, or other streaming solutions (e.g., Striim, Flink).
- You possess a strong understanding of cloud networking principles (VPC, firewalls, peering) within GCP.
- You are passionate about data governance and have experience implementing data quality frameworks and best practices in a production environment.
- You are a natural leader who enjoys mentoring others and can drive technical decisions across multiple teams.
- You have excellent communication skills and can articulate complex technical concepts to both technical and non-technical audiences.
- Impact: You'll build the data foundation for a platform that millions of merchants rely on to run their business.
- Challenge: You'll solve complex, large-scale data problems that will stretch your skills and expertise.
- Growth: We invest in our engineers with opportunities for continuous learning, professional development, and career advancement.
- Apply using your legal name
- Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).