
Tech Lead - Datalab
- Bangalore, Karnataka
- Permanent
- Full-time
- Supply Chain optimization,
- Pricing strategy,
- Service Fulfillment (e.g. warranty management, field service management, service parts management, knowledge management).
- Architect, design, and build microservices with a focus on scalability, maintainability, and high performance.
- Lead a team of software developers, conduct design reviews, code reviews, and mentor team members.
- Translate complex business requirements into technical specifications and deliverables.
- Collaborate with product managers, QA engineers, architects, and other stakeholders to ensure smooth execution of product roadmap items.
- Set coding standards and lead by example in writing clean, testable, and efficient code.
- Implement robust observability, logging, and monitoring practices.
- Contribute to platform-wide technical improvements, innovation initiatives, and architectural decisions.
- Promote DevOps practices and CI/CD pipelines to ensure quick and safe software delivery.
- Guide resolution of production issues with urgency and root-cause analysis.
- 9+ years of hands-on software development experience, including 3+ years in technical leadership roles.
- Strong programming expertise in Python with sound knowledge of object-oriented programming, design patterns, and system design.
- Proficient in designing and implementing microservices architectures and RESTful APIs with a DDD approach, from bounded contexts to deployment units and communication.
- Deep understanding of Docker, Kubernetes, Istio service mesh, and Calico for enhanced cluster functionality.
- Hands-on experience with Kubernetes serverless orchestration (Knative) to build cost-effective, scalable solutions.
- Solid platform engineering knowledge with a passion for evangelizing adoption and best practices.
- Experienced in CI/CD pipelines, focusing on code quality, security posture, and test coverage.
- Skilled with tools like GitHub Enterprise, Sonar, and Snyk for secure, high-quality code delivery.
- Created Python libraries using Poetry, with packaging compatible with tools like Artifactory.
- Proficient in AWS, with infrastructure automation using AWS CDK or similar IaC tools.
- Experienced in Aurora Postgres, DynamoDB, and effective data modeling for single-responsibility microservices.
- Familiar with both relational and non-relational databases and their use in scalable systems.
- Exposure to ML model management tools such as MLflow and Evidently.
- Comfortable working in Agile/Scrum environments and collaborating across globally distributed teams.
- Experience with multi-tenant system design, and running Jupyter notebooks in containerized environments.
- Understands how distributed systems function end-to-end and delivers secure, scalable solutions.
- Experience with observability and monitoring tools such as Loki, Tempo, Prometheus and Grafana.
- Always keeps his/her knowledge uptodate with new technologies in AWS and opensource
- You have good knowledge of machine learning lifecycle including how a ml model can be taken from development to production
- Familiarity with event-driven architectures and message brokers.
- Knowledge of end to end ml platforms like Kubeflow,SageMaker, H20, Metaflow..etc
- Ability to write simple ml experiments in Jupyter notebooks for demonstration of datascience workflows
- Knowledge of scientific packages like scikit, pytorch ..etc
- Work with the chief architects in understanding the technical architecture , reviewing , creating decision logs and taking forward the implementation.
- Ability to create custom solutions with Kubernetes and AWS by modifying open source components and guiding such implementations
- Passion for clean architecture, code quality, and mentoring junior developers.
- Strong experience with automated testing strategies and frameworks.
#LI-Remote
#LI-Hybrid