
Lead Data Engineer / Architect (Snowflake, DBT, Airflow, Terraform)
- Bangalore, Karnataka
- Permanent
- Full-time
We are seeking a highly skilled Lead Data Engineer / Architect with 10+ years of industry experience (5+ years working extensively on Snowflake and modern data engineering platforms). The ideal candidate will be responsible for designing, optimizing, and leading large-scale data engineering solutions using Snowflake, DBT, Airflow, Snowpark, and Terraform. This role requires strong expertise in advanced SQL, data pipeline orchestration, CI/CD automation, and proven leadership in driving architecture and mentoring teams.
Key Responsibilities
- Build and optimize ELT/ETL pipelines leveraging Snowpark, dbt, and Python.
- Write complex SQL queries including window functions, CTEs, and advanced ANSI SQL for business-critical transformations.
- Develop reusable frameworks for data quality, lineage, and monitoring.
- Implement CI/CD pipelines for Snowflake, dbt, and Airflow using Git and Terraform.
- Optimization & Performance Tuning
- Monitor and optimize query performance, warehouse costs, and storage usage.
- Implement query tuning strategies using clustering, micro-partitioning, caching, and indexing techniques.
- Drive efficiency in batch and streaming ingestion using COPY INTO, Snowpipe, and orchestration frameworks.
10+ years of total IT experience, with 5+ years of relevant Snowflake data engineering experience.
Expertise in:
- Snowflake (Snowpipe, COPY INTO, Snowpark, Snowsql)
- dbt for modular transformations and version control
- Airflow for orchestration and scheduling
- Terraform for infrastructure automation
- Advanced ANSI SQL including window functions and performance tuning
- Strong knowledge of CI/CD pipelines and DevOps practices for data engineering.
- Hands-on experience with data architecture, governance, and cost optimization strategies.
- SnowPro Advanced: Data Engineer Certification required.
- Strong communication, stakeholder management, and leadership skills.