
Snowflake Developer
- Chennai, Tamil Nadu
- Permanent
- Full-time
Min Experience: 6 years
Location: Chennai, Hyderabad, Bengaluru
JobType: full-timeRequirements:We are seeking an experienced and detail-oriented Snowflake Developer with over 6 years of expertise in data engineering and data warehouse development. The ideal candidate will have strong proficiency in Snowflake Data Cloud and a proven track record in designing, implementing, and optimizing large-scale data solutions. This role requires deep technical skills in Snowflake along with a solid understanding of ETL/ELT pipelines, SQL performance tuning, and cloud-based data ecosystems. You will be responsible for enabling efficient data flow, supporting analytics, and ensuring the business has access to clean, reliable, and timely data for decision-making.Key Responsibilities
- Snowflake Development:
- Design and develop high-performance data models in Snowflake.
- Create and manage databases, schemas, tables, views, streams, and tasks in Snowflake.
- Build and maintain ETL/ELT workflows using Snowflake and integration tools.
- Implement Snowpipe for real-time and batch data ingestion.
- Performance Optimization:
- Write and optimize complex SQL queries, stored procedures, and UDFs.
- Monitor query performance and optimize workloads to ensure efficient usage of Snowflake resources.
- Tune virtual warehouses and optimize storage and compute costs.
- Data Quality, Security & Governance:
- Ensure data integrity, accuracy, and consistency across data pipelines.
- Implement role-based access control (RBAC), data masking, and security policies in Snowflake.
- Maintain compliance with audit, regulatory, and governance standards.
- Collaboration & Support:
- Work closely with data architects, BI teams, and business stakeholders to gather requirements and deliver solutions.
- Support integration of Snowflake with BI tools (Tableau, Power BI, Looker, etc.).
- Provide technical expertise to cross-functional teams and mentor junior developers.
- Continuous Improvement:
- Stay updated with new Snowflake features and industry best practices.
- Drive automation and process improvements in data pipeline management.
- Contribute to overall data architecture strategies and modernization initiatives.
- Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
- Minimum of 6 years of experience in data engineering or database development.
- Minimum of 3 years hands-on experience with Snowflake Data Cloud.
- Strong proficiency in:
- Snowflake features (Virtual Warehouses, Time Travel, Fail-Safe, Streams, Tasks, Snowpipe).
- Advanced SQL development and performance tuning.
- ETL/ELT tools such as Informatica, Talend, dbt, Matillion, or similar.
- Working with cloud platforms (AWS, Azure, or GCP).
- Knowledge of data modeling, warehousing concepts, and BI integration.
- Familiarity with scripting languages like Python or Shell for data transformation (preferred).
- Excellent problem-solving, analytical, and communication skills.