
Snowflake Data Engineer
- India
- Permanent
- Full-time
- Data Modeling & Warehousing: Design, implement, and optimize scalable data models and data warehouses using Snowflake to meet analytical and business requirements.
- ETL Development: Build, maintain, and optimize ETL/ELT pipelines with SnapLogic, DBT, and Python, ensuring efficient ingestion and transformation of structured and unstructured data.
- Lakehouse Architecture: Contribute to the development of a modern Lakehouse architecture, enabling unified data storage and advanced analytics.
- Performance Optimization: Monitor, troubleshoot, and fine-tune queries, pipelines, and workflows to ensure high performance and reliability.
- Automation & Integration: Develop reusable frameworks for automation of data workflows, orchestration, and integration with enterprise systems.
- Data Quality & Governance: Implement data validation, error handling, and monitoring mechanisms to ensure data accuracy, integrity, and compliance with governance policies.
- Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver solutions that meet organizational goals.
- Documentation & Best Practices: Maintain clear documentation of designs, workflows, and processes while advocating best practices for data engineering.
- 3 - 5 years of professional experience in data engineering or data warehousing roles.
- Strong expertise in Snowflake, including performance tuning, query optimization, and warehouse management.
- Proficiency in DBT (Data Build Tool) for modeling, transformations, and version-controlled data workflows.
- Hands-on experience with ETL/ELT pipeline design using SnapLogic or similar tools.
- Solid understanding of Lakehouse architecture and modern data platform concepts.
- Strong programming skills in Python for data transformation, automation, and API integrations.
- Knowledge of SQL and advanced query writing for analytics and reporting.
- Familiarity with data governance, data security, and compliance frameworks.
- Strong problem-solving skills with the ability to analyze complex data challenges and propose effective solutions.
- Excellent communication and collaboration skills, with the ability to work in cross-functional teams.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Exposure to orchestration tools such as Airflow, Prefect, or Dagster.
- Familiarity with CI/CD pipelines for data engineering.
- Experience working in an agile, fast-paced product or data-driven organization.