Senior Data Engineer
Assurant View all jobs
- Bangalore, Karnataka
- Permanent
- Full-time
- Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory, Snowflake Tasks, and Snowpipe for real-time and batch ingestion.
- Implement best practices for data modeling, transformation, and performance tuning within Snowflake.
- Build and manage pipelines connecting multiple structured and unstructured data sources across cloud and on-prem environments.
- Automate data quality checks, data lineage tracking, and error handling within ETL workflows.
- Develop and maintain Snowflake schemas, views, stored procedures, and materialized views.
- Configure and optimize Snowpipe for continuous data loading.
- Utilize Snowsight for monitoring query performance, cost optimization, and workload analysis.
- Implement role-based access control and ensure data security in Snowflake.
- Integrate Azure Data Factory with other Azure services (Blob Storage, Synapse, Key Vault).
- Design scalable cloud architectures and orchestrate pipelines across hybrid environments.
- Implement CI/CD pipelines for data workflows using GitHub Actions.
- Collaborate with business analysts and BI teams to enable Power BI dashboards backed by optimized Snowflake data models.
- Create semantic models and data marts to support self-service analytics and reporting.
- Develop Python scripts for automation, data processing, and custom integrations.
- Leverage Python-based frameworks (Pandas, PySpark, Airflow) to enhance pipeline capabilities.
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 8+ years of experience in data engineering, ETL development, and cloud data platforms.
- Strong proficiency in Snowflake, Azure Data Factory, and Python.
- Experience with CI/CD, data security, and performance optimization.
- Familiarity with BI tools (Power BI, Looker, etc.) and data modeling best practices.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
- Knowledge of Airflow, PySpark, and data orchestration frameworks.
- Experience with real-time data ingestion and streaming architectures.
- Understanding of cost optimization in cloud environments.