Lead Data Engineer
Principal Financial View all jobs
- Pune, Maharashtra
- Permanent
- Full-time
2)Architect enterprise-level Snowflake solutions. Design and implement complex data warehousing solutions. Ability to optimize warehouse configurations and resource utilization.
3)Design, build and operationalize large scale enterprise data solutions and applications using AWS services like S3, Glue, Lambda, SNS, EC2, IAM & KMS, and Snowflake (snowpipe, streams, tasks stored procedures).
4)Design complex SQL-based data transformations keeping query performance & best practices into consideration. Develop and maintain stored procedures and functions. Design efficient data models and schema structures.
5)Architect Python-based data processing solutions. Design reusable Python libraries and frameworks.
6)Implement automated testing strategies. Develop complex ETL/ELT processes using Python. Integration of Python with Snowpark.
7)Should be able to collaborate with stakeholders to gather requirements, propose the best cloud solutions and implement them effectively.
8)Follow established project execution processes, coding standards, and industry best practices.
9)Should be aware about AWS, Snowflake and Data security principles. Qualifications:Must have Skills:
- Extensive experience with Snowflake native features like streams, tasks, stored procedures, snowpipe and data sharing, AWS services: S3, Glue, Crawler, Lambda, SNS, EC2, IAM and KMS, Python Hands on & expertise in SQL.
- Proficiency in CI/CD Pipelines in GitHub.
- Proven track record for handling teams of 2-5 team members.
- Exposure of collaborating with business stakeholders.
Hands on experience/Familiarity with one/more of the following technologies: Database, Data Visualization (Power BI), Scripting, ETL (Informatica PowerCenter), understand data mesh architectures. Nice to have Cloud certifications in AWS, Snowflake.