
Cloud data engineer
- Pune, Maharashtra
- Permanent
- Full-time
- Design, develop, and maintain automated data pipelines and ETL processes.
- Create and optimize data tables and views in Snowflake.
- Work with datasets across Azure, AWS, and various structured/unstructured formats.
- Ensure data security, privacy, and compliance with industry and organizational standards (e.g., BHP standards).
- Support and mentor junior team members by providing guidance on data modelling, data management, and data engineering best practices.
- Strong hands-on experience in data pipeline development, ETL scripting using Python, and handling data formats like JSON, CSV, etc.
- Proficiency in AWS services such as:
- AWS Glue
- AWS Batch
- AWS Step Functions
- Experience with Azure services, including:
- Azure Data Factory
- Azure Logic Apps
- Azure Functions
- Azure Blob Storage
- Solid understanding of:
- Data management principles
- Data structures & storage solutions
- Data modeling techniques
- Strong programming skills in:
- Python (with OOP concepts)
- PowerShell
- Bash scripting
- Advanced SQL skills, including writing and optimizing complex queries.
- Working experience with Terraform and CI/CD pipelines for infrastructure and deployment automation.