ETL Data Engineer
The Techgalore
- Delhi
- Permanent
- Full-time
- PySpark
- AWS
- Redshift
- About 4-6 years of professional technology experience mostly focused on the following:
- 4+ year of experience on developing data ingestion pipelines using Pyspark(batch and streaming).
- 4+ years experience on multiple Data engineering related services on AWS, e.g. Glue, Athena, DynamoDb, Kinesis, Kafka, Lambda, Redshift etc.
- 1+ years of experience of working with Redshift esp the following.
- 2 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend
- Good knowledge on Data Modelling(design patterns and best practices)
- Experience with Reporting Technologies (i.e. Tableau, PowerBI)
Expertia AI Technologies