Data Engineer
SRKay Consulting Group View all jobs
- Bangalore, Karnataka
- Permanent
- Full-time
- Develop and maintain ETL/ELT pipelines on GCP.
- Work with BigQuery datasets for transformation and analytics workloads.
- Implement batch and near real-time data ingestion pipelines.
- Collaborate with senior engineers and architects to implement best practices.
- Monitor data pipelines and ensure data quality and reliability.
- Support BI and analytics teams with curated datasets.
- 2-4 years of experience in Data Engineering.
- Hands-on experience with Google Cloud Platform (GCP): BigQuery, Dataflow or Dataproc, Cloud Storage
- Strong proficiency in Python and SQL.
- Experience building ETL pipelines and working with structured/unstructured data.
- Familiarity with Airflow or Cloud Composer.
- Understanding of data warehousing concepts.
- Experience with Azure Data Factory (ADF).
- Knowledge of Spark or PySpark.
- Exposure to Git, CI/CD pipelines, and Agile workflows.
- Understanding of streaming concepts (Pub/Sub or Kafka).
- Strong analytical and problem-solving mindset.
- Good communication and collaboration skills.
- Willingness to learn new cloud technologies.