
GCP Data Engineer
- Bangalore, Karnataka
- Permanent
- Full-time
- Design, build, and maintain scalable and efficient data pipelines.
- Develop and optimize data models in GCP BigQuery.
- Work with Apache Spark / PySpark for large-scale data processing.
- Implement and manage data workflows using Airflow and dbt.
- Collaborate with analysts, data scientists, and business stakeholders to deliver high-quality data solutions.
- Ensure data quality, reliability, and compliance across platforms.
- Monitor and optimize performance of data pipelines and queries.
- Strong proficiency in SQL and Python.
- Hands-on experience with GCP BigQuery for data warehousing.
- Expertise in Apache Spark / PySpark for data transformations.
- Experience with dbt for data modeling and transformation.
- Knowledge of Apache Airflow for orchestration and scheduling.
- Solid understanding of data engineering best practices, performance optimization, and data governance.
- Experience working in agile, collaborative environments.