Senior Data Engineer
SRKay Consulting Group View all jobs
- Bangalore, Karnataka
- Permanent
- Full-time
- Design and implement scalable batch and streaming data pipelines on GCP.
- Build and maintain data solutions using BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and related services.
- Architect robust data models for analytics, BI, and AI/ML use cases.
- Lead technical discussions and mentor junior data engineers.
- Optimize data processing performance, cost, and reliability.
- Implement CI/CD, DevOps, and Infrastructure-as-Code practices.
- Collaborate with data architects, analysts, and business stakeholders to deliver high-quality solutions.
- Ensure data governance, security, and compliance standards are followed.
- 6-9 years of experience in Data Engineering or Data Platform development.
- Strong hands-on experience with GCP services: BigQuery, Dataflow (Apache Beam), Dataproc / Spark, Pub/Sub, Cloud Composer (Airflow)
- Advanced Python and SQL skills.
- Expertise in building ETL/ELT pipelines and data modeling.
- Experience with distributed processing frameworks (Spark preferred).
- Strong understanding of Data Lakehouse architecture.
- Experience with Azure Data Factory (ADF) or Azure data ecosystem.
- Knowledge of Terraform or other IaC tools.
- Exposure to ML/AI data pipelines.
- Experience with containerization (Docker/Kubernetes).
- Familiarity with BI tools such as Power BI or Looker.
- Strong leadership and mentoring ability.
- Excellent communication and stakeholder management.
- Ability to work in fast-paced, Agile environments.