Job Description:UST is looking for a skilled GCP Data Engineer to join team with 5 to 10 years of experience and contribute to the development and maintenance of robust data pipelines and systems on the Google Cloud Platform (GCP), with a specific emphasis on leveraging PySpark for efficient data processing. As a Data Engineer,will play a crucial role in ensuring the efficient flow, storage, and processing of data for our organization. Responsibilities: Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines on the GCP platform using tools such as Apache Beam, Dataflow, and PySpark. GCP Services Utilization: Leverage GCP services like BigQuery, Cloud Storage, Dataflow, and others to build end-to-end data solutions. ETL Processing with PySpark: Develop and optimize ETL processes using PySpark to facilitate seamless data extraction, transformation, and loading. Requirements: Experience: Proven experience as a Data Engineer with a focus on GCP data solutions, including hands-on experience with PySpark. Programming Skills: Proficient in Python, with advanced experience in PySpark for efficient data processing. ETL Tools: Hands-on experience with GCP Dataflow, Apache Beam, or other ETL tools for data processing.Skills:data engineering,gcp data engineering,python,etl process,pyspark,data pipeline,About Company:UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.