About Albertsons Companies IndiaAlbertsons Companies is a leading food and drug retailer in the United States. As of February 22, 2025, the Company operated 2,270 retail stores with 1,728 in-store pharmacies, 405 associated fuel centers, 22 dedicated distribution centers and 19 manufacturing facilities. Albertsons Companies India is a vital extension of the Albertsons Companies Inc. workforce and important to the next phase in the company & technology journey to support millions of customers & lives every day.Job DescriptionPosition Title:Advance Engineer SoftwareAlbertsons Media Collective is the retail media network arm of Albertsons Companies. It connects retail brands with shoppers by delivering personalized, digitally native, and shopper-centric advertising experiences across both in-store and online channel The platform leverages Albertsons extensive first-party data and technology to help brands target and engage loyal shoppers and loyalty program membersAlbertsons Media Collective offers omnichannel advertising solutions, including native display, sponsored products, email, search, and app placements, as well as innovative offerings like Collective TVa solution for planning, executing, and measuring video ad campaigns across streaming, digital video, and soon linear TV. The network is designed to drive sales, accelerate growth for partner brands, and deliver measurable results through data-driven insights and closed-loop measurement.Position Purpose:Data Engineering at Albertsons is inspired to build best in class customer experience and revolutionize the food and drug retail industry. We are looking for people who are excited about re-imagining the grocery experience by harnessing the power of AI and digital technologies. The Data Engineering team uses the Big Data paradigm to make data available for data scientists and multitude of other business users to draw insights to delight our customers, to improve store operations, to optimize supply chain and to proactively improve product lifecycles.You will enjoy working with one of the richest data sets in the world, cutting edge technology, and the ability to see your insights turn into business impacts on a regular basis The candidate will have a background in computer science or a related technical field with experience working with large data sets and a passion towards enabling data-driven decision making. You are a self-starter, smart yet humble, with a bias for action.Key Responsibilities:
Design, develop, and optimize Big Query stored procedures and complex SQL queries.
Build and maintain robust ETL pipelines using Python and Google Cloud Dataflow.
Manage data ingestion and transformation workflows using Google Cloud Composer or Stone branch.
Utilize Google Cloud Storage (GCS) for data staging and archival.
Implement and maintain scalable data models and data warehousing solutions.
Collaborate with cross-functional teams using Agile methodologies (Epics, Stories, Standups, Retrospectives).
Document data processes and flows using Confluence.
Track and manage tasks using Jira.
Required Qualifications:
Advanced proficiency in SQL (Big Query experience is highly preferred; mandatory for lead and manager roles).
Experience with Python (or another scripting language) and automation.
Hands-on expertise with GCP services: Big Query, GCS, Dataflow, and Cloud Composer (mandatory for lead and manager roles).
Strong understanding of data modeling, ETL processes, and data warehousing concepts.
Experience with streaming systems such as Kafka.
Familiarity with version control systems like Git.
Proven experience working in Agile environments.
Nice to Have:
Experience with Power BI (preferred that at least one team member has Power BI experience) or ThoughtSpot for data visualization and analytics.
Exposure to other cloud platforms, especially Microsoft Azure.
Awareness of Snowflake, Oracle, and DB2.
Expertise in AI, preferably in the context of data engineering.
Experience with Stone branch for job scheduling.
Knowledge of CI/CD pipelines and DevOps practices.