As partners to product owners, incumbent of this role is expected to help product owners in articulating the techno-functional requirements. Work with project managers to define stakeholders, project planning (scope, schedule and budget) Ensure data products are designed for scalability, supportability, and reusability. Lead the design and architecture of data products and solutions that meet business needs and align with the overall data strategy. Create complex enterprise datasets that adhere to enterprise technology and data protection standards. Deliver strategic infrastructure and data pipelines for optimal data extraction, transformation, and loading. Document solutions with architecture diagrams, dataflows, code comments, data lineage, entity relationship diagrams, and technical and business metadata. Long Description Design, engineer, and orchestrate scalable, supportable, and reusable datasets. Collaborate with senior business stakeholders, functional analysts, and data scientists to deliver robust data solutions aligned with quality measures (availability, completeness, accuracy, etc.). Support continuous integration and continuous delivery, maintaining architectural runways for a series of products within a Value Chain. Implement data governance frameworks and tools to ensure data quality, privacy, and compliance. Develop and support advanced data solutions and tools. Participate in solution planning, incremental planning, product demos, and inspect and adapt events Provide technical oversight and encourage security, quality, and automation Assess technical capabilities across Value Streams to facilitate the selection and alignment of technical solutions following enterprise guardrails. BE in Computer Science, Electrical , Electronics/ Any other equivalent Degree. Experience in building Datawarehouse / data lakes / data lake houses Experience in architecting Datawarehouse for at least 1 functional domain (e.g. Finance, Supply Chain) Experience / Knowledge of finance data concepts as Accounts payables, receivables, general ledger, financial close etc. Experience or knowledge of Snowflake features and functionality. Expertise in complex SQL, python scripting, and performance tuning. Understanding of Snowflake Data engineering practices and dimensional modeling. Optimize data models for performance and scalability. Experience with data security and data access controls in Snowflake. Expertise in setting up security framework and overall governance and different compliances (Ex: SOX) Advanced SQL skills for building queries to create various monitors in Snowflake. Good Expertise in setting up Resource monitors for compute usage, security etc. Demonstrated ability to analyze and interpret complex business processes and systems and how these relate to data requirements Extensive experience utilizing best practices in data engineering and data visualization. Advanced experience in creating interactive analytics solutions using technologies like Power BI and Python. Extensive experience with cloud platforms such as Azure, including cloud-based data storage and processing technologies. Expertise in dimensional (Star, Snowflake, Data Vault) and transactional (3rd normal form) data modeling using OLTP, OLAP, NoSQL, and Big Data technologies. Familiarity with data frameworks and storage platforms like Cloudera, Databricks, Dataiku, Snowflake, dbt, Coalesce, and data mesh. Experience developing and supporting data pipelines, including code, orchestration, quality, and observability. Expert-level programming ability in multiple data manipulation languages (Python, Spark, SQL, PL-SQL). Intermediate ability to interact with batch (ETL/ELT), on-demand (SQL), and streaming (messaging, queues) data integration methodologies. Intermediate experience with DevOps and CI/CD principles and tools. Proficiency in Azure Data Factory. Experience with data governance frameworks and tools to ensure data quality, privacy, and compliance. Solid understanding of cybersecurity concepts such as encryption, hashing, certificates etc. Demonstrate strong analytical skills, to critically evaluate the data presented and mapped from the multiple sources, reconcile data conflicts, decompose high-level data into details, abstract up from low-level information to a more general understanding. Continually learn new modules, ETL tools and programming techniques to provide value to our business by enhancing self-knowledge and capability Established as a key data leader at Enterprise level