
Ab Initio Data Engineers
- Pune, Maharashtra
- Permanent
- Full-time
- Master of Science (preferably in Computer and Information Sciences or Business Information Technology) or an Engineering degree in the above areas.
- Have a minimum of 4 years experience in Data Management, Data Engineering & Data Governance space with hands on project experience using Ab Initio, Pyspark, Databricks and SAS
- Should have worked on large data initiatives and should have exposure to different ETL / Data engineering tools
- Work with business stakeholders to gather and analyze business requirements, building a solid understanding of the Data Analytics and Data Governance domain
- Document, discuss and resolve business, data and reporting issues within the team, across functional teams, and with business stakeholders
- Should be able to work independently and come up with solution design
- Build optimized data processing and data governance solutions using the given toolset
- Collaborate with delivery leadership to deliver projects on time adhering to the quality standards
- Must have strong Data Warehousing / Data Engineering foundational skills with exposure to different types of data architecture
- Strong conceptual understanding of Data Management & Data Governance principles
- Hands-on experience using
- Strong Ab Initio skills with hands on experience on GDE, Express>IT, Conduct>IT, MetadataHub
- Databricks and should be fluent with Pyspark & Spark SQL
- Experience working with multiple databases like Oracle/SQL Server/Netezza as well as cloud hosted DWH like Snowflake/Redshift/Synapse and Big Query
- Exposure to Azure services relevant for Data engineering - ADF/Databricks/Synapse Analytics
- Experience working in an agile software delivery model is required.
- Prior data modelling experience is mandatory preferably for DWH/Data marts/Lakehouse
- Discuss & document data and analytics requirements with cross functional teams and business stakeholders.
- Analyze requirements and come up with technical specifications, source-to-target mapping, data and data models
- Manage changing priorities during the software development lifecycle (SDLC)
- Transforming business/functional requirements into technical specifications.
- Azure Certification relevant for Data Engineering/Analytics
- Experience and knowledge of one or more domains within Banking and Financial Services
- Exposure to tools like Talend,Informatica,SAS for data processing
- Prior experience in converting Talend/Informatica/Mainframe based data pipelines to Ab Initio will be a big plus
- Data validation and testing using SQL or any tool-based testing methods
- Reporting/Visualization tool experience - PowerBI
- Exposure to Data Governance projects including Metadata Management, Data Dictionary, Data Glossary, Data Lineage and Data Quality aspects.