Informatica (ETL) Developer
Sutherland View all jobs
- Hyderabad, Telangana
- Permanent
- Full-time
- Data Pipeline / ETL (40%): Designs and implements data stores and ETL data flows and data pipelines to connect and prepare operational systems data for analytics and business intelligence (BI) systems.
- Support & Operations (10%): Manages production deployments and automation, monitoring, job control and production support. Works with business users to test programs in Development and Quality. Investigates issues using vendor support website(s).
- Data Modeling / Designing Datasets (10%): Reviews and understands business requirements for development tasks assigned and applies standard data modelling and design techniques based upon a detailed understanding of requirements.
- Data Architecture and Technical Infrastructure (10%): Plans and drives the development of data engineering solutions ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance.
- SDLC Methodology & Project Management (5%): Contributes to technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
- Data Governance and Data Quality (5%): Identifies and investigates data quality/integrity problems, determine impact and provide solutions to problems.
- Metadata Management & Documentation (5%): Documents all processes and mappings related to Data Pipelines work and follows development best practices as adopted by the BIA team.
- End-User Support, Education and Enablement (5%): Contributes to training and Data Literacy initiatives within the team and End user community.
- Innovation, Continuous Improvement & Optimization (5%): Continuously improves and optimizes existing Data Engineering assets/processes.
- Partnership and Community Building (5%): Collaborates with other IT teams, Business Community, data scientists and other architects to meet business requirements. Interact with DBAs on data designs optimal for data engineering solutions performance
- Experience: 4 to 7 years of proven experience as part Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
- Developing complex SQL queries and SQL optimization.
- Experience with other cloud platforms (e.g., AWS, Google Cloud) and multi-cloud environments
- Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business Intelligence/Data Warehousing Architecture.
- Skills:
- Proficiency in Informatica / PowerCenter / IDMC tools (4+ Years).
- Informatica IDMC (Cloud Data Integration / Application Integration) (3+ Years)
- Data pipeline development using cloud platforms
- Snowflake data warehousing (2+ Years)
- Salesforce (SFDC) integration (2+ Years)
- Informatica Salesforce Data Connector configuration (SFDC)
- Real-time and batch data integration of Salesforce to Snowflake and Snowflake to Salesforce.
- Data migration and ETL/ELT processes
- API-based integrations and data orchestration
- Strong SQL and data modeling skills
- Performance tuning and troubleshooting of data pipelines
- Certifications: Snowflake, Salesforce connector, IDMC Tools are Plus
- Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
- Understanding of Data Architecture
- Knowledge of ETL and data engineering standards and best practices for the design and development of data pipelines and data extract, transform and load processes
- Design, build and test data products based on feeds from multiple systems, using a range of different storage technologies, access methods or both
- Knowledge of data warehousing concepts, including multi-dimensional models and ETL logic for maintaining star-schemas
- Good Understanding of concepts and principles of data modelling.
- Ability to produce, maintain and update relevant data models for specific needs.
- Can reverse-engineer data models from a live system
- SQL programming desirable (i.e., stored procedures dev.)
- Proficient in data analysis, defect identifications and resolutions.
- Strong professional verbal and written communication skills.
- Ability to work with little supervision and within changing priorities.
- Ability to analyze requirements and troubleshoot problems.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Hybrid work model: In-office on Monday, Wednesday, and Friday.
- Working Time: India shift timings will be until 11:30 PM IST
- Work Location: Pune / Hyderabad