Informatica (ETL) Solution Architect
Sutherland View all jobs
- Hyderabad, Telangana
- Permanent
- Full-time
- Data Architecture and Technical Infrastructure (40%): Plans and drives the development of data engineering solutions ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance.
- Innovation, Continuous Improvement & Optimization (15%): Continuously improves and optimizes existing Data Engineering assets/processes.
- Data Pipeline/ETL (15%): Designs and implements data stores and ETL data flows and data pipelines to connect and prepare operational systems data for analytics and business intelligence (BI) systems.
- Data Modeling/Designing Datasets (10%): Reviews and understands business requirements for development tasks assigned and applies standard data modelling and design techniques based upon a detailed understanding of requirements.
- Support & Operations (5%): Manages production deployments and automation, monitoring, job control and production support. Works with business users to test programs in Development and Quality. Investigates issues using vendor support website(s).
- SDLC Methodology & Project Management (5%): Contributes to technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
- Data Governance and Data Quality (5%): Identifies and investigates data quality/integrity problems, determine impact and provides solutions to problems.
- Metadata Management & Documentation (5%): Documents of all processes and mappings related to Data Pipelines work and follows development best practices as adopted by the BIA team.
- End-User Support, Education and Enablement (5%): Contributes to training and Data Literacy initiatives within the team and End user community.
- Partnership and Community Building (5%): Collaborates with other IT teams, Business Community, data scientists and other architects to meet business requirements. Interact with DBAs on data designs optimal for data engineering solutions
- Experience: 9+ years of proven experience as part Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
- Developing complex SQL queries and SQL optimization.
- Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business Intelligence/Data Warehousing Architecture.
- Skills:
- Strong experience in designing end-to-end data integration and cloud architecture solutions. (4+ Year)
- Informatica IDMC (Cloud Data Integration / Application Integration) (5+ Years)
- Strong expertise in Informatica IDMC (Cloud Data Integration and Application Integration) for building scalable integration frameworks. (6+ Year)
- Proven experience in architecting data pipelines between cloud systems, Snowflake, and Salesforce (SFDC).
- Hands-on knowledge of Informatica Salesforce Connector configuration and integration design.
- Experience in real-time and batch data integration patterns and architecture.
- Strong understanding of data warehousing concepts, preferably with Snowflake.
- Ability to define solution architecture, data flow design, and integration standards across enterprise systems.
- Experience in stakeholder engagement, requirement gathering, and translating business needs into technical solutions.
- Strong knowledge of APIs, microservices, and cloud integration best practices.
- Ability to guide development teams and ensure alignment with architectural standards and governance.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Hybrid work model: In-office on Monday, Wednesday, and Friday.
- Working Time: India shift timings will be until 11:30 PM IST
- Work Location: Pune / Hyderabad