
Data Engineer, AVP
- Bangalore, Karnataka
- Permanent
- Full-time
- Best in class leave policy
- Gender neutral parental leaves
- 100% reimbursement under childcare assistance benefit (gender neutral)
- Sponsorship for Industry relevant certifications and education
- Employee Assistance Program for you and your family members
- Comprehensive Hospitalization Insurance for you and your dependents
- Accident and Term life Insurance
- Complementary Health screening for 35 yrs. and above
- 10-15 years of IT Development experience
- Established adapter/API based connectivity to REST and SOAP web-services, transports like HTTP, JMS, Web Services Listener, HTTP Client, Web Services SOAP Client
- Designed Technical/Integration Architectures, including development, runtime, operations architectures, automating business process models and cloud-based services.
- Strong experience in testing and implementation phases of software development life cycle.
- Having good experience in Telecom & Banking domain. A self-motivated and quick learner who is willing to adapt to new challenges & Technologies
- Few Years of Experience on DWH,OLTP and ETL specifically on Informatica, Oracle and SQL Server.
- Experience on integrating Kafka, MQ for streaming data ingestion.
- Good hands on experience on Performance Tuning and Optimizations.
- Having experience on Cloud Technologies with a focus on GCP.
- Experience on working within Agile Scrum Teams.
- Having exposure to DevOps activities, build and deployment in Data Engineering side
- Manage the BAU/Production Stability by closing the Incident tickets and offer stable/reliable permanent remediations across Asset management data platforms.
- Perform and Participate in Platform Migration and Shared Services – Informatica IICS Cloud based Setup and Onboarding of Applications
- Migrate On Prem PowerCenter based Workflows into IICS based Cloud Informatica IDMS Cloud Orchestrations
- Help in standardizing the database platforms to be able apply data management at scale
- Ability to schedule jobs, monitor and correct the issues related to shell scripts, Oracle procedure call, file transfer, etc.
- Knowledge on integration using shell scripts, plugins and API based connections with other infrastructure services.
- Document technical solutions and build KOP for the monitoring L2 team.
- Knowledge on Linux, Unix and Windows environment.
- Exposure to automation and programming through UNIX shell scripting.
- Analytical skill to understand user requirements, procedures and problems to automate processing or improve performance.
- Work with Business Users/Analysts to understand the requirements and transform them into deliverables.
- Programming in Oracle PL/SQL.
- Perform Unit/System testing provide support to the Quality assurance Release Management Teams.
- Ensure process adherence for the development activity.
- Work in Agile methodology where development teams may also support production support teams for L2/L3 activities.
- Flexible to learn other relevant technologies and perform as per project need.
- Good knowledge and experience working with different automation tooling with IT operations.
- Very good communication & coordination skills.
- Very good team player.
- Very good analytical and problem solving skills.
- Self-driven, committed, process Oriented and able to handle challenging situations.
- Open to take-up additional responsibilities as required by the organization.
- Is a fluent communicator and have experience with large banks and financial groups
- At least 10+ years of Technology and Domain experience in handling Asset Management Enterprise Data Platforms
- Have performed at least couple of migrations on Oracle Databases and hosting clusters and sound knowledge on the RDBMS
- Have good experience on SCD(1-4) rank based ETL topology and has extensively developed Informatica Workflows
- Good Skill in managing Oracle 12c, Exadata platforms, dataGuard and HPE clusters
- Good Skill in Unix, Linux Bash, Python Scripting
- Good knowledge Data repositories – NFS, DFS, Cloud bases ( SSD, PSD, RSD, GCB)
- Have good Asset Management Back Office and Front Office Fund Accounting and Key asset flows supporting Ledgers and Financial AuM
- Good Experience in TIBCO Design, Development and Administrator Skills.
- Strong Experience on IPAAS (CloubHub) and Cloud Hosting ( AWS, GCP)
- Proficiency in various integration mechanisms like point-to-point, pub/sub. Capable of developing and incorporating well Integration solutions with IBM MQ Cloud Pak based environments
- Extensively worked in Scrum environment with active involvement in daily meetings and in scrum meetings, reviews.
- Experience in developing and deploying microservices architectures using Spring Boot, Spring Batch, Spring Data or similar frameworks. Strong knowledge of RESTful web services and related technologies such as JSON, Swagger and XML
- Experience in BW 5.X, TIBCO EMS, BW 6.X, Web Services & SOA. With the following Suite of Products:
- a) Rendezvous 64 bit 8.4.4.
- b) Tibco TRA Suite 5.10.0
- c) Tibco BusinessWorks 5.13.0
- d)Tibco Activematrix Adapter 7.2.0 –for database
- e)TIBCO ActiveMatrix Adapter7.0 for Files (SFTP).
- f)Tibco BW Plugin SFTP 1.1.0HF02
- g)Tibco BW Plugin MQ 7.7.0
- h)IBM Websphere MQ Client 8.0.0.5
- i)JRE dependency
- Established adapter/API based connectivities to REST and SOAP web-services, transports like HTTP, JMS, Web Services Listener, HTTP Client, Web Services SOAP Client
- Designed Technical/Integration Architectures, including development, runtime, operations architectures, automating business process models and cloud-based services.
- Strong experience in testing and implementation phases of software development life cycle.
- Having good experience in Telecom & Banking domain. A self-motivated and quick learner who is willing to adapt to new challenges & Technologies
- Experience with Informatica Intelligent Cloud Services ( IICS). Supporting integration configurations with IPaaS though connected Apps, API, microservices and webservices
- Migration experience to migrate workflows/repos from On-premise Informatica Power Center (9,*) to IICS on GCP or AWS
- Experience with Shell Scripting/Python, Webservice, XML, JSON and Real time data integration using IICS CAI (Cloud Application Integration) in building ETL Pipelines, Data Factory, Data Flows and Data replication
- Experienced in creating, monitoring, debugging and publishing processes in Informatica Cloud Real Time (ICRT) process designer/console using service connectors like REST, JDBC, File.
- Deliver hybrid heterogenous integrations of Cloud and on-premise systems of Informatica Data Integration Services
- Hands-on with creation of resources in catalog admin/onboarding of various types of data/sources and relational Databases (SQL Server, Oracle, PostgreSQL), SQL, ODBC, JDBC API
- Experienced with mappings involving SCD 1- VI to perform ETL data transformations using Update Strategy, Lookup, SP, Router, Filters, Sequencer Generators, Joiner, Aggregate and expressions mappings on IICS
- Training and development to help you excel in your career
- Coaching and support from experts in your team
- A culture of continuous learning to aid progression
- A range of flexible benefits that you can tailor to suit your needs