Software Engineering Lead Analyst - HIH - Evernorth
Cigna View all jobs
- Hyderabad, Telangana
- Permanent
- Full-time
- Design and implement ETL processes to extract, transform, and load data from various sources.
- Conduct research and perform proof of concept (POC) activities
- Continuously learn and master new technologies
- Use vibe coding tools to improved development velocity
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Optimize and maintain existing ETL workflows for improved efficiency and performance.
- Ensure data integrity and quality through rigorous testing and validation.
- Create and implement best practices to ensure successful delivery and support of large data transformation layer.
- Collaborate with other team members to ensure timely and successful delivery of production ready applications.
- Prepare detailed technical documentation.
- 5 - 8 years of relevant ETL development.
- 4+ years of experience on Databricks.
- 4+ years of experience on Python.
- Hands-on experience with building data integration solutions.
- Experience with both NOSql and Relational Database Development – e.g. Mongo, Postgres, Teradata, SQL Server required.
- Experience with Data Warehouse, Data Lake, and Lake House technologies
- Strong understanding of database concepts and experience with query optimization and performance tuning.
- Hands-on experience with scripting languages like Python.
- Hands on experience with Databricks and Airflow monitoring.
- Excellent problem-solving skills and the ability to analyze complex issues and implement effective solutions.
- Experience with Denodo Platform administration including VDP server configuration, data source setup, view creation (base views, derived views, interface views), role-based access control (RBAC), and cache management.
- Hands-on experience with Denodo Solution Manager for environment promotion, deployment management, and multi-environment lifecycle (DEV → PROD) governance.
- Experience configuring and tuning Presto/MPP engine within Denodo, including memory settings, concurrent query limits, and query cost optimization.
- Familiarity with Denodo SSO integration using SAML/OAuth providers (e.g., Okta) and managing user/group mappings across environments.
- Experience with Denodo JDBC/ODBC connectivity to BI tools such as Tableau, including driver configuration and connection troubleshooting.
- Hands-on experience in automating code/artifact migration across environments (DEV → UAT → PROD) using CI/CD pipelines, scripting, or platform-native tools such as Denodo Solution Manager, Databricks Repos, or AWS CodePipeline.
- Python, Databricks, Cloud Experience (AWS/Azure)
- Denodo VDP, Presto/MPP, Data Virtualization
- Data Virtualization tools, Databricks Certifications, Cloud Certification (AWS Developer Associate or similar)
- Denodo Platform 8.0+ certification or equivalent hands-on admin experience.
- Experience deploying and managing Denodo on AWS EC2, including OS-level configuration, JVM tuning, and log management.