Software Engineering Lead Analyst - HIH - Evernorth
The Cigna Group View all jobs
- Hyderabad, Telangana Andhra Pradesh
- Permanent
- Full-time
- Snowflake Expertise: Design, develop, and optimize data pipelines using Snowflake, including building and maintaining multi-layered data models. Optimize queries, ensure performance, and manage the cost efficiency of Snowflake environments.
- Cloud Experience: Extensive experience working with AWS or other cloud platforms (such as Azure or Google Cloud) to design, deploy, and manage scalable data solutions. Proficient in utilizing cloud-native services for data storage, processing, and orchestration (e.g., S3, EMR, EC2, Redshift, Athena, Lambda, Glue). Familiar with setting up and managing cloud environments for optimal performance and cost-efficiency.
- Big Data Engineering: Lead the design and development of Big Data solutions using Apache Spark (Python/Scala), enabling the processing of petabytes of data in both real-time and batch modes. Work with both structured and unstructured data to build efficient data architectures.
- Data Orchestration & Integration: Develop and manage end-to-end ETL/ELT pipelines using Apache Airflow, Apache NiFi, DBT, or other equivalent orchestration tools. Automate data workflows, schedule tasks, and ensure the reliability of data pipelines.
- Data Modeling: Design and implement scalable and efficient data models for data storage and retrieval, ensuring data accessibility and optimization. Work on building and managing multi-layered data models that facilitate easy analysis and reporting.
- Real-Time Data Streaming: Work with Kafka or other streaming technologies to enable real-time data ingestion and processing. Build and optimize stream processing solutions to integrate event-driven data sources into the pipeline.
- Collaboration & Leadership: Work with cross-functional teams, including product, engineering, to develop and implement data solutions. Manage and coordinate with offshore development teams to deliver high-quality data engineering solutions on time and within budget.
- SQL & Database Management: Write complex SQL queries to transform and process data across relational and columnar databases. Design and maintain scalable database architectures for both operational and analytical workloads.
- Innovation & Optimization: Continuously seek ways to improve the performance, scalability, and reliability of the data platform. Drive innovation by researching and applying the latest tools and technologies in the data engineering space.
- 5-8 years of experience in Data Engineering, with a proven track record of designing and building data pipelines and architectures at scale.
- At least 4 years of hands-on experience with Snowflake, including data modeling and query optimization, designing and implementing multi-layered data architectures
- Experience with cloud services, preferably AWS (e.g., S3, EMR, EC2, Redshift, Athena, Lambda, Glue), for designing, deploying, and managing scalable data solutions.
- Strong experience in Big Data technologies like Spark (Python/Scala) for large-scale data processing.
- Proficiency in Data Orchestration and Integration tools, such as Apache Airflow, Apache NiFi, DBT, or equivalent.
- Expertise in working with relational (e.g., SQL Server, PostgreSQL) and columnar databases (e.g., AWS Redshift, Snowflake), with advanced SQL query writing skills.
- Hands-on experience with Kafka or equivalent streaming data systems to process real-time data.
- Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems.
- Proven ability to collaborate with offshore teams and coordinate across time zones to deliver successful projects.
- Experience with version control systems like Bitbucket or GitHub, and project management tools such as Jira, with familiarity in working within an Agile/Sprint model.
- Good to have experience with Jenkins for continuous integration and deployment (CI/CD) pipelines and Terraform for infrastructure as code (IaC) automation.
- 5-8 Years of Experience
- College degree (Bachelor) in related technical/business areas or equivalent work experience.
- Exposure to AWS
- Healthcare experience
- Coaching of team members