Senior Data Engineer

FM India

  • Bangalore, Karnataka
  • Permanent
  • Full-time
  • 5 hours ago
  • Apply easily
Job DescriptionAbout us:We are a highly successful 190-year-old, Fortune 500 commercial property insurance company of 6,000+ employees with a unique focus on science and risk engineering. Businesses worldwide trust our expertise to protect their assets, relying on our comprehensive risk assessments and robust, engineering-based insurance solutions to safeguard against fire, natural disasters, and other perils. Serving over a quarter of the Fortune 500 and major corporations globally, we deliver data-driven strategies that enhance resilience, ensure business continuity, and empower organizations to thrive.FM India is a strategic location for driving our global operational efficiency. Our presence in India allows us to leverage the countrys talented workforce and advance our capabilities to serve our clients better. We have diverse corporate functions that emphasize research, advanced technologies like AI and analytics, risk engineering, research, finance, marketing, HR, etc. working together to provide innovative solutions and nurture lasting relationships from co-workers to clients.Role Title: Senior Data EngineerPosition Summary:The Senior Data Engineer designs, builds, and supports high-quality, reliable, and scalable data solutions that enable analytics, reporting, and advanced data use cases across the organization. Senior Data Engineers operate with a high degree of independence, contribute to solution design decisions, and apply DataOps practices to improve reliability and efficiency. Individuals in this role typically lead small to moderate initiatives and serve as strong contributors on larger, more complex programs, without formal people management responsibility.Responsible for analysis, data modeling, data collection, data integration, and preparation of data for consumption. This includes creating and managing data infrastructure, data pipeline design, implementation and data verification. Along with the team, responsible for ensuring the highest standards of data quality, security and compliance. Displays personal accountability for successful outcomes and support quality efforts within the team. Interfaces with colleagues and other stakeholders to evaluate defined business requirements and processes. Uses available approved technologies. Responsible for implementing methods to improve data reliability and quality, combining raw information from different sources to create consistent data sets. This role requires strong DataOps capabilities and hands-on data engineering expertise. Those holding this position are typically assigned to lead small to medium scale projects and participate as part of a development team on larger projects.Job Responsibilities:Data Engineering & Solution Development:
  • Develop solid knowledge of structured and unstructured data sources within each product journey (Underwriting and Risk; Client Service, Sales and Marketing; Claims; Account and Location Engineering) as well as emerging data sources (purchased data sets; external data; tc.)
  • Apply data modeling techniques to create efficient, well structured data sets that support analytics, reporting, and downstream consumption.
  • Implement ETL / ELT solutions using approved analytics and cloud platforms including Fabric, Synapse Analytics, SQL Server, SSIS, Azure services Data Factory, and data lakes others as required.
  • Ensure tables and views are designed for data integrity, efficiency, performance, and ease of comprehension.
  • Partner with Data Analytics team members, product owners, developers, solution architects, business analysts, data engineers, data analysts, data scientists and others to understand data and reporting needs
  • Develop solutions using data modeling techniques
  • Validate code through detailed and disciplined testing
  • Participate in peer code review to ensure solutions are accurate
  • Ensure data solutions meet standards for accuracy, performance, scalability, and maintainability through disciplined testing and peer review.
  • Design and implement the data flow, infrastructure pipelines, ETL/ELT, structured and unstructured data movement and storage solutions into and out of Data Analytics data assets.
  • Design, develop, and maintain data pipelines and integrations for structured and unstructured data across enterprise product domains (e.g., Underwriting, Risk, Claims, Client Services, Sales and Marketing).
Data Quality, Reliability & Operations:
  • Analyze and resolve data quality issues by identifying root causes and implementing corrective actions.
  • Support data profiling, cleansing, and transformation activities in partnership with analytics and business teams.
  • Monitor pipeline performance, storage capacity, and system reliability; recommend and implement optimizations as needed.
  • Provide production support, including timely remediation of issues using appropriate validation and deployment practices.
  • Ensure data solutions meet standards for accuracy, performance, scalability, and maintainability through disciplined testing and peer code review.
Move and Store Data:
  • Design, develop, and maintain data pipelines and integrations for structured and unstructured data sources across enterprise product domains (e.g., Underwriting, Risk, Claims, Client Services, Sales and Marketing).
  • Implement ETL/ELT solutions using approved cloud and analytics platforms (e.g., Synapse, SQL Server, Azure services Data Factory, Fabric, data lakes).
Data Quality, Reliability & Operations:
  • Analyze and assess reported data quality issues, quickly identifying root causes and implementing corrective actions.
  • Support data profiling, cleansing, and transformation activities in partnership with analytics and business teams.
  • Monitor pipeline performance, storage capacity, and system reliability; identify and implement optimization opportunities.
  • Address production issues quickly using appropriate validation and deployment steps; provide ongoing production support.
  • Consult with DBAs and team members on configuration and maintenance of warehouse infrastructure.
  • Understand and design data relationships between business and data subject areas
  • Follow standards for naming conventions, code documentation and code review
Collaboration & Delivery:
  • Partner with product owners, solution architects, business analysts, data analysts, data scientists, and engineers to understand data and reporting needs and translate them into technical solutions.
  • Support team members with data cleansing tasks
  • Conduct data profiling to identify data anomalies
  • Participate actively in hybrid agile delivery practices, including backlog refinement, task estimation, prioritization, and escalation of colliding priorities.
  • Assist team members with data preparation tasks
  • Support developers, data analysts and data scientists who need to interact with data
  • Analyze and assess reported data quality issues, quickly identifying root cause
  • Monitor system performance and identify opportunities for optimization
  • Monitor storage capacity and reliability
  • Address production issues quickly, with appropriate validation and deployment steps
  • Provide clear and professional communication to users, management, and teammates
  • Provide ad hoc data extracts and analysis to respond to tactical business needs
Participate in effective execution of team priorities:
  • Identify work tasks and capture them in the team backlog
  • Organize known tasks, following provided prioritization
  • Escalate colliding priorities
  • Provide production support
  • Network with product teams to keep abreast of database changes as well as business process changes which result in data interpretation changes
Skill and Experience:
  • 3-5 year of experiencerequired to perform essential job functions.
  • Strong SQL skills
  • Data modeling abilities
  • Relational 3rd Normal Form and non-relational Kimball/Inmon database theory
  • Design, build, maintain data lakes and data warehouses
  • ETL /ELT design and support
  • Programming languages, SQL, Python, Spark, Kafka, Azure Cloud services, JSON
Must Have Skills:
  • SQL
  • Pyspark / Spark,
  • Fabric Platform,
  • Data lakes, data warehouses
  • Ability to read and create data models
  • ETL/ELT Pipeline development experience
  • Ability to keep pace with rapidly evolving cloud services and AI technologies
  • Demonstrate importance of data governance, quality and observability
  • Preferred: Orchestration and automation knowledge for managing complex data workflows
  • Communication and collaboration skills
Good to Have Skills:
  • Relational (3NF) and non-relational (Inmon/Kimball) database theory
  • AI/LLM Integration experience
  • Azure Data Factory; Kafka, Synapse Analytics Platform

FM India

Similar Jobs

  • Senior Data Engineer - Voice

    Ferguson

    • Bangalore, Karnataka
    About Ferguson Ferguson is the largest value-added distributor serving the specialized professional in the residential and non-residential North American construction market. We …
    • 1 day ago
    • Apply easily
  • Senior Analyst- Data Visualization

    MUFG

    • Bangalore, Karnataka
    About MUFG Global Service (MGS) MUFG Bank, Ltd. is Japan’s premier bank, with a global network spanning in more than 40 markets. Outside of Japan, the bank offers an extensive sc…
    • 1 day ago
    • Apply easily
  • Senior Engineer I

    Lululemon Athletica

    • Bangalore, Karnataka
    About lululemon India Tech Hub Founded in 1998 at Vancouver, lululemon is a performance apparel company that sells athletic and lifestyle products. Setting the bar in technical f…
    • 1 day ago
    • Apply easily