
Staff Engineer - DataOps engineering, Python, API
- Chennai, Tamil Nadu
- Permanent
- Full-time
- Data Architecture: Function as steward and owner of ID&A’s data lake, including developing data pipelines, new database structures and APIs as applicable
- Data Design: Translate logical data architectures into physical data designs, ensuring alignment with data modeling best practices and standards
- Data Process and Monitoring: Ensure proper data ingestion, validation, testing, and monitoring for ID&A data lake
- Data Migration: Design and support migration to a data model that is independent of the particular technologies or tools used for storing, processing, or accessing the data
- Data Integrity: Ensure the accuracy, completeness, and quality of data is maintained over time regardless of upstream systems or downstream
- Agile Methodologies: Lead an agile feature team and manage data assets as per the enterprise standards, guidelines and policies
- Partner closely with business intelligence team to capture and define data requirements for new and enhanced data visualizations
- Work with product teams to prioritize new capabilities and data requirements for ongoing sprints and manage backlog
- Enterprise Thinking: Partner with data architects and engineers across Global Infrastructure and Asset Management teams to evolve the broader tech operations data architecture landscape
- Enforce SDLC standards, enterprise reuse, and architecture guidelines and participate in any required Architecture Design Reviews
- Act as point of contact for data-related inquiries and data access requests
- Contribute to decisions about tools, methods and approaches
- Innovation: Leverage the evolving technical landscape as needed, including AI, Big Data, Machine Learning and other technologies to deliver meaningful business insights
- 10+ years of DataOps engineering experience in implementing pipeline orchestration, data quality monitoring, governance, security processes, and self-service data access
- Deep technical understanding of event-driven architectures, API-first design, cloud-native technologies, and front-end integration patterns
- Extensive experience managing databases, ETL/ELT pipelines, data lake architectures, and real-time processing
- Hands-on coding experience in Python
- Hands-on expertise with design and development across one or more database management systems (e.g. SQL Server, PostgreSQL, Oracle)
- Experience in designing and implementing data solutions with high resiliency, availability, and reliability
- Demonstrated experience leading cross-functional teams of Data Engineers, Data Architects, and Product-type roles
- Proven ability to translate business data requirements into complete, accurate, extensible, and flexible logical data models using data modeling tools
- Testing and Troubleshooting: Ability to test, troubleshoot, and debug data processes as needed
- Strong analytical skills with a proven ability to understand and document business data requirements in complete, accurate, extensible and flexible logical data models, data visualization tools (e.g. Apptio BI, PowerBI)
- Strong written and verbal communications, presentation skills, leadership, problem-solving and organizational skills
- Fluent in data risk, management, and compliance terminology and best practices
- Proven track record for managing large, complex ecosystems with multiple stakeholders
- Self-starter who is able to problem-solve effectively, organize and document processes, and prioritize feature with limited guidance
- An enterprise mindset that connects the dots across various requirements and the broader operations/infrastructure data architecture landscape
- Excellent influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross-team communication
- Understanding of complex software delivery including build, test, deployment, and operations; conversant in AI, Data Science, and Business Intelligence concepts and technology stack
- Experience working in technology business management, technology infrastructure or data visualization teams
- Experience in infrastructure products a plus; this may include network, compute, storage, database, midrange, mainframe, observability, and both public & private cloud
- Foundational Public Cloud (AWS, Google, Microsoft) certification; advanced Public Cloud certifications a plus
- Experience with design and coding across multiple platforms and languages a plus
- Bachelor’s Degree in computer science, computer science engineering, data engineering, or related field required; advanced degree preferred
- Competitive base salaries
- Bonus incentives
- Support for financial-well-being and retirement
- Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location)
- Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need
- Generous paid parental leave policies (depending on your location)
- Free access to global on-site wellness centers staffed with nurses and doctors (depending on location)
- Free and confidential counseling support through our Healthy Minds program
- Career development and training opportunities