
Lead I - Data Engineerings
- Pune, Maharashtra
- Permanent
- Full-time
- Document and communicate milestones/stages for end-to-end delivery.
- Code adhering to best coding standards debug and test solutions to deliver best-in-class quality.
- Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency.
- Validate results with user representatives integrating the overall solution seamlessly.
- Develop and manage data storage solutions including relational databases NoSQL databases and data lakes.
- Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools.
- Influence and improve customer satisfaction through effective data solutions.
- Adherence to schedule / timelines
- Adhere to SLAs where applicable
- # of defects post delivery
- # of non-compliance issues
- Reduction of reoccurrence of known defects
- Quickly turnaround production bugs
- Completion of applicable technical/domain certifications
- Completion of all mandatory training requirements
- Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).
- Average time to detect respond to and resolve pipeline failures or data issues.
- Number of data security incidents or compliance breaches.
and checklists. * Review code for team members and peers.Documentation: * Create and review templateschecklists
guidelines
and standards for design
processes
and development. * Create and review deliverable documentsincluding design documents
architecture documents
infrastructure costing
business requirements
source-target mappings
test cases
and results.Configuration: * Define and govern the configuration management plan.
- Ensure compliance within the team.
and execution plans. * Review the test plan and test strategy developed by the testing team.
- Provide clarifications and support to the testing team as needed.
- Complete relevant domain certifications to enhance expertise.
- Identify defect trends and take proactive measures to improve quality.
libraries
and client universities. * Review reusable documents created by the team.Release Management: * Execute and monitor the release process to ensure smooth transitions.Design Contribution: * Contribute to the creation of high-level design (HLD)low-level design (LLD)
and system architecture for applications
business components
and data models.Customer Interface: * Clarify requirements and provide guidance to the development team.
- Present design options to customers and conduct product demonstrations.
- Understand team members' aspirations and provide guidance and opportunities for growth.
- Ensure team engagement in projects and initiatives.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning of data processes.
- Expertise in designing and optimizing data warehouses for cost efficiency.
- Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Capacity to clearly explain and communicate design and development aspects to customers.
- Ability to estimate time and resource requirements for developing and debugging features or components.
- Proficiency in SQL for analytics including windowing functions.
- Understanding of data schemas and models relevant to various business contexts.
- Familiarity with domain-related data and its implications.
- Expertise in data warehousing optimization techniques.
- Knowledge of data security concepts and best practices.
- Familiarity with design patterns and frameworks in data engineering.