
Senior Data Architect
- Mumbai, Maharashtra Powai, Maharashtra
- Permanent
- Full-time
- Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources.
- Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.
- Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis.
- Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis.
- Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness. Implement and maintain data governance and security measures to protect sensitive data.
- Monitoring and troubleshooting data infrastructure, perform root cause analysis, and implement necessary fixes.
- Ensuring the use of state-of-the-art methodologies to carry out job in the most productive and effective way. This may include research activities to identify and introduce new technologies in the field of data acquisition and data analysis
- Have a Bachelors in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with a minimum 4 years of experience
- Have Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.
- Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).
- Have Experience with building complex jobs for building SCD type mappings using ETL tools like Talend, Informatica, etc
- Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
- Have strong problem-solving and analytical skills, with the ability to handle complex data challenges. Excellent communication and collaboration skills to work effectively in a team environment.
- Have Experience in data modeling, data warehousing and ETL principles.
- Have Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).
- Have Advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). (Good to have)
- Have Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
- Have certification in relevant technologies or data engineering disciplines.
- Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive
- Contemporary work-life balance policies and wellbeing activities
- Comprehensive private medical care options
- Safety net of life insurance and disability programs
- Tailored financial programs
- Additional elected or voluntary benefits