
Staff, Data Architect
- Hyderabad, Telangana
- Permanent
- Full-time
- Design and implement scalable, secure, and high-performance data architectures to support diverse business needs.
- Develop and maintain data models, data dictionaries, and metadata repositories.
- Define and enforce data standards, policies, and procedures.
- Collaborate with stakeholders to maintain data strategies and roadmaps.
- Establish and maintain data governance frameworks to ensure data quality and compliance. Implement re-usable data quality frameworks.
- Evaluate and recommend new data technologies and tools.
- Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery.
- Ensure data security and privacy through appropriate access controls and encryption.
- Design and implement scalable, secure, and high-performance data architectures to support diverse business needs.
- Develop and maintain data models, data dictionaries, and metadata repositories.
- Define and enforce data standards, policies, and procedures.
- Collaborate with stakeholders to maintain data strategies and roadmaps.
- Establish and maintain data governance frameworks to ensure data quality and compliance. Implement re-usable data quality frameworks.
- Evaluate and recommend new data technologies and tools.
- Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery.
- Ensure data security and privacy through appropriate access controls and encryption.
- Design ETL/ELT processes for data integration from various sources.
- Build software across our data platform, including event driven data processing, storage, and serving through scalable and highly available APIs, with awesome cutting-edge technologies.
- Optimize data storage and retrieval for efficient data access.
- Work closely with data analysts and business stake holders to make data easily accessible and understandable to them.
- Design and implement cloud-based data solutions using platforms like AWS, Azure, Snowflake and Databricks.
- Optimize cloud data storage and processing for cost-effectiveness and performance.
- Stay up to date with the latest cloud data technologies and trends.
- Develop and enforce data engineering, security, data quality standards through automation.
- Monitor and optimize data system performance.
- Troubleshoot and resolve data-related issues.
- Conduct performance tuning and capacity planning.
- Help us stay ahead of the curve by working closely with data engineers, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically.
- Work closely with business analysts & business users to understand data requirements.
- Communicate complex technical concepts to non-technical stakeholders.
- Provide technical leadership and mentorship to junior team members.
- Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
- Help build and maintain foundational data products such as but not limited to Finance, Titles, Content Sales, Theatrical, Consumer Products etc.
- Work closely with various other data engineering teams to roll out new capabilities.
- Build process and tools to maintain Machine Learning pipelines in production.
- Bachelor's degree in computer science, information systems, or information technology.
- 10+ years of experience building and scaling data platforms, data architecture, data modeling, and data warehousing.
- 3+ Years of experience working as an Architect, designing enterprise data solutions.
- Strong experience with cloud data technologies including Snowflake, AWS Data Services, Python Scripting, Apache Airflow and Databricks.
- Passion for working with data and deriving insights to answer business questions that drive actions and decision-making
- Strong Experience in SQL and Data Modelling tools & concepts.
- Experience with data virtualization and data mesh architectures.
- Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.)
- Experience leveraging creative analytics techniques and output to tell a story to drive business decisions
- Solid business acumen, and critical problem-solving ability, with a capacity for strategic thinking
- Comfort level with ambiguity and ability to manage multiple projects at the same time
- Excellent communication, presentation, and customer relationship skills.
- Public speaking and presentation skills.
- Deep understanding of API connectivity and data streaming architecture