What this role entail● Analyse database design and architecture requirements for the distributed applications● Develop, construct, test and maintain existing and new data-driven architectures● Align architecture with business requirements and provide solutions which fits best to solve the business problems● Data acquisition and clean up from multiple sources across the organization● Design custom tools to collate the data● Identify ways to improve data reliability, efficiency, and quality● Develop automated tasks for data operations● Deliver updates to stakeholders based on analytics● Design and implement scalable data models and schemas in BigQuery● Demonstrate a solid understanding of BigQuery architecture, including the organization of projects, datasets, tables, and workflows.● Handle large-scale datasets (1+ TB) with a focus on performance optimization and cost efficiency using techniques like partitioning, clustering, and materialized views.What lands you in this role● Four or more years as a hands-on technologist with a good mix of application development and database development.● Expert Level SQL, PostgreSQL, including stored procedures, functions, triggers, and views.● Monitor BigQuery usage and costs, implementing optimization strategies● Implement data lifecycle management and archival policies● Optimize slot usage and query scheduling for cost efficiency● Set up and maintain dataset-level and table-level security controls● Strong understanding in database design and modelling.● Ability to efficiently write database code without compromising data quality, privacy or security.● Knowledge of database design principles, query optimization, index management, integrity checks, statistics, partitioning and isolation levels● Participate in an agile software development life cycle including providing testing guidelines for database related changes.● Good understanding of Programming language, preferably Python for custom scripts● Understanding No SQL and In memory databases is a value add