Senior Data Engineer
Linarc
- Chennai, Tamil Nadu
- Permanent
- Full-time
- Architect and manage high-performance RDBMS systems (e.g., PostgreSQL, MySQL) with deep focus on performance tuning, indexing, and partitioning.
- Design and optimize document databases (e.g., MongoDB, DynamoDB) for flexible and scalable data models.
- Implement and manage real-time databases (e.g., Firebase, Firestore) for event-driven or live-sync applications.
- Manage and tune in-memory databases (e.g., Redis, SQLite) for low-latency data access and offline sync scenarios.
- Integrate and optimize data warehouse solutions (e.g., Redshift, Snowflake, BigQuery) for analytics and reporting.
- Build scalable ETL/ELT pipelines to move and transform data across transactional and analytical systems.
- Implement and maintain Elasticsearch for fast, scalable search and log indexing.
- Collaborate with engineering teams to build and maintain data models optimized for analytics and operational use.
- Write complex SQL queries, stored procedures, and scripts to support reporting, data migration, and ad-hoc analysis.
- Work with BI and data lineage tools like Dataedo, dbt, or similar for documentation and governance.
- Define and enforce data architecture standards, best practices, and design guidelines.
- Tune database configurations for high-throughput and low-latency scenarios under different load profiles.
- Manage data access controls, backup/recovery strategies, and ensure data security on AWS (RDS, DynamoDB, S3, etc.).
- 10+ years of professional experience as a Data Engineer or Database Architect.
- 6+ years hands-on experience in database design, optimization, and configuration.
- Deep knowledge of RDBMS performance tuning, query optimization, and system profiling.
- Strong experience with NoSQL, real-time, and in-memory databases (MongoDB, Firebase, Redis, SQLite).
- Hands-on with cloud-native data services (AWS RDS, Aurora, DynamoDB, Redshift).
- Strong proficiency in structured query design, data modeling, and analytics optimization.
- Experience with data documentation and lineage tools like Dataedo, dbt, or equivalent.
- Proficient with Elasticsearch cluster management, search optimization, and data ingestion.
- Solid foundation in data warehouse integration and performance-tuned ETL pipelines.
- Excellent understanding of data security, encryption, and access control in cloud environments.
- Familiarity with event-driven architecture, Kafka, or streaming systems.
- Experience with CI/CD for data pipelines, infrastructure-as-code (Terraform, CloudFormation).
- Programming or scripting experience (Python, Bash, etc.) for data automation and orchestration.
- Exposure to dashboarding tools (e.g., Power BI, Tableau) and building datasets for visualization.
Expertia AI Technologies