
Senior Snowflake & Oracle Golden Gate Data Engineer
- Mumbai, Maharashtra
- Permanent
- Full-time
- Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology.
- Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion.
- Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions.
- Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption.
- CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes.
- Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance.
- Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations.
- Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines.
- Should have 4 years of experience as Data Engineer
- Strong expertise in Snowflake architecture, data modeling, and query optimization.
- Proficiency in SQL for writing and optimizing complex queries.
- Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication.
- Knowledge of Snowpipe for automated data ingestion.
- Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake.
- Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows.
- Working knowledge of Snowflake Data Vault methodology.
- Exposure to Databricks for data processing and analytics.
- Knowledge of Python or Scala for data engineering tasks.
- Familiarity with Terraform or CloudFormation for infrastructure as code (IaC).
- Experience in data governance and compliance best practices.
- Understanding of ML and AI integration with data pipelines.
- Can I confidently set up and manage real-time data replication using Oracle GoldenGate?
- Have I worked with Snowpipe to automate data ingestion processes?
- Am I proficient in SQL and capable of writing optimized queries in Snowflake?
- Do I have experience integrating Snowflake with AWS cloud services?
- Have I implemented CI/CD pipelines for Snowflake development?
- Can I troubleshoot performance issues in Snowflake and optimize queries effectively?
- Have I documented data engineering processes and best practices for team collaboration?
- Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology.
- Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion.
- Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions.
- Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption.
- CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes.
- Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance.
- Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations.
- Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines.
- Should have 4 years of experience as Data Engineer
- Strong expertise in Snowflake architecture, data modeling, and query optimization.
- Proficiency in SQL for writing and optimizing complex queries.
- Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication.
- Knowledge of Snowpipe for automated data ingestion.
- Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake.
- Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows.
- Working knowledge of Snowflake Data Vault methodology.
- Exposure to Databricks for data processing and analytics.
- Knowledge of Python or Scala for data engineering tasks.
- Familiarity with Terraform or CloudFormation for infrastructure as code (IaC).
- Experience in data governance and compliance best practices.
- Understanding of ML and AI integration with data pipelines.
- Can I confidently set up and manage real-time data replication using Oracle GoldenGate?
- Have I worked with Snowpipe to automate data ingestion processes?
- Am I proficient in SQL and capable of writing optimized queries in Snowflake?
- Do I have experience integrating Snowflake with AWS cloud services?
- Have I implemented CI/CD pipelines for Snowflake development?
- Can I troubleshoot performance issues in Snowflake and optimize queries effectively?
- Have I documented data engineering processes and best practices for team collaboration?