Data Engineer

Michael Page

  • Gurgaon, Haryana
  • Permanent
  • Full-time
  • 15 days ago
  • Apply easily
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).
  • Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.
  • Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).
  • Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.
  • ADF, Databricks and Azure certification is a plus.
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).
  • Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.
  • Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).
  • Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.
  • ADF, Databricks and Azure certification is a plus.
Competetive SalaryPF and GratuityQualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).
  • Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.
  • Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).
  • Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.
  • ADF, Databricks and Azure certification is a plus.
Our client is an international professional services brand of firms, operating as partnerships under the brand. It is the second-largest professional services network in the worldCompetitive compensation commensurate with role and skill set
Medical Insurance Coverage worth of 10 Lacs
Social Benifits including PF & Gratuity
A fast-paced, growth-oriented environment with the associated (challenges and) rewards
Opportunity to grow and develop your own skills and create your future

Michael Page

Similar Jobs

  • Principal Software Engineer - TM

    Alkami

    • Gurgaon, Haryana
    About Alkami The Alkami Platform is a cloud-based solution for all digital banking needs. Alkami helps clients transform through retail and business banking, digital account open…
    • 4 hours ago
    • Apply easily
  • Software Engineer- TM

    Alkami

    • Gurgaon, Haryana
    About Alkami The Alkami Platform is a cloud-based solution for all digital banking needs. Alkami helps clients transform through retail and business banking, digital account open…
    • 4 hours ago
    • Apply easily
  • Senior Software Engineer - TM

    Alkami

    • Gurgaon, Haryana
    About Alkami The Alkami Platform is a cloud-based solution for all digital banking needs. Alkami helps clients transform through retail and business banking, digital account open…
    • 4 hours ago
    • Apply easily