
Senior Data Engineer - ISV
- Chandigarh
- Permanent
- Full-time
- Collaborate with multi-functional teams, including data analysts, data scientists, and business collaborators, to understand their data requirements and deliver effective solutions.
- Leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives.
- Expertise in data modeling, with a proven focus on data warehouse and lakehouse design.
- Design and implement data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services.
- Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar tools to prepare data for analysis and reporting.
- Implement data quality checks and governance practices to ensure accuracy, consistency, and security of data assets.
- Supervise and optimize data pipelines and workflows for performance, scalability, and cost efficiency, using Microsoft Fabric for real-time analytics and AI-powered workloads.
- Strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms.
- Experience with data integration and ETL tools like Azure Data Factory.
- Expertise in Microsoft Fabric or similar data platforms.
- In-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions.
- Good communication skills, with the ability to convey technical concepts to both technical and non-technical customers.
- Ability to work independently work within a team environment.
- Microsoft certifications in data-related fields are preferred.
- DP-700 (Microsoft Certified: Fabric Data Engineer Associate) is a plus.
- 5-7 years in Data Warehousing with on-premises or cloud technologies.
- Strong analytical abilities with a proven track record of resolving complex data challenges.
- Proficient in database management, SQL query optimization, and data mapping.
- Understanding of Excel, including formulas, filters, macros, pivots, and related operations.
- Extensive experience with Fabric components, including Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Models.
- Proficiency in Python and SQL/Advanced SQL for data transformations/Debugging.
- Willingness to work flexible hours based on project requirements.
- Strong documentation skills for maintaining clear and structured records.
- Advanced SQL skills, including experience with complex queries, data modeling, and performance tuning.
- Hands-on experience implementing Medallion Architecture for data processing.
- Prior experience in a manufacturing environment is strongly preferred.
- Ability to quickly learn new business areas, software, and emerging technologies.
- Ability to handle sensitive and confidential information with discretion.
- Experience with Oracle, SAP, or other ERP systems is a plus.
- Willing to travel up to 20% as needed.
- Bachelor's degree or equivalent experience in Science, with a focus on MIS, Computer Science, Engineering, or a related subject area.
- Good interpersonal skills in English (spoken and written) to collaborate efficiently with overseas teams.
- Agile certification is preferred.