Data Engineer
Job Connect India
- Bangalore, Karnataka
- Rs. 12,00,000 per year
- Permanent
- Full-time
- Design, implement, and maintain scalable ETL and ELT pipelines for efficient data ingestion, transformation, and loading.
- Develop data models and storage solutions that align with business requirements.
- Optimize database performance and data retrieval processes.
- Work with distributed computing frameworks such as Apache Spark, Hadoop, or Kafka to process large datasets efficiently.
- Deploy and manage data solutions on AWS, GCP, or Azure (e.g., Redshift, BigQuery, Snowflake).
- Implement data validation, quality checks, and monitoring to ensure data integrity and compliance with industry standards.
- Work closely with cross-functional teams to understand data needs and provide optimized solutions.
- Continuously improve and optimize existing data infrastructure for performance and cost-effectiveness.
- Bachelors or Masters degree in Computer Science, Data Engineering, Information Technology, or a related field.
- Programming Languages: Proficiency in Python, SQL, and Scala/Java.
- Big Data Technologies: Experience with Spark, Hadoop, Hive, or Kafka.
- Cloud Platforms: Hands-on experience with AWS (Glue, Redshift, S3, Lambda), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse Analytics).
- ETL & Data Warehousing: Strong knowledge of ETL processes, data warehousing concepts, and pipeline automation.
- Databases: Expertise in PostgreSQL, MySQL, NoSQL (MongoDB, Cassandra), and Data Lakes.
- DevOps & CI/CD: Familiarity with Docker, Kubernetes, Terraform, and CI/CD pipelines for data deployments.
- ETL(Extract, Transform, Load) - 4 Years
- Intermediate
- SQL - 4 Years
- Intermediate
- Database development - 4 Years
- Intermediate
- Python - 3 Years
- Intermediate
- Java (All Versions) - 2 Years
- Intermediate
- Power BI - 3 Years
- Intermediate
- Tableau - 2 Years
- Intermediate
- AWS - 2 Years
- Intermediate
- Snowflake - 2 Years
- Intermediate
Expertia AI Technologies