
Sr. Software Engineer - AWS+Python+Pyspark Job
- Pune, Maharashtra
- Permanent
- Full-time
- Primary skillsets :AWS services including Glue, Pyspark, SQL, Databricks, Python
- Secondary skillset- Any ETL Tool, Github, DevOPs(CI-CD)
- Experience: 3-4yrs
- Degree in computer science, engineering, or similar fields
- Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing , developing, testing and supporting data pipelines and applications.
- 3+ years working experience in data integration and pipeline development.
- 3+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus.
- 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server)
- Strong real-life experience in python development especially in PySpark in AWS Cloud environment
- Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch
- Workflow management tools like Airflow
- AWS cloud services: RDS, AWS Lambda, AWS Glue, AWS Athena, EMR (equivalent tools in the GCP stack will also suffice)
- Good to Have : Snowflake, Palantir Foundry
- Flexible work arrangements, Free spirit, and emotional positivity
- Agile self-determination, trust, transparency, and open collaboration
- All Support needed for the realization of business goals,
- Stable employment with a great atmosphere and ethical corporate culture