Data Engineer, Staff
Costco View all jobs
- Hyderabad, Telangana
- Permanent
- Full-time
- We are excited to announce an opening for Data Engineer, Staff at Costco.
- Python, GCP, Airflow, Cloud Composer, Data Orchestration, Data Streaming, Pub/Sub,
- Kafka, Dataflow, Dataproc, Data Architecture, DWH, Data Warehouse, Warehouse, Data
- Lake, Data lake
- 8+ years
- Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
- Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
- Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
- Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
- Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
- Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.
- 5+ years of experience
- Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
- Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
- Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
- Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
- Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
- Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.
- Python, orchestration tools (e.g. Airflow, Cloud Composer)
- Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
- Experience with GCP and production-grade data platform deployment
- Data security, compliance, and governance standards
- Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
- SQL Skills level - Expert
- Deep understanding of big query, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.