← Serch more jobs

Data Engineer - Project Delivery Analyst

LinkedIn Deloitte Bellevue, WA
Not Applicable Posted March 27, 2026 2 variants Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • 1+ year of experience building/enhancing data pipelines and curated datasets for analytics/downstream consumers.
  • 1+ year of hands-on experience with SQL and Python, including Snowflake and/or PySpark for transformations and scalable processing.
  • 1+ year of experience with cloud data engineering on AWS (preferred) or Azure/GCP, including orchestration/scheduling (e.g., Airflow/MWAA, Step Functions, Glue, ADF/Fabric Data Factory).
  • Understanding of ELT patterns and Lakehouse/warehouse concepts; familiarity with S3 file formats/partitioning (e.g., Parquet/Delta).
  • Working knowledge of DevOps practices (Git-based workflows, CI/CD) and exposure to Infrastructure-as-Code (Terraform/CloudFormation).
  • Understanding data quality, basic observability, and metadata/governance fundamentals.
  • Limited immigration sponsorship may be available.
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.
Preferred Skills
  • 1+ year of experience with cloud data engineering on AWS (preferred) or Azure/GCP, including orchestration/scheduling (e.g., Airflow/MWAA, Step Functions, Glue, ADF/Fabric Data Factory).
  • Agile delivery experience .
  • Analytical ability to manage multiple projects and prioritize tasks into manageable work products.
  • Can operate independently or with minimum supervision.
  • Excellent written and communication skills.
  • Ability to deliver technical demonstrations.
Education
  • (Not required) – Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience.