← Serch more jobs

Slalom Flex (Project Based)- Federal GCP Data Engineer

LinkedIn Slalom Dallas, TX
Mid-Senior level Posted April 2, 2026 6 variants Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • U.S. citizenship
  • Ability to obtain and maintain a federal Public Trust clearance
  • 3+ years of experience in cloud-based data engineering
  • Strong hands-on expertise with Google BigQuery
  • Proficiency in Python for pipeline development, automation, and cloud integration
  • Experience building data pipelines in GCP, including BigQuery, Dataform, Airflow/Cloud Composer, Cloud Functions, or similar
  • Strong SQL skills, including data modeling and data quality testing
  • Experience with Git-based version control and CI/CD concepts
  • Familiarity with data governance, metadata management, and compliance considerations
  • Strong communication and stakeholder engagement skills
Preferred Skills
  • Strong communication and stakeholder engagement skills
  • Experience supporting federal or regulated environments
  • Familiarity with Looker and downstream BI enablement
  • Understanding of ML workloads or data structures optimized for modeling
  • Experience with Agile/Scrum or SAFe
  • Knowledge of data quality frameworks and testing strategies
  • Exposure to GCP data governance tools such as Dataplex
  • Experience with serverless architectures (Cloud Functions, Cloud Run)
  • Familiarity with JavaScript for Dataform SQLX extensibility
  • Hands-on experience with orchestration tools (Airflow, Prefect, Dagster, Luigi)
  • Experience with dbt, Dataform, Databricks, or other analytics engineering tooling
  • Relevant GCP certifications