← Serch more jobs

Head of Data Engineering

LinkedIn Cadre AI San Diego, CA
Not Applicable Posted April 4, 2026 2 variants Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • 8+ years of professional experience in data engineering, data architecture, or a closely related technical role
  • 3+ years of hands-on experience with Snowflake as a primary data platform, including advanced features: Snowpark, Snowpipe, Tasks, Streams, and Dynamic Tables
  • 3+ years in a client-facing consulting, professional services, or agency environment—you know how to earn trust and deliver under pressure
  • Deep expertise in SQL, Python, and modern data transformation tools (dbt strongly preferred)
  • Strong experience with cloud platforms—AWS preferred; Azure and GCP also valued—including infrastructure-as-code tools like Terraform
  • Proven experience designing multi-tenant data architectures with robust access control, data isolation, and cost allocation
  • Experience with data pipeline orchestration tools such as Airflow, Dagster, Prefect, or Databricks Workflows
  • Demonstrated ability to lead technical teams and grow a practice or function from early stage
  • Exceptional communication skills—you can present to a C-suite executive and pair with a junior engineer in the same day
  • Strong understanding of data governance, data quality frameworks, and regulatory compliance including SOC 2, GDPR, and CCPA
  • Snowflake SnowPro Advanced certifications (Architect or Data Engineer)
  • Experience with Snowflake Cortex AI, Streamlit in Snowflake, and AI/ML data preparation workflows
  • Familiarity with complementary platforms: Databricks, Microsoft Fabric, Redshift, or BigQuery
  • Experience building data products or analytics-as-a-service offerings for external customers
  • Domain experience in mortgage/financial services, IoT, SaaS, or professional services—industries where the data complexity is real
  • Experience with real-time data pipelines using Kafka or Kinesis and streaming architectures
  • Track record of contributing to the data community through speaking, open-source contributions, or published content
  • Experience managing P&L responsibility or practice-level financial metrics in a consulting environment
Preferred Skills
  • Strong experience with cloud platforms—AWS preferred; Azure and GCP also valued—including infrastructure-as-code tools like Terraform
  • Snowflake SnowPro Advanced certifications (Architect or Data Engineer)
  • Experience with Snowflake Cortex AI, Streamlit in Snowflake, and AI/ML data preparation workflows
  • Familiarity with complementary platforms: Databricks, Microsoft Fabric, Redshift, or BigQuery
  • Experience building data products or analytics-as-a-service offerings for external customers
  • Domain experience in mortgage/financial services, IoT, SaaS, or professional services—industries where the data complexity is real
  • Experience with real-time data pipelines using Kafka or Kinesis and streaming architectures
  • Track record of contributing to the data community through speaking, open-source contributions, or published content
  • Experience managing P&L responsibility or practice-level financial metrics in a consulting environment