← Serch more jobs

Sr. Engineering Manager/Coach, Data Team

LinkedIn Speridian Technologies Sacramento, CA
Mid-Senior level Posted April 4, 2026 Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • Proven track record managing data engineering teams of 20+ members
  • Experience owning P&L or budget responsibility for data platforms or products
  • Demonstrated ability to connect data infrastructure to business outcomes and ROI
  • Experience building and operating production data platforms at scale
  • Strong background in modern data engineering practices and cloud data technologies
  • Demonstrated ability to make architectural decisions for data systems and pipelines
  • Experience with full data lifecycle from ingestion through consumption
  • Track record of developing data engineers and building strong data engineering cultures
  • Data Platforms: Snowflake, Databricks, BigQuery, Redshift, or similar
  • Data Processing: Apache Spark, Airflow, dbt, Kafka, streaming architectures
  • Cloud & Infrastructure: AWS/Azure/GCP data services and infrastructure as code
  • Data Modeling: Dimensional modeling, data vault, data mesh principles
  • Languages: SQL, Python, Scala, and data-specific programming paradigms Business & Financial
  • Financial Management: Cloud data cost optimization, budget ownership, and ROI analysis
  • Business Metrics: Defining and tracking data platform KPIs and usage metrics
  • Value Communication: Articulating data investments in business terms
  • Resource Planning: Capacity planning for data workloads and storage
  • Vendor Management: Evaluating and managing data tools and platform services Leadership
Education
  • (Not required) – Bachelor's degree in Computer Science, Engineering, or equivalent experience Technical
  • (Not required) – Data Platforms: Snowflake, Databricks, BigQuery, Redshift, or similar
  • (Not required) – Data Processing: Apache Spark, Airflow, dbt, Kafka, streaming architectures
  • (Not required) – Cloud & Infrastructure: AWS/Azure/GCP data services and infrastructure as code
  • (Not required) – Data Modeling: Dimensional modeling, data vault, data mesh principles