Not Applicable
Posted March 14, 2026
Job link
Thinking about this job
Responsibilities
Commitments
Responsibilities
- Create and maintain a data pipeline architecture
- Assembling large, complex sets of data that meet mission requirements
- Identify, design, and implement improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes
- Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, SQL, NiFi technologies
- Build analytical tools to utilize the data pipeline, providing actionable insight into data performance including operational efficiency and customer acquisition
- Working with mission stakeholders and assist them with data-related technical issues
- Working with technology stakeholders to support their data infrastructure needs while assisting with data-related technical issues
Commitments
Fuel offers
Not Met Priorities
What still needs stronger evidence
Requirements
- Ability to build and optimize data sets, 'big data' data pipelines and architectures
- Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
- Excellent analytic skills associated with working on unstructured datasets
- Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
- Minimum of five years of experience using some of the below software and tools:
- Big data tools like Kafka, Spark and Hadoop
- Relational NoSQL and SQL databases including Cassandra and PostgreSQL
- Workflow management and pipeline tools such as Airflow, Luigi and Azkaban
- AWS close services including Redshift, RDS, EMR and EC2
- Stream-processing systems like Spark-Streaming and Storm
- Object function/object-oriented scripting languages including Scala, C++, Java and Python.
- Data workflow orchestration tools including Pentahoe or Apache NiFi About Fuel
- Fuel Consulting LLC helps clients create the path to their future.
Preferred Skills
- Data workflow orchestration tools including Pentahoe or Apache NiFi About Fuel
Education
- (Not required) – Bachelor’s degree in information systems, informatics, statistics, or computer science/software engineering
View all jobs
Data Engineer (SME) CK22
McLean, VIRGINIA
Fuel
Data Engineer (SME), McLean
Duties And Responsibilities
Create and maintain a data pipeline architecture
Assembling large, complex sets of data that meet mission requirements
Identify, design, and implement improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes
Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, SQL, NiFi technologies
Build analytical tools to utilize the data pipeline, providing actionable insight into data performance including operational efficiency and customer acquisition
Working with mission stakeholders and assist them with data-related technical issues
Working with technology stakeholders to support their data infrastructure needs while assisting with data-related technical issues
Skills And Experience
Ability to build and optimize data sets, 'big data' data pipelines and architectures
Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
Excellent analytic skills associated with working on unstructured datasets
Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
Bachelor’s degree in information systems, informatics, statistics, or computer science/software engineering
Minimum of five years of experience using some of the below software and tools:
Big data tools like Kafka, Spark and Hadoop
Relational NoSQL and SQL databases including Cassandra and PostgreSQL
Workflow management and pipeline tools such as Airflow, Luigi and Azkaban
AWS close services including Redshift, RDS, EMR and EC2
Stream-processing systems like Spark-Streaming and Storm
Object function/object-oriented scripting languages including Scala, C++, Java and Python.
Data workflow orchestration tools including Pentahoe or Apache NiFi About Fuel
Fuel Consulting LLC helps clients create the path to their future. Clients get expertise that needs no ramp-up time to deliver the best practices, customized tools, and methodologies to help them achieve their vision and leave a legacy. If helping others and working with a team of exceptionally talented individuals excites you, then look no further than Fuel. Fuel offers
Data Engineer (SME) CK22
McLean, VIRGINIA
Fuel
Data Engineer (SME), McLean
Duties And Responsibilities
Create and maintain a data pipeline architecture
Assembling large, complex sets of data that meet mission requirements
Identify, design, and implement improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes
Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, SQL, NiFi technologies
Build analytical tools to utilize the data pipeline, providing actionable insight into data performance including operational efficiency and customer acquisition
Working with mission stakeholders and assist them with data-related technical issues
Working with technology stakeholders to support their data infrastructure needs while assisting with data-related technical issues
Skills And Experience
Ability to build and optimize data sets, 'big data' data pipelines and architectures
Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
Excellent analytic skills associated with working on unstructured datasets
Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
Bachelor’s degree in information systems, informatics, statistics, or computer science/software engineering
Minimum of five years of experience using some of the below software and tools:
Big data tools like Kafka, Spark and Hadoop
Relational NoSQL and SQL databases including Cassandra and PostgreSQL
Workflow management and pipeline tools such as Airflow, Luigi and Azkaban
AWS close services including Redshift, RDS, EMR and EC2
Stream-processing systems like Spark-Streaming and Storm
Object function/object-oriented scripting languages including Scala, C++, Java and Python.
Data workflow orchestration tools including Pentahoe or Apache NiFi About Fuel
Fuel Consulting LLC helps clients create the path to their future. Clients get expertise that needs no ramp-up time to deliver the best practices, customized tools, and methodologies to help them achieve their vision and leave a legacy. If helping others and working with a team of exceptionally talented individuals excites you, then look no further than Fuel. Fuel offers