Entry level
Posted April 1, 2026
Job link
Thinking about this job
Responsibilities
Responsibilities
- Work with team members to operationalize data pipelines and supporting cloud infrastructure
- Collaborate with external data producers and consumers to obtain and provide data through interfaces such as REST APIs and S3
- Provide day-to-day support of deploying Python-native data pipelines and performing data engineering tasks to enable data brokering and exchange capabilities
- Provide Tier 2/3 troubleshooting and incident resolution support for data pipelines in Production What you’ll need to succeed:
- Active TS/SCI with CI poly required
- 4-7 years of proven experience in data engineering, with expertise in designing, developing, and maintaining data ingestion, transformation, and loading pipelines and components
- Demonstrated experience in designing and deploying data pipelines leveraging AWS cloud infrastructure across multiple classification domains (e.g., IL5 to IL6+)
Not Met Priorities
What still needs stronger evidence
Requirements
- Provide day-to-day support of deploying Python-native data pipelines and performing data engineering tasks to enable data brokering and exchange capabilities
- Provide Tier 2/3 troubleshooting and incident resolution support for data pipelines in Production What you’ll need to succeed:
- Active TS/SCI with CI poly required
- 4-7 years of proven experience in data engineering, with expertise in designing, developing, and maintaining data ingestion, transformation, and loading pipelines and components
- Demonstrated experience in designing and deploying data pipelines leveraging AWS cloud infrastructure across multiple classification domains (e.g., IL5 to IL6+)
- Experience with Infrastructure-as-Code (IaC) tools, including Terraform, CloudFormation, or Ansible, to automate deployment of data pipeline cloud infrastructure
- Understanding of RMF security principles and hands-on experience implementing security controls for data pipelines in cloud environments
- Strong scripting and programming skills in languages such as Go, Python, and Bash
- Experience with data pipeline tools and technologies such as Nifi, Hadoop, HDFS, and Kafka.
- Strong communication skills, with the ability to clearly convey complex technical concepts
- TS/SCI with Poly required #CJ
Preferred Skills
- Active TS/SCI with CI poly required
- Demonstrated experience in designing and deploying data pipelines leveraging AWS cloud infrastructure across multiple classification domains (e.g., IL5 to IL6+)
- Experience with Infrastructure-as-Code (IaC) tools, including Terraform, CloudFormation, or Ansible, to automate deployment of data pipeline cloud infrastructure
- Experience with data pipeline tools and technologies such as Nifi, Hadoop, HDFS, and Kafka.
- Experience implementing data pipelines in the Cloudera Data Platform environment is highly preferred
- TS/SCI with Poly required #CJ
Education
- (Required) – TS/SCI with Poly required #CJ
***Active TS/SCI with CI Polygraph Required*** Red Arch Solutions is a proven and effective small business integrator and consultant, recognized as a leading provider of IT development to the Federal Government, and primarily focused within the Intelligence Community. As a Data Engineer , you’ll work with team members to build, deploy, and operate data pipelines and data engineering components to enable data brokering and exchange capabilities across a diverse range of producers and consumers. In this role, you will be responsible for building robust, scalable data pipelines while working collaboratively with cross-functional teams to ensure seamless integration, optimized performance, and efficient delivery of solutions. If you think you can see yourself delivering our mission and pursuing our goals with us, then check out the job description below! What you’ll do:
Work with team members to operationalize data pipelines and supporting cloud infrastructure
Collaborate with external data producers and consumers to obtain and provide data through interfaces such as REST APIs and S3
Provide day-to-day support of deploying Python-native data pipelines and performing data engineering tasks to enable data brokering and exchange capabilities
Provide Tier 2/3 troubleshooting and incident resolution support for data pipelines in Production What you’ll need to succeed:
Active TS/SCI with CI poly required
4-7 years of proven experience in data engineering, with expertise in designing, developing, and maintaining data ingestion, transformation, and loading pipelines and components
Demonstrated experience in designing and deploying data pipelines leveraging AWS cloud infrastructure across multiple classification domains (e.g., IL5 to IL6+)
Experience with Infrastructure-as-Code (IaC) tools, including Terraform, CloudFormation, or Ansible, to automate deployment of data pipeline cloud infrastructure
Understanding of RMF security principles and hands-on experience implementing security controls for data pipelines in cloud environments
Strong scripting and programming skills in languages such as Go, Python, and Bash
Experience with data pipeline tools and technologies such as Nifi, Hadoop, HDFS, and Kafka. Experience implementing data pipelines in the Cloudera Data Platform environment is highly preferred
Strong communication skills, with the ability to clearly convey complex technical concepts
TS/SCI with Poly required #CJ
Work with team members to operationalize data pipelines and supporting cloud infrastructure
Collaborate with external data producers and consumers to obtain and provide data through interfaces such as REST APIs and S3
Provide day-to-day support of deploying Python-native data pipelines and performing data engineering tasks to enable data brokering and exchange capabilities
Provide Tier 2/3 troubleshooting and incident resolution support for data pipelines in Production What you’ll need to succeed:
Active TS/SCI with CI poly required
4-7 years of proven experience in data engineering, with expertise in designing, developing, and maintaining data ingestion, transformation, and loading pipelines and components
Demonstrated experience in designing and deploying data pipelines leveraging AWS cloud infrastructure across multiple classification domains (e.g., IL5 to IL6+)
Experience with Infrastructure-as-Code (IaC) tools, including Terraform, CloudFormation, or Ansible, to automate deployment of data pipeline cloud infrastructure
Understanding of RMF security principles and hands-on experience implementing security controls for data pipelines in cloud environments
Strong scripting and programming skills in languages such as Go, Python, and Bash
Experience with data pipeline tools and technologies such as Nifi, Hadoop, HDFS, and Kafka. Experience implementing data pipelines in the Cloudera Data Platform environment is highly preferred
Strong communication skills, with the ability to clearly convey complex technical concepts
TS/SCI with Poly required #CJ