0
0
0
0
Mid-Senior level
Posted March 13, 2026
Job link
Thinking about this job
Responsibilities
Commitments
Responsibilities
- Designing, coding, and testing new data pipelines using Ab Initio Designing
- Implementing ETL/ELT Processes
- Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
- Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
- Manage and monitor GCP resources specifically used for data processing and storage
- Optimize Cloud Data Workloads
Commitments
Locations: Charlotte, NC 4 days onsite, 1 days remote weekly
Duration: 12 months – likely to be extended based on project need/performance
W2 Contract
Not Met Priorities
What still needs stronger evidence
Requirements
- MUST have 10+ years of overall IT experience
- 8+ years of Ab Initio experience (minimum of 5 years of experience)
- 4+ years of GCP experience (minimum of 2 years of experience)
- 4+ years of experience with BigQuery (minimum of 2 year’s experience)
- Expertise with SQL/ETL
- 4+ years of Agile and JIRA experience
- Experience with technical stakeholder interactions
- Enterprise level experience
- EXCELLENT written and verbal communication skills
- Day To Day
- Designing, coding, and testing new data pipelines using Ab Initio Designing
- Implementing ETL/ELT Processes
- Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
- Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
- Manage and monitor GCP resources specifically used for data processing and storage
- Optimize Cloud Data Workloads
- Background in Banking/Financial Technology – Deposits, Payments, Cards domain, etc.
- Skills: gcp,communication,cloud,cards,agile,w2,etl,sql,ab initio,data,aggregation
Preferred Skills
- Optimize Cloud Data Workloads
- Java experience highly desired
- Python experience highly desired
- Background in Banking/Financial Technology – Deposits, Payments, Cards domain, etc.
- Skills: gcp,communication,cloud,cards,agile,w2,etl,sql,ab initio,data,aggregation
Locations: Charlotte, NC 4 days onsite, 1 days remote weekly
Duration: 12 months – likely to be extended based on project need/performance
W2 Contract
We are currently Looking for Local Candidates
Must Haves
MUST have 10+ years of overall IT experience
8+ years of Ab Initio experience (minimum of 5 years of experience)
4+ years of GCP experience (minimum of 2 years of experience)
4+ years of experience with BigQuery (minimum of 2 year’s experience)
Expertise with SQL/ETL
4+ years of Agile and JIRA experience
Experience with technical stakeholder interactions
Enterprise level experience
EXCELLENT written and verbal communication skills
Day To Day
Designing, coding, and testing new data pipelines using Ab Initio Designing
Implementing ETL/ELT Processes
Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
Manage and monitor GCP resources specifically used for data processing and storage
Optimize Cloud Data Workloads
Java experience highly desired
Python experience highly desired
Background in Banking/Financial Technology – Deposits, Payments, Cards domain, etc.
Skills: gcp,communication,cloud,cards,agile,w2,etl,sql,ab initio,data,aggregation
Duration: 12 months – likely to be extended based on project need/performance
W2 Contract
We are currently Looking for Local Candidates
Must Haves
MUST have 10+ years of overall IT experience
8+ years of Ab Initio experience (minimum of 5 years of experience)
4+ years of GCP experience (minimum of 2 years of experience)
4+ years of experience with BigQuery (minimum of 2 year’s experience)
Expertise with SQL/ETL
4+ years of Agile and JIRA experience
Experience with technical stakeholder interactions
Enterprise level experience
EXCELLENT written and verbal communication skills
Day To Day
Designing, coding, and testing new data pipelines using Ab Initio Designing
Implementing ETL/ELT Processes
Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
Manage and monitor GCP resources specifically used for data processing and storage
Optimize Cloud Data Workloads
Java experience highly desired
Python experience highly desired
Background in Banking/Financial Technology – Deposits, Payments, Cards domain, etc.
Skills: gcp,communication,cloud,cards,agile,w2,etl,sql,ab initio,data,aggregation