TeraData Developer Jobs Everywhere
(Found 23 Jobs)
Ab Initio Developer
Collabera
Hey Y'all!!
If you're a looking for an Ab Initio Developer role. Please contact me at - apoorva.pisharoty@collabera.com
Position Title: Ab Initio Developer
Locations: Charlotte, NC; Hybrid
Durati...
Jul 15, 2025
Charlotte, NC

BI Application Developer III
Spectrum
TITLE: BI Application Developer IIILOCATION: Stamford, CTJOB SUMMARY: Develop sustainable business intelligence reporting solutions to support business verticals and maximize utility of Charter's data...
Jun 12, 2025
Stamford, CT

ETL Developer
Apex Systems, Inc.
Job#: 2078395
Job Description:
12 Month ContractHybrid On Site Dearborn MI60-70/HRPosition Description: Data Software Engineer position will involve design, implementation, testing and launch of dat...
Jun 17, 2025
Dearborn, MI

Ab Initio Developer
Apex Systems, Inc.
Job#: 2073793
Job Description:
Apex Systems, a World-Class Technology Solutions Provider, is seeking applicants for the below position on behalf of our client. Please apply if interested and qualifi...
May 17, 2025
Minneapolis, MN

Commercial Lending Data Analyst
Apex Systems, Inc.
Job#: 2082581
Job Description:
Business Data Analyst - Commercial LendingHybrid On-Site 3 days/week in Charlotte, North CarolinaW2 Contract 12-24 Months, Paid for hours workedW2 Pay up to $60/hourSu...
Jul 23, 2025
Charlotte, NC
Hey Y'all!!
If you're a looking for an Ab Initio Developer role. Please contact me at - apoorva.pisharoty@collabera.com
Position Title: Ab Initio Developer
Locations: Charlotte, NC; Hybrid
Duration: 12 months - likely to be extended based on project need/performance
Must Haves:
- MUST have 10+ years of Data Engineer experience
- 8+ years of Ab Initio experience
- 8+ years of Teradata experience
- 4+ years of GCP experience
- 4+ years of experience with BigQuery
- Expertise with SQL/ETL
- 4+ years of Agile and JIRA experience
- Experience with technical stakeholder interactions
- Enterprise level experience
- EXCELLENT written and verbal communication skills
Day to Day:
- Designing, coding, and testing new data pipelines using Ab Initio Designing
- Implementing ETL/ELT Processes
- Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
- Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
- Manage and monitor GCP resources specifically used for data processing and storage
- Optimize Cloud Data Workloads
Desired Experience:
- Java experience highly desired
- Python experience highly desired
- Experience with Spark, Hadoop, MapR, Data Lake
- Background in Banking/Financial Technology - Deposits, Payments, Cards domain, etc.