1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Ab Initio Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Ab Initio Developer in Dallas, TX, on a contract basis with a pay rate of "unknown." Requires 5+ years in ETL development, expertise in Ab Initio, Teradata, Redshift, SQL, and Unix scripting for data processing and integration.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 4, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Atlanta, GA
🧠 - Skills detailed
#Lambda (AWS Lambda) #Scala #Batch #SQL (Structured Query Language) #DevOps #Kafka (Apache Kafka) #Python #Data Storage #Storage #Data Engineering #Business Analysis #Data Management #Data Migration #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Documentation #"ETL (Extract #Transform #Load)" #Scripting #Data Integration #Metadata #Azure #Bash #Data Lineage #Oracle #SQL Server #Talend #Unix #SQL Queries #Databases #Snowflake #PostgreSQL #Monitoring #Data Processing #Redshift #GCP (Google Cloud Platform) #Data Governance #Hadoop #Ab Initio #Migration #Shell Scripting #Automation #Cloud #AWS (Amazon Web Services) #Big Data #Teradata #Informatica #Data Architecture
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Job Description:

Job Title: Ab Initio Developer

Location: Dallas, TX

Job Type: Contract

Job Summary:

We are seeking an experienced Ab Initio Developer to design, develop, and optimize ETL workflows and data processing solutions. The ideal candidate will have strong expertise in Ab Initio, Teradata, Redshift, SQL, Cloud platforms, and Unix scripting to handle large-scale data processing and enterprise data integration.

Key Responsibilities:

ETL Development & Data Processing:

   • Design, develop, and implement ETL workflows using Ab Initio for large-scale data processing.

   • Optimize and troubleshoot Ab Initio graphs, plans, and metadata management.

   • Ensure efficient data transformation, cleansing, and validation.

Database & Data Warehousing:

   • Work with Teradata, Redshift, and other relational databases to develop efficient SQL queries.

   • Design data models and warehouse solutions to support business needs.

   • Perform query tuning and performance optimization.

Cloud & Big Data Integration:

   • Develop cloud-based ETL pipelines on AWS, Azure, or Google Cloud.

   • Leverage Redshift, S3, or Snowflake for high-performance data storage and retrieval.

   • Implement data migration and integration strategies across on-premise and cloud platforms.

Automation & Scripting:

   • Develop Unix shell scripts to automate data workflows, scheduling, and monitoring.

   • Troubleshoot and optimize batch processes for high availability and scalability.

Collaboration & Documentation:

   • Work with data architects, business analysts, and DevOps teams to gather requirements.

   • Document ETL designs, data flow diagrams, and best practices.

Required Qualifications:

Experience: 5+ years in ETL Development, Data Engineering, or Data Warehousing.

Technologies:

   • ETL Tools: Ab Initio (Preferred), Informatica, Talend.

   • Databases: Teradata, Redshift, Oracle, SQL Server, PostgreSQL.

   • Cloud Platforms: AWS (Redshift, S3, Lambda), Azure, GCP.

   • Scripting: Unix shell scripting, Python, or Bash.

   • Big Data (Optional): Spark, Hadoop, Kafka.

Skills:

   • Strong expertise in Ab Initio Co>Operating System, GDE, EME, and Conduct>It.

   • Experience with data lineage, metadata management, and data governance.

   • Ability to optimize and tune SQL queries for performance.