1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

ETL Developer(AbInitio)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (Ab Initio) with a contract length of "unknown," offering a pay rate of "unknown." Key skills include 3+ years of Ab Initio experience, SQL proficiency, and familiarity with Unix/Linux. A Bachelor's degree is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
March 28, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas-Fort Worth Metroplex
🧠 - Skills detailed
#Shell Scripting #Data Accuracy #Web Services #Tableau #Scripting #Documentation #Visualization #Metadata #Computer Science #Data Extraction #Hadoop #Compliance #Unix #Datasets #Teradata #Databases #Big Data #SQL Server #Microsoft Power BI #Data Modeling #AWS (Amazon Web Services) #DevOps #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language) #Data Governance #Data Integration #Oracle #Cloud #Security #Azure #Ab Initio #Agile #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Snowflake #Linux #BI (Business Intelligence)
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

We are seeking an experienced Ab Initio Developer to design, develop, and maintain data integration solutions. The ideal candidate will possess expertise in Ab Initio ETL tool and demonstrate strong skills in data modeling, processing, and performance tuning to support business requirements.

Key Responsibilities

   • Design, develop, and optimize ETL workflows using Ab Initio Graphical Development Environment (GDE).

   • Build and maintain metadata-driven frameworks to streamline data processes.

   • Collaborate with cross-functional teams to gather requirements and develop solutions aligned with business needs.

   • Develop and maintain reusable components for data extraction, transformation, and loading.

   • Conduct performance tuning of ETL jobs to ensure efficient processing of large datasets.

   • Maintain and monitor ETL workflows in production, addressing issues and ensuring data accuracy.

   • Write and review technical documentation, including solution design and operations manuals.

   • Support data integration initiatives by leveraging APIs, web services, or other data exchange methods.

   • Adhere to data governance, security, and compliance standards.

Required Skills and Qualifications

   • Bachelor’s degree in Computer Science, Information Technology, or a related field.

   • 3+ years of hands-on experience with Ab Initio ETL tools.

   • Strong knowledge of data integration, warehousing, and ETL concepts.

   • Proficiency in SQL and working with relational databases such as Oracle, SQL Server, or Teradata.

   • Familiarity with Unix/Linux environments and shell scripting.

   • Experience with big data technologies (e.g., Hadoop, Spark) is a plus.

   • Excellent problem-solving and analytical skills.

   • Strong written and verbal communication skills, with the ability to convey technical information to non-technical stakeholders.

Preferred Skills

   • Experience with cloud platforms such as AWS, Azure, or GCP.

   • Knowledge of Agile and DevOps methodologies.

   • Exposure to data visualization and BI tools like Tableau or Power BI.

   • Experience in integrating Ab Initio with tools like Kafka or Snowflake.