

Ab Initio Developer
Job Description:
Job Title: Ab Initio Developer
Location: Dallas, TX
Job Type: Contract
Job Summary:
We are seeking an experienced Ab Initio Developer to design, develop, and optimize ETL workflows and data processing solutions. The ideal candidate will have strong expertise in Ab Initio, Teradata, Redshift, SQL, Cloud platforms, and Unix scripting to handle large-scale data processing and enterprise data integration.
Key Responsibilities:
ETL Development & Data Processing:
• Design, develop, and implement ETL workflows using Ab Initio for large-scale data processing.
• Optimize and troubleshoot Ab Initio graphs, plans, and metadata management.
• Ensure efficient data transformation, cleansing, and validation.
Database & Data Warehousing:
• Work with Teradata, Redshift, and other relational databases to develop efficient SQL queries.
• Design data models and warehouse solutions to support business needs.
• Perform query tuning and performance optimization.
Cloud & Big Data Integration:
• Develop cloud-based ETL pipelines on AWS, Azure, or Google Cloud.
• Leverage Redshift, S3, or Snowflake for high-performance data storage and retrieval.
• Implement data migration and integration strategies across on-premise and cloud platforms.
Automation & Scripting:
• Develop Unix shell scripts to automate data workflows, scheduling, and monitoring.
• Troubleshoot and optimize batch processes for high availability and scalability.
Collaboration & Documentation:
• Work with data architects, business analysts, and DevOps teams to gather requirements.
• Document ETL designs, data flow diagrams, and best practices.
Required Qualifications:
Experience: 5+ years in ETL Development, Data Engineering, or Data Warehousing.
Technologies:
• ETL Tools: Ab Initio (Preferred), Informatica, Talend.
• Databases: Teradata, Redshift, Oracle, SQL Server, PostgreSQL.
• Cloud Platforms: AWS (Redshift, S3, Lambda), Azure, GCP.
• Scripting: Unix shell scripting, Python, or Bash.
• Big Data (Optional): Spark, Hadoop, Kafka.
Skills:
• Strong expertise in Ab Initio Co>Operating System, GDE, EME, and Conduct>It.
• Experience with data lineage, metadata management, and data governance.
• Ability to optimize and tune SQL queries for performance.
Job Description:
Job Title: Ab Initio Developer
Location: Dallas, TX
Job Type: Contract
Job Summary:
We are seeking an experienced Ab Initio Developer to design, develop, and optimize ETL workflows and data processing solutions. The ideal candidate will have strong expertise in Ab Initio, Teradata, Redshift, SQL, Cloud platforms, and Unix scripting to handle large-scale data processing and enterprise data integration.
Key Responsibilities:
ETL Development & Data Processing:
• Design, develop, and implement ETL workflows using Ab Initio for large-scale data processing.
• Optimize and troubleshoot Ab Initio graphs, plans, and metadata management.
• Ensure efficient data transformation, cleansing, and validation.
Database & Data Warehousing:
• Work with Teradata, Redshift, and other relational databases to develop efficient SQL queries.
• Design data models and warehouse solutions to support business needs.
• Perform query tuning and performance optimization.
Cloud & Big Data Integration:
• Develop cloud-based ETL pipelines on AWS, Azure, or Google Cloud.
• Leverage Redshift, S3, or Snowflake for high-performance data storage and retrieval.
• Implement data migration and integration strategies across on-premise and cloud platforms.
Automation & Scripting:
• Develop Unix shell scripts to automate data workflows, scheduling, and monitoring.
• Troubleshoot and optimize batch processes for high availability and scalability.
Collaboration & Documentation:
• Work with data architects, business analysts, and DevOps teams to gather requirements.
• Document ETL designs, data flow diagrams, and best practices.
Required Qualifications:
Experience: 5+ years in ETL Development, Data Engineering, or Data Warehousing.
Technologies:
• ETL Tools: Ab Initio (Preferred), Informatica, Talend.
• Databases: Teradata, Redshift, Oracle, SQL Server, PostgreSQL.
• Cloud Platforms: AWS (Redshift, S3, Lambda), Azure, GCP.
• Scripting: Unix shell scripting, Python, or Bash.
• Big Data (Optional): Spark, Hadoop, Kafka.
Skills:
• Strong expertise in Ab Initio Co>Operating System, GDE, EME, and Conduct>It.
• Experience with data lineage, metadata management, and data governance.
• Ability to optimize and tune SQL queries for performance.