

Data Engineer - Ab Initio
Data Engineer - Ab Initio
Start: 2 weeks from date of offer
Location: 100 % Remote - Candidates must be based out of one of the following states:
North Carolina, Alabama, Arizona, Arkansas, Colorado, Florida, Georgia, Idaho, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maryland, Michigan, Mississippi, Missouri, Ohio, Oklahoma, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, Wisconsin, and Wyoming
Contract to Perm: Ideally 6 month contract to perm conversion
Hourly Pay for Contract: 50-65 per hour
Salary Expectations: 100-150K. Preferred salary 120-125K
• W2 Contract Only
• Interview process: 3 Interviews including 1 that is a virtual test
Data Engineer
Primary requirement is 3 years of Ab Initio experience as an Engineer. This is a new new position with a company with very low turnover. Candidates will have the opportunity to work on many different workstreams / departments. Additional client summary below:
• Define and extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient programming processes
• Document, and test moderate data systems that bring together data from disparate sources, making it available to data scientists, and other users using scripting and/or programming languages
• Write and refine code to ensure performance and reliability of data extraction and processing
• Participate in requirements gathering sessions with business and technical staff to distill technical requirement from business requests
• Develop SQL queries to extract data for analysis and model construction
• Own delivery of moderately sized data engineering projects
• Design and develop scalable, efficient data pipeline processes to handle data ingestion, cleansing, transformation, integration, and validation required to provide access to prepared data sets to analysts and data scientists
• Ensure performance and reliability of data processes
• Document and test data processes including performance of through data validation and verification
• Collaborate with cross functional team to resolve data quality and operational issues and ensure timely delivery of products
• Develop and implement scripts for database and data process maintenance, monitoring, and performance tuning
• Analyze and evaluate databases in order to identify and recommend improvements and optimization
• Hiring Requirements
• Bachelor’s degree and 3 years of experience with Data Integration, Data Warehouses, Operational Data Stores, Data Lakes and Big Data platforms
• Direct experience with at least one ETL development language/technology such as AbInitio, DataStage, Informatica, Python, R
• Advanced SQL knowledge and experience with database technologies such as Snowflake, DB2, AWS
• In lieu of degree, 5 years of the experience as stated above
• Hiring Preferences
• Experience in healthcare or insurance
• Experience collaborating effectively with vendors and business partners for solution delivery