Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 9-month contract, based in Chicago, IL, and Ann Arbor, MI, offering a pay rate of "TBD." Requires 8+ years of experience, proficiency in SQL, Python, and ETL processes; local profiles only.
🌎 - Country
United States
πŸ’± - Currency
$ USD
πŸ’° - Day rate
Unknown
Unknown
πŸ—“οΈ - Date discovered
April 3, 2025
πŸ•’ - Project duration
More than 6 months
🏝️ - Location type
Hybrid
πŸ“„ - Contract type
W2 Contractor
πŸ”’ - Security clearance
Unknown
πŸ“ - Location detailed
Chicago, IL
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Computer Science #Oracle Cloud #Data Processing #Data Modeling #Agile #Python #Code Reviews #Security #Documentation #Oracle #Programming #Databases #Automation #Data Manipulation #SQL Queries #Data Pipeline #GIT #Migration #Version Control #SQL (Structured Query Language) #Database Performance #Snowflake #Data Migration #Database Security #PostgreSQL #Cloud #Data Engineering
Role description

Position: Data Engineer 3 days hybrid model

​

Location: 311 S Wacker Dr #1600, Chicago, IL & Ann Arbor, MI.

​

Job type: 9 Months Contract (Dec 2025)

​

Visa : Any visa is fine

​

Only Local profiles don’t share non-Local profiles.

​

Exp Level: Min 8+ Years.

​

About the Team

​

This is an excellent opportunity to join the Index IT team, as part of a delivery-focused IT group responsible for designing, developing and supporting internal, client and public-facing distribution solutions.

​

If selected, you will work as part of a delivery focused and talented software development team responsible for designing, developing and supporting the index and product generation platforms.

​

You will use cutting edge software development techniques and technologies, following the best practices of the industry.

​

Key Responsibilities

   β€’ Design, develop, and maintain database structures and data pipelines in Snowflake and Oracle environments

   β€’ Write efficient SQL queries, stored procedures, and functions to support application requirements

   β€’ Create and optimize ETL processes for data migration between different database platforms

   β€’ Implement database security measures and access control protocols

   β€’ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions

   β€’ Develop Python scripts for data processing, analysis, and automation

   β€’ Monitor database performance and recommend optimization strategies

   β€’ Participate in code reviews and implement best practices for database development

   β€’ Create and maintain comprehensive documentation for database structures and processes

​

Required Qualifications

   β€’ Bachelor’s degree in computer science, Information Technology, or related field

   β€’ 5+ years of experience with Software/database development.

   β€’ 3+ years of experience with Relational database development

   β€’ Python programming skills, particularly for data manipulation and analysis

   β€’ Proficiency in SQL query optimization and performance tuning

   β€’ Experience with ETL/ELT processes and data migration strategies

   β€’ Knowledge of database security best practices

   β€’ Strong analytical and problem-solving abilities

​

Preferred Qualifications

   β€’ Experience with PostgreSQL architecture and development

   β€’ Experience with Oracle databases or Oracle cloud infrastructure.

   β€’ Knowledge of data modeling and dimensional modeling concepts

   β€’ Familiarity with version control systems (Git)

   β€’ Experience with CI/CD pipelines for database changes

   β€’ Background in financial services or experience with financial data

   β€’ Understanding of agile development methodologies

   β€’ Certification in Snowflake, Oracle, or related technologies

​