Data Engineer with Strong Airflow and Python

This role is for a Data Engineer with strong expertise in Python, Airflow, Snowflake, and DBT. It is a 40-hour per week contract in Chicago, paying $65-70/hr. Requires 8 years of Python and Airflow experience, technical leadership skills, and cloud-native application design.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
January 27, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL 60603
🧠 - Skills detailed
#Automation #GitHub #Snowflake #Data Vault #Linux #Continuous Deployment #Cloud #Data Ingestion #Vault #Kafka (Apache Kafka) #Computer Science #Mathematics #Data Engineering #Complex Queries #SQL (Structured Query Language) #Migration #"ETL (Extract #Transform #Load)" #SQL Server #Microservices #Consul #Python #Data Architecture #Airflow #Agile #Jira #Data Pipeline #Deployment #dbt (data build tool) #Leadership #BitBucket #Programming
Role description
Log in or sign up for free to view the full role description and the link to apply.

We are seeking a Data Engineer with extensive experience in Python, Airflow, Snowflake and DBT.
If you are interested and available, please send your latest resume.
Job Location: Chicago (Hybrid on-site 3 days a week)
Bill Rate: $65-70/hr DOE
Job Type: Contract (40 hours per week)
KEY SKILLS:
-Ability to Design and develop (must be hands on).
-Python: ability to create own scripts for dependency injection into Airflow (scheduling, workflows). Expert level
-Airflow: strong familiarization
-Snowflake: primary database
Description:
The ideal contractor will be responsible for designing, developing, testing, and deploying software solutions for Hedge Fund Services.
Propose new designs and modify existing ones to continuously improve performance, functionality, and stability of the system.
Partner with business leaders and business unit partners to define priorities and deliver custom solutions to solve business problems or address business needs.
Must be competent to work at the highest technical level of all phases of system design and implementation.
Provides comprehensive consultation to Business Unit and IT management and staff at the highest technical level on all phases of the project development cycle.
Acts as the principal designer for major systems and their subsystems utilizing a thorough understanding of available technology, tools, and existing designs.
Design and develop high-performance programming language components used by trading applications.
Provide technical expertise to support and enhance core-trading applications.
Provides leadership and guidance to staff, fostering an environment that encourages employee participation, teamwork, and communication.
Seasoned multi-disciplinary expert with extensive technical and / or business knowledge and functional expertise. Works at the highest technical level of all phases of system design and implementation.
Focus of role is on execution of strategic direction of business function activities
Carries out complex initiatives involving multiple disciplines and/or ambiguous issues
Displays a balanced, cross-functional perspective, liaising with the business to improve efficiency, effectiveness, and productivity
Experience Level: Senior
Qualifications:
A BS degree in Computer Science, Mathematics, or related Computer Engineering or Science curriculum is required.
Strong programming skills in Snowflake, Python, AirFlow, DBT, Linux
Strong server-side programming experience with automation and backend support.
Experience with Snowflake.
Experience with agile project methodology and collaboration.
Excellent communication skills, analytical ability, strong judgment and management skills, and the ability to work effectively with client and IT management and staff required.
Strong skills in working with Opensource technologies, Database technologies, micro service architecture, cloud-native development, continuous build, continuous integration and continuous deployment.
Ability to work effectively with end users to define requirements.
Leadership and organizational skills are required to determine the Business Unit's goals, resources needed, and to assess and develop the skills of staff.
Experience designing and building cloud-native applications using microservices architecture.
Hands-On experience with Kafka and overall use for developing an Event driven architecture model
Experience in Domain Driven Design.
Experience with continuous integration and collaboration tools like JIRA, Bitbucket, GitHub, and Confluence.
Experience with building Data pipelines to Snowflake.
Specific Technical Responsibilities:
Overall (applies to all technology platforms listed below):
Provide production support for several data analytics solutions used every day
Ability to perform as a technical lead in addition to being a contributing developer
Code review of other teams’ members
Create and enhance data architecture models
Ability to troubleshoot and identify root causes for a variety of production and data issues
Snowflake:
Data transformation (ETL)
Write Snowflake SQL including stored procedures and complex queries involving CTEs and temp tables
Help design data models for new data to be ingested
Snowflake SQL performance tuning
Help complete migration of existing SQL Server based Data Vault into Snowflake
Continue to support and work on future enhancements for Snowflake Data Vault
Data ingestion (familiarity with Python and Kafka connectors is a nice to have but not necessarily required)
Python/Linux
Nice to Have:
A MS Degree is preferred. Experience with multi-threaded application design and development; including testing and deployment phases.
Job Type: Contract
Pay: $65.00 - $70.00 per hour
Expected hours: 40 per week
Schedule:

8 hour shift

Experience:

Python coding: 8 years (Required)
Airflow: 8 years (Required)
Snowflake: 7 years (Required)
technical lead: 6 years (Required)

Work Location: Hybrid remote in Chicago, IL 60603