Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Python Data Engineer

This role is a Python Data Engineer contract position, remote, with a pay rate of $60-$75/hour. Key skills include Python, Pandas, NumPy, Apache Airflow, Docker, REST API development, and MinIO S3 experience. Strong data engineering background required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
February 21, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Cupertino, CA
🧠 - Skills detailed
#Data Manipulation #Redis #Kubernetes #Data Engineering #SQL (Structured Query Language) #API (Application Programming Interface) #SQL Queries #"ETL (Extract #Transform #Load)" #Unit Testing #Data Ingestion #Data Pipeline #Cloud #Pandas #SQLAlchemy #Apache Airflow #Python #Pytest #Scala #Minio #NumPy #Docker #REST API #Airflow #Flask #Deployment #Libraries #Storage #Data Processing #S3 (Amazon Simple Storage Service) #REST (Representational State Transfer)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Python Data Engineer

Job Description:

We are seeking a highly skilled Python Data Engineer to join our team and contribute to the development of scalable data pipelines, APIs, and containerized applications. The ideal candidate will have a strong background in Python development, data processing, and cloud storage solutions.

Responsibilities:
• Implement and maintain unit tests using Python unit test frameworks (e.g., pytest, unittest).
• Develop and optimize data processing workflows using Pandas and NumPy for large-scale data manipulation.
• Build and manage Apache Airflow pipelines for automated data ingestion and transformation.
• Containerize applications using Docker/Kubernetes for scalable deployment.
• Design and implement RESTful APIs using Flask or similar frameworks.
• Develop and manage caching solutions with Redis for high-performance data retrieval.
• Apply composition patterns over inheritance for maintainable and scalable software architecture.
• Write optimized SQL queries and integrate them into Python applications using appropriate modules (e.g., SQLAlchemy, psycopg2).
• Work with MinIO S3 storage, handling Parquet files efficiently using Pandas and the MinIO Python SDK.

Requirements:
• Strong proficiency in Python with experience in data engineering and backend development.
• Hands-on experience with Pandas, NumPy, and other Python data processing libraries.
• Experience developing and orchestrating workflows using Apache Airflow.
• Proficiency in Docker and containerized application development.
• Strong understanding of REST API development using Flask or equivalent frameworks.
• Experience working with Redis as a caching mechanism.
• Familiarity with unit testing frameworks such as pytest or unittest.
• Understanding of software design patterns, particularly composition vs inheritance.
• Ability to write and optimize SQL queries and integrate them into Python applications.
• Experience working with MinIO S3 storage and handling Parquet files in Pandas.

Submit resume to jobs@OSIengineering.com

Location: Remote

Job Type: Contract

Pay Rate: $60-$75/h) (DOE)

Tony Do

408.550.2800 x115