ML & Database Engineer (No Third Parties Please)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ML & Database Engineer, 4 months remote, paying $55-$60/hour for 20 hours/week. Requires advanced skills in AWS (SageMaker, RDS/PostgreSQL, Glue), Python, SQL, and experience in machine learning and ETL processes.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New York, United States
🧠 - Skills detailed
#RDS (Amazon Relational Database Service) #Database Administration #Monitoring #Data Integration #Database Monitoring #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Terraform #Cloud #Python #AWS Lambda #Database Schema #Lambda (AWS Lambda) #PostgreSQL #Amazon RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #SQL (Structured Query Language) #Data Quality #Documentation #Amazon CloudWatch #Database Design #Batch #SageMaker #AWS Glue
Role description

4 months, remote, $55-$60/hour, Start ASAP, 20 hours per week

ML & Database Engineer

Required AWS Service Skills:

   • Amazon SageMaker (primary) SageMaker Canvas SageMaker Model Monitor Amazon RDS/PostgreSQL (advanced) AWS Glue (advanced) Amazon S3 AWS Lambda Amazon CloudWatch

Required Technical Skills:

   • Python (advanced) Machine Learning Feature Engineering ETL Pipelines SQL (advanced) Database Administration Database Design Batch Processing Data Validation Model Training Terraform

Required Soft Skills:

   • Analytical Thinking Technical Documentation Knowledge Transfer Adaptability Team Collaboration

Primary Responsibilities:

   • Design and implement database schemas and data models

   • Configure and optimize PostgreSQL database instances

   • Develop advanced ETL processes using AWS Glue

   • Implement SageMaker Canvas environment for model exploration

   • Design and develop machine learning models for deer movement prediction

   • Create validation framework with accuracy metrics

   • Implement SageMaker training pipeline for prediction models

   • Configure distributed training and spot instance optimization

   • Set up SageMaker batch transform jobs for nightly inference

   • Develop database monitoring, backup and recovery procedures

   • Implement data integration between application components and ML pipeline

   • Assist with performance optimization and query tuning

   • Develop data quality validation procedures

   • Implement model drift detection and retraining mechanism