Senior Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Scientist with an 8-month contract, hybrid location (3 days in office), and a pay rate of "unknown." Key skills include Python, PySpark, ML algorithms, deep learning, and AWS SageMaker. AWS certifications are required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
April 11, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Lambda (AWS Lambda) #PyTorch #TensorFlow #Spark (Apache Spark) #Data Engineering #Cloud #AWS (Amazon Web Services) #NLP (Natural Language Processing) #Data Cleaning #BERT #Python #Data Science #Deep Learning #Data Analysis #SageMaker #AWS SageMaker #ML (Machine Learning) #DevOps #NLU (Natural Language Understanding) #StepFunctions #AI (Artificial Intelligence) #PySpark #Deployment #Automation #Monitoring
Role description

Position Title : Data Scientist Sr.

Pittsburgh, PA, Strongsville, OH, Birmingham, AL, Farmers Branch, TX, Phoenix AZ

Ability to work remote: Hybrid 3 days in office

Acceptable time zone(s): EST

Potential for Contract Extension: yes

8 Months to Hire

Contract to hire W2

Roles and Responsibilities:

   • Collaborate with other data scientists, data engineers and DevOps engineers to help build and deploy models using SageMaker in a hybrid environment

   • Coordinate the build and automations for the entire MLOps pipeline including data and features, model (re)developments, deployment and ongoing monitoring of inference endpoints and model performance

   • Implement automated monitoring and alerting systems to detect and remediate potential issues proactively

   • Look for opportunities to optimize timelines, resource utilizations and resiliency of end-to-end MLOps process

   • Collaborate for the development and integration of customized LLMs to enhance data analysis, natural language understanding, and generation tasks for agentic systems

   • Stay updated on the latest developments, explore and experiment to push boundaries and contribute to team and intellectual property development

Must Have Technical Skills:

   • Python and PySpark proficient

   • Statistical analysis with data cleaning and augmentation experience

   • Strong footing on ML algorithms and their suitability for varied use cases

   • Deep learning and NLP experience (TensorFlow/PyTorch, BERT/GPT-3, etc.)

   • AWS SageMaker and additional AWS services (Lambda, StepFunctions, etc.)

Flex Skills/Nice to Have: Fine-tuning LLMs, SageMaker pipelines, Infrastructure-as-a-code (IaaC), CI/CD, Model Monitoring, Explainable AI (XAI)

Education/Certifications:

   • AWS Certified Machine Learning – Specialty

   • AWS Certified DevOps Engineer – Professional

   • Other Cloud Solution Provider (CSP) certifications in these areas will also count

   • Additional Data Science and LLM focused certification will be a plus

Interview Process:

Screening Interview-30 mins

2nd Round, more Technical 30-50 minsx