Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Machine Learning Engineer

This role is for a Machine Learning Engineer with expertise in Unity Catalog and Databricks, offering a 3-month contract at a pay rate of "unknown." Remote work is allowed, but candidates must work PST. Proficiency in Java, Python, and Azure tools is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 11, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Datadog #Azure #Java #Python #Web Services #Databricks #ML (Machine Learning) #Spark (Apache Spark) #Data Pipeline #Containers #Azure Cosmos DB #Monitoring #Splunk #Scala #Data Engineering #"ETL (Extract #Transform #Load)" #Data Access
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Description: Machine Learning Engineer, Unity Catalog

Location: Remote OK, must work PST

Possible 3 Month CTH

Photo ID IS MANDATORY FOR ALL CANDIDATES EVEN CITIZENS.

We are looking for an ML engineer with expertise in Unity Catalog and Feature Store in Databricks to help us build and maintain a solid foundation for our data and machine learning workflows. You will work on organizing data, managing access, and enabling machine learning models to operate efficiently in production
• Proficiency with Java

The ML Engineers will be supporting 3 web services applications – tech stack - Java 11 – Azure, AKS, and APIM.
• Set up and manage Unity Catalog in Databricks to organize and secure data access across teams
• Design and operationalize Feature Stores to support machine learning models in production
• Build efficient data pipelines to process and serve features to ML workflows
• Collaborate with teams using Databricks, Azure Cosmos DB, and other Azure tools to integrate data solutions
• Monitor and optimize the performance of pipelines and feature stores
• 5 - Strong experience with Unity Catalog in Databricks for managing data assets and access control
• 4 - Hands-on experience working with Databricks Feature Store or similar solutions
• 2 - Knowledge of building and maintaining scalable ETL pipelines in Databricks
• 2- Familiarity with Azure tools like Azure Cosmos DB and ACR
• 2- Understanding of machine learning workflows and how feature stores fit into the pipeline
• 5- Strong problem-solving skills and a collaborative mindset
• 3- Proficiency in Python and Spark for data engineering tasks
• 3- Experience with monitoring tools like Splunk or Datadog to ensure system reliability
• 2- Familiarity with AKS for deploying and managing containers