MLOps Architect (Databricks Specialist)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an MLOps Architect (Databricks Specialist) with 12+ years of IT experience, focusing on data engineering and analytics. Contract length is unspecified, with a pay rate of "unknown." Requires expertise in Databricks, Spark, and cloud platforms (AWS, Azure, or GCP), along with relevant certifications.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Data Pipeline #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Spark (Apache Spark) #PySpark #Cloud #Azure #Data Processing #Python #Deployment #Data Modeling #Migration #TensorFlow #StreamSets #Data Architecture #ML (Machine Learning) #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Scala #Apache Spark #Data Migration #Data Engineering #Spark SQL #PyTorch #Computer Science #Distributed Computing #Data Management #Databricks
Role description

Job Description:

   • 12+ years in IT experience with a minimum 10+ Years of experience in data engineering, data platform, and analytics.

   • Projects delivered with hands-on experience in development on Databricks.

   • Working knowledge of any one cloud platform (AWS, Azure, or GCP).

   • Deep experience with distributed computing with Spark, including knowledge of Spark runtime internals.

   • Familiarity with CI/CD for production deployments.

   • Working knowledge of MLOps.

   • Current knowledge across the breadths of Databricks product and platform features.

   • Familiarity with optimization for performance and scalability.

   • Completed data engineering professional certification and required classes.

   • Minimum qualifications:

   • Educational Background: Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Technical Skills:

   • Expert-level proficiency in Spark Scala, Python, and PySpark.

   • In-depth knowledge of data architecture, including Spark Streaming, Spark Core, Spark SQL, and data modeling.

   • Hands-on experience with various data management technologies and tools, such as Kafka, StreamSets, and MapReduce.

   • Proficient in using advanced analytics and machine learning frameworks, including Apache Spark MLlib, TensorFlow, and PyTorch, to drive data insights and solutions.

   • Databricks Specific Skills:

   • Extensive experience in data migration from on-premises to cloud environments and in implementing data solutions on Databricks across cloud platforms (AWS, Azure, GCP).

   • Skilled in designing and executing end-to-end data engineering solutions using Databricks, focusing on large-scale data processing and integration.

   • Proven hands-on experience with Databricks administration and operations, including notebooks, clusters, jobs, and data pipelines.

   • Experience integrating Databricks with other data tools and platforms to enhance overall data management and analytics capabilities.

Certifications:

   • Certification in Databricks Engineering (Professional)

   • Microsoft Certified: Azure Data Engineer Associate

   • GCP Certified: Professional Google Cloud Certified.

   • AWS Certified Solutions Architect Professional