Azure DataBricks Lead/Architect (with AI Expertise) - W2 / 1099 Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure DataBricks Lead/Architect with AI expertise, offering a 12-month contract in Bannockburn, Illinois. Pay rate is W2/1099. Key skills include DataBricks, AI frameworks, cloud platforms, and data architecture. Remote work within North America is allowed.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 16, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
1099 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL
🧠 - Skills detailed
#DevOps #Cloud #Data Engineering #Azure Machine Learning #Apache Spark #Batch #Big Data #Synapse #"ETL (Extract #Transform #Load)" #Hadoop #Spark (Apache Spark) #Kafka (Apache Kafka) #ML (Machine Learning) #AWS (Amazon Web Services) #Data Pipeline #Databases #PyTorch #GCP (Google Cloud Platform) #Delta Lake #Data Lake #Security #AI (Artificial Intelligence) #Snowflake #Azure #Data Governance #Azure Databricks #TensorFlow #Data Science #Scala #Data Architecture #Data Processing #Databricks #Data Storage #Storage #Data Ingestion #Data Integration
Role description

Job-Role/Title: Azure DataBricks Lead/Architect (with AI expertise)

Job-Location: Bannockburn, Illinois, United States.

Job-Type: Long-Term Contract Opportunity. The initial contract is for 12 months, with extensions.

Work Style: Can work remotely from within North America.

Key Deliverables:

   • Design and architect data solutions using Azure DataBricks

   • Develop and deploy AI and machine learning models using DataBricks

   • Integrate DataBricks with other Azure services (e.g., Azure Storage, Azure Synapse)

   • Optimize data pipelines and workflows for performance and scalability

   • Collaborate with data scientists and engineers to build AI-powered solutions

   • Data Architecture Design and Implementation:

   • Design and implement cloud-based data architectures leveraging DataBricks, Azure, and AWS or Google Cloud Platform.

   • Architect end-to-end data pipelines that integrate with DataBricks and other cloud services.

   • Ensure that the data architecture supports both batch and real-time data processing and analytics.

   • Leverage DataBricks Delta Lake for data storage, management, and processing.

   • Data Integration:

   • Integrate data from various sources, including operational databases, data lakes, and third-party APIs.

   • Work with teams to ensure seamless ETL/ELT processes within the DataBricks environment

   • Collaborate with data engineers to build data ingestion pipelines and automate data workflows.

Key Skillset for this Role

   • Hands-on experience with DataBricks (using Apache Spark, Delta Lake, etc.)

   • Strong understanding of data architecture and data engineering principles.

   • Solid experience in working with big data technologies such as Hadoop, Spark, Kafka, or similar.

   • Expertise with Snowflake Architecture is a big plus.

   • Experience with AI and Machine Learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn)

   • Experience with cloud platforms (Azure, and AWS, or Google Cloud Platform) and cloud-native data tools

   • Experience leading or managing teams and projects.

   • Experience with Azure services (e.g., Azure Storage, Azure Synapse, Azure Machine Learning)

   • Excellent knowledge of data governance and security best practices

   • Hands-on DevOps best practices and CI/CD pipelines.

We appreciate all the applicants for their interest in working with us, however, only those candidates shortlisted for the next steps in the hiring process will be contacted.

Thank you. Cloudingest!