Cloud Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Architect specializing in Azure Databricks, offering a contract position with a remote work location. Key requirements include 8 years of experience in data engineering, strong SQL skills, and proficiency in Azure services and big data technologies. Pay rate is unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 16, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Cloud #PySpark #SQL (Structured Query Language) #Data Engineering #Leadership #Big Data #Azure Data Factory #BI (Business Intelligence) #ADF (Azure Data Factory) #Azure cloud #"ETL (Extract #Transform #Load)" #Synapse #Hadoop #Data Security #NoSQL #Spark (Apache Spark) #Data Pipeline #Compliance #ARM (Advanced RISC Machine) #Infrastructure as Code (IaC) #Databases #Delta Lake #Data Lake #Security #Version Control #Microsoft Power BI #Azure #Azure Databricks #ADLS (Azure Data Lake Storage) #Azure ADLS (Azure Data Lake Storage) #Scala #Documentation #Data Processing #GIT #Databricks #Storage #Azure Resource Manager #Data Ingestion
Role description

Position: Azure Databricks SME

Remote

Duration: Contract

Job description

Responsibilities

   • Lead the architecture design and implementation of advanced analytics solutions using Azure Databricks Fabric The ideal candidate will have a deep understanding of big data technologies data engineering and cloud computing with a strong focus on Azure Databricks along with Strong SQL

   • Work closely with business stakeholders and other IT teams to understand requirements and deliver effective solutions

   • Oversee the endtoend implementation of data solutions ensuring alignment with business requirements and best practices

   • Lead the development of data pipelines and ETL processes using Azure Databricks PySpark and other relevant tools

   • Integrate Azure Databricks with other Azure services eg Azure Data Lake Azure Synapse Azure Data Factory and onpremise systems

   • Provide technical leadership and mentorship to the data engineering team fostering a culture of continuous learning and improvement

   • Ensure proper documentation of architecture processes and data flows while ensuring compliance with security and governance standards

   • Ensure best practices are followed in terms of code quality data security and scalability

   • Stay updated with the latest developments in Databricks and associated technologies to drive innovation

Essential Skills

   • Strong experience with Azure Databricks including cluster management notebook development and Delta Lake

   • Proficiency in big data technologies eg Hadoop Spark and data processing frameworks eg PySpark

   • Deep understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory

   • Experience with ETLELT processes data warehousing and building data lakes

   • Strong SQL skills and familiarity with NoSQL databases

   • Experience with CICD pipelines and version control systems like Git

   • Knowledge of cloud security best practices

Soft Skills

   • Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders

   • Strong problemsolving skills and a proactive approach to identifying and resolving issues

   • Leadership skills with the ability to manage and mentor a team of data engineers

Nice to have Skills

   • Power BI for dashboarding and reporting

   • Microsoft Fabric for analytics and integration tasks

   • Spark Streaming for processing realtime data streams

   • Familiarity with Azure Resource Manager ARM templates for infrastructure as code IaC practices

Experience

   • Demonstrated expertise of 8 years in developing data ingestion and transformation pipelines using DatabricksSynapse notebooks and Azure Data Factory

   • Solid understanding and handson experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2

   • Experience in efficiently using Auto Loader and Delta Live tables for seamless data ingestion and transformation

   • Proficiency in building and optimizing query layers using Databricks SQL

   • Demonstrated experience integrating Databricks with Azure Synapse ADLS Gen2 and Power BI for endtoend analytics solutions

   • Prior experience in developing optimizing and deploying Power BI reports

   • Familiarity with modern CICD practices especially in the context of Databricks and cloudnative solutions

Skills

Mandatory Skills : Azure Cloud Architecture, Cloud Architecture, Cloud Solution Architecture, Enterprise Cloud Architecture