Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Databricks Architect

This role is for a "Databricks Architect" with a contract length of over 6 months, offering a remote work location. Key skills include Azure cloud architecture, healthcare data modeling (HL7/FHIR), Python/Pyspark/Scala, and data pipeline management. A bachelor's degree and 6 years of experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Azure cloud #Storage #Linux #Databricks #Data Science #Data Lake #Spark (Apache Spark) #Snowflake #Data Warehouse #Data Engineering #Programming #Scala #Data Analysis #Azure #SQL (Structured Query Language) #Agile #ADF (Azure Data Factory) #PySpark #Data Quality #Data Ingestion #Data Modeling #Project Management #Kafka (Apache Kafka) #Data Integration #Azure Data Factory #Data Processing #Airflow #DevOps #Azure Databricks #Batch #FHIR (Fast Healthcare Interoperability Resources) #Metadata #BI (Business Intelligence) #Microsoft Power BI #Cloud #Data Pipeline #Python #Security
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Databricks Architect / Sr Data Engineer - Fully Remote - Health Care Background Must

Only Fulltime (No C2C and C2H)

Overall Responsibility:

Responsible for developing, implementing, and operating stable, scalable, low cost solutions to source data from client systems into the Data lake, data warehouse and end-user facing Bi applications Kesponsible for ingestion, transtormation and integration of data to provide a platform that supports data analysis and enrichment as well as making data operationally available for analysis. The data Engineer will be a data pipelin builder, data wrangler and support application developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture and is consistent throughout our ongoing projects

Essential Functions:

Bulld and maintain scalable automated data pipelines. Support critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation

Deploy, automate,

maintain and manage Azure cloud-based production system, to ensure the availability, performance, scalability and security of productions systems.

Good architectural understanding to build and ensure customer success when building new solutions and migrating existing data applications on Azure platform.

Conduct full technical discovery, identifying pain points, business and technical requirements, "as is" and "to be" scenarios besign and arrangement of scalable, highly attainable, and fault tolerant systems on Azure platform Ownership and responsibility for end-to-end design and development, testing, release of key components

Understand and implement best practices in management of data, including master data, reference data, metadata, data quality and lineage. experience with code versioning tools and a command of configuration management concepts and tools, el-cu ancludang Devops

Other duties as assigned.

Experience:

Expert level SuL knowledge and experience

Experience with Azure cloud architecture and solutions

experience with Azure Data Lake store, Blob Storage, VMs, Data Factory, SQL Data Warehouse, Azure Databricks, HDInsight, etc. expert level experience with Python/Pyspark/Scala and Object Oriented programming.

experience with streaming integration and cloud-based data processing systems such as; Kafka, and Databricks experience with healthcare data modeling standards, such as HL7 or FHIR is Mandatory.

Experience in processing healthcare data sets (medical records. claims.

clinical data etc. is Mandatory

Hands-on knowledge of cloud-based data warehouse solutions like Azure & Snowflake

Experience with data pipeline and workflow management tools: Azure Data Factory, Airflow, etc. experience with Uracle, Microsoft SoL Server Database system.

Experience working within Agile methodologies.

Experience with Microsoft Windows and Linux virtual servers.

Moderate skill in Power BI.

Strong project management skills.

Strong problem-solving, decision-making and analytical skills.

Excellent interpersonal and organizational skills - a team player who can effectively partner with all levels of the company.

Detail oriented and organized.

Ability to handle numerous assignments simultaneously.

Ability to work independently and as part of a team.

Kequirements

Manamum tducation/Experience

bachelor's degree (BA or 85) from an accredited college or university plus a minimum of six (6) years of experience in the specific or related field.