

Databricks Engineer
Title: Databricks Engineer
Location: 100% remote
Duration: 12 months
Must have: Delta Lake, Databricks SQL, MLflow, Data warehousing, SSIS, SQL Server and strong experience with API. Databricks on AWS/Azure cloud infrastructure and functions. Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions and Databricks CLI. Strong proficiency in Python and SQL and Excellent oral and written communication skills
Responsibilities:
• 7+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions.
• 5+ years of experience in PBI and Data Warehousing experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Experience with AWS (e.g. S3, Athena, Glue, Lambda, etc.) preferred.
• Deep understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT) implementing highly performant data ingestion pipelines from multiple sources
• Strong proficiency in Python and SQL.
• Deep understanding of Databricks platform features (Delta Lake, Databricks SQL, MLflow)
• Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions, and Databricks CLI
• Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.
• Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
• Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow.
• Basic working knowledge of API or Stream based data extraction processes like Salesforce API, Bulk API.
• Understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing)
• Excellent problem-solving and analytical skills
• Able to Work Independently
• Excellent oral and written communication skills
• Nice to have: Databricks certifications and AWS Solution Architect certification.
• Nice to have: experience with building data pipeline from various business applications like Salesforce, Marketo, NetSuite, Workday etc.
Required: Bachelor's degree in Computer Science or a related field
This role is open to W2 or those seeking Corp-Corp employment. The salary range for this role is 106k-115k. For corp-Corp rates please contact the recruiter. In addition to other benefits, Accion Labs offers a comprehensive benefits package, with Accion covering 65% of the medical, dental, and Vision Premiums for employees, their spouses, and dependent children enrolling in the Accion-provided plans.
Title: Databricks Engineer
Location: 100% remote
Duration: 12 months
Must have: Delta Lake, Databricks SQL, MLflow, Data warehousing, SSIS, SQL Server and strong experience with API. Databricks on AWS/Azure cloud infrastructure and functions. Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions and Databricks CLI. Strong proficiency in Python and SQL and Excellent oral and written communication skills
Responsibilities:
• 7+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions.
• 5+ years of experience in PBI and Data Warehousing experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Experience with AWS (e.g. S3, Athena, Glue, Lambda, etc.) preferred.
• Deep understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT) implementing highly performant data ingestion pipelines from multiple sources
• Strong proficiency in Python and SQL.
• Deep understanding of Databricks platform features (Delta Lake, Databricks SQL, MLflow)
• Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions, and Databricks CLI
• Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.
• Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
• Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow.
• Basic working knowledge of API or Stream based data extraction processes like Salesforce API, Bulk API.
• Understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing)
• Excellent problem-solving and analytical skills
• Able to Work Independently
• Excellent oral and written communication skills
• Nice to have: Databricks certifications and AWS Solution Architect certification.
• Nice to have: experience with building data pipeline from various business applications like Salesforce, Marketo, NetSuite, Workday etc.
Required: Bachelor's degree in Computer Science or a related field
This role is open to W2 or those seeking Corp-Corp employment. The salary range for this role is 106k-115k. For corp-Corp rates please contact the recruiter. In addition to other benefits, Accion Labs offers a comprehensive benefits package, with Accion covering 65% of the medical, dental, and Vision Premiums for employees, their spouses, and dependent children enrolling in the Accion-provided plans.