Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Big Data Engineer

This role is for a Senior Big Data Engineer with 5+ years of experience in big data technologies, particularly Databricks. Contract length is unspecified, with a pay rate of "unknown." Location is on-site in NJ. Local candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 14, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New Jersey, United States
🧠 - Skills detailed
#Databases #Databricks #VPC (Virtual Private Cloud) #AWS (Amazon Web Services) #PySpark #MLflow #Redshift #Airflow #IAM (Identity and Access Management) #RDS (Amazon Relational Database Service) #EC2 #AI (Artificial Intelligence) #Snowflake #Python #S3 (Amazon Simple Storage Service) #Retool #Data Pipeline #Apache Spark #Data Engineering #Automation #Terraform #Cloud #Spark (Apache Spark) #Kafka (Apache Kafka) #NoSQL #ML (Machine Learning) #Security #Big Data #Jenkins
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Senior Big Data Engineer with strong technical skills and experience working with and supporting multiple engineering teams.

• Must be able to go for onsite interview in NJ

• Need only local candidates

• Requirements:
• Must have heavy Databricks exp
• Experience in building Big Data/ML/AI applications and optimizing data pipelines, architectures and data sets.
• 5+ years of technical experience with big data technologies. They should also have experience using the following software/tools:
• Experience with big data tools: Apache Spark, Databricks, Parquet/Delta, PySpark, SparkSQL, Spark Streaming, Kafka/Kinesis, S3, Glue
• Experience with Databricks using Unity Catalog and MLFlow is a big plus
• Experience with any relational/noSQL databases or any MPP databases like Snowflake and Redshift.
• Experience with data pipeline and workflow management tools: Step Functions, Airflow
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift, IAM, Security Group, VPC etc.
• Experience developing in Python and Notebooks/IDE
• Experience with Automation: Jenkins CI/CD, Terraform, CDK, Boto3

• Must be able to go for onsite interview in NJ

• Need only local candidates