Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6+ month contract, offering $65.00 - $70.00/hr. Located onsite hybrid in Miramar or Dallas, it requires expertise in Databricks, AWS, Python, and data integration, with a minimum of 5 years of experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Miramar, FL
🧠 - Skills detailed
#Monitoring #AWS (Amazon Web Services) #Programming #Terraform #Delta Lake #GitHub #Cloud #Informatica #dbt (data build tool) #Security #VPC (Virtual Private Cloud) #Datadog #Python #AWS Lambda #Lambda (AWS Lambda) #Jenkins #API (Application Programming Interface) #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #SQL (Structured Query Language) #Infrastructure as Code (IaC) #AWS SageMaker #Prometheus #Data Framework #Documentation #Data Engineering #Database Systems #Fivetran #Logging #Amazon CloudWatch #Computer Science #Redshift #Databricks #Data Ingestion #SageMaker #AWS Glue
Role description

Title:  Senior Data Engineer

Location: Onsite Hybrid (Miramar or Dallas)

Duration: 6+ months

Compensation: $65.00 - 70.00/hr

Work Requirements: US Citizen, GC Holders or Authorized to Work in the U.S.

Senior Data Engineer

This role requires a highly skilled Databricks Platform Administrator responsible for hands-on management, optimization, and maintenance of Databricks environments on AWS. The ideal candidate will have extensive experience in data engineering, programming, and cloud-based integration platforms, ensuring seamless data flow and interoperability between various systems and applications within our organization.

Primary Responsibilities:

   • Build/Design large-scale application development projects and programs with a hands-on approach.

   • Ensure the technical validity of solutions and actively drive their implementation.

   • Develop and maintain detailed business and technical process documentation and training materials and build/code frameworks

   • Review problem logs, identify recurring issues, implement long-term solutions, and automate solutions

   • Hands-on development, admin, design, and performance tuning

Minimum Qualifications:

   • 5+ years of hands-on experience with a BS or MS in Computer Science or equivalent education and experience.

   • 3+ years of hands-on experience in framework development and building integration layers to solve complex business use cases, with a strong emphasis on Databricks and AWS.

Technical Skills:

   • Strong hands-on coding skills in Python.

   • Extensive hands-on experience with Databricks for developing integration layer solutions.

   • AWS Data Engineer or Machine Learning certification or equivalent hands-on experience with AWS Cloud services.

   • Proficiency in building data frameworks on AWS, including hands-on experience with tools like AWS Lambda, AWS Glue, AWS SageMaker, and AWS Redshift.

   • Hands-on experience with cloud-based data warehousing and transformation tools such as Delta Lake Tables, DBT, and Fivetran.

   • Familiarity with machine learning and open-source machine learning ecosystems.

   • Hands-on experience with integration tools and frameworks such as Apache Camel and MuleSoft.

   • Solid understanding of API design principles, RESTful services, and message queuing technologies.

   • Familiarity with database systems and SQL.

   • Hands-on experience with Infrastructure as Code (IaC) tools like Terraform and AWS CloudFormation.

   • Proficiency in setting up and managing Databricks workspaces, including VPC management, security groups, and VPC peering.

   • Hands-on experience with CI/CD pipeline management using tools like AWS CodePipeline, Jenkins, or GitHub Actions.

   • Knowledge of monitoring and logging tools such as Amazon CloudWatch, Datadog, or Prometheus.

   • Hands-on experience with data ingestion and ETL processes using AWS Glue, Databricks Auto Loader, and Informatica.

Our benefits package includes:

   • Comprehensive medical benefits

   • Competitive pay

   • 401(k) Retirement plan

   • …and much more!

About INSPYR Solutions

Technology is our focus and quality is our commitment. As a national expert in delivering flexible technology and talent solutions, we strategically align industry and technical expertise with our clients’ business objectives and cultural needs. Our solutions are tailored to each client and include a wide variety of professional services, project, and talent solutions. By always striving for excellence and focusing on the human aspect of our business, we work seamlessly with our talent and clients to match the right solutions to the right opportunities. Learn more about us at inspyrsolutions.com.

INSPYR Solutions provides Equal Employment Opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics. In addition to federal law requirements, INSPYR Solutions complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities.