DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a 6-month remote DevOps Engineer position with a pay rate of $55-65/hr, requiring 2+ years of experience in Data Engineering or SRE roles, proficiency in Terraform, CI/CD, Apache Airflow, and cloud platforms like AWS or Google Cloud.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
April 5, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New York City Metropolitan Area
🧠 - Skills detailed
#Airflow #Spark (Apache Spark) #DevOps #Apache Spark #Computer Science #Databricks #Kubernetes #Grafana #Scala #Automation #HBase #Terraform #Batch #Infrastructure as Code (IaC) #GitHub #Snowflake #Apache Beam #AWS (Amazon Web Services) #Observability #Apache Airflow #Big Data #Data Pipeline #Monitoring #Cloud #Python #SQL (Structured Query Language) #Data Engineering #Deployment #R #Java
Role description

DevOps Engineer

New York, New York (100% remote)

6-month Contract

$55-65/hr

We are seeking a highly skilled DevOps Engineer to join a Fortune 50 Broadcast Media & Entertainment leader based in New York, New York. As the DevOps Engineer, you will be responsible for building & maintaining scalable, reliable, and efficient data infrastructures. In this role, you will have the opportunity to work closely with data engineers, developers, and operations teams to ensure seamless CI/CD pipelines, automate data workflows, and support the implementation of robust monitoring and observability practices.

Minimum Qualifications:

   • 2+ years of relevant experience in Data Engineering, Operations, or SRE roles.

   • Experience with Terraform for infrastructure as code and automation.

   • Proven experience building and maintaining CI/CD pipelines using GitHub Actions, Concourse, or similar tools.

   • Hands-on experience with Apache Airflow for managing data workflows.

   • Proficiency in Python, Java, Scala, R, or SQL for automating data processes and workflows.

   • Familiarity with Real-Time and Batch Data Pipelines, especially in the context of Big Data Engineering

   • Practical experience building distributed, scalable, and highly available systems using Google Cloud or AWS

   • Experience with Kubernetes, Apache Beam, Apache Spark, Snowflake, and Databricks or similar tools.

   • Strong understanding of SRE best practices for system observability, including tools like Grafana.

   • Bachelor's degree in Computer Science, Engineering, Physics, or a related quantitative field (or equivalent industry experience).

   • Excellent communication skills, with the ability to collaborate effectively across cross-functional teams.

Responsibilities:

   • Implement and maintain infrastructure as code (IaC) using HashiCorp Terraform for scalable, reliable infrastructure management.

   • Develop and maintain CI/CD pipelines, utilizing modern tools like GitHub Actions or Concourse to ensure seamless code deployment and integration.

   • Work with Apache Airflow to design and manage graph-based data workflows, automating and optimizing data pipelines for both real-time and batch processing.

   • Write clean, efficient, and reusable code in Python, Java, Scala, R, SQL, or similar languages to automate data processes, analysis, and workflows.

   • Design and implement scalable, distributed, and highly available systems, with hands-on experience in Google Cloud and/or AWS platforms.

   • Collaborate with teams to integrate Apache Spark, Apache Beam, Snowflake, Databricks, and other tools into the data pipeline architecture.

   • Apply SRE best practices to ensure the observability and reliability of data pipelines, using monitoring tools such as Grafana.

   • Develop and maintain monitoring and alerting systems for real-time data pipeline performance and operational health.

   • Continuously improve the automation, scalability, and efficiency of data engineering processes.

   • Collaborate with cross-functional teams to meet the operational and development needs of the business.

What’s in it for you?

   • Work with a globally recognized media streaming organization at the forefront of innovation.

   • Collaborate with high-level business professionals and technical teams, gaining valuable cross-functional experience.

   • Opportunity to accelerate your career in a fast-paced, evolving industry.