1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6-month contract, offering a pay rate of "$X/hour." The position is remote, requiring 5+ years of IT experience, 3+ years as a Data Engineer, and proficiency in GCP technologies.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 1, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Nashville, TN
🧠 - Skills detailed
#Agile #Programming #AWS (Amazon Web Services) #BigQuery #Data Science #Python #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #SQL (Structured Query Language) #GCP (Google Cloud Platform) #MongoDB #Terraform #Dataflow #GIT #HBase #SQL Server #Azure #NoSQL #NLP (Natural Language Processing) #Deployment #Cloud #FHIR (Fast Healthcare Interoperability Resources) #Jenkins #Scala #GitHub #Hadoop #Logging #Mercurial #Teradata #Databases #Kafka (Apache Kafka) #MS SQL (Microsoft SQL Server) #RDBMS (Relational Database Management System) #Maven #Spark (Apache Spark) #Computer Science #Java #Data Analysis #Unix #JSON (JavaScript Object Notation) #Data Engineering #Linux #Oracle #Azure cloud
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

   • 

   • 

   • 

   • We are not in a position to sponsor candidates for employment for this position nor can we work with anyone in a corp-to-corp arrangement. W2 only! No exceptions!

   • 

   • 

   • 

   • Senior Data Engineer

The Senior Data Engineer is a key development resource responsible for designing, coding, testing, implementing, documenting, and maintaining NextGen solutions for GCP Cloud enterprise data initiatives. This role involves close collaboration with data teams, often within a matrixed environment as part of a larger project team. Given the rapidly evolving nature of GCP/Hadoop technology, staying informed about technological advancements and effectively applying new innovations is essential.

Key Responsibilities:

   • Analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision.

   • Build and support a GCP-based ecosystem for enterprise-wide analysis of structured, semi-structured, and unstructured data.

   • Integrate new data sources into GCP, transform and load data into databases, and manage data transfers between clusters.

   • Develop a deep understanding of relevant product areas, codebases, and systems.

   • Demonstrate proficiency in data analysis, programming, and software engineering.

   • Collaborate with the Lead Architect and Product Owner to define, design, and build new features and improve existing products.

   • Produce high-quality code with good test coverage, using modern abstractions and frameworks.

   • Work independently, completing tasks on schedule by exercising strong judgment and problem-solving skills.

   • Collaborate closely with team members to execute development initiatives using Agile practices and principles.

   • Participate in the deployment, change, configuration, management, administration, and maintenance of deployment processes and systems.

   • Prioritize workload effectively to meet deadlines and work objectives in a rapidly changing business environment.

   • Work collaboratively with Data Scientists and business and IT leaders to understand their needs and use cases.

   • Engage in technical group discussions and adopt new technologies to improve development and operations.

Experience:

   • 5+ years of relevant IT work experience.

   • 3+ years of experience as a Data Engineer.

Qualifications:

   • Bachelor’s degree in Computer Science or a related discipline.

Required Skills:

   • Strong understanding of best practices and standards for GCP data process design and implementation.

   • 2+ years of hands-on experience with the GCP platform and familiarity with components such as:

   • Cloud Run, GKE, Cloud Functions

   • Spark Streaming, Kafka, Pub/Sub

   • Bigtable, Firestore, Cloud SQL, Cloud Spanner

   • HL7, FHIR, JSON, Avro, Parquet

   • Python, Java, Terraform

   • BigQuery, Dataflow, Data Fusion

   • Cloud Composer, DataProc, CI/CD, Cloud Logging

   • Vertex AI, NLP, GitHub

   • Ability to multitask and balance competing priorities.

   • Strong problem-solving skills and the ability to impose order in a fast-changing environment.

   • Excellent verbal, written, and interpersonal skills, with a desire to work in a highly matrixed, team-oriented environment.

Preferred Skills:

   • Experience in the healthcare domain and with patient data.

   • Experience with AWS or Azure cloud platforms.

   • Familiarity with hardware/operating systems such as Linux, UNIX, and distributed, highly scalable processing environments.

   • Knowledge of databases including RDBMS (MS SQL Server, Teradata, Oracle) and NoSQL (Hbase, Cassandra, MongoDB, in-memory, columnar, and other emerging technologies).

   • Experience with build systems (TFS, Maven, Ant), source control systems (Git, Mercurial), and continuous integration systems (Jenkins, Bamboo).

Certifications (a plus, but not required):

   • GCP Cloud Professional Data Engineer

Other Requirements:

   • Prolonged sitting at a computer.

   • Ability to participate in after-hours support.