1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Engineer - GCP

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - GCP in Alpharetta, GA, with a contract length of "unknown" and a pay rate of "unknown." Requires 5+ years in data solutions, expertise in GCP, Python, SQL, and strong data modeling skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
March 28, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Alpharetta, GA
🧠 - Skills detailed
#Data Engineering #Dataflow #Data Pipeline #Data Security #Monitoring #BigQuery #Logging #Visualization #Computer Science #Database Systems #Scala #Google Cloud Storage #NoSQL #Programming #Storage #SQL Server #Storytelling #Data Modeling #IAM (Identity and Access Management) #DevOps #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language) #Mathematics #Oracle #Security #YAML (YAML Ain't Markup Language) #Cloud #Data Processing #Azure #Data Architecture #Agile #Azure DevOps #Code Reviews #Data Design #Scrum #"ETL (Extract #Transform #Load)" #Terraform #ML (Machine Learning) #Python
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Exciting contract opportunity for a Data Engineer in Alpharetta, GA. In this role you will work 3 days/week in the office (Monday, Tuesday, Thursday). As a Data Engineer you will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem and delivering real-time streaming capabilities to our business applications. You will bring a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management. Our client’s mission is design & implement a data and analytics platform/infrastructure that enables a future-state analytics lifecycle, data monetization opportunities, data acquisition, analysis & feature engineering, model training, impact analysis, reporting, predictive and quantitative analysis, & monitoring.

What You Get to Do:

   • An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.

   • You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.

   • Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.

   • Holds accountability for the quality, usability, and performance of the solutions.

   • Leads design sessions and code reviews to elevate the quality of engineering across the organization.

   • Design, develop data foundation on cloud data platform using GCP tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, Cloud SQL, BigTable, BigLake DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREPO, CI/CD Pipelines, Secret Manager, Cloud IAM Terraform/YAML etc.

   • ETL pipeline using Python builds and scalable solutions.

   • Multi-level Data Curation and modeling.

   • Data design and architecture.

   • Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.

   • Increase the efficiency and speed of complicated data processing systems.

   • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.

   • Analyzing data gathered during tests to identify strengths and weaknesses of ML Models

   • increase the efficiency and speed of complicated data processing systems.

   • Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.

   • Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.

   • Develop a certified vs. self-service analytics framework for the organization.

   • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.

   • Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.

   • Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.

What You Bring to the Table:

   • Bachelor’s degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.

   • Ideally 5+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.

   • Expertise in programming languages including Python and SQL.

   • Familiarity with certain software development methodologies such as Agile, or Scrum.

   • Critical thinking.

   • Leveraging cloud-native services for data processing and storage.

   • Storage – BigQuery, GCS, Cloud SQL, BigTable, BigLake

   • Event processing – Pub/Sub, EventArc

   • Data pipeline and analytics – Dataflow, DataForm, Cloud Run, Cloud Run Function, DataStream, Cloud Scheduler, Workflows, Composer, Dataplex, ADO GITREPO, CI/CD Pipelines, Terraform/YAML

   • Security – Secret Manager, Cloud IAM

   • Others – Artifact Registry, Cloud Logging, Cloud Monitoring

   • Work with distributed data processing frameworks like Spark.

   • Strong knowledge of database systems, and data modeling techniques.

   • Ability to adapt to evolving technologies and business requirements.

   • Ability to explain technical concepts to nontechnical business leaders.

   • Monitor system performance and troubleshoot issues.

   • Ensure data security.

   • Proficiency in technical skills, cloud tools and technologies.

Technical Skills:

   • Must Have

   • GCP (Google Cloud Platform)

   • ETL pipeline using Python

   • Expertise in programming languages including Python and SQL.

Got Extra to Bring?

   • GCP – Professional Data Engineer Certification

   • Documenting all steps in the development process

   • Manage the data collection process providing interpretation and recommendations to management