Contract Role: GCP Data Architect at Secaucus, NJ (Remote 100%)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect with 8+ years of data engineering experience, focusing on data lake and ETL pipeline design. It is a long-term remote contract with a competitive pay rate, requiring strong skills in GCP, SQL, and Python.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 16, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Secaucus, NJ
🧠 - Skills detailed
#Cloud #SQL (Structured Query Language) #Data Engineering #Leadership #Project Management #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Strategy #dbt (data build tool) #Data Analysis #Kafka (Apache Kafka) #Apache Kafka #Data Pipeline #API (Application Programming Interface) #Java #Base #Data Lineage #BigQuery #Data Access #GCP (Google Cloud Platform) #Dataflow #Apache Beam #Data Lake #Version Control #Data Aggregation #Scala #Data Quality #Data Architecture #GIT #Computer Science #Data Integration #Data Warehouse #Python
Role description

GCP Data Architect

Secaucus, NJ (Remote)

Long Term Contract

   • 8+ years of experience in Data Engineering or L3 level of support in data Analytics

General Description

   • Experience with Data Lake, data warehouse ETL pipelines build and design.

   • Hands-on with GCP data and analytics services - Cloud DataProc, Cloud Dataflow, Cloud Dataprep , Apache Beam/ composer, Cloud BigQuery,

   • Data Pipeline Development: Design, develop, and maintain ETL/ELT pipelines to ensure seamless data flow from various sources to our data warehouse using DBT or GCP BigQuery.

   • Data Modelling: Implement and manage data models in BigQuery to support business analytics and reporting needs.

   • Version Control: Utilize GIT for version control to manage changes in data pipelines, schemas, and related code.

   • Solid coding skills in languages such as SQL, Python, or Java

   • 8+ years of experience as a Data Engineer or in a similar role

Technical Requirements

   • Identify, create, maintain and support data model/data aggregation model base on business requirements and implement ETL for those models accordingly.

   • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

   • Implementing, establishing standards, and maintaining company data lake (UDP), regional platforms data streaming from Apache Kafka topics.

   • Develop and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.

   • Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization.

   • Writes unit/integration tests, contributes to engineering wiki, and documents work.

   • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

   • Works closely with a team of frontend and backend engineers, product managers, and analysts.

   • Design data integrations and data quality framework.

   • Design and evaluates open source and vendor tools for data lineage.

   • Work closely with all business units and engineering teams to develop strategy for long term data platform architecture.

   • Plan, create, and maintain data architectures while also keeping it aligned with business requirements.

   • Work closely with business and application delivery team on new digital project/implementation to understand solution and implemented data structure so that all relevant data engineer work including reports and various data-related enhancement can be implemented accordingly as part of the project.

   • Develop report/dashboard/dataset for business user base on request.

   • Assist to support existing reports/dashboard/dataset and also maintain report/dashboard/dataset repository.

Soft Skills

   • Excellent communication and presentation skills, with the ability to explain complex technical concepts to non-technical stakeholders.

   • Strong problem-solving and critical thinking skills.

   • Exceptional project management and organizational abilities.

   • Team collaboration and leadership skills.

   • Demonstrate a general knowledge of market trends and competition.

Be a strong team player.

   • Client-focused approach with a commitment to delivering exceptional customer service.

Certifications (Good to have)

   • Google Professional Data Engineer

   • Google Professional Data Architect

Educational Qualifications

   • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.