GCP Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect with a contract length of "unknown" and a pay rate of "unknown." It requires 10+ years of experience in data architecture, expertise in GCP services, strong programming skills, and familiarity with big data technologies.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#DevOps #Monitoring #Grafana #Data Pipeline #Airflow #Kafka (Apache Kafka) #Programming #Storage #Big Data #Cloud #Dataflow #Apache Kafka #Data Processing #Python #Data Storage #Docker #Data Architecture #Apache Airflow #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Scala #Data Quality #Java #Prometheus #Data Engineering #Logging #Data Governance #Kubernetes #Computer Science #BigQuery #Batch
Role description

Required Skills: GCP Data Architect

Job Description

Key Responsibilities:

   • Design and architect scalable telemetry data storage and analytics systems using GCP services.

   • Define and implement data architecture strategies for real-time and batch data processing.

   • Ensure optimal performance, scalability, and cost-efficiency of data storage and processing solutions.

   • Develop and enforce data governance policies and best practices.

   • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.

   • Provide architectural guidance and mentorship to data engineering teams.

   • Set up monitoring, alerting, and automated reporting systems to ensure data quality and system reliability.

   • Evaluate and recommend new technologies and tools to enhance data architecture.

Required Skills and Qualifications:

   • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.

   • 10+ years of experience in data architecture, big data technologies, and analytics.

   • Strong expertise in Google Cloud Platform (GCP) services, including but not limited to BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions.

   • Proficiency in event streaming platforms such as Apache Kafka or Event Hub.

   • Experience with data pipeline orchestration tools like Apache Airflow or Google Cloud Composer.

   • Strong programming skills in languages such as Python, Java, or Scala.

   • Solid understanding of SQL and experience with database technologies.

   • Knowledge of monitoring and logging tools like Prometheus, Grafana, or Stackdriver.

   • Excellent problem-solving skills and the ability to work independently and as part of a team.

   • Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders.

Preferred Skills:

   • Familiarity with containerization and orchestration tools like Docker and Kubernetes.

   • Experience with CI/CD pipelines and DevOps practices.

   • Experience with relevant projects in a large retail company