Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Senior Data Engineer in Sunnyvale, CA (Hybrid, 3 days/week) with a contract length of "Unknown" and a pay rate of "Unknown." Requires 10+ years in data warehousing, 6+ years with Hadoop/Spark, and 4+ years in GCP.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 11, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Sunnyvale, CA
🧠 - Skills detailed
#Datasets #BigQuery #Jira #Spark (Apache Spark) #Agile #Data Engineering #Data Warehouse #Apache Kafka #Computer Science #Docker #Physical Data Model #Apache Spark #Data Pipeline #BitBucket #Programming #GCP (Google Cloud Platform) #Perl #Apache Airflow #Jenkins #Python #Data Lake #Apache Hive #Automation #Automated Testing #Data Processing #Kubernetes #Scrum #Java #Big Data #Scripting #Airflow #Kafka (Apache Kafka) #Scala #Project Management #RDBMS (Relational Database Management System) #Hadoop
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Direct Client

Location: Sunnyvale, CA (Hybrid 3/days)

Role: Senior Data Engineer

Job Summary:

We are seeking a Senior Data Engineer with expertise in designing and developing big data applications using open-source technologies. This role requires hands-on experience in GCP, data pipeline development, workflow automation, and distributed data processing platforms. The ideal candidate will also lead daily standups, mentor junior engineers, and act as the point of contact for assigned business domains.

Key Responsibilities:
• Design and develop big data applications using the latest open-source technologies.
• Develop logical and physical data models for big data platforms.
• Automate workflows using Apache Airflow.
• Create data pipelines using Apache Hive, Apache Spark, and Apache Kafka.
• Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
• Learn the business domain and technology infrastructure, sharing knowledge with team members.
• Mentor junior engineers and lead daily standups and design reviews.
• Groom and prioritize backlog using JIRA.
• Act as the point of contact for assigned business domains.
• Code, debug, test, document, and communicate product, component, and feature development stages.
• Validate results with user representatives and integrate the overall solution.
• Optimize efficiency, cost, and quality while adhering to engineering standards.
• Contribute to domain relevance, project management, defect management, and knowledge sharing.

Required Qualifications:
• 10+ years of hands-on experience developing data warehouse solutions and data products.
• 6+ years of experience developing distributed data processing platforms with Hadoop, Hive or Spark, and Airflow or similar workflow orchestration solutions.
• 5+ years of experience in modeling and designing schemas for data lakes or RDBMS platforms.
• 4+ years of recent experience with GCP, including GCP Dataproc, GCS, and BigQuery.
• Experience building data pipelines in GCP.
• Proficiency in programming languages such as Python, Java, and Scala.
• Experience with scripting languages such as Perl and Shell.
• Strong practice in processing and managing large datasets (multi-TB/PB scale).
• Exposure to test-driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Ability to manage multiple competing priorities with minimal supervision.
• Excellent verbal and written communication skills.
• Bachelor's degree in Computer Science or equivalent experience.

Preferred Qualifications:
• Experience with Gitflow, Kubernetes, Docker, and Atlassian products (BitBucket, JIRA, Confluence).
• Experience with continuous integration tools such as Bamboo, Jenkins, or TFS.
• Ability to mentor and handle team-related challenges.
• Strong analytical and problem-solving skills.