

GCP Data Engineer
This role is for a "GCP Data Engineer" with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, advanced SQL, PySpark, Airflow, and Kafka. Experience in REST API and functional programming is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 8, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#Batch #Kafka (Apache Kafka) #Regression #Python #Programming #REST (Representational State Transfer) #Unit Testing #REST API #API (Application Programming Interface) #SQL (Structured Query Language) #Teradata #Spark (Apache Spark) #Data Engineering #GCP (Google Cloud Platform) #PySpark #Airflow
Role description
Required/Notes:
• Expertise in Python, advanced SQL, PySpark (Spark Batch and Streaming), Airflow, Kafka.
• Exposure to REST API creation and usage
• Experience in functional programming, test first architecture (unit testing and regression testing), OOPS concepts(classes and objects).
• Exposure in DB2, Teradata is a plus.