1 of 3 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Engineer

This role is for a Data Engineer in Seattle, offering $70/hr on a W2/1099 contract. Requires 5+ years of experience, Google Cloud and Azure expertise, SQL, Python, and data pipeline development skills. Google Professional Data Engineer Certification preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
March 9, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Seattle, WA
🧠 - Skills detailed
#Migration #Data Science #Airflow #Python #Data Architecture #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Documentation #"ETL (Extract #Transform #Load)" #Compliance #Data Bricks #Storage #Data Migration #Databricks #Azure #Data Pipeline #Data Analysis #Kafka (Apache Kafka) #ADF (Azure Data Factory) #Snowflake #Data Engineering #Big Data #Data Modeling #Cloud #Azure Databricks #DevOps #Data Storage #Data Quality #Scala
Role description
You've reached your limit of 3 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Role : Data Engineer

Location : Onsite /Seattle

Rate : 70 USD/Hr

Type : W2/1099

Required Qualifications

   • Seeking an experienced Data Engineer with expertise in Google Cloud (GCP), Azure, Databricks, Snowflake, and data pipeline development.

   • Responsibilities include building and maintaining secure, high-performance data pipelines, supporting DevOps, and ensuring data compliance.

   • Must have experience in SQL, Python, CI/CD, data modeling (Star/Snowflake Schema), and tools like ADF, Airflow, and Kafka.

Preferred qualifications: 5+ years in Data Engineering, Google Professional Data Engineer Certification, and experience with ClickStream data migration.

   • Design, construct, install, and maintain large-scale processing systems and architecture.

   • Develop data set processes for data modeling, mining, and production.

   • Collect and process raw data from various sources.

   • Implement data quality and validation processes.

   • Transform and clean data for analysis and reporting.

   • Ensure data is stored securely and efficiently.

   • Collaborate with data architects and data scientists to define requirements.

   • Optimize data storage solutions for performance and scalability.

   • Monitor data pipeline performance and troubleshoot issues.

   • Create and maintain documentation for data workflows and structures.

   • Work with APIs to extract and interact with data from external systems.

   • Conduct data analysis to derive insights and recommendations.

   • Evaluate and implement new data technologies as appropriate.

   • Stay current on industry trends and best practices in data engineering.

   • Provide technical support to data users across the organization.

Skills: problem solving,sql proficiency,azure,snowflake,data bricks,data warehousing,python,databricks,ci/cd,kafka,airflow,snowflake schema,adf,gcp,star schema,big data technologies,google cloud (gcp),data modeling,sql