

Data Engineer
Role : Data Engineer
Location : Onsite /Seattle
Rate : 70 USD/Hr
Type : W2/1099
Required Qualifications
• Seeking an experienced Data Engineer with expertise in Google Cloud (GCP), Azure, Databricks, Snowflake, and data pipeline development.
• Responsibilities include building and maintaining secure, high-performance data pipelines, supporting DevOps, and ensuring data compliance.
• Must have experience in SQL, Python, CI/CD, data modeling (Star/Snowflake Schema), and tools like ADF, Airflow, and Kafka.
Preferred qualifications: 5+ years in Data Engineering, Google Professional Data Engineer Certification, and experience with ClickStream data migration.
• Design, construct, install, and maintain large-scale processing systems and architecture.
• Develop data set processes for data modeling, mining, and production.
• Collect and process raw data from various sources.
• Implement data quality and validation processes.
• Transform and clean data for analysis and reporting.
• Ensure data is stored securely and efficiently.
• Collaborate with data architects and data scientists to define requirements.
• Optimize data storage solutions for performance and scalability.
• Monitor data pipeline performance and troubleshoot issues.
• Create and maintain documentation for data workflows and structures.
• Work with APIs to extract and interact with data from external systems.
• Conduct data analysis to derive insights and recommendations.
• Evaluate and implement new data technologies as appropriate.
• Stay current on industry trends and best practices in data engineering.
• Provide technical support to data users across the organization.
Skills: problem solving,sql proficiency,azure,snowflake,data bricks,data warehousing,python,databricks,ci/cd,kafka,airflow,snowflake schema,adf,gcp,star schema,big data technologies,google cloud (gcp),data modeling,sql