Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer contract position in Bellevue, WA, paying "pay rate." Requires 10+ years of experience, expertise in Snowflake and Python, and familiarity with AI. Strong communication skills and experience in data quality validation are essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 6, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Bellevue, WA
🧠 - Skills detailed
#Compliance #Documentation #AI (Artificial Intelligence) #Data Pipeline #Automation #ChatGPT #Data Processing #Scala #Project Management #Jira #Data Modeling #Data Engineering #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Python #Agile #Kanban #BI (Business Intelligence) #REST (Representational State Transfer) #Scripting #Snowflake #REST API #Data Warehouse #Data Quality #ML (Machine Learning)
Role description

Job Role: Data Engineer with AI(Must Have)

Location: Bellevue, WA

Visa: GC/USC only

Exp: Min 10 years

Type: Contract

Summary:

Fiona Solutions Inc is looking for an experienced Data Engineer specializing in Snowflake and Python to enhance our reporting infrastructure and data pipelines. This role is vital for enabling accurate, timely, and insightful analytics, directly supporting strategic decision-making around Enterprise ChatGPT governance, usage, and business impact.

Role & Responsibilities:

·      Design, build, and maintain robust, scalable data pipelines using Snowflake.

·      Develop and execute rigorous data validation processes to ensure data quality and reliability.

·      Collaborate with reporting analysts and developers to finalize data structures, views, and architecture documentation.

·      Troubleshoot data quality issues, implement corrections, and continuously enhance the data pipeline's efficiency and reliability.

·      Support ad-hoc data requests and analytics, providing rapid insights to inform business decisions.

·      Clearly document pipeline architecture, processes, and best practices for internal knowledge sharing and governance.

Skills:

·      Strong expertise in Snowflake for data warehouse design and pipeline management.

·      Proficiency in Python for data processing, ETL workflows, automation, and scripting.

·      Demonstrated experience with data modeling, schema optimization, and database views.

·      Excellent communication skills, both written and verbal, to effectively collaborate across cross-functional teams.

·      Ability to thrive in a fast-paced environment, smoothly adapting to shifting priorities and competing demands.

·      Proven experience with data quality validation, issue resolution, and documentation.

·      Minimum 3-5 years of relevant experience in data engineering, ETL development, or related fields.

·      Passionate about Generative AI.

·      Familiarity with Power BI, including dataset preparation and report integration.

·      Experience in working with REST APIs and integrating external data sources.

·      Familiarity with OpenAI Compliance APIs.

·      Experience working with Agile/Kanban methodologies.

·      Proficiency with project management tools like Jira and documentation platforms such as Confluence.

·      Prior experience in reporting, analytics, or AI/ML-focused teams.