1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Government Data Engineer - Local NC

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Government Data Engineer in Raleigh, NC, lasting 6+ months at $50-55/hr. Requires 5+ years in data engineering, expertise in Python, SQL, ETL processes, and data quality assurance. Onsite work is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
440
🗓️ - Date discovered
March 29, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Garner, NC
🧠 - Skills detailed
#Data Access #Data Integration #Agile #Data Analysis #SQL (Structured Query Language) #S3 (Amazon Simple Storage Service) #Cybersecurity #DevOps #"ETL (Extract #Transform #Load)" #Quality Assurance #Snowflake #Data Engineering #BI (Business Intelligence) #Data Quality #Python #Monitoring #Security #Cloud
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Title: Data Engineer

location: Raleigh NC

Duration: 6+ Months

Pay: $50-55/hr all-inclusive

   • 

   • 

   • The manager would like the candidate to be onsite regularly throughout the position. The amount of onsite time per week will be negotiable. However, weekly onsite time will be required.

   • 

   • 

   • 

   • 5+ years experience in data engineering, with an emphasis on data quality assurance and ETL processes.

Job Summary - Data Engineer:

We are seeking a skilled mid-level+ Data Engineer to joinour team and focus on quality assurance, quality checking, and ETL processes.The successful candidate will be responsible for ensuring the integrity andaccuracy of data transferred from a shared file transfer service to an S3bucket and subsequently into and through our Snowflake data platform. This datawill be utilized by downstream applications and reporting systems. Theseapplications and the corresponding consumed data are critical to businessprocess execution.

Key Responsibilities:

   • Quality Assurance & Quality Checking:  Implement and maintain data quality to ensure the accuracy and reliability of data throughout the ETL process.

   • ETL Processes: Design, develop, and optimize ETL workflows to efficiently transfer data from file transfer services to S3 buckets and Snowflake.

   • Data Integration: Ensure seamless data integration into data platform, enabling efficient consumption by downstream applications and reporting tools.

   • Data Quality Management: Address data quality challenges, including inconsistencies in source data that do not meet ingestion requirements, which can lead to load failures or data backouts.

   • Collaboration: Work closely with business owners, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver high-quality data solutions.

   • Monitoring & Troubleshooting: To preserve data flow and integrity, monitor pipelines, identify issues, and implement solutions.

Qualifications (Knowledge/Skills/Abilities):

   • Demonstrated mid-level+ experience in data engineering, with a emphasis on data quality assurance and ETL processes.

   • Expertise in Python, PyPI, and SQL

   • Expert analytical and problem-solving skills.

   • Demonstrate a strong understanding of cybersecurity principles related to code development, DevOps, data access, and fundamental cybersecurity.

   • Understanding of fundamental public-cloud capabilities.

   • Proven capacity to comprehend business needs and convert them into technical requirements.

   • Demonstrated excellence in communication and collaboration abilities.

   • Proven capacity to define success, deliver, and operate in an agile setting.