Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Montpelier, Vermont, with a contract length of "unknown" and a pay rate of "unknown." Requires 3-5 years of AWS data lake experience, proficiency in ELT pipelines, and knowledge of data governance. On-site work only.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 6, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Montpelier, VT
🧠 - Skills detailed
#Data Analysis #Apache Airflow #Data Management #Data Science #Data Pipeline #Azure #Scala #S3 (Amazon Simple Storage Service) #Airflow #Data Ingestion #Data Lake #Athena #Documentation #Redshift #AWS Glue #Data Architecture #Data Engineering #Data Lineage #Schema Design #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Data Quality #Data Governance #Compliance #ADF (Azure Data Factory) #Azure Data Factory #Metadata
Role description
Position: Data Engineer Montpelier, Vermont Role Overview: Data Engineers will be responsible for designing and implementing scalable data ingestion workflows to support enterprise data lake solutions. These workflows will ingest data from a range of sources, including government operations, business processes, workforce, and health domains. Domain expertise in any of these areas is preferred. Key Responsibilities: • Design and implement robust ELT (Extract, Load, Transform) pipelines using AWS services such as S3, Glue, Athena, and Redshift. • Define and manage data models and schemas for the bronze and silver layers of a medallion architecture. • Implement data quality checks, data lineage tracking, and metadata management to ensure compliance with governance and regulatory requirements. • Automate workflows using tools like AWS Glue, AWS Step Functions, or Apache Airflow. • Maintain comprehensive documentation for data lake architecture, data models, and data pipeline workflows. • Collaborate closely with data analysts, data scientists, technical teams, and business stakeholders. • Provide ongoing support for a limited number of existing Azure Data Factory workloads. • Adhere to all platform requirements, standards, and procedures as defined by the State. • Participate in team standups, project-specific meetings, and collaborative engagements with other vendors and stakeholders. Preferred Qualifications: • 3–5 years of experience building and maintaining enterprise-scale data lake solutions on the AWS platform. • Hands-on experience with tools and technologies such as S3, Glue, Athena, Redshift, Step Functions, Apache Airflow, and Azure Data Factory. • Strong understanding of data architecture principles, medallion architecture, and schema design. • Familiarity with data governance best practices and compliance standards. • Excellent communication and collaboration skills across technical and non-technical teams. Additional Requirements: • All work must be performed within the United States. • The Agency of Digital Services is the only entity authorized to initiate work under this contract. • The contractor must provide constructive feedback to support the continued development and enhancement of the platform. • Participation in collaborative team activities is required.