Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for an AWS Data Engineer in Cincinnati, OH/Minneapolis, MN, requiring expertise in AWS, Snowflake, and database design. Key skills include data integration, quality governance, and automation. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Minneapolis, MN
🧠 - Skills detailed
#AWS (Amazon Web Services) #Oracle #Data Accuracy #Data Pipeline #Data Processing #Automation #GDPR (General Data Protection Regulation) #NoSQL #Snowflake #Database Design #Databases #Data Quality #SQL (Structured Query Language) #Compliance #Data Governance #Data Engineering #Data Integration
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Title: AWS Data Engineer

Location: Cincinnati, OH/ Minneapolis, MN

AWS and Snowflake experience is a Must. also like to add experience working with Oracle database is a nice to have.
• Data Integration and Management:
• Develop and maintain data pipelines to ingest, process, and store financial data from various sources (internal and external).
• Integrate structured and unstructured data from multiple systems such as accounting software, trading platforms, financial databases, and market feeds.
• Database Design and Optimization:
• Build and optimize relational and non-relational databases (e.g., SQL, NoSQL) to store financial data.
• Ensure efficient querying and retrieval of financial data to support reporting and analytics.
• Data Quality and Governance:
• Implement data quality checks to ensure data accuracy, consistency, and completeness.
• Establish and enforce data governance policies to ensure compliance with financial regulations and standards (e.g., GDPR, SOX).
• Automation and Optimization:
• Automate repetitive data processing tasks to improve operational efficiency.
• Continuously monitor and improve the performance of data pipelines and systems to handle large volumes of financial data.