Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data Engineer- Contract-W2 - AnnArbor-MI- 100% Remote

This role is a Senior Data Engineer contract (W2) for 100% remote work in Ann Arbor, MI, offering a pay rate of "unknown." Requires 4+ years of experience, proficiency in Python, SQL, AWS, and expertise in data architecture and big data processing.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 11, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Datasets #Data Analysis #Spark (Apache Spark) #Data Architecture #Spark SQL #Data Engineering #"ETL (Extract #Transform #Load)" #Data Modeling #AWS (Amazon Web Services) #ML (Machine Learning) #Programming #Python #Data Lake #Data Integration #Data Management #Data Processing #Cloud #SQL (Structured Query Language) #Database Systems #Database Design #AWS Machine Learning #Compliance #Big Data #Kafka (Apache Kafka) #Scala
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Innovative IT Solutions Inc, is seeking the following. Apply via Dice today!

Responsibilities

Data Modeling: Understanding datasets (platform and business systems from business functionality point of view ) and deciding what is needed in the data lake. Decide and implement a framework for data integration

Data acquisition and transformation: Acquiring datasets that align with business needs.

Data management Implementation: Building, testing, and maintaining pipeline architectures for all data sets in the data lake.

Data analysis: Creating data analysis tools and identifying patterns in historical data

Algorithm development: Developing algorithms to collect and transform data into actionable information

Software Engineering and Machine Learning Cloud computing: Deploying and managing scalable data solutions in the AWS Machine learning:

Work with ML engineers to create the required datasets, features stores and prompts. Creation of new features from existing data.

Experience And Skills

Programming languages like Python, Clojure, SQL and Spark SQL

Database systems and database design

Big data processing frameworks (Kafka, EKS, Delta Live Tables)

AWS Cloud computing platforms

Bachelors Degrees. Masters preferred.

Minimum 4+ years of relevant experience

Demonstrated experience with Enterprise Data Architecture, Data Analytics including technology evaluations, solution

Data best practices, Compliance and regulatory knowledge