1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in Chicago, IL, offering an 18-month contract to hire with a negotiable W2 pay rate. Candidates should have 8+ years of experience in data architecture, expertise in SQL, Python, and cloud platforms (AWS, Azure, Google Cloud).
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 1, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL
🧠 - Skills detailed
#AWS (Amazon Web Services) #Data Architecture #Python #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #SQL (Structured Query Language) #Informatica #Talend #Terraform #Ansible #SQL Server #Data Governance #Snowflake #Azure #NoSQL #Cloud #BI (Business Intelligence) #Infrastructure as Code (IaC) #Compliance #Scala #Hadoop #SQL Queries #Metadata #Data Lineage #Security #MySQL #Data Extraction #ML (Machine Learning) #Databases #Kafka (Apache Kafka) #Leadership #Airflow #DevOps #Spark (Apache Spark) #Data Management #Data Modeling #dbt (data build tool) #PostgreSQL #Data Processing #Apache Airflow #Data Profiling #Big Data #Migration #Data Engineering #Automation #Oracle #Data Quality #Scripting #Data Migration #Data Catalog
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Role: Data Architect

Location: Chicago IL (Downtown)

Term: 18 Month Contract to Hire

Compensation: W2 Negotiable (Cannot work Corp to Corp)

Summary

We are seeking a Data Architect with expertise in data modeling, data profiling, Python scripting, and SQL query development to design and optimize our data architecture. The ideal candidate will play a key role in defining and implementing scalable data solutions that support business intelligence, analytics, and operational data needs.

Key Responsibilities

   • Design and implement data models (conceptual, logical, and physical) to support business and analytical needs.

   • Perform data profiling and analysis to ensure data quality, consistency, and integrity.

   • Develop and optimize SQL queries for data extraction, transformation, and reporting.

   • Create Python scripts for data processing, automation, and integration with other data services.

   • Collaborate with cross-functional teams, including Data Engineers, Analysts, and Business Stakeholders, to align data architecture with business objectives.

   • Define and enforce data governance, security, and compliance best practices.

   • Support data migration and integration initiatives across different databases and platforms.

   • Work with modern cloud-based data platforms (AWS, Azure, or Google Cloud) to design scalable data solutions.

   • Provide technical leadership and mentorship to junior data professionals.

Qualifications

   • 8+ years of experience in data architecture, data modeling, and data engineering roles.

   • Strong experience in designing relational and non-relational databases (e.g., SQL Server, PostgreSQL, Oracle, MySQL, NoSQL databases).

   • Deep understanding of data modeling techniques (e.g., star schema, snowflake schema, normalized forms).

   • Proficiency in SQL development, query optimization, and performance tuning.

   • Experience with Python for data processing, scripting, and automation.

   • Knowledge of ETL/ELT processes and tools such as Apache Airflow, Talend, Informatica, or dbt.

   • Familiarity with big data technologies (e.g., Spark, Hadoop) is a plus.

   • Strong problem-solving skills and ability to communicate technical concepts effectively to both technical and non-technical stakeholders.

Preferred Qualifications

   • Experience in data cataloging, metadata management, and data lineage tracking.

   • Familiarity with machine learning pipelines and AI-driven analytics.

   • Knowledge of DevOps and Infrastructure as Code (Terraform, Ansible) for data infrastructure management.

   • Experience with Kafka or other event-driven architecture for real-time data processing.