1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Architect position lasting 6-12 months, offering remote work at a pay rate of "rate". Key skills include expertise in Matillion, Snowflake, data integration, ETL processes, and cloud technologies. Certifications in relevant cloud technologies are a plus.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 2, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New York City Metropolitan Area
🧠 - Skills detailed
#AWS (Amazon Web Services) #Business Analysis #Oracle #GDPR (General Data Protection Regulation) #Teradata #Compliance #dbt (data build tool) #Strategy #Data Integration #Databricks #Python #"ETL (Extract #Transform #Load)" #Alation #Metadata #Big Data #Snowflake #Kafka (Apache Kafka) #Visualization #Data Privacy #Data Lineage #Azure #XML (eXtensible Markup Language) #Data Governance #Synapse #Data Quality #SQL (Structured Query Language) #Data Modeling #Normalization #Cloud #Data Architecture #Data Security #Microsoft Power BI #Public Cloud #Spark (Apache Spark) #Hadoop #Data Integrity #Data Science #Data Ingestion #Data Warehouse #Data Profiling #Data Management #Qlik #Data Engineering #Scala #JSON (JavaScript Object Notation) #Security #Data Strategy #Redshift #Tableau #Leadership #BI (Business Intelligence)
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Position: Data Architect

Location: Remote.

Duration: 6-12 months

Matallion, Snoflake required.

Job Overview:

In this Data Architect role, you will focus on designing and governing data architecture to facilitate data-driven decision making. Your responsibilities will include crafting and overseeing the implementation of efficient data ingestion strategies, ensuring robust ETL processes, and maintaining data quality and accessibility. You will leverage your expertise in Snowflake data warehousing, SQL optimization, and data integration technologies to drive the data strategy forward, with a primary emphasis on design and governance rather than hands-on implementation.

You will also have the chance to deliver data-driven organizational transformation, utilizing on-premises and cloud data and applying the best practices and methods in this field.

Your responsibility will be:

   • Take the initiative and ownership of work streams and clients

   • Design and architect to appropriately safeguard data security, data integrity and data privacy at the conceptual and logical level.

   • Own the definition and implementation of the Data Architecture Roadmap to support client’s strategic and operational roadmap.

   • Design and implement data integration solutions to consolidate data from various sources into client’s analytical platforms and develop and manage data ingestion pipelines using modern technologies for timely and accurate data availability.

   • Lead the development of efficient, scalable, and maintainable ETL processes and frameworks to extract, transform, and load data into the data artefacts.

   • Collaborate with business analysts and stakeholders to create and maintain source-to-target mappings, documenting data lineage and transformation logic to ensure accurate data requirements.

   • Implement best practices for data modeling, security, and performance optimization in Datawarehouse systems (Cloud and On-Premises) to design and manage data warehousing solutions.

   • Collaborate with data engineers, data scientists, and business stakeholders to provide comprehensive data solutions while offering technical leadership and mentorship to junior team members.

   • Provide technical expertise during client meetings, helping to address questions and concerns related to data architecture and management.

   • Key skillset required:

   • Excellent problem-solving, attention to detail and communication skills (written and spoken)

   • Proven technical expertise in Data Architecture, Data Management or in a similar role.

   • Experience in system-level architecture and conducting dimensional modeling.

   • Extensive experience in designing performant data warehouse schemas, e.g., star and snowflake schemas, with a deep understanding of the tradeoffs related to normalization/denormalization.

   • Extensive experience working with Cloud or On-premises data warehousing systems like Snowflake, Synapse, Big Query, Redshift, Teradata, DB2, Oracle, etc.

   • Experience in management of structured, semi-structured and unstructured data like tabular, parquet, Avro, Json, xml, audio, video, etc.

   • Experience with any major public cloud, preferably AWS, Azure, and their data services.

   • Knowledge of big data technologies such as Hadoop, Spark, Flink, Kafka, Databricks, Synapse, etc.

   • Proficient in SQL and experienced in query planning and optimization.

   • Strong background in data integration, ETL development, and data ingestion technologies like DBT, SQL Mesh, Spark, Python, Matallion, etc.

   • Experience with data profiling, source target mappings, and data quality management with tools like Open Metadata, Datahub, Great Expectations, Atlan, Alation, etc.

   • Familiarity with testing methodologies and implementation processes.

   • Familiarity with data governance and compliance standards like GDPR, HIPAA, etc.

   • Experience with at least one major data visualization/BI platform such as Tableau, Power BI, Qlik, Cognos, etc., is a plus.

   • Certifications in Cloud technologies and data engineering e.g., AWS, Databricks, Snowflake, Kafka, is a plus