Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data Architect

This role is for a Senior Data Architect with 14+ years of tech experience, including 12+ years in Data Engineering and 7+ in Data & Analytics architecture. Key skills include Data Lakes, SQL, NoSQL, and big data tools like Kafka. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 8, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Exton, PA
🧠 - Skills detailed
#Data Integration #Data Engineering #Data Modeling #Data Ingestion #QlikView #Scala #Visualization #Tableau #Data Architecture #MDM (Master Data Management) #Data Pipeline #Talend #Big Data #Data Management #Data Lake #Kafka (Apache Kafka) #Cloud #SQL (Structured Query Language) #Oracle #Data Lineage #Data Warehouse #Data Governance #NoSQL #Databases #Agile #SAP #Qlik #Workday
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Sr Data Architect -
• 14+ years of overall technology experience required
• 12+ years of Data Engineering, Data Modeling, Data Warehousing, Master Data Management, Reference Data Management, Data Lineage, Data Governance and Meta Data Management experience required
• 7+ years of experience in defining Data & Analytics architecture implementing multiple large technology projects.
• 5+ years of experience working with Agile teams preferred
• Expertise in designing, validating and implementing multiple projects across the hybrid infrastructure ( On-cloud to On-Premise and vice versa)
• Experience in seamless integration of enterprise data models with data models of packaged solutions (e.g. Oracle Apps, Workday, SAP, Service Now)
• Expertise in setting up Data Lakes and analytical environments
• Expertise with Data Engineering tools such as Talend, BODS
• Expertise with relational SQL and NoSQL databases
• Extensive experience in building data pipelines, data ingestions, data integrations, data preparations, and traditional Data warehouses and DataMarts
• Experience with visualization tools such as QlikView, Qlik Sense, Tableau, etc.
• Experience in message queuing, stream processing, and highly scalable ‘big data’ data stores
• Experience with big data tools such as Kafka

Datawarehouse, Data Lake , Data Modelling (traditional and latest), Data Management practice are mandatory