Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr. Data Engineer

This role is for a Sr. Data Engineer with a 6-month contract to permanent hire in Long Beach, CA, offering $65-75/hr. Key skills include Azure, ELT with Coalesce, Snowflake, and 2+ years of healthcare experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
February 21, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Long Beach, CA
🧠 - Skills detailed
#ADF (Azure Data Factory) #Data Warehouse #Apache NiFi #Data Analysis #dbt (data build tool) #Talend #Data Engineering #NiFi (Apache NiFi) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Azure #Strategy #Java #Redshift #Programming #Data Security #Data Pipeline #Database Design #Cloud #Informatica #Apache Spark #Agile #GCP (Google Cloud Platform) #Migration #Data Modeling #Python #Security #Databricks #Data Governance #Streamlit #Datasets #Scala #Spark (Apache Spark) #Compliance #Snowflake #Apache Kafka #Data Processing #Databases #Kafka (Apache Kafka) #Vault #Data Vault #Data Quality #Data Management #AWS (Amazon Web Services)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Position: Sr. Data Engineer (Azure, ELT w/ Coalesce, Snowflake, Healthcare industry)

Work Model: Hybrid Model (3 Onsite / 2 Remote) in Long Beach, CA

Location: Long Beach, CA

Duration: (CTH) 6 month contract to permanent full time hire

Hourly Rate during 6 month period: 65/hr to 75/hr

Salary Conversion at 6 month mark: 135k to 145k + Excellent Benefits Program

Must Haves:
• Azure
• ELT w/ Coalesce
• Snowflake
• Healthcare industry experience

What You Will Do:
• Assemble large, complex data sets that meet functional / non-functional business requirements.
• Design and build efficient ETL pipelines for data processing and integration
• Create and maintain data warehouse architectures for storing and querying large datasets
• Design new/enhance existing data pipelines, repositories, models for structured and semi-structured data. Ensure optimal design, integrity, and accuracy throughout the pipelines
• Analyze, design, determine coding and integration activities required
• Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
• Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
• Monitor and manage the performance of data systems and troubleshoot issues
• Develops documents related to process design, mock-ups, systems specification and test plans, according to existing standards and methodologies.
• Develop process flows, context diagrams, and data flow diagrams to document processes
• Ensure processes developed meet compliance with data security and privacy regulations
• Works under limited supervision of Enterprise Data Arch & Eng Manager.
• Collaborate with other data engineers and analysts to support data-driven decision-making
• Works closely with all business units and engineering teams to develop strategy for long term data platform architecture

What Gets You The Job:
• 4+ years Professional related experience.
• 2+ years of healthcare experience.
• 4+ years of experience in SQL/Query languages
• 2+ years of experience in Snowflake is required
• Experience with cloud migrations a strong plus
• Strong understanding of relational and non-relational databases
• Knowledge of data modeling, database design, and data governance
• Fluent in structured, unstructured data management and modern transformation techniques
• Proven experience in designing data pipelines and working with data warehouse technologies like Snowflake, Databricks, SQL, Redshift
• Experience designing, building, and maintaining data processing systems
• Proficiency in data processing tools like Coalesce, dbt, ADF, Apache NiFi, Talend, or Informatica
• Proficiency in programming languages such as Python, Java, or Scala
• Experience with cloud platforms (e.g. AWS, Google Cloud Platform, Azure).
• Familiarity with Data Vault 2.0 modeling
• Familiarity with creating data apps using Streamlit
• Familiarity with data pipeline frameworks and data processing platforms (e.g. Apache Kafka, Apache Spark).
• Experience with Agile Software Development methodologies

Please send your resume to Dave Lim, Senior Technical Recruiter for immediate consideration.

Irvine Technology Corporation (ITC) is a leading provider of technology and staffing solutions for IT, Security, Engineering, and Interactive Design disciplines servicing startups to enterprise clients, nationally. We pride ourselves in the ability to introduce you to our intimate network of business and technology leaders – bringing you opportunity coupled with personal growth, and professional development! Join us. Let us catapult your career!

Irvine Technology Corporation provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Irvine Technology Corporation complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.