Data Engineer

This role is for a Data Engineer with 10+ years of experience, offering a remote contract at $50.00 per hour. Key skills include Snowflake, AWS S3, complex SQL queries, and data warehousing. Requires 4 years with Snowflake and SCD techniques.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
400
🗓️ - Date discovered
January 16, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Remote
🧠 - Skills detailed
#Redshift #EC2 #SnowSQL #Slowly Changing Dimensions #JSON (JavaScript Object Notation) #SnowPipe #Data Ingestion #AWS (Amazon Web Services) #Snowflake #AWS EC2 (Amazon Elastic Compute Cloud) #Complex Queries #Lambda (AWS Lambda) #Data Warehouse #Cloud #Python #SQL (Structured Query Language) #S3 (Amazon Simple Storage Service) #Data Engineering #AWS S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)"
Role description
Log in or sign up for free to view the full role description and the link to apply.

Job Title: Data Engineer
Experience: 10+ yrs.
Type of Work: Remote
Type of Visa: Independent visa
Must-Have
· Snowflake Cloud Data Warehouse
· AWS S3
· SQL (complex queries, stored procedures)
· Data Warehousing and ETL Concepts
· Slowly Changing Dimensions (SCD Types 1, 2, 3)
· SnowSQL and SnowPipe
· Performance Tuning (Query Profiler, Caching, Scaling)
· Data Handling (JSON, ORC, PARQUET, CSV)
· Redshift (Good to Have)
· AWS EC2 and Lambda (Good to Have)
· Python (Good to Have)
Required Skills:

Snowflake cloud data warehouse and AWS S3.
Strong hands-on SQL skills/ Stored procedures / writing complex queries
Good knowledge of Data warehousing and all related concepts. Hands on experience in Data Ingestion from different sources
Candidate must be strong in SQL
Must have 4 years of Relevant experience in working with Snowflake cloud data warehouse and exposure to AWS S3.
Implementing SCD Techniques like slowly changing dimensions (type1, type2 and type3).
Experience in working with AWS S3 and Snowflake cloud data warehouse.
Good exposure in Snowflake Cloud Architecture and SnowSQL and SNOWPIPE for continuous data ingestion.
Handling large and complex data sets like JSON, ORC, PARQUET, CSV files from various sources like AWS S3.
Hands-on experience in bulk loading and unloading data into Snowflake tables.
Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in snowflake cloud data warehouse.
Experience with performance tuning of SnowFlake data warehouse with Query Profiler, Caching and Virtual data warehouse scaling
Knowledge on redshift (Good to have)
Knowledge on AWS EC2, lambda
Good to have python

Job Type: Contract
Pay: From $50.00 per hour
Benefits:

401(k)
Dental insurance
Health insurance

Application Question(s):

May I know your current visa status?

Experience:

Data Engineer: 10 years (Preferred)
Snowflake Cloud: 6 years (Preferred)
Data warehouse: 6 years (Preferred)
• Performance Tuning (Query Profiler, Caching, Scaling): 1 year (Preferred)

Work Location: Remote