1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6-month contract, offering a pay rate of "$X/hour." Work location is remote. Key skills required include Python, GCP (BigQuery, Dataflow), and SQL. A minimum of 5 years in Data Engineering is essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Greater Minneapolis-St. Paul Area
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Dataflow #Big Data #Python #Security #BigQuery #Datasets #Data Analysis #GCP (Google Cloud Platform) #Data Manipulation #Visualization #Consul #SQL Queries #Data Pipeline #SQL (Structured Query Language) #Data Integration #Data Integrity #Consulting #Cloud #Data Engineering #Storage
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Nimbus Consulting is seeking a Senior Data Engineer with expertise in Reporting & Data Services, Python, Google Cloud Platform (GCP) (BigQuery, Dataflow), and SQL (Big Data). The ideal candidate will have hands-on experience in developing and managing data pipelines, processing large data sets, and optimizing reporting and data services solutions. This is an excellent opportunity for professionals who enjoy working with cutting-edge cloud technologies to drive data innovation.

Key Responsibilities:

   • Design, develop, and optimize data pipelines using Python and GCP (BigQuery, Dataflow) to process large datasets efficiently.

   • Work with Big Data tools and technologies (such as SQL, BigQuery, and Dataflow) to manage and analyze massive data sets.

   • Build and maintain reporting and data services solutions that provide business insights.

   • Collaborate with cross-functional teams to ensure seamless data integration and reporting solutions.

   • Ensure high-performance data retrieval and storage to support efficient reporting systems.

   • Write and maintain complex SQL queries for data analysis and transformation.

   • Monitor and optimize the performance of data services and pipelines in a cloud environment.

   • Develop and execute test plans, troubleshoot issues, and ensure high-quality data integrity.

Required Skills & Qualifications:

   • 5+ years of experience in Data Engineering, with strong proficiency in Python for data manipulation and processing.

   • Solid experience with Google Cloud Platform (GCP), particularly BigQuery and Dataflow.

   • Extensive knowledge of Big Data technologies and SQL, especially for working with large datasets.

   • Hands-on experience in building and optimizing data pipelines and reporting services.

   • Familiarity with ETL processes and tools in the cloud environment.

   • Ability to work with and process data from diverse sources and ensure efficient data integration.

   • Strong problem-solving skills and ability to optimize performance in a cloud-based data environment.

   • Excellent communication and collaboration skills.

Preferred Skills:

   • Experience with data visualization and reporting tools is a plus.

   • Knowledge of cloud infrastructure best practices and security.