Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Modeler

This role is for a Data Modeler in Charlotte, NC, on an 18-month contract at $76.38/hour. Requires 5+ years in Python, SQL, Alteryx, and Tableau, along with strong data modeling skills and banking/finance domain experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
608
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#Agile #Mathematics #"ETL (Extract #Transform #Load)" #Distributed Computing #Spark (Apache Spark) #Data Wrangling #Monitoring #Lambda (AWS Lambda) #PySpark #Python #Spark SQL #Apache Spark #Visualization #Programming #Data Engineering #Data Science #SQL Queries #Debugging #SQL (Structured Query Language) #Compliance #Database Management #Data Modeling #Data Pipeline #Tableau #Project Management #Alteryx
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Position: Data Engineer/Modeler (Python and Model Operations)

Location: Charlotte, NC

Length: 18 month contract

Pay Rate: $76.38

Interview Process: 2 rounds

Start Date: ASAP – 18 month contract.

Required (hard) skills/experience
• 5+ years of Python programming experience - preferably with Apache spark or distributed computing experience (Spark, MapReduce, DataFrame, Spark SQL) with a Masters or PhD in an analytical field (e.g., Economics, Mathematics, Engineering
• Strong troubleshooting and debugging skills in Python
• Strong data modeling experience
• 4+ years SQL
• Alteryx experience
• Lambda experience
• Experience writing reports in Tableau

Desired skills/experience

Masters or PhD in an analytical field
• Knowledge of Banking and Finance domain and/or experience working with model developers
• Experience with Agile Development, and/or Test Driven Development.

JD:

This candidate will be responsible for maintaining data pipelines and models that transform raw data into actionable insights related to operational risk loss. You will leverage your expertise in Python, SQL, Alteryx, and Tableau to develop and optimize data workflows, support analytics projects, and create impactful visualizations that drive decision-making. Forecast Operations team members utilize their skills in data wrangling, process engineering, database management and software development lifecycle project management to ensure that the processes to create forecasts is well controlled. These team members leverage skills in quantitative methods to conduct ongoing monitoring of model performance. They also possess capabilities in data science and data visualization techniques and tools.

Responsibilities.:
• Write Python and/or PySpark code to automate production processes of several risk and loss measurement statistical models. Example of model execution production processes are error attribution, scenario shock, sensitivity, result publication and reporting.
• Work closely with application users, model developers, technology, and other business partners to understand and document requirements of production processes to be implemented.
• Work with technology teams to integrate Python solution into existing in-house generic platform for process execution
• Ensure that software is developed to meet functional, non-functional, and compliance requirements.
• Write complex SQL queries to validate production results, integrate results with existing downstream applications and produce reports in format expected by end users.