Database Administrator - ETL, Python, SQL

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Database Administrator in Tallahassee, Florida, on a 6-month contract at $55-$60/hour. Requires 5 years of Snowflake experience, 3 years with ETL tools (e.g., Matillion, Talend), Python, SQL, and a Bachelor's degree in a related field.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Tallahassee, FL
🧠 - Skills detailed
#Indexing #Data Science #Talend #Data Integration #Schema Design #Data Pipeline #Data Integrity #DBA (Database Administrator) #Data Warehouse #Informatica #Security #Snowflake #Azure #Databases #Matillion #Python #Consulting #Oracle #Business Analysis #Data Modeling #Consul #Automation #Migration #Clustering #Data Architecture #Azure Data Factory #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #MS SQL (Microsoft SQL Server) #Data Quality #Physical Data Model #Data Migration #Unit Testing #Documentation #Logging #Data Governance #Data Mart #Computer Science #ADF (Azure Data Factory)
Role description

Job Summary:

Scope of Work

Required Consultant Experience Provided By Contractor, Shall Include:

   • 5 years of experience with Snowflake data warehouse.

   • 3 years of experience with no/low code ETL tools/platforms (e.g., Matillion, Talend, Informatica).

   • 3 years of experience performing ETL processes using code (e.g. Python, SQL).

   • Strong experience with data integration, data quality, and data governance.

   • Strong experience with relational databases, SQL, and data modeling.

   • Strong experience working collaboratively with business analysts, developers, and non-technical end users related to data warehousing.

Preferred Experience:

   • Excellent skills in communicating technical information to non-technical resources.

   • Excellent troubleshooting skills.

   • Excellent interpersonal and communication skills.

   • Excellent communication and collaboration skills.

   • Strong analytical and problem-solving skills.

   • Detail-oriented, ability to switch tasks, ability to self-direct and prioritize tasks.

   • Experience researching and investigating problems to develop viable solutions, report findings, and provide clear and effective recommendations.

   • Experience working in a project-oriented environment.

   • Experience developing functional and technical specifications, design flow, and system blueprints.

   • Experience with Microsoft Office 365.

   • Ability to plan, organize and document IT solutions.

   • Ability to plan, organize and coordinate IT work assignments with other technical resources.

   • Ability to work independently and as part of a team.

Required Duties and Responsibilities of Consultant shall include but are not limited to:

The customer is seeking a Database Analyst to create tables, migrate data, configure ETL pipelines, and create backup loads.

The Database Analyst position will provide consulting services in support of the creation of a Data Warehouse for the Property Tax Oversight Program.

Writing ETL Pipelines

   • ETL Tool Selection:

   • Provide alternatives and advantages/disadvantages to aid in selection of a no/low code ETL platform/tool.

   • Design and Develop ETL Pipelines:

   • Create, optimize, and maintain ETL pipelines to migrate data from various sources (Oracle, MS- SQL, flat files) to the target Snowflake data warehouse.

   • Utilize a combination of coding (Python, SQL, etc.) and no/low-code tools (e.g., Azure Data Factory, Talend, Informatica) to develop ETL processes that meet data integration needs.

   • Data Transformation and Validation:

   • Apply transformations to cleanse, filter, join, and aggregate data from the source systems.

   • Ensure data quality by implementing validation rules to verify data correctness and completeness before loading into Snowflake.

   • Source-to-Target Mapping:

   • Create detailed source-to-target mappings to ensure accurate transformation from source systems (Oracle, MS-SQL) into Snowflake.

   • Ensure solution provides data quality, consistency, and integrity throughout the ETL process.

   • Automation and Scheduling:

   • Work with other resources to automate the ETL processes to run at scheduled intervals based on specific needs.

   • Monitor and optimize ETL jobs to ensure they run efficiently and on time.

   • Error Handling and Troubleshooting:

   • Implement error detection, logging, and alerting mechanisms within the ETL pipelines.

   • Troubleshoot ETL failures, identify root causes, and make necessary adjustments to ensure pipeline stability.

   • Performance Optimization:

   • Tune ETL processes for performance improvements, minimizing resource consumption and ensuring fast data migration.

   • Testing:

   • Develop test cases for ETL processes and data pipelines to ensure they meet business requirements and are free of errors.

   • Perform unit testing of ETL pipelines.

Creating and Maintaining Snowflake Tables

   • Table Design and Creation:

   • Design, create, and optimize Snowflake tables to store transformed data, ensuring proper indexing, clustering, and partitioning strategies.

   • Work with business stakeholders to understand data requirements and create tables that meet the business needs (e.g., fact and dimension tables, data marts).

   • Schema Management:

   • Maintain and manage Snowflake schema to ensure efficient querying and reporting.

   • Monitor schema changes and manage data migrations across environments (development, staging, production).

   • Data Modeling:

   • Develop and maintain logical and physical data models in Snowflake, ensuring they align with business requirements and reporting needs.

   • Support dimensional data modeling and star/snowflake schema design to facilitate reporting and analysis.

   • Data Governance and Security:

   • Implement data governance standards and policies to ensure data integrity and security.

   • Define roles and access controls within Snowflake to restrict access to sensitive data.

   • Data Quality and Consistency:

   • Implement data quality checks within Snowflake to ensure that the data loaded into tables meets the necessary standards for reporting and analytics.

   • Perform regular data audits to check for discrepancies or inconsistencies.

Collaboration / Documentation

   • Collaboration:

   • Work closely with business stakeholders, business analysts, data architect, and IT teams to gather requirements and provide technical solutions.

   • Provide training, support, and documentation to end-users and technical teams to provide for long-term ongoing maintenance and support of the solutions.

   • Regularly interface with and provide status updates to other project resources including project manager, business analysts, data warehouse architect, and key stakeholders.

   • Work diligently to ensure completion of project goals by the end of the project.

   • Documentation:

   • Document ETL processes, Snowflake schema designs, and data pipelines for future reference and team knowledge sharing.

   • Maintain clear documentation on data definitions, transformation rules, and data quality standards.

IF YOU ARE NOT WILLING TO RELOCATE TO TALLAHASSEE, FLORIDA FOR THIS POSITION, DO NOT APPLY FOR THIS POSITION.

Remote work is not available for this one (1) position, no exceptions. The location of this position is in Tallahassee, Florida.

This position is for a 40 hours/week resource that is based in the Tallahassee, FL area. Work will be completed in person in the Tallahassee, FL office.

Education/Certifications:

   • Bachelor's degree in Computer Science, Information Technology, Data Science, or related field. Master's degree preferred.

location: Tallahassee, Florida

job type: Contract

salary: $55 - 60 per hour

work hours: 8am to 5pm

education: Bachelors

Responsibilities:

Scope of Work

Education/Certifications:

   • Bachelor's degree in Computer Science, Information Technology, Data Science, or related field. Master's degree preferred.

Qualifications:

Scope of Work

Education/Certifications:

   • Bachelor's degree in Computer Science, I