Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Principal Power BI Developer

This role is for a "Principal Power BI Developer" with a contract length of "unknown," offering a pay rate of "$X per hour." Key skills include Power BI development, data engineering, SQL, and ETL processes. A Bachelor's degree and 5-7 years of experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
640
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Data Modeling #Data Engineering #DAX #Data Processing #Scala #Batch #Data Architecture #Microsoft Power BI #BI (Business Intelligence) #SQL (Structured Query Language) #Data Pipeline #"ETL (Extract #Transform #Load)" #Monitoring #Data Management #Computer Science #Visualization #Lean #Data Quality
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Summary

We are seeking a highly skilled Senior Power BI Developer with a strong background in data engineering to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining robust Power BI solutions that provide actionable insights to stakeholders across the organization while also optimizing UX. Additionally, expertise in data engineering will be utilized to streamline data pipelines, optimize data flows, and ensure the availability and reliability of data for analysis.

Qualifications
• Bachelor’s Degree in Business, Information Technology, Computer Science, or related field; or the equivalent combination of education and experience
• 5-7 years of experience
• Proficient in Power BI development, including data modeling, & DAX calculations
• Experience with data analytics and visualization tools like PowerBI
• Experience in SQL, writing data‐driven reports, working with data to create reports
• Knowledge of data warehousing theory/practice, data management practices, data models, and relationships between data elements
• Well-developed skills in Extract, Transform, and Load (ETL) processes
• Keen to learn, develop and master new skills
• Keen attention to detail, specifically when given direction, gathering requirements, listening to customers, documenting issues, and problem-solving
• Energetic and self-motivated, willingness to learn and openness to change are important
• Ability to work in a fast-paced, changing environment, and with all levels of the organization and cope with rapidly changing information

Responsibilities
• Design, develop, and implement interactive Power BI dashboards and reports that meet business requirements and provide actionable insights.
• Work closely with business stakeholders to understand their analytical needs and translate requirements into effective data visualizations.
• Optimize Power BI data models for performance and scalability, utilizing best practices for data modeling and DAX calculations.
• Development and maintenance of enterprise reporting software in a collaborative, team environment
• Drive technical improvements throughout the team and the organization
• Comprehend and implement medium to complex software architectures
• Initiate and/or participate in established change management practices and processes
• Document clean, functional, and testable requirements for PowerBI reporting
• Communicate clearly and effectively
• Work independently as well as in a collaborative environment
• Adhere to lean principles and standard processes to ensure continuous improvement
• Build and maintain effective working relationships with business and technical leaders
• Collaborate with data architects and engineers to design and implement scalable data pipelines that support both batch and real-time data processing.
• Ensure data quality and integrity by implementing data validation checks, error handling mechanisms, and monitoring processes.