1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Business Intelligence Solutions Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Business Intelligence Solutions Engineer with a contract length of over 6 months, offering a competitive pay rate. Key skills include cloud platforms, big data technologies, data validation, and proficiency in SQL and Python. A bachelor's degree is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
454
🗓️ - Date discovered
April 2, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#AWS (Amazon Web Services) #Data Analysis #Compliance #Python #Batch #Data Pipeline #Storage #Big Data #Snowflake #Kafka (Apache Kafka) #BigQuery #Apache Spark #Azure #Data Governance #Quality Assurance #Data Quality #SQL (Structured Query Language) #Cloud #Data Architecture #Spark (Apache Spark) #Hadoop #GCP (Google Cloud Platform) #Data Science #Data Warehouse #Datasets #Base #BI (Business Intelligence) #Computer Science #Scala #Data Engineering #Data Access #Security #"ETL (Extract #Transform #Load)" #Redshift #Data Processing
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Position Summary:

Blu Omega is looking for a Data and Business Intelligence (BI) Solutions Engineer to design, develop, and maintain robust data pipelines and architectures that support seamless data flow and empower decision-making. The ideal candidate will collaborate with analysts and data scientists, ensure data quality, and optimize systems for evolving organizational needs. This role requires expertise in managing large datasets, cloud platforms, and big data technologies while adhering to best practices in security, compliance, and data engineering.

Responsibilities:

   • The essential functions include, but are not limited to the following

   • Design, develop, and maintain scalable data pipelines to enable seamless data flow into data warehouses.

   • Ensure the reliability and efficiency of data pipelines for real-time, batch and ad hoc data processing.

   • Partner with data analysts and scientists to understand and meet data requirements for analysis and modeling.

   • Translate organization needs into technical data solutions.

   • Build and optimize data architectures for efficient storage and transfer of structured and unstructured data.

   • Implement solutions that enhance data accessibility and performance.

   • Perform data validation and cleansing to ensure accuracy, consistency, and reliability.

   • Establish processes to identify and resolve data discrepancies.

   • Monitor data pipelines and infrastructure for performance and availability.

   • Diagnose and resolve issues promptly to minimize disruption.

   • Adhere to data engineering best practices, including security, compliance, and governance standards.

   • Ensure data systems comply with organizational and government regulatory requirements.

   • Document data processes, workflows, and system designs to maintain a clear knowledge base.

   • Provide regular updates and reports to stakeholders on system performance and improvements.

   • Proactively enhance workflows, infrastructure, and technologies to meet evolving organization needs.

   • Stay up-to-date with advancements in data engineering and big data technologies.

   • Manage large datasets using cloud-based platforms (e.g., AWS, Azure, GCP) and big data tools (e.g., Hadoop, Spark).

   • Optimize costs and performance for cloud-based data solutions.

Minimum Qualifications

   • Eight (8) years experience in Data Analysis, Business intelligence, or a related role on programs and contracts of similar scope, type, and complexity is required.

   • Takes initiative, detail-oriented, problem solver, willing to ask questions, clear communicator

   • Proven experience in designing and maintaining data pipelines and architectures

   • Hands-on experience with cloud-based platforms and big data technologies

   • Strong background in data validation, cleansing, and quality assurance

   • Proficiency in data processing tools and languages (e.g., SQL, Python, Apache Spark, or Kafka)

   • Experience with ETL tools and processes

   • Strong understanding of data warehousing concepts and platforms (e.g., Snowflake, Redshift, BigQuery)

   • Familiarity with data governance, security, and compliance standards

   • Excellent problem-solving and troubleshooting skills

   • High emotional intelligence and sensitivity

   • Excellent communication, interpersonal, and mentoring skills

   • Able to analyze, think quickly and resolve conflict

   • Able to adapt to a changing environment

   • Skilled at supporting multiple clients at a time

   • Self-motivated, and able to maintain a professional, positive, “can-do” attitude while staying focused in the middle of distraction

   • Bachelor’s degree in Computer Science, Data Engineering, Information Systems or related field