Power BI Integration

Import Haltian IoT Parquet data into Azure and visualize it in Microsoft Power BI

This integration copies Parquet files from the Haltian IoT Data API S3 bucket into Microsoft Azure and connects them to Power BI for analytics and dashboards. The infrastructure is fully automated with Terraform.

Overview

Haltian IoT sensors collect workplace data — occupancy, environment, movement — and export it as Parquet files via the Data API. This integration brings that data into Microsoft Power BI so your team can build dashboards, reports, and analytics without any custom development.

What it does

An automated pipeline copies sensor data from Haltian’s cloud into your Azure environment on a configurable schedule (default: every 15 minutes). Power BI connects directly to the data for real-time dashboards.

Setup at a glance

StepWhat
1Haltian enables Data API access and provides S3 credentials
2Deploy Azure infrastructure (automated with Terraform)
3Deploy the copy function (automated with Terraform)
4Connect Power BI and build dashboards

Architecture

A timer-triggered Azure Function App copies new Parquet files from S3 on a configurable schedule. You choose one of two Azure destinations.

%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#F6FAFA', 'primaryTextColor': '#143633', 'primaryBorderColor': '#143633', 'lineColor': '#143633', 'secondaryColor': '#C7FDE6', 'tertiaryColor': '#73F9C1', 'clusterBkg': '#ffffff', 'clusterBorder': '#143633', 'edgeLabelBackground': '#ffffff'}}}%%
flowchart TB
    subgraph South["Haltian IoT"]
        S3["fa:fa-database S3 Bucket<br/>Parquet Files"]
    end

    subgraph Middle["Customer Azure"]
        FUNC["fa:fa-cogs Azure Function App<br/>Python · Timer Trigger"]

        subgraph Dest["Destination (choose one)"]
            ONELAKE["fa:fa-layer-group Microsoft Fabric<br/>OneLake Lakehouse"]
            STORAGE["fa:fa-hdd Azure Storage Account<br/>Blob Storage"]
        end
    end

    subgraph North["Analytics"]
        PBI["fa:fa-chart-bar Power BI<br/>Dashboards & Reports"]
    end

    S3 -->|"List & Get Objects"| FUNC
    FUNC -->|"DFS API"| ONELAKE
    FUNC -->|"Blob Service"| STORAGE
    ONELAKE --> PBI
    STORAGE --> PBI

Destination Options

OptionBest For
Microsoft Fabric OneLakeEnterprise analytics with built-in Spark, SQL endpoint
Azure Storage AccountSimple file landing, custom pipelines

See Microsoft Fabric pricing and Azure Storage pricing for current costs.

Source Files

All Terraform modules and the Azure Function source code are available as downloadable files. Each deployment guide below includes a download table with direct links to every file:

ModuleFilesDownload From
infra/onelake8 .tf filesDeploy OneLake
infra/storageaccount5 .tf filesDeploy Storage Account
azure-function/terraform5 .tf filesDeploy Function App
azure-function (Python)10 filesDeploy Function App

Prerequisites

  • Terraform ≥ 1.5.0 — install guide
  • Azure CLI (az) — installed and authenticated (az login)
  • Azure subscription with Contributor permissions
  • S3 credentials from Haltian (access key + secret, or bring-your-own IAM role)
  • Python 3.10+ (optional, for local Function App development)

Additional prerequisites depend on your chosen destination — see the deployment guides below.

Deployment Guides

Follow these guides in order to deploy the complete integration:

StepGuideDescription
1Deploy OneLake InfrastructureFabric Capacity, Workspace, Lakehouse, AAD app
1— or — Deploy Storage AccountAzure Storage with blob containers
2Deploy Function AppTimer-triggered S3→Azure copy function
3Connect Power BILoad data into Power BI Desktop and create reports

S3 Access Credentials

Haltian provides two options for S3 bucket access:

MethodSetupBest For
Access Key + SecretHaltian provides credentials directlyQuick setup
Bring-your-own IAM RoleYou provide an ARN; Haltian configures bucket policyEnterprise security policies

Data Flow

Once deployed, the Azure Function runs on a configurable schedule (default: every 15 minutes):

  1. List Objects — queries S3 for Parquet files within the configured time range
  2. Download — retrieves each file from S3
  3. Authenticate — uses MSAL client credentials (OneLake) or connection string (Storage Account)
  4. Upload — writes to OneLake via DFS API or to Blob Storage
  5. Log — records activity in Application Insights

Reference


Deploy OneLake Infrastructure

Provision Microsoft Fabric Capacity, Workspace, Lakehouse, and Azure AD application for OneLake access using Terraform or Azure Portal.

Deploy Storage Account

Provision an Azure Storage Account as a data landing zone for Haltian IoT Parquet files using Terraform or Azure Portal.

Deploy Function App

Deploy the Azure Function App that copies Parquet files from the Haltian IoT S3 bucket to OneLake or Azure Storage Account.

Connect Power BI

Load Haltian IoT Parquet data into Power BI Desktop from OneLake or Azure Blob Storage and create visualizations.

Configuration Reference

Complete Terraform variable reference for all Power BI integration modules — OneLake, Storage Account, and Function App.

Troubleshooting

Common issues and solutions for the Power BI integration — Azure Function, OneLake, Storage Account, and Power BI connectivity.