Deploy Storage Account

Provision an Azure Storage Account as a data landing zone for Haltian IoT Parquet files using Terraform or Azure Portal.

This guide deploys an Azure Storage Account as the destination for Parquet files from the Haltian IoT Data API. This is a simpler and more cost-effective alternative to OneLake. After completing this guide, proceed to Deploy Function App.

Download Source Files

Download all Terraform files for this module:

FileDescription
main.tfStorage account, containers, access policies
variables.tfInput variables
outputs.tfModule outputs
output-function-config-template.tfAuto-generate Function App tfvars
providers.tfProvider requirements

Place all files in a directory (e.g., infra/storageaccount/) before running Terraform.

What Gets Created

The infra/storageaccount Terraform module provisions:

ResourcePurpose
Resource GroupContainer for Azure resources (optional)
Storage AccountADLS Gen2-compatible blob storage with TLS 1.2
Blob ContainerPrivate container for incoming Parquet files
%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#F6FAFA', 'primaryTextColor': '#143633', 'primaryBorderColor': '#143633', 'lineColor': '#143633', 'secondaryColor': '#C7FDE6', 'tertiaryColor': '#73F9C1', 'clusterBkg': '#ffffff', 'clusterBorder': '#143633', 'edgeLabelBackground': '#ffffff'}}}%%
flowchart TB
    subgraph Azure["Azure Subscription"]
        RG["fa:fa-folder Resource Group"]

        subgraph Storage["Storage Account"]
            SA["fa:fa-hdd Standard LRS<br/>TLS 1.2+ · Private Access"]
            Container["fa:fa-box Blob Container<br/>incoming"]
        end
    end

    subgraph Access["Access Methods"]
        ConnStr["Connection String"]
        MSI["Managed Identity (RBAC)"]
    end

    RG --> SA
    SA --> Container
    SA --> ConnStr
    SA --> MSI

Prerequisites

  • Terraform ≥ 1.5.0
  • Azure CLI authenticated (az login)
  • Subscription Contributor or Storage Account Contributor role

No Fabric capacity or Global Admin required — this path is significantly simpler than OneLake.

Deployment with Terraform

Step 1: Gather Azure Information

# Get your subscription ID
az account show --query id -o tsv

# Get your tenant ID
az account show --query tenantId -o tsv

Step 2: Configure terraform.tfvars

Navigate to the infra/storageaccount directory and create terraform.tfvars:

# Azure Subscription Settings
subscription_id        = "YOUR_SUBSCRIPTION_ID"
fabric_tenant_id       = "YOUR_TENANT_ID"

# Resource Group
resource_group_name    = "rg-storage-demo"
create_resource_group  = true
location               = "westeurope"

# Naming prefix
prefix                 = "s3storage"

# Storage Account Configuration
storage_use_existing              = false
upload_storage_container_name     = "incoming"
upload_subpath                    = "upload/"

# Optional: Grant Power BI Desktop users read access
# blob_readers_email = ["analyst@yourorganization.com"]

Step 3: Deploy

cd infra/storageaccount

terraform init
terraform plan
terraform apply

Deployment takes approximately 2–5 minutes.

Step 4: Capture Outputs for Function App

After deployment, generate a ready-to-use terraform.tfvars for the Function App module:

terraform output function_app_template_tfvars \
  | grep -v EOT \
  > ../../azure-function/terraform/terraform.tfvars

Then edit the generated file to add your S3 credentials.

Key Outputs

# View all outputs
terraform output

# Get storage details
terraform output storage_account_name
terraform output storage_account_url
terraform output storage_container_name

# Get connection string (sensitive)
terraform output -raw storage_connection_string

Deployment via Azure Portal

1. Create Resource Group

  1. In Azure Portal, click + Create a resource → search Resource Group
  2. Configure:
    • Resource group: rg-storage-demo
    • Region: West Europe
  3. Click Review + createCreate

2. Create Storage Account

  1. Click + Create a resource → search Storage account
  2. Configure:
    • Resource group: rg-storage-demo
    • Storage account name: a globally unique name (e.g., s3storagedata)
    • Region: West Europe
    • Performance: Standard
    • Redundancy: Locally-redundant storage (LRS)
  3. On the Advanced tab, keep defaults (TLS 1.2)
  4. Click Review + createCreate

3. Create Container

  1. In your storage account, click Containers under Data storage
  2. Click + Container
  3. Name: incoming, access level: Private
  4. Click Create

4. Note the Connection Details

  1. Connection String: Go to Access keys → click Show → copy Connection string
  2. Storage URL: From Overview, copy the Blob service endpoint URL
  3. Container name: incoming

You’ll need these values when deploying the Function App.

Storage Account Naming Rules

Azure Storage Account names must be:

  • Globally unique across all of Azure
  • 3–24 characters long
  • Lowercase letters and numbers only (no hyphens)

The Terraform module automatically sanitizes names from the prefix.

Cost Estimates

Azure Storage is extremely cost-effective:

ComponentApproximate Cost
Storage (100 GB, Hot tier)~€2/month
Transactions (1M operations)~€0.40/month
Total~€3–5/month

Compare this to the OneLake path which starts at ~€240/month for Fabric capacity. Storage Account is ideal for straightforward file landing and Power BI connectivity.

Next Steps

Deploy Function App to set up automated S3-to-Storage data transfer