Deploy Storage Account
This guide deploys an Azure Storage Account as the destination for Parquet files from the Haltian IoT Data API. This is a simpler and more cost-effective alternative to OneLake. After completing this guide, proceed to Deploy Function App.
Download Source Files
Download all Terraform files for this module:
| File | Description |
|---|---|
| main.tf | Storage account, containers, access policies |
| variables.tf | Input variables |
| outputs.tf | Module outputs |
| output-function-config-template.tf | Auto-generate Function App tfvars |
| providers.tf | Provider requirements |
Place all files in a directory (e.g., infra/storageaccount/) before running Terraform.
What Gets Created
The infra/storageaccount Terraform module provisions:
| Resource | Purpose |
|---|---|
| Resource Group | Container for Azure resources (optional) |
| Storage Account | ADLS Gen2-compatible blob storage with TLS 1.2 |
| Blob Container | Private container for incoming Parquet files |
%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#F6FAFA', 'primaryTextColor': '#143633', 'primaryBorderColor': '#143633', 'lineColor': '#143633', 'secondaryColor': '#C7FDE6', 'tertiaryColor': '#73F9C1', 'clusterBkg': '#ffffff', 'clusterBorder': '#143633', 'edgeLabelBackground': '#ffffff'}}}%%
flowchart TB
subgraph Azure["Azure Subscription"]
RG["fa:fa-folder Resource Group"]
subgraph Storage["Storage Account"]
SA["fa:fa-hdd Standard LRS<br/>TLS 1.2+ · Private Access"]
Container["fa:fa-box Blob Container<br/>incoming"]
end
end
subgraph Access["Access Methods"]
ConnStr["Connection String"]
MSI["Managed Identity (RBAC)"]
end
RG --> SA
SA --> Container
SA --> ConnStr
SA --> MSIPrerequisites
- Terraform ≥ 1.5.0
- Azure CLI authenticated (
az login) - Subscription Contributor or Storage Account Contributor role
No Fabric capacity or Global Admin required — this path is significantly simpler than OneLake.
Deployment with Terraform
Step 1: Gather Azure Information
# Get your subscription ID
az account show --query id -o tsv
# Get your tenant ID
az account show --query tenantId -o tsv
Step 2: Configure terraform.tfvars
Navigate to the infra/storageaccount directory and create terraform.tfvars:
# Azure Subscription Settings
subscription_id = "YOUR_SUBSCRIPTION_ID"
fabric_tenant_id = "YOUR_TENANT_ID"
# Resource Group
resource_group_name = "rg-storage-demo"
create_resource_group = true
location = "westeurope"
# Naming prefix
prefix = "s3storage"
# Storage Account Configuration
storage_use_existing = false
upload_storage_container_name = "incoming"
upload_subpath = "upload/"
# Optional: Grant Power BI Desktop users read access
# blob_readers_email = ["analyst@yourorganization.com"]
Use blob_readers_email to grant specific users Storage Blob Data Reader access. This is required for Power BI Desktop to connect to the storage account.
Step 3: Deploy
cd infra/storageaccount
terraform init
terraform plan
terraform apply
Deployment takes approximately 2–5 minutes.
Step 4: Capture Outputs for Function App
After deployment, generate a ready-to-use terraform.tfvars for the Function App module:
terraform output function_app_template_tfvars \
| grep -v EOT \
> ../../azure-function/terraform/terraform.tfvars
Then edit the generated file to add your S3 credentials.
Key Outputs
# View all outputs
terraform output
# Get storage details
terraform output storage_account_name
terraform output storage_account_url
terraform output storage_container_name
# Get connection string (sensitive)
terraform output -raw storage_connection_string
Deployment via Azure Portal
1. Create Resource Group
- In Azure Portal, click + Create a resource → search Resource Group
- Configure:
- Resource group:
rg-storage-demo - Region: West Europe
- Resource group:
- Click Review + create → Create
2. Create Storage Account
- Click + Create a resource → search Storage account
- Configure:
- Resource group:
rg-storage-demo - Storage account name: a globally unique name (e.g.,
s3storagedata) - Region: West Europe
- Performance: Standard
- Redundancy: Locally-redundant storage (LRS)
- Resource group:
- On the Advanced tab, keep defaults (TLS 1.2)
- Click Review + create → Create
3. Create Container
- In your storage account, click Containers under Data storage
- Click + Container
- Name:
incoming, access level: Private - Click Create
4. Note the Connection Details
- Connection String: Go to Access keys → click Show → copy Connection string
- Storage URL: From Overview, copy the Blob service endpoint URL
- Container name:
incoming
You’ll need these values when deploying the Function App.
Storage Account Naming Rules
Azure Storage Account names must be:
- Globally unique across all of Azure
- 3–24 characters long
- Lowercase letters and numbers only (no hyphens)
The Terraform module automatically sanitizes names from the prefix.
Cost Estimates
Azure Storage is extremely cost-effective:
| Component | Approximate Cost |
|---|---|
| Storage (100 GB, Hot tier) | ~€2/month |
| Transactions (1M operations) | ~€0.40/month |
| Total | ~€3–5/month |
Compare this to the OneLake path which starts at ~€240/month for Fabric capacity. Storage Account is ideal for straightforward file landing and Power BI connectivity.
Next Steps
→ Deploy Function App to set up automated S3-to-Storage data transfer