Troubleshooting

Common issues and solutions for the Power BI integration — Azure Function, OneLake, Storage Account, and Power BI connectivity.

Azure Function App

Function not triggering on schedule

Symptoms: Function doesn’t execute automatically.

Solutions:

  1. Verify the Function App is started (not stopped): Azure Portal → Function App → Overview → Start
  2. Check COPY_PARQUET_SCHEDULE is set correctly in Application Settings
  3. Verify CRON syntax includes the seconds field (6 parts, not 5): 0 */15 * * * *
  4. Check Application Insights for startup errors
  5. Restart the Function App:
    az functionapp restart --name <function-app-name> --resource-group <rg-name>
    

Function App fails to start

Symptoms: Deployed but doesn’t execute or functions don’t appear.

Solutions:

  1. Check logs: Function App → Log Stream
  2. Verify requirements.txt dependencies can be installed
  3. Verify host.json and function_app.py syntax
  4. The SCM_DO_BUILD_DURING_DEPLOYMENT setting should be true (set by Terraform)
  5. If ZIP deploy hangs, deploy manually:
    az functionapp deployment source config-zip \
      --resource-group <rg-name> \
      --name <function-app-name> \
      --src functionapp.zip
    

S3 access denied (403)

Symptoms: Function logs show AWS authorization errors.

Solutions:

  1. Verify S3_ACCESS_KEY_ID and S3_SECRET_ACCESS_KEY are correct
  2. Ensure the IAM policy allows s3:ListBucket and s3:GetObject
  3. Check S3 bucket name and region match the configuration
  4. Contact Haltian to verify credentials are active

Out of memory or timeout

Symptoms: Function fails with timeout or memory errors.

Solutions:

  1. Reduce measurements_time_range_days to fetch fewer files per run
  2. Increase timeout in host.json (max 10 minutes for Consumption plan):
    { "functionTimeout": "00:10:00" }
    
  3. Consider upgrading to Premium Plan (EP1+) for more resources

OneLake Issues

AADSTS500011: Resource not found

Symptoms: Token request fails with “The resource principal named {resource} was not found in the tenant.”

Cause: Using incorrect resource/audience values in token requests. This is the most common OneLake authentication error.

Solutions:

  1. Verify the custom OneLake app was created:
    terraform output onelake_app_client_id
    
  2. Ensure you’re using the correct client ID and secret from the infra/onelake module
  3. The token should be requested for scope: https://onelake.dfs.fabric.microsoft.com/.default
  4. Wait 5–10 minutes for app registration to propagate

OneLake upload fails (403 Forbidden)

Symptoms: Function authenticates but can’t write to Lakehouse.

Solutions:

  1. Get the Function App’s managed identity principal ID:
    terraform output function_app_identity_principal_id
    
  2. In Fabric Portal, go to your workspace → SettingsManage access
  3. Add the principal ID with Contributor role
  4. Wait 5–15 minutes for permissions to propagate
  5. Verify Fabric capacity is running (not paused)

Fabric workspace not found

Symptoms: Cannot access workspace after deployment.

Solutions:

  1. Verify workspace exists:
    terraform output fabric_workspace_id
    
  2. Check Fabric Portal directly
  3. Ensure the Fabric capacity is running (not paused)
  4. Wait 5–10 minutes for propagation after creation

Insufficient privileges for directory roles

Symptoms: terraform apply fails on role assignment.

Solutions:

  1. Set assign_directory_roles = false and assign_graph_permissions = false in terraform.tfvars
  2. Deploy with Terraform
  3. Have a Global Admin manually grant consent: Azure AD → App registrations → Your app → API permissions → Grant admin consent

Storage Account Issues

Storage account name not available

Symptoms: “The storage account name is already taken.”

Solutions:

  1. Choose a different, globally unique name
  2. Set upload_storage_account_name explicitly in terraform.tfvars
  3. Or use an existing storage account: storage_use_existing = true

Access denied when uploading (403)

Symptoms: Function can’t write to storage account.

Solutions:

  1. Using connection string: Verify STORAGE_CONNECTION_STRING is correct in app settings
  2. Using managed identity: Assign the “Storage Blob Data Contributor” role:
    az role assignment create \
      --role "Storage Blob Data Contributor" \
      --assignee $(cd azure-function/terraform && terraform output -raw function_app_identity_principal_id) \
      --scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Storage/storageAccounts/<account>
    
  3. Wait 5–10 minutes for RBAC propagation

Container not found

Symptoms: Application reports container doesn’t exist.

Solutions:

  1. Verify:
    terraform output storage_container_name
    
  2. Check in Azure Portal: Storage Account → Containers
  3. Ensure the container name in Function App settings matches

Power BI Issues

Can’t connect to Storage Account from Power BI Desktop

Symptoms: Power BI shows authentication or access errors.

Solutions:

  1. Ensure you’re signed in to Azure in Power BI Desktop (top-right corner)
  2. Verify your user has Storage Blob Data Reader role on the storage account
  3. In Terraform, use the blob_readers_email variable to grant access:
    blob_readers_email = ["your-email@company.com"]
    
  4. If using firewall rules, ensure your IP is allowed

Can’t connect to Lakehouse from Power BI Desktop

Symptoms: Lakehouse doesn’t appear in Power BI data sources.

Solutions:

  1. Verify you have access to the Fabric workspace
  2. Ensure the Fabric capacity is running (not paused)
  3. Try using Get Data → Microsoft Fabric → Lakehouses instead of OneLake data hub
  4. Check you’re signed in with the correct Azure account

No data appears after loading

Symptoms: Tables are empty in Power BI.

Solutions:

  1. Verify the Azure Function has executed successfully (check Application Insights)
  2. For Storage Account: check files exist in the container using Azure Portal or CLI
  3. For OneLake: check files exist in the Lakehouse Files section
  4. In Power Query, verify the Text.Contains filter matches your measurement type names
  5. Click Refresh in Power BI Desktop to reload data

Power Query errors on Parquet files

Symptoms: “Expression.Error” or data type conversion failures.

Solutions:

  1. Check column names match your actual Parquet schema
  2. Verify the storage account name in Power Query M code is correct
  3. Ensure Parquet files are not corrupted — download one manually and verify
  4. If column types differ, adjust the Table.TransformColumnTypes step

Terraform Issues

Provider initialization fails

Symptoms: terraform init fails to download providers.

Solutions:

  1. Verify internet connectivity
  2. Check Terraform version: terraform -version (must be ≥ 1.5.0)
  3. Clear provider cache: rm -rf .terraformterraform init

State lock errors

Symptoms: Terraform can’t acquire state lock.

Solutions:

  1. If a previous run was interrupted: terraform force-unlock <lock-id>
  2. Ensure no other Terraform process is running against the same state
  3. Consider using a remote backend for team environments

Destroy fails with dependency errors

Symptoms: terraform destroy fails with resource dependency issues.

Solutions:

  1. Always destroy Function App first, then infrastructure:
    cd azure-function/terraform && terraform destroy
    cd ../../infra/onelake && terraform destroy        # or infra/storageaccount
    
  2. If individual resources fail, use targeted destroy:
    terraform destroy -target=azurerm_linux_function_app.func
    

Verification Commands

Use these commands to quickly check the health of your deployment:

# Check Function App status
FUNC=$(cd azure-function/terraform && terraform output -raw function_app_name)
RG=$(cd azure-function/terraform && terraform output -raw resource_group_name)
az functionapp show --name $FUNC --resource-group $RG --query state -o tsv

# View recent function executions
az monitor app-insights query \
  --app $(cd azure-function/terraform && terraform output -raw application_insights_app_id) \
  --analytics-query "traces | where timestamp > ago(1h) | order by timestamp desc | take 10" \
  --output table

# List files in Storage Account
az storage blob list \
  --account-name $(cd infra/storageaccount && terraform output -raw storage_account_name) \
  --container-name incoming \
  --auth-mode login \
  --output table

# Check OneLake files (via Fabric Portal link)
cd infra/onelake
WORKSPACE=$(terraform output -raw fabric_workspace_id)
LAKEHOUSE=$(terraform output -raw fabric_lakehouse_id)
echo "https://app.fabric.microsoft.com/groups/$WORKSPACE/lakehouses/$LAKEHOUSE"

Enable Debug Logging

For detailed troubleshooting, enable DEBUG level logging:

Via Terraform:

# In azure-function/terraform/terraform.tfvars
log_level = "DEBUG"

Then terraform apply.

Via Azure Portal:

  1. Go to Function App → Configuration
  2. Add/update setting: LOG_LEVEL = DEBUG
  3. Save and restart

Getting Help

If you’re unable to resolve an issue:

  1. Check Application Insights logs for detailed error messages
  2. Enable DEBUG logging and reproduce the issue
  3. Contact Haltian support with the error details and your deployment configuration