Friday, April 24, 2026

Connecting Azure Blob Storage to Business Central

Setting Up External File Storage in Business Central - No Code Required (Part 2: Connecting Azure Blob Storage to Business Central)

With Business Central version 28, Microsoft introduced a powerful capability to store attachments and files in external storage systems using a fully out-of-the-box, no-code approach.

One of the key limitations of Business Central is that storing large volumes of files directly in the database increases size, impacts performance, and raises cost. External file storage solves this by offloading files to scalable cloud storage like Azure Blob Storage, while still keeping everything fully integrated and configuration-based no development required.

In the previous part, we created an Azure Storage Account, Blob Container, and configured a Shared Access Signature (SAS). In this part, we will connect that storage to Business Central and complete the setup using standard configuration only.

Prerequisites

Before starting, ensure:

  • An Azure Storage Account is created
  • A Blob Container is available
  • The container is accessible using a Shared Access Signature (SAS)

Step 1: Open External File Accounts in Business Central

In Business Central:

  • Search for External File Accounts
  • Click on Add File Account

This opens the assisted setup wizard.

Click Next.



Step 2: Select Storage Type

  • Choose Blob Storage
  • Click Next


Step 3: Configure Storage Details

Fill in the required fields:

  • Account Name: A logical name for identification
  • Azure Storage Account Name: Your Azure storage account name
  • Authorization Type: Select Shared Access Signature (SAS)
  • Secret: Paste the SAS token from Azure
  • Container Name: Enter your blob container name

Click Next.



Step 4: Confirmation

If all details are correct, you will see a congratulations message confirming the connection.



At this point, Business Central is now linked to your Azure Blob Container as an external file account.

Step 5: Assign File Scenario

Now we define where this storage will be used:

  • Search for Assign File Scenarios
  • Open the page
  • Select the configured file account
  • Click Assign Scenarios

Choose:

  • Documents Attachments - External Storage

Confirm the selection.



Warning Note

When enabling external file storage, Business Central displays a warning that:

  • Files will no longer stored in the database
  • Azure now becomes responsible for file storage and backups
  • Retention and governance must be managed externally

This is an expected behavior when moving to external storage.



Step 6: Enable Scenario

  • After you continue, External Storage Setup page will open.
  • Click Enabled
  • Optionally select a Root Folder (or let the system create one automatically)

Now attachments will be stored in Azure Blob Storage instead of the Business Central database.

Step 7: Test the Setup

To verify:

  • Open a document such as a Purchase Order
  • Add an attachment
  • Upload the file, in my case I uploaded Cronus.jpg


Then go to Azure Portal:

  • Open your Blob Container
  • You will see the uploaded file stored there


Cost Benefit Analysis

To understand the real-world advantage of this setup, consider the following scenario:

  • 1 TB of file storage
  • Around 5,000 file operations per day

Azure Blob Storage

  • Storage cost: approx. $18.43/month
  • Operations cost: approx. $0.04/month

Total: approx. $18 to $20 per month

Business Central Database Storage

Business Central database storage is significantly more expensive because it is tied to SQL capacity pricing.

  • Approximate cost: $40 to $60 per GB per year
  • For 1 TB:

    Approx. $4,000+ per month

Conclusion

With the setup completed across both parts, we now have a fully working no-code, out-of-the-box external file storage solution in Business Central using Azure Blob Storage.

This approach allows Business Central to store attachments outside the database, improving performance, reducing storage growth, and enabling better scalability without any custom development.

At the same time, Azure Blob Storage provides a highly cost-effective solution, even at large scale making it a practical and modern alternative to traditional database-based file storage.

Together, this setup forms a clean, scalable, and cost-efficient architecture for managing files in Business Central.

Creating Azure Storage Account and Blob Container

Setting Up External File Storage in Business Central - No Code Required (Part 1: Creating Azure Storage Account and Blob Container)

This blog is part of a series where we’ll explore how to use External File Accounts in Microsoft Dynamics 365 Business Central to store files outside the database using standard, out-of-the-box features and without writing any code.

One of the first requirements for this setup is to have an Azure Storage Account and a Blob Container ready. In this post, we’ll walk through how to create them step by step.

Prerequisites

Make sure you have an active Azure subscription before getting started.

Step 1: Sign in to Azure Portal

Log in to the Azure Portal and search for Storage accounts using the search bar.

Step 2: Create a Storage Account

Click on + Create to begin.



Under the Basics tab, provide the following details:

  • Subscription: Select your Azure subscription
  • Resource Group:  Choose an existing one or create a new one
  • Storage Account Name: Enter a globally unique name
  • Region:  Select the region closest to your users or services
  • Performance:  Choose between Standard (common) or Premium
  • Redundancy: Select based on your needs:
    • LRS (Locally Redundant Storage): cost-effective (we use here)
    • ZRS (Zone-Redundant Storage): higher availability
    • GRS / RA-GRS / GZRS: for geo-redundancy and maximum durability




You can explore additional tabs like Networking, Data protection, and Encryption if you want more control over security and access. For now, you can proceed with default settings.

Click Review + Create, verify the details, and then click Create.

Step 3: Deployment Completion

Once the deployment is complete, you’ll see a confirmation message. Click Go to Resource to open your storage account.



Step 4: Create a Container

Inside the storage account:

  • In the left pane, under Data storage, click Containers
  • Click + Add Container
  • Enter a meaningful Name
  • Click Create





Step 5: Verify Container

Your new container will appear in the list and is now ready to store.

Step 6: Generate a Shared Access Signature (SAS) for the Container

To allow Business Central to access the blob container securely, you need to generate a Shared Access Signature (SAS).

Azure provides multiple access methods such as:

  • Azure Active Directory (Entra ID)
  • Role-Based Access Control (RBAC)
  • Access Keys
  • Shared Access Signatures (SAS)

In this setup, we will use a container-level SAS, which gives controlled and time-limited access to a specific container. There are other account level access available as well but this container level access option is more granular.

Generate SAS from the Container

  • Open your Storage Account in the Azure Portal
  • Go to Data storage then Containers
  • Select your newly created container
  • In the left pane, under Settings, click on Shared access tokens



Configure Access

Set the required options:

  • Permissions: Select based on your needs commonly Read, Write, Delete
  • Start and expiry date/time: Define how long the access should be valid
  • Allowed protocols: Select HTTPS only as its recommended


Generate token and Save

Click on Generate SAS token and URL.

You will see:

  • Blob SAS token
  • Blob SAS URL

Store these values securely (for example, in a key vault or another secure location). These will be used later when connecting the container to Business Central.

Conclusion

You have successfully created an Azure Storage Account and a Blob Container an essential first step toward using external file storage with Business Central.

In the next part, we’ll look at how to connect this container to Business Central and start storing attachments outside the database using standard features no custom development required.

Sunday, March 12, 2023

New functionality: Control Database Isolation level on individual reads on a record instance

 

Hi,

We are familiar with the Record.LockTable property that we use to explicitly lock the table against write transactions that can conflict.

By default, Business Central automatically determines the isolation levels used when querying database. AL developers can now explicitly control the database isolation level on individual reads on a record instance.

With Business Central v.22, a new ReadIsolation method has been introduced on the record data type. Its syntax is as:

Rec.ReadIsolation := IsolationLevel::<enum value>

IsolationLevel can have following values:

Default, ReadCommitted, ReadUncommitted, RepeatableRead and UpdLock

Example:

To understand the concept, we use two simple procedures. One procedure will get customer 10000 and modifies Name while other procedure gets the same customer 10000 and shows the Name. Add these functions to actions on any page, I added them on Customer card using Page Extension. We will call these functions concurrently and one after the other to cover different scenarios in different sessions. Sleep function is used to delay the commit.

Procedure 1:

procedure modifyOnly()
    var
        Customer: Record Customer;
    begin
        Customer.Get('10000');
        Customer.Name := 'Modified Name';
        Customer.Modify();
        Sleep(45000);
    end;

Procedure 2:

procedure Show()
    var
        Customer: Record Customer;
    begin
        Customer.ReadIsolation := IsolationLevel::RepeatableRead;
        Customer.Get('10000');
        Message(Customer.Name);
    end;

       

We will go through the Isolation Levels one by one.

Default: 

It follows the table isolation level for reads; same behavior as not setting Isolation Level.

Open Business Central and Run ModifyName that will put the system to Sleep for 45 seconds after Customer.Modify. So, record is modified but not yet committed.

Open another Business Central session in incognito window and Run ShowName procedure. You will see that ShowName procedure will show the modified name even though it is not yet committed to DB. It is an example of dirty read.

Result: If Default is used then system can read uncommitted data.


ReadCommitted: 

Allows to read only committed data.

Change IsolationLevel to ReadCommitted in code. Open Business Central and change the customer 10000 name back to Original Name. Run ModifyName procedure that will put the system to Sleep for 45 seconds after Customer.Modify. So, record is modified but not yet committed.

Open another Business Central session in incognito window and Run ShowName procedure. You will see that after a few seconds you will get error that We cannot save your changes…



Result: If ReadCommitted is used then Read operation can performed only on a fully committed record, if the record is in between a transaction error will be thrown as above.


ReadUncommitted: 

Allows the record to read data that has been modified by other transactions but not yet committed (also called dirty reads)

Its behaviour is same that we saw in Default. You can perform example in same way.


RepeatableRead: 

Reads only committed data (as we see in ReadCommitted) but it also locks the record until current transaction is completed, means no write transaction can be performed on that record.

To understand it a bit more make a little change to Procedure 2 and replace Message(Customer.Name) by Customer.Modify(); Change IsolationMethod to RepeatableRead.

Open Business Central and change customer name back to Original Name and refresh.

Run procedure#1 ModifyName that will modifies the name and sleep.

Open Business Central in another incognito window and run procedure 2. After a few seconds you will get the error that We cannot save your changes….



Result: RepeatableRead only reads committed data and put lock until transaction is completed.


UpdLock: 

Ensures that reads stay consistent for the life of the current transaction. Until transaction completes record cannot read uncommitted data and other transactions with same isolation level cannot read  data that was read by this record.

Same behaviour as RepeatableRead, but additionally it does not allow other transactions with same isolation level to read data that was read by it. So, it reads only committed data, lock the record for write transaction and block the record from reading for transactions with same isolation level.


For any feedback or if you find any mistakes/issues feel free to send email at raibilalhaider@yahoo.com


Wednesday, November 25, 2020

Setting up Item's price based on Unit of Measure in NAV

 Question:

What if we have an item that we sell in two different units of measure? What to set up so that system should automatically pick the Unit price related to the unit of measure given in sales line?

Pre-Requisite:

Required units should present in our Unit of Measure table.



Steps:

1. Go to Sales Price Worksheet and put these two items on two different lines along with Unit of Measure and Unit Price.

2. Select the action Implement Price Change



3. To verify this Create a Sales Order and then create two lines of same item but with different unit of measure.