Friday, May 1, 2026

Mastering Job Queues: A Refresher on Configuration and Orchestration

If you are already working with Business Central you must be familiar with Job Queues in Business Central. They are the backbone of automation and run scheduled tasks in the background. But while many developers and consultants know how to create a basic Job Queue entry, there are critical configuration fields and advanced patterns that often go overlooked. 

This blog serves as a refresher on essential settings like Job Queue Category, Priority, and Parameter String, while discussing error handling techniques and orchestration strategies that can elevate your automation from simple scheduled tasks to intelligent, self-healing workflows.

1. Essential Job Queue Configurations

Before you build complex workflows, master these settings to keep your server stable and performant.

Job Queue Category

Think of the Job Queue Category as a dedicated load balancing lane.

When you have heavy, resource-intensive tasks i.e. high-volume General Ledger postings or complex inventory recalculations by grouping them into a single, defined category provides two major benefits:

  • Deadlocks prevention: You prevent multiple heavy processes from overwhelming the server at the same time, which significantly reduces the risk of database deadlocks and system timeouts.
  • Resource Management: Categorization allows you to effectively partition your workload. This ensures that high-impact background tasks do not starve the rest of your system of the essential resources needed to keep the user interface responsive.


Priority

The Priority field determines the execution order of jobs within a specific category. You can assign a value of Low, Medium, or High to reflect the urgency of the task.

When multiple jobs are ready to run simultaneously, the Job Queue Processor prioritizes those marked as High before processing Medium or Low tasks. This ensures that mission-critical operations, such as automated bank statement imports or essential system integrations, are executed ahead of non-urgent background processes, maintaining consistent performance for your most vital workflows.

Parameter String

The Parameter String allows you to pass data into your code without hardcoding values, making your extensions flexible and environment-aware.

Instead of embedding static values such as API URLs, folder paths, or specific file names directly into your AL source code, you store them in the Parameter String. This enables you to reuse the same code unit across different environments (e.g., Sandbox and Production) by simply updating the parameter value.



Maximum No. of Attempts to Run

This setting serves as your primary defense against transient failures. Because Business Central often interacts with external web services or APIs, momentary network instability can occasionally cause a task to fail.

By setting the Maximum No. of Attempts, you instruct the system to automatically retry a failed task a specified number of times. This acts as a self-healing mechanism, allowing the system to resolve minor, self-correcting issues without human intervention. The task will only be marked with a status of Error once all configured retry attempts have been exhausted.

2. Robust Error Handling

When code fails, you don't want the whole queue to stop. To manage background tasks effectively, you can implement one of error-handling strategies. Each offers a different level of control over the Job Queue's execution status:

The [TryFunction] Pattern

Perfect for risky tasks, such as calling an external web service where a failure is a possibility. By marking a procedure as a TryFunction the system catches any errors that occur within that logic and returns a boolean value instead of crashing the entire Job Queue entry.



The Wrapper Pattern (Codeunit.Run)

This can be used when you need to ensure that a database transaction doesn't corrupt your data or stop your background process, wrap your logic inside a Codeunit.Run call. If the code inside the inner code unit crashes, the database rolls back only that specific transaction, keeping the main Job Queue entry alive. This pattern effectively isolates failures, allowing you to log the error, skip the problematic record, and move on to the next item in your queue.


3. Orchestration: Building a Better Pipeline

Orchestration in Job Queues is about coordinating separate jobs so they behave like a connected workflow rather than isolated tasks. In Microsoft Dynamics 365 Business Central, this is typically achieved by controlling data readiness and processing conditions. Each step updates the state of the data it works on, and subsequent jobs are designed to process only what is ready. This ensures that one step completes fully before the next begins, creating a reliable and traceable flow without requiring direct links between jobs.

Orchestration Strategies

Linear Pipeline: Each job performs one stage of the lifecycle. By updating a status field in a central table, you create a chain reaction where the completion of one task acts as the "green light" for the next. Below is illustration:




Event-Driven: Instead of relying on a timer that polls every 5 minutes, use an event-driven approach. By triggering the next step programmatically the millisecond the previous one finishes, you eliminate "dead time" between tasks.

Batch or Parallel: When dealing with thousands of records, you can use a Dispatcher to break your data into smaller, manageable chunks. By scheduling multiple instances of the same worker code unit to run at the same time, you can process data in parallel, significantly increasing throughput.

By decoupling your integration processes into distinct, sequential stages such as Ingest, Validate, and Commit you transition from fragile, single-run integrations to a resilient, enterprise-grade architecture.

This approach ensures that external data is verified before updating your core business records, provides a clear audit trail for every transaction, and allows you to safely reprocess specific failures without the risk of data duplication. Ultimately, orchestration shifts your integration strategy from reactive troubleshooting to a controlled, predictable data lifecycle.

Practical Scenario:

This diagram illustrates a 4-stage sequential job queue pipeline designed for automated data integration in Business Central.

The process begins when a scheduled trigger activates Job 1 (Import), which fetches data from an external source with high priority and automatic retry capability. Once imported, the data flows sequentially through Job 2 (Validate) to verify business rules, then Job 3 (Transform) to apply formatting and calculations, and finally Job 4 (Post) to create the actual records in Business Central.

Jobs with external dependencies (Import and Post) are configured with 3 automatic retry attempts to handle temporary network issues, while validation and transformation jobs fail immediately to the error log when issues are detected. 

All failed jobs that exhaust their retry attempts trigger an administrator alert, ensuring no data loss goes unnoticed. The colored flow highlights the critical path in green through purple, with error handling branches in orange and red, demonstrating how the system isolates failures while maintaining overall pipeline integrity.

Conclusion:

In this blog, we covered the essential Job Queue configurations including Job Queue Category, Priority, Parameter String, and Maximum No. of Attempts to Run, which form the foundation of reliable background processing. We explored robust error handling strategies using TryFunction and Wrapper patterns to isolate failures and keep your automation resilient. 

Finally, we examined orchestration strategies i.e. Linear Pipeline, Event-Driven, and Batch/Parallel that coordinate separate jobs into cohesive workflows, transforming isolated tasks into intelligent, self-healing automation pipelines.


Friday, April 24, 2026

Connecting Azure Blob Storage to Business Central

Setting Up External File Storage in Business Central - No Code Required (Part 2: Connecting Azure Blob Storage to Business Central)

With Business Central version 28, Microsoft introduced a powerful capability to store attachments and files in external storage systems using a fully out-of-the-box, no-code approach.

One of the key limitations of Business Central is that storing large volumes of files directly in the database increases size, impacts performance, and raises cost. External file storage solves this by offloading files to scalable cloud storage like Azure Blob Storage, while still keeping everything fully integrated and configuration-based no development required.

In the previous part, we created an Azure Storage Account, Blob Container, and configured a Shared Access Signature (SAS). In this part, we will connect that storage to Business Central and complete the setup using standard configuration only.

Prerequisites

Before starting, ensure:

  • An Azure Storage Account is created
  • A Blob Container is available
  • The container is accessible using a Shared Access Signature (SAS)

Step 1: Open External File Accounts in Business Central

In Business Central:

  • Search for External File Accounts
  • Click on Add File Account

This opens the assisted setup wizard.

Click Next.



Step 2: Select Storage Type

  • Choose Blob Storage
  • Click Next


Step 3: Configure Storage Details

Fill in the required fields:

  • Account Name: A logical name for identification
  • Azure Storage Account Name: Your Azure storage account name
  • Authorization Type: Select Shared Access Signature (SAS)
  • Secret: Paste the SAS token from Azure
  • Container Name: Enter your blob container name

Click Next.



Step 4: Confirmation

If all details are correct, you will see a congratulations message confirming the connection.



At this point, Business Central is now linked to your Azure Blob Container as an external file account.

Step 5: Assign File Scenario

Now we define where this storage will be used:

  • Search for Assign File Scenarios
  • Open the page
  • Select the configured file account
  • Click Assign Scenarios

Choose:

  • Documents Attachments - External Storage

Confirm the selection.



Warning Note

When enabling external file storage, Business Central displays a warning that:

  • Files will no longer stored in the database
  • Azure now becomes responsible for file storage and backups
  • Retention and governance must be managed externally

This is an expected behavior when moving to external storage.



Step 6: Enable Scenario

  • After you continue, External Storage Setup page will open.
  • Click Enabled
  • Optionally select a Root Folder (or let the system create one automatically)

Now attachments will be stored in Azure Blob Storage instead of the Business Central database.

Step 7: Test the Setup

To verify:

  • Open a document such as a Purchase Order
  • Add an attachment
  • Upload the file, in my case I uploaded Cronus.jpg


Then go to Azure Portal:

  • Open your Blob Container
  • You will see the uploaded file stored there


Cost Benefit Analysis

To understand the real-world advantage of this setup, consider the following scenario:

  • 1 TB of file storage
  • Around 5,000 file operations per day

Azure Blob Storage

  • Storage cost: approx. $18.43/month
  • Operations cost: approx. $0.04/month

Total: approx. $18 to $20 per month

Business Central Database Storage

Business Central database storage is significantly more expensive because it is tied to SQL capacity pricing.

  • Approximate cost: $40 to $60 per GB per year
  • For 1 TB:

    Approx. $4,000+ per month

Conclusion

With the setup completed across both parts, we now have a fully working no-code, out-of-the-box external file storage solution in Business Central using Azure Blob Storage.

This approach allows Business Central to store attachments outside the database, improving performance, reducing storage growth, and enabling better scalability without any custom development.

At the same time, Azure Blob Storage provides a highly cost-effective solution, even at large scale making it a practical and modern alternative to traditional database-based file storage.

Together, this setup forms a clean, scalable, and cost-efficient architecture for managing files in Business Central.

Creating Azure Storage Account and Blob Container

Setting Up External File Storage in Business Central - No Code Required (Part 1: Creating Azure Storage Account and Blob Container)

This blog is part of a series where we’ll explore how to use External File Accounts in Microsoft Dynamics 365 Business Central to store files outside the database using standard, out-of-the-box features and without writing any code.

One of the first requirements for this setup is to have an Azure Storage Account and a Blob Container ready. In this post, we’ll walk through how to create them step by step.

Prerequisites

Make sure you have an active Azure subscription before getting started.

Step 1: Sign in to Azure Portal

Log in to the Azure Portal and search for Storage accounts using the search bar.

Step 2: Create a Storage Account

Click on + Create to begin.



Under the Basics tab, provide the following details:

  • Subscription: Select your Azure subscription
  • Resource Group:  Choose an existing one or create a new one
  • Storage Account Name: Enter a globally unique name
  • Region:  Select the region closest to your users or services
  • Performance:  Choose between Standard (common) or Premium
  • Redundancy: Select based on your needs:
    • LRS (Locally Redundant Storage): cost-effective (we use here)
    • ZRS (Zone-Redundant Storage): higher availability
    • GRS / RA-GRS / GZRS: for geo-redundancy and maximum durability




You can explore additional tabs like Networking, Data protection, and Encryption if you want more control over security and access. For now, you can proceed with default settings.

Click Review + Create, verify the details, and then click Create.

Step 3: Deployment Completion

Once the deployment is complete, you’ll see a confirmation message. Click Go to Resource to open your storage account.



Step 4: Create a Container

Inside the storage account:

  • In the left pane, under Data storage, click Containers
  • Click + Add Container
  • Enter a meaningful Name
  • Click Create





Step 5: Verify Container

Your new container will appear in the list and is now ready to store.

Step 6: Generate a Shared Access Signature (SAS) for the Container

To allow Business Central to access the blob container securely, you need to generate a Shared Access Signature (SAS).

Azure provides multiple access methods such as:

  • Azure Active Directory (Entra ID)
  • Role-Based Access Control (RBAC)
  • Access Keys
  • Shared Access Signatures (SAS)

In this setup, we will use a container-level SAS, which gives controlled and time-limited access to a specific container. There are other account level access available as well but this container level access option is more granular.

Generate SAS from the Container

  • Open your Storage Account in the Azure Portal
  • Go to Data storage then Containers
  • Select your newly created container
  • In the left pane, under Settings, click on Shared access tokens



Configure Access

Set the required options:

  • Permissions: Select based on your needs commonly Read, Write, Delete
  • Start and expiry date/time: Define how long the access should be valid
  • Allowed protocols: Select HTTPS only as its recommended


Generate token and Save

Click on Generate SAS token and URL.

You will see:

  • Blob SAS token
  • Blob SAS URL

Store these values securely (for example, in a key vault or another secure location). These will be used later when connecting the container to Business Central.

Conclusion

You have successfully created an Azure Storage Account and a Blob Container an essential first step toward using external file storage with Business Central.

In the next part, we’ll look at how to connect this container to Business Central and start storing attachments outside the database using standard features no custom development required.

Sunday, March 12, 2023

New functionality: Control Database Isolation level on individual reads on a record instance

 

Hi,

We are familiar with the Record.LockTable property that we use to explicitly lock the table against write transactions that can conflict.

By default, Business Central automatically determines the isolation levels used when querying database. AL developers can now explicitly control the database isolation level on individual reads on a record instance.

With Business Central v.22, a new ReadIsolation method has been introduced on the record data type. Its syntax is as:

Rec.ReadIsolation := IsolationLevel::<enum value>

IsolationLevel can have following values:

Default, ReadCommitted, ReadUncommitted, RepeatableRead and UpdLock

Example:

To understand the concept, we use two simple procedures. One procedure will get customer 10000 and modifies Name while other procedure gets the same customer 10000 and shows the Name. Add these functions to actions on any page, I added them on Customer card using Page Extension. We will call these functions concurrently and one after the other to cover different scenarios in different sessions. Sleep function is used to delay the commit.

Procedure 1:

procedure modifyOnly()
    var
        Customer: Record Customer;
    begin
        Customer.Get('10000');
        Customer.Name := 'Modified Name';
        Customer.Modify();
        Sleep(45000);
    end;

Procedure 2:

procedure Show()
    var
        Customer: Record Customer;
    begin
        Customer.ReadIsolation := IsolationLevel::RepeatableRead;
        Customer.Get('10000');
        Message(Customer.Name);
    end;

       

We will go through the Isolation Levels one by one.

Default: 

It follows the table isolation level for reads; same behavior as not setting Isolation Level.

Open Business Central and Run ModifyName that will put the system to Sleep for 45 seconds after Customer.Modify. So, record is modified but not yet committed.

Open another Business Central session in incognito window and Run ShowName procedure. You will see that ShowName procedure will show the modified name even though it is not yet committed to DB. It is an example of dirty read.

Result: If Default is used then system can read uncommitted data.


ReadCommitted: 

Allows to read only committed data.

Change IsolationLevel to ReadCommitted in code. Open Business Central and change the customer 10000 name back to Original Name. Run ModifyName procedure that will put the system to Sleep for 45 seconds after Customer.Modify. So, record is modified but not yet committed.

Open another Business Central session in incognito window and Run ShowName procedure. You will see that after a few seconds you will get error that We cannot save your changes…



Result: If ReadCommitted is used then Read operation can performed only on a fully committed record, if the record is in between a transaction error will be thrown as above.


ReadUncommitted: 

Allows the record to read data that has been modified by other transactions but not yet committed (also called dirty reads)

Its behaviour is same that we saw in Default. You can perform example in same way.


RepeatableRead: 

Reads only committed data (as we see in ReadCommitted) but it also locks the record until current transaction is completed, means no write transaction can be performed on that record.

To understand it a bit more make a little change to Procedure 2 and replace Message(Customer.Name) by Customer.Modify(); Change IsolationMethod to RepeatableRead.

Open Business Central and change customer name back to Original Name and refresh.

Run procedure#1 ModifyName that will modifies the name and sleep.

Open Business Central in another incognito window and run procedure 2. After a few seconds you will get the error that We cannot save your changes….



Result: RepeatableRead only reads committed data and put lock until transaction is completed.


UpdLock: 

Ensures that reads stay consistent for the life of the current transaction. Until transaction completes record cannot read uncommitted data and other transactions with same isolation level cannot read  data that was read by this record.

Same behaviour as RepeatableRead, but additionally it does not allow other transactions with same isolation level to read data that was read by it. So, it reads only committed data, lock the record for write transaction and block the record from reading for transactions with same isolation level.


For any feedback or if you find any mistakes/issues feel free to send email at raibilalhaider@yahoo.com


Wednesday, November 25, 2020

Setting up Item's price based on Unit of Measure in NAV

 Question:

What if we have an item that we sell in two different units of measure? What to set up so that system should automatically pick the Unit price related to the unit of measure given in sales line?

Pre-Requisite:

Required units should present in our Unit of Measure table.



Steps:

1. Go to Sales Price Worksheet and put these two items on two different lines along with Unit of Measure and Unit Price.

2. Select the action Implement Price Change



3. To verify this Create a Sales Order and then create two lines of same item but with different unit of measure.