Optimization for handling large record sets + new update limit to 20,000 records at a time (v.23.2/ v.24, breaking change)

As database sizes grow, so does the need to process large sets of records simultaneously.

This increasing demand necessitates innovative approaches to data management and processing. Traditional methods may no longer be sufficient, requiring us to develop advanced optimization techniques and implement new processing limits to ensure efficiency and maintain system performance.

By addressing these challenges, we aim to improve overall system functionality and prevent potential performance issues that could arise from processing large volumes of records simultaneously.

For this reason, we are implementing a series of changes starting with (but not limited to) the following key focuses:

  • Optimization of the validations and procedures during record-saving

This includes review and recreation of specific processes to address particular issues (e.g., product update is optimized for version 23.2) and rewriting business rules for better efficiency and speed (various developments for versions 23.2, 24, and future versions).

  • A new type of generation procedures that is able to create multiple smaller documents when necessary

Working with large databases introduces new challenges for document generation procedures as well, especially for those handling vast amounts of data due to its business logic.

For example, generating Accounting Vouchers from the Accounting Operations document can produce tens of thousands of rows, depending on the properties of the accounts. This not only affects document creation but also subsequent processing of these documents (opening, editing, saving).

Version 24.2 will include a new generation procedure that not only leverages modern system calculations i.e. will be fastest, but also will be able to break the creating into smaller parts - multiple documents with up to 10,000 records each. This number is suitable for the business process in question, but other processes/ future generations might require creating multiple documents with varying numbers of rows (e.g., 100, 1,000).

  • New limits (breaking change - v.23.2 or v.24 if the client hasn't used v.23.2 prior to the updating to v. 24):

New validation that prohibits saving or deleting more than 20,000 records simultaneously (version 23.2 SP4).

Our investigations into system dropouts often reveal excessive usage, such as the real case of an attempt to update 66,000 products at once. We understand the need and will continue to improve these processes, but we must also impose some limits. Otherwise, we leave the door wide open and allow anyone to cause performance issues at any moment.

After various tests, we determined that 20,000 records is a reasonable limit that is large enough to be useful but does not cause data processing problems or performance issues for the entire database and all its users.

  • Ability to cancel large record saves through the system interface

We have added a new "Cancel" button to the progress bar of the save operation. This allows users to stop the operation if they detect an error or if it takes an unusually long time. They can stop the saving, make corrections, and restart the process, rather than waiting for it to complete.

The improved information window displays the current stage of the save process as the cancelation is possible only if it has not yet reached the "Commit database transaction" stage.


This is just part of our strategy for handling large data sets and performance optimization for what we call "big databases."

Refining our strategy and methods will continue based on the real cases and client needs.

We will also continue to perform near-constant monitoring of our server processes and include new query optimizations in each version to enhance big data processing and overall system performance.


Have more questions? Submit a request


Please sign in to leave a comment.
Powered by Zendesk