dev:030def:0100perftd

Performance in TradeDesign Software

The following text cannot be a complete tutorial for a performance optimization, but it shows some of the issues, which consistently led to unnecessary use of resources in the past.

There are several sectors in which resources can be wasted needlessly:

  • during the runtime of the program
  • during development
  • in the user guidance

Thus, the following has to be considered

  • What causes unnecessarily long development periods?
  • What do programmers have to pay particular attention to, so that the users are guided quickly and easily through the transaction?
  • What do programmers have to pay particular attention to during the development, that program tasks are to be performed as fast as possible without wasting too much computer and network resources?

Runtime / Idle Time

The time a program needs to be able to process user entries after the start or after previous user entries, depends on various factors.

In this scenario it becomes important which resp. how many commands have to be executed at all. Another factor is how long it takes for each command to be processed.

Quick commands that have to be executed very often can slow down the system as well as individual commands that need longer processing time.

Furthermore, the work environment play an important role. Commands, for example, that are not processed remarkably slow in test environments with small amounts of data, can react considerably slower in production environments with large volumes of data. The same is true for different network environments.

Redundant Parts of Source

Source parts that are not required use resources, as additional data have to be loaded, especially when starting transactions.

Time and resource consuming are:

  • Instantiated panels
  • Additionally linked modules
  • Unused or redundent parts of source

Unused modules can be searched via Crossreference (GENXRF). As this task is time-consuming, it should be performed periodically after longer development phases.

If a module with many fields and/or Inits/Defaults is linked on multiple occasions only be cause this results in an individual (minor) rule being necessary, it may be sensible to define this rule as 'static' instead and to continue linking the module only once. Before conversion, however, the other dependencies should be borne in mind. The drawback is that the list of general rules becomes longer and the processing becomes minimally slower at this point.

Which commands can potentially use a lot of resources?

Item What can cause an increased use of resources?
Grid Modifying the number of elements by using
- GridInsertRow
- GridDeleteRow
- Clear of Grids
- DbReadSet
causes a recalculation of the flow. This leads to a re-determination of the dependencies of all fields and rules in the whole transaction.
GetLine Internally each call transfers the whole block into a stream and then accesses the selected line. This function becomes consumptive, if the block holds more than a few lines or lines are accessed multiple times from the same field.
Loops If loops have huge numbers of cycles, it is very important to recognize invariant parts or to move those parts to an external location. Intermediate results can also be moved to Register and can be used multiple times afterwards, instead of re-determining these results again and again.
Streamsearch If searches are repeatedly executed in huge streams, that are not sorted with “keep-sorted”.
Accessing INI Files - On a single access, not many resources are used.
- The entries have to be searched case-insensitively. The search depends on the size of the INI file.
- Usually, the INI files have always to be reloaded again from scratch.
Database Accesses - INR access to a record takes almost no search time.
- “ToUpper” via “SqlFldUpper” causes, that database indices cannot be used in most cases and then a “FulTableScann” is executed. With smaller tables this is ok, but many resources are used when bigger tables are processed.
- The load of resources depends on the data environment. In most cases these 'resource consumers' cannot be seen in development and test environments.
Directory Search Depending on the size of the directory and the network used, directory searches can take very long.
Recalc Immediately (synchronously) and not after completing the current Basic process, all defaults that are depending on the object are executed. If several Recalcs are executed in one rule, multiple identical defaultings might occur.

How does the usage of Cache help?

When processing default rules, the results of complex or resource-intensive logic should be cached (also in other cases, if complex logic is evaluated multiple times and the base data has not been not modified).

  • Reasonable and unique cache names should be used (typically, including the rulename the cache is used in)
  • All data on which the logic depends must be included in the cache name (not forgetting the entity, for example)
  • For modules, it may be necessary to determine and include “ObjTyp” and “ObjInr” separately in the cache name
  • It is always good praxis to include the entity in the cache name

Example solution for cache usage:

$Txt = ""

CacheRead( $Txt, "PTSPTA_RolTxt", \SYSMOD\ETY\EXTKEY, $ConObj, $ConInr, $Rol )

if Errorcode = tdCacheNotFound then
  # Logic to determine $Txt
  
  $Txt = PtsModGetSpcTxtXyzDemo( ArgCon, $Rol )
  
  CacheWrite( $Txt, "PTSPTA_RolTxt", \SYSMOD\ETY\EXTKEY, $ConObj, $ConInr, $Rol )

else

  reraise

endif

What effects do Logics have at which area?

Init

  • once only
  • But always on Startup
  • extends the starting time

Default

  • Frequently
  • also on Startup

Defaults will be executed always when the field changes on which the field to be defaulted depends. Thus, defaults should include as less logic and time-consuming commands as possible. For example, instead of using a resource-consuming database access, in the default a selection can be done in the Init as well as using the cache in the default.

Event

  • Only if Event is triggered
  • No dependency
  • The developer has to take care of the call

Self-Administrating Logics - On Demand

  • Distributes the execution times
  • e.g. in the Enter Event Panel
  • e.g. in the Event that raises a Popup Panel
  • e.g. checks at the beginning of the rule if the data has already been determined and determines only when needed

Development Times / Development Cycles

Issue What slows down the system Can be prevented by
Double
sources
- Requires double maintenance
- Increases the risk of forgetting individual places when changing the software
Multiple used functions have to be defined at an adequate place and can be used multiple times.
Vague definition of variables Search for errors, e.g. for Dump 160 Defining a variable in the header of the rule (i.e. always set) instead of using IFs (i.e. only set sometimes)
Uncommented
Sources
- The source code must always be re-read and comprehended
- The risk increases to 'destroy' sources by doing too many source modifications, as the meaning of a particular place in the sources will not be interpreted correctly
- Task and function of a rule should be described shortly in the header of a rule; no repetition of the source texts
- Comments have to be inserted in between the commands, where a description why something is done cannot easily be interpreted by reading the sources.
Inventing something that already exists in the sources - Development and testing has to be done again
- Double sources are created
Developer has to search whether such a function is already available within the system. This can be done via:
- the 'Find' function in the Module Explorer
- using Crossreference within the transaction
- using Crossreference in all transactions

User Guidance

Issue What slows down the system Can be prevented by
Field distribution to
the Panels
- fields that belong together from a banking point of view are not grouped together
- The sequence of accessing the fields does not comply to the banking requirements
- Sensible arrangement on the Panel
- Correct integration into the tab folders
Label Unclear descriptions Correct banking descriptions
Helptexts Queries / entering errors due to unclear field descriptions Defining helptexts for panel fields
Hints Queries / entering errors due to unclear descriptions Entering hints



How can I find out what slows the System down?

Profiling - Trace and GENPRF or Profiling panel in the 'Watch' window
Note: The output of profiling data slows the system down in different degrees.

  • Which rules are processed how many times
  • Searching for very expensive rules
  • Distinction between gross / net time
  • Identifying data dependencies / configuration dependencies


Frequent causes for slowing down the system:

  • Settlement handling / Account Defaulting via Defaults
  • Grid handling
  • Expensive SQLs
  • Multi-defaults
  • Database search accesss in Defaults (not via INR) without using Cache

There is a page describing how to handle performance complains of customer or to improve the overall performance in Performance Analysis.


Performance Differences Fat-Client <-> Client-Server

  • String handling on servers often is not that efficient (depends on C++-Compiler)
  • Communication with client is slower and takes time
  • CPU cycles are mostly slower and are not always available without any restriction
  • INI files need to be read entirely at each time of access

Using powerfull commands instead of coding own loops

As the TD-Basic code is compiled to a p-code which is interpreted in the final context it is beneficial to use powerfull commands instead of setting up own implemented loops or logic.

E.g. use tokenize to split a line into the elements instead of using multiple pos and mid commands.

dev/030def/0100perftd.txt · Last modified: 2024/04/05 10:10 (external edit)