OPC data transfer via transfer object in 500ms
Systems
Host
operating system: Windows 11
RAM: 32 GB
processor: Intel i7-8700
Virtual machines
OPC Router
operating system: Windows Server 2022
RAM: dynamic, up to 16GB
Virtual processors: 8
OPC Router: 5.3.5008.157 (inraySDK version: 3.32.6002.24)
Test server
Operating system: Windows Server 2025
RAM: 8 GB
Virtual processors: 6
OPC server: IoT Edge OPC UA PLC in Docker
Test setup
In the connections, a batch read transfer object or a data access transfer object was used to read 10 data points each. The data is sent to MQTT and also written to an Influx database.
Separate tests were carried out with data change triggers, time triggers and cron triggers. Performance was determined by comparing the expected number of executions with the actual number of executions.
.png)
The deviations in the tables were rounded to the nearest five.
In this test, we deliberately pushed the OPC Router and the OPC server to and beyond their performance limits. Data rates of up to 20,000 tags per second were achieved.
Results when using the cron trigger
With the batch read TO
Connections | Data points | Trigger interval | Expected executions | Deviation | Plug-in count |
---|---|---|---|---|---|
2000 | 20000 | 1 second | 500 | 0 - 10 | 2 |
1000 | 10000 | 1 second | 500 | TBD | 1 |
With the OPC Data Access TO
Connections | Data Points | Trigger Interval | Expected executions | Deviation | Plug-in count |
---|---|---|---|---|---|
1000 | 10000 | 1 second | 500 | 0 - 5 | 1 |
Results when using the data change trigger
With batch read transfer object
At a data change rate of less than 500 ms, the sample rate was adjusted accordingly in the plug-in.
Connections | Data points (total) | Data changes | Expected executions | Deviation | Plug-in Number |
---|---|---|---|---|---|
100 | 1000 | 500 ms | 500 | 0 | 1 |
250 | 2500 | 500 ms | 500 | 0 | 1 |
500 | 5000 | 500 ms | 500 | 0 - 20 | 1 |
1000 | 10000 | 500 ms | 500 | 55 - 80 | 1 |
50 | 500 | 250 ms | 500 | TBD | 1 |
100 | 1000 | 50 ms | 500 | TBD | 1 |
250 | 2500 | 50 ms | 500 | TBD | 1 |
500 | 5000 | 50 ms | 500 | TBD | 1 |
With OPC Data Access Transfer Object
Connections | Data points Read | Data changes | Expected executions | deviation | Plug-in Number |
---|---|---|---|---|---|
100 | 1000 | 500 ms | 500 | 0 | 1 |
250 | 2500 | 500 ms | 500 | 5 - 30 | 1 |
500 | 5000 | 500 ms | 500 | 25 - 40 | 1 |
1000 | 10000 | 5 00 ms | 500 | 20 - 60 | 1 |
50 | 500 | 250 ms | 500 | TBD | 1 |
100 | 1000 | 50 ms | 500 | TBD | 1 |
250 | 2500 | 50 ms | 500 | TBD | 1 |
500 | 5000 | 50 ms | 500 | TBD | 1 |
Results when using the time trigger
Please note that the time trigger interval is the time that is waited between executions. Since the execution time is not included in the calculation, the time trigger already results in a difference. Therefore, the number of expected executions cannot be achieved exactly here, but only as an approximation.
Using the batch read transfer object
Connections | Data points | Trigger Interval | Expected executions | deviation | Plug-in Number |
---|---|---|---|---|---|
100 | 1000 | 500 ms | 500 | TBD | 1 |
250 | 2500 | 500 ms | 50 0 | 30 - 35 | 1 |
500 | 5000 | 500 ms | 500 | 30 - 35 | 1 |
1000 | 10000 | 500 ms | 500 | 50 - 60 | 1 |
50 | 500 | 250 ms | 500 | TBD | 1 |
100 | 1000 | 250 ms | null | TBD | 1 |
250 | 2500 | 250 ms | 500 | TBD | 1 |
500 | 5000 | 250 ms | 500 | TBD | 1 |
With the OPC Data Access transfer object
< td>250 ms
Connections | Data points | Trigger Interval | Expected executions | deviation | Plug-in Number |
---|---|---|---|---|---|
100 | 1000 | 500 ms | 500 | 0 | 1 |
250 | 2500 | 500 ms | 500 | 30 - 35 | 1 |
500 | 5000 | 500 ms | 500 | 30 | 1 |
1000 | 10000 | 500 ms | 500 | 60 - 70 | 1 |
50 | 500 | 250 ms | 500 | TBD | 1 |
100 | 1000 | 250 ms | 500 | TBD | 1 |
250 | 2500 | 250 ms | 500 | TBD | 1 |
500 | 5000 | 500 | TBD | 1 |