Load testing has been attempted using the package from K6.io. Output has been written to an InfluxDB database, and visualizations have been configured using Graphana. Tests can be viewed at https://dev.azure.com/TCOPP/_git/EGIS?path=%2FARM%2Floadtest&version=GBfeatures%2Floadtest%2Fk6, specifically egisloadtest.js. See the readme.md in that folder for details on how to run these.
Table of Contents |
---|
Sandbox
Testing has been performed on the SBX instance, consisting of an Azure VM for each of the Web and Portal, GIS Server, and Data Store roles. Note that SBX is behind a firewall maintained by the cloud team, and our loadtest VM is not whitelisted to access SBX at the time of writting. All tests were performed on the TC network. The results follow:
...
These results look around the same despite the improved hardware. No conclusions will be drawn until the test can be repeated again To clear this confusion, the tests were repeated on the TC network .
...
in
...
the early morning. The results were more promising:
...
Here we see SBX handle a 200 user load with acceptable response times.
Dev
Testing has been performed on the DEV instance, consisting of an Azure VM for each of the Web and Portal, GIS Server, and Data Store roles.
...
The layer in question is located here: https://sbxweb.tcgis.ca/portal/home/item.html?id=459b8810542d4cc7a4e24aa65d741f3a
- Have a look at this layer
...
This layer is the result of an analysis that has been saved as a hosted layer. In our initial look it stood out as being noticeably slow. The reason for this is two-fold. First, flood data consists of complex polygons, with many points to create the curved boundaries, being multipart polygons, and containing holes. The second, this layer is viewable at a national scale despite the polygon features being too small to see at that scale.
As a demonstration, this layer was disolved using the eGIS Analysis Tools to join all polygons that share the same property ID and date. The result was a layer that contained less than half of the features of the original (450 versus 201) yet looked the same.
...
When the two layers were subjected to the performance tests above, the dissolved layer was noticeably faster.
...
This was an exercise to show that the decision of how to represent a data set can greatly affect the performance of its service. In this instance, would using the simpler polygons of the Real Property data source been any less true than
Suggestions
Data that can be cached should be cached
Data that can’t be cached should be stored and displayed in the fewest and simplest features possible
Best practices for publishing and symbolizing data must be followed regardless of our hardware.
Create map caches where possible
Re-project all data to WGS84 Web Mercator
Avoid all re-projection on the fly
Debate?
Test performance on all layers?
Web hooks, test new layers?
...