I have encountered some situations where data pulled from an external source ends up being quite large and causes poor performance when attempting to view the related logs. For example, an API call is returning ~2000 results that is being parsed and updating records in a SQL table. The whole process finishes in 400ms, but the logs can take up to 2 minutes to expand the workdata.
Suggestion
When a single service returns a large array of data, could a limit be implemented (i.e. first 20 nodes) in the treeview with an option to "show all 2000 child nodes" so that the logs can be more quickly viewed?
This would make it easier to look through the logs when I am looking for something that shares the same workdata as the service with the large array of data while also allowing a small example of the large array to be quickly viewable for comparisons. If I need the full list, then I would either download the full log or click to show all, and at that point the long load time would be expected.
Loading Large Logs Suggestion
-
- Posts: 178
- Joined: August 31st, 2021, 11:37 am
- Contact:
-
- Posts: 88
- Joined: August 26th, 2021, 5:21 pm
- Contact:
-
- Posts: 576
- Joined: August 26th, 2021, 9:56 am
- Contact:
Re: Loading Large Logs Suggestion
Issue has been resolved as of DesignTime Revision 2296.
word count: 9