Loading Large Logs Suggestion
Posted: April 4th, 2022, 7:26 pm
I have encountered some situations where data pulled from an external source ends up being quite large and causes poor performance when attempting to view the related logs. For example, an API call is returning ~2000 results that is being parsed and updating records in a SQL table. The whole process finishes in 400ms, but the logs can take up to 2 minutes to expand the workdata.
Suggestion
When a single service returns a large array of data, could a limit be implemented (i.e. first 20 nodes) in the treeview with an option to "show all 2000 child nodes" so that the logs can be more quickly viewed?
This would make it easier to look through the logs when I am looking for something that shares the same workdata as the service with the large array of data while also allowing a small example of the large array to be quickly viewable for comparisons. If I need the full list, then I would either download the full log or click to show all, and at that point the long load time would be expected.
Suggestion
When a single service returns a large array of data, could a limit be implemented (i.e. first 20 nodes) in the treeview with an option to "show all 2000 child nodes" so that the logs can be more quickly viewed?
This would make it easier to look through the logs when I am looking for something that shares the same workdata as the service with the large array of data while also allowing a small example of the large array to be quickly viewable for comparisons. If I need the full list, then I would either download the full log or click to show all, and at that point the long load time would be expected.