is there an efficient way to read a file line by line, but starting from the end?
my use case is grabbing the just most recent entries from a lucee log files to show
a current server status page in the Log analyzer plugin, with a refresh option which shows
what has happened since the status page was last loaded
is cflooping over the file is most memory efficient approach, or does log4j etc have
something to offer here?
Doing this without getting your hands “dirty”, i don’t believe is possible.
Now, a way to approach the problem. Depending on the filesize, you may place the contents on ram:// and do your work there. This is going to be very fast. If however the file contents are a lot, your code may consume all your RAM and you don’t want to do this. In this case, you can dump the file content into a temp SQL table and just reverse the order with a simple query.
Hope this helps, or at list gives a first direction.
That “trust” relies on the assertion that the user kept the default layout in the log settings, and did not use a custom pattern or layout class.
If that’s the case (which I would argue that it is not, based on the above), then you can use the built in functions listFirst(logline, ",", true, 5) and listRest(logline, ",", true, 6) to separate the “metadata” from the “message”.
left on my todo list is some graphic design love and then using the tail function,
showing a summary page which tails all the current logs and shows what’s happened
recently, with filters by severity / log type, with a polling update which just grabs any
changes since the page was displayed / last poll