2

I have a ./program which outputs lots of logging messages and also having very limited storage space on my VPS:

I would like my log-file to contain only the latest N lines all the time (or rather when I terminate ^C my program or when it crashes...)

A. I do not want the following:

  1. "Redirect the output to a (log-)file and then use tail to keep only the last N lines."

    Well the log-file would still take precious take space until I run tail on it, which makes it pointless... I can set a cronjob up to do it periodically if I have no other choice, but I'd like to explore the possibilities first.

  2. "Use logrotate."

    logrotate seems to be the proper solution to the problem, but it's too much of an hassle and I want something simpler, preferably something I can do with pipes and redirections.

B. I have tried the following:

(replaced ./program with seq 1000000 if for testing)

  1. seq 1000000 | tee /dev/fd/2 | tail -n 10 > logfile

    Works perfectly fine when it terminates on its own, but when I interrupt ^C it myself, the logfile is empty (whereas I expect it to contain last 10 lines that are printed on the screen by tee)

  2. mkfifo fifo; tail fifo -n 10 > logfile & seq 1000000 | tee fifo

    Works perfectly fine when it terminates on its own, but when I interrupt ^C it myself, the logfile is not empty, but also does not contain a bit of latest log entries that are printed on the screen:

.

$ tail fifo -n 10 > logfile & seq 1000000 | tee fifo
[1] 2616
1
2
3
⋮
480178
480179
480180
480181
480182
480183
^C
[1]+  Done                    tail fifo -n 10 > logfile
$ cat logfile
479297
479298
479299
479300
479301
479302
479303
479304
479305

Here you can see that the latest entries are at 480 thousands, whereas the latest entry in the logfile is 479,305... meaning I miss 878 latest lines! I think this has something to do with buffering but I am not sure.

Can someone show me how to do this using only shell and (preferably standard) Linux utilities? Thanks!

1
  • Personally I use a cron that runs everyday at 12:12:12 am to purge log files... since they are pruned to size daily they never get big, that cron generates some html to a file that gets included in a personal status page (mix of php and html) that lets me quickly keep an eye on things.
    – Tyson
    Commented Jul 29, 2018 at 18:08

1 Answer 1

1

Simplest solution for your case is probably circular log, that has fixed size.

If you are on Linux, you can try kernel module emlog

The emlog kernel module implements simple character device driver. The driver acts like a named pipe that has a finite, circular buffer. The size of the buffer is easily configurable. As more data is written into the buffer, the oldest data is discarded. A process that reads from an emlog device will first read the existing buffer, then see new text as it's written, similar to monitoring a log file using "tail -f". (Non-blocking reads are also supported, if a process needs to get the current contents of the log without blocking to wait for new data.)

On BSD systems see CLOG(8)

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .