I have a ./program
which outputs lots of logging messages and also having very limited storage space on my VPS:
I would like my log-file to contain only the latest N lines all the time (or rather when I terminate ^C my program or when it crashes...)
A. I do not want the following:
"Redirect the output to a (log-)file and then use
tail
to keep only the last N lines."Well the log-file would still take precious take space until I run
tail
on it, which makes it pointless... I can set a cronjob up to do it periodically if I have no other choice, but I'd like to explore the possibilities first."Use
logrotate
."logrotate
seems to be the proper solution to the problem, but it's too much of an hassle and I want something simpler, preferably something I can do with pipes and redirections.
B. I have tried the following:
(replaced ./program
with seq 1000000
if for testing)
seq 1000000 | tee /dev/fd/2 | tail -n 10 > logfile
Works perfectly fine when it terminates on its own, but when I interrupt ^C it myself, the
logfile
is empty (whereas I expect it to contain last 10 lines that are printed on the screen bytee
)mkfifo fifo; tail fifo -n 10 > logfile & seq 1000000 | tee fifo
Works perfectly fine when it terminates on its own, but when I interrupt ^C it myself, the
logfile
is not empty, but also does not contain a bit of latest log entries that are printed on the screen:
.
$ tail fifo -n 10 > logfile & seq 1000000 | tee fifo
[1] 2616
1
2
3
⋮
480178
480179
480180
480181
480182
480183
^C
[1]+ Done tail fifo -n 10 > logfile
$ cat logfile
479297
479298
479299
479300
479301
479302
479303
479304
479305
Here you can see that the latest entries are at 480 thousands, whereas the latest entry in the logfile
is 479,305... meaning I miss 878 latest lines! I think this has something to do with buffering but I am not sure.
Can someone show me how to do this using only shell and (preferably standard) Linux utilities? Thanks!