0

Usually when I'm searching the logs, its for something that happened recently. I end up tabbing at bash prompt to complete the names.

I finally did bash alias that uses date arithmetic to build the file name. Something like following

grep <test> `date '+localhost_access_log.%C%y-%2m-%2d.txt' -d "-3 days"` grep 400 `date '+localhost_access_log.%C%y-%2m-%2d.txt' -d "-2 days"` grep 400 `date '+localhost_access_log.%C%y-%2m-%2d.txt' -d "-1 days"` grep 400 `date '+localhost_access_log.%C%y-%2m-%2d.txt' -d "-0 days"`

Is there anyway to optimize this? Ideally I would like to create multiple aliases to grep through most recent three files like localhost_access_log.2019-06-24.txt or api-2019-06-24-1.log and don't like date command code duplication. Essentially looking for help for a bash function that will return 4 most recent file names based on the pattern I send including today's log file and can be passed to grep command

2 Answers 2

1

If you don't insist on bash, you can do this quite easily in zsh using setopt extendedglob:

grep <test> *(ND.om[1,4])

will grep through the four most recent files in the current directory.

Explanation:

  • N - turn on "null glob" (silently substitute nothing if the glob pattern doesn't match)
  • D - turn on "GLOB_DOTS" (also match "hidden files")
  • . - match plain files
  • om - order by modification time
  • [1-4] - array subscript

A pure bash solution could be something like

grep test $(ls -1rt | tail -n 4)

(But this will fail on funky filenames, such as ones with embedded spaces.)

find . -maxdepth 1 -type f -name "pattern" -mtime -1 -exec grep test {} +

Will grep through files in the current directory whose name matches pattern and which have been modified within the last day (this could be more than four, but it could also be zero). It'll work on funky filenames, though.

3
  • From bash (or any other shell): zsh -c 'grep -e "$1" *(ND.om[1,4])' zsh "<test>"
    – Kusalananda
    Commented Jun 24, 2019 at 21:46
  • zsh is installed by default on ubuntu boxes - or at least its not installed on the servers I checked. Being a production boxes, I'll have to provide justifications and will probably be laughed at for this request
    – user871199
    Commented Jun 24, 2019 at 21:50
  • I'd say increased productivity is good justification. :) There are many other zsh features that increase productivity. Why not give it a try? zsh has no esoteric dependencies and installing it won't break anything, just make the lives of people who know how to use it easier. Commented Jun 24, 2019 at 21:54
0

Here is a simple bash-based solution. It works by using ls to list the N most recent files matching a given wildcard, on the assumption that each log file is for a single day:

$ grep test $(ls -tr localhost_access_log.*.txt | tail -4)

If that counts as an unwise case of parsing the output of ls, then use this instead:

$ grep test $(find . -name localhost_access_log.\*.txt -mtime -3d)

In general, the point is to avoid getting tangled up in the date arithmetic. Just run grep on the N most recent files. Again, this assumes that your logs rotate once per day, as your question suggests.

4
  • L, the reason why I went for dates is second log file name example I gave. There will be multiple files having that date api-2019-06-24-1.log or api-2019-06-24-25.log as some log files split based on the size. I have to look for certain dates and not just latest files
    – user871199
    Commented Jun 24, 2019 at 22:26
  • @user871199 Then the find version of the command may be the most useful to you. That will catch all log files <= N days old.
    – Jim L.
    Commented Jun 24, 2019 at 22:33
  • It will still fail on filenames with embedded spaces. Better use find ... -exec grep test {} ++. Commented Jun 25, 2019 at 6:23
  • Granted, but from the OP it is known that they don't: date '+localhost_access_log.%C%y-%2m-%2d.txt'
    – Jim L.
    Commented Jun 25, 2019 at 13:21

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .