14

I want to monitor the log file of my application which however doesn't work locally but on a SaaS platform and is exposed over HTTP and WebDAV. So, an equivalent of tail -f that works for URLs would do great job for me.

P.S. If you know of any other tools that can monitor remote files over HTTP, it may also be of help. Thanks

3
  • 1
    Is it shown as plain text on the remote server or as html?
    – terdon
    Commented Dec 3, 2012 at 12:46
  • Plain text with specific format: [timestamp] Error_name ..... Which I then intent to filter through grep
    – munch
    Commented Dec 3, 2012 at 12:54
  • You can use wget -N http://somewhere/something, that'll download file only if it's newer than one that you downloaded before or use wget -O - http://somewhere/something to redirect file to stdout.
    – week
    Commented Dec 3, 2012 at 13:08

4 Answers 4

15

There may be a specific tool for this, but you can also do it using wget. Open a terminal and run this command:

while :; do 
    sleep 2
    wget -ca -O log.txt -o /dev/null http://yoursite.com/log
done

This will download the logfile every two seconds and save it into log.txt appending the output to what is already there (-c means continue downloading and -a means append the output to the file name given). The -o redirects error messages to /dev/null/.

So, now you have a local copy of log.txt and can run tail -f on it:

tail -f log.txt 
7
  • I found out that I could use davfs2 to integrate with the webDAV interface and then use the file like a regular file. This is what I really expected. But your solution is more simple and actually works
    – munch
    Commented Dec 3, 2012 at 16:05
  • I found that everything is being saved in "log" file not "log.txt". In my case this works: wget -ca -O log.txt -o /dev/null yoursite.com/log
    – yatsek
    Commented Feb 18, 2014 at 10:49
  • @munch davfs2 doesn't work that well. In my case I found that tail -f doesn't update file changes unless there's some other process actively asking server for directory updates (a plain ls seems enough). Problem is tail -f relies on inotify, and inotify doesn't seem to work over davfs2.
    – jesjimher
    Commented Jan 9, 2018 at 11:04
  • @jesjimher tail does not depend on inotify. It simply reads the file, seeks back and reads again. If it doesn't work well with davfs, that will be down to how davfs itself works. Presumably, it only updates information when something is actively reading the directory and since tail keeps the file open, that doesn't trigger it. Or something along those lines.
    – terdon
    Commented Jan 9, 2018 at 12:59
  • 1
    @KhaledAbuShqear yes, the \ are not needed here at all. In my defense, this was written quite a few years ago :)
    – terdon
    Commented Jun 12, 2020 at 16:33
5

curl with range option in combination with watch can be used to achieve this:

RANGES

HTTP 1.1 introduced byte-ranges. Using this, a client can request to get only one or more subparts of a specified document. Curl supports this with the -r flag.

watch -n <interval> 'curl -s -r -<bytes> <url>'

For example

watch -n 30 'curl -s -r -2000 http://yoursite.com/log'

This will retrieve the last 2000 bytes of the log every 30 seconds.

Note: for self signed https use --insecure curl option

5

I answered the same question over here with a complete shell script that takes the URL as it's argument and tail -f's it. Here's a copy of that answer verbatim:


This will do it:

#!/bin/bash

file=$(mktemp)
trap 'rm $file' EXIT

(while true; do
    # shellcheck disable=SC2094
    curl --fail -r "$(stat -c %s "$file")"- "$1" >> "$file"
done) &
pid=$!
trap 'kill $pid; rm $file' EXIT

tail -f "$file"

It's not very friendly on teh web-server. You could replace the true with sleep 1 to be less resource intensive.

Like tail -f, you need to ^C when you are done watching the output, even when the output is done.

0

I have created a powershell script which

  1. Gets the content from given url every 20 secons
  2. Gets only specific amount of data using "Range" HTTP request header.
while ($true) {
    $request = [System.Net.WebRequest]::Create("https://raw.githubusercontent.com/fascynacja/blog-demos/master/gwt-marquee/pom.xml")
    $request.AddRange(-1000)
    $response = $request.GetResponse()
    $stream = $response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($stream)
    $content = $reader.ReadToEnd()
    $reader.Close()
    $stream.Close()
    $response.Close()
 
    Write-Output $content
    
    Start-Sleep -Seconds 20
}

You can adjust the Range and the Seconds to your own needs. Also If needed you can easily add color patterns for specific search terms.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .