422

I have a log file being written by another process which I want to watch for changes. Each time a change occurs I'd like to read the new data in to do some processing on it.

What's the best way to do this? I was hoping there'd be some sort of hook from the PyWin32 library. I've found the win32file.FindNextChangeNotification function but have no idea how to ask it to watch a specific file.

If anyone's done anything like this I'd be really grateful to hear how...

[Edit] I should have mentioned that I was after a solution that doesn't require polling.

[Edit] Curses! It seems this doesn't work over a mapped network drive. I'm guessing windows doesn't 'hear' any updates to the file the way it does on a local disk.

3
  • 1
    on Linux one could use strace monitoring write calls for this
    – test30
    Commented Mar 30, 2016 at 9:28
  • @simao's answer uses python-watchdog. Python-Watchdog has great documentation --> here is a link to the ["QuickStart"] documentation which provides a minimal code example that watches the current working directory. Commented May 1, 2017 at 14:36
  • 1
    See Tim Golden's Python Stuff: Watch a Directory for Changes for an example of using the win32file.FindNextChangeNotification function.
    – martineau
    Commented Apr 5, 2022 at 16:54

28 Answers 28

332

Did you try using Watchdog?

Python API library and shell utilities to monitor file system events.

Directory monitoring made easy with

  • A cross-platform API.
  • A shell tool to run commands in response to directory changes.

Get started quickly with a simple example in Quickstart...

11
  • 69
    Installable with easy_install? Check. Free license? Check. Solves the problem on the big platforms? Check. I endorse this answer. Only note: the example on their project page doesn't work out of the box. Use the one on their github instead.
    – Inaimathi
    Commented Oct 22, 2012 at 20:25
  • 9
    We use watchdog. We may switch to QFileSystemWatcher. Just a fair warning- watchdog is good but far from perfect on all platforms (at this time). Each OS has it's idiosyncrasies. So, unless you are dedicated to making it perfect you will be pulling your hair out. If you are just looking to watch 10 files or so, I'd poll. OS disk caching is very mature and Watchdog involves polling APIs anyhow. It's mainly for watching huge folder structures IMHO. Commented Oct 15, 2013 at 23:29
  • 4
    My one gripe with watchdog is, that it has many dependencies. Fewer than PyQt, of course, but it doesn't work and feel like the minimal, best practice, does-one-job-and-does-it-right solution.
    – AndreasT
    Commented Jan 15, 2014 at 12:36
  • 1
    Is @denfromufa correct here? Does watchdog really lock files, so they can't be edited concurrently to watchdog watching them? I can hardly believe that, it would be completely useless. Commented Dec 9, 2015 at 23:24
  • 1
    @MichelMüller I just checked this example (see link below) and it works! not sure what was wrong before, but this answer does not provide any example. stackoverflow.com/a/18599427/2230844
    – den.run.ai
    Commented Dec 10, 2015 at 0:24
137

If polling is good enough for you, I'd just watch if the "modified time" file stat changes. To read it:

os.stat(filename).st_mtime

(Also note that the Windows native change event solution does not work in all circumstances, e.g. on network drives.)

import os

class Monkey(object):
    def __init__(self):
        self._cached_stamp = 0
        self.filename = '/path/to/file'

    def ook(self):
        stamp = os.stat(self.filename).st_mtime
        if stamp != self._cached_stamp:
            self._cached_stamp = stamp
            # File has changed, so do something...
7
  • 3
    How can you do this on an interval?
    – dopatraman
    Commented May 4, 2016 at 15:16
  • 7
    @dopatraman Here is how you can do this on an interval ` import sys import time pub = Monkey() while True: try: time.sleep(1) pub.watch() except KeyboardInterrupt: print('\nDone') break except: print(f'Unhandled error: {sys.exc_info()[0]}') ` Commented Apr 25, 2017 at 19:22
  • 1
    Great simple solution! I added a check to keep it from reporting the file changed on first run: if self._cached_stamp is not None.
    – Noumenon
    Commented Feb 27, 2020 at 16:28
  • 1
    there's no watch method as @VladBezden wrote. is code missing?
    – Lei Yang
    Commented Jan 22, 2021 at 8:15
  • watch is defined as ook into the class above. Write Vlad's example, then add the class above changing ook into watch.
    – nnsense
    Commented Sep 1, 2022 at 16:25
60

If you want a multiplatform solution, then check QFileSystemWatcher. Here an example code (not sanitized):

from PyQt4 import QtCore

@QtCore.pyqtSlot(str)
def directory_changed(path):
    print('Directory Changed!!!')

@QtCore.pyqtSlot(str)
def file_changed(path):
    print('File Changed!!!')

fs_watcher = QtCore.QFileSystemWatcher(['/path/to/files_1', '/path/to/files_2', '/path/to/files_3'])

fs_watcher.connect(fs_watcher, QtCore.SIGNAL('directoryChanged(QString)'), directory_changed)
fs_watcher.connect(fs_watcher, QtCore.SIGNAL('fileChanged(QString)'), file_changed)
8
  • 7
    I think that this is quite possibly the best answer of the bunch given that they either a) rely on Win32's FileSystemwatcher object and cannot be ported or b) poll for the file (which is bad for performance and will not scale). It's a pity Python doesn't have this facility built in as PyQt is a huge dependency if all you're using is teh QFileSystemWatcher class. Commented Oct 13, 2011 at 10:07
  • 4
    I like this solution. I wanted to point out that you'll need a QApplication instance for it to work, I added "app = QtGui.QApplication(sys.argv)" right under the imports and then "app.exec_()" after the signal connections.
    – spencewah
    Commented May 2, 2012 at 22:36
  • Just testing this on a Linux box, I'm seeing that the directory_changed method is being called, but not file_changed.
    – Ken Kinder
    Commented Nov 22, 2012 at 19:08
  • 1
    why not use PySide for that instead of PyQt for such a small use. Commented Sep 16, 2015 at 15:22
  • 3
    This is no longer supported on PyQt5 Commented Feb 11, 2020 at 14:08
35

It should not work on windows (maybe with cygwin ?), but for unix user, you should use the "fcntl" system call. Here is an example in Python. It's mostly the same code if you need to write it in C (same function names)

import time
import fcntl
import os
import signal

FNAME = "/HOME/TOTO/FILETOWATCH"

def handler(signum, frame):
    print "File %s modified" % (FNAME,)

signal.signal(signal.SIGIO, handler)
fd = os.open(FNAME,  os.O_RDONLY)
fcntl.fcntl(fd, fcntl.F_SETSIG, 0)
fcntl.fcntl(fd, fcntl.F_NOTIFY,
            fcntl.DN_MODIFY | fcntl.DN_CREATE | fcntl.DN_MULTISHOT)

while True:
    time.sleep(10000)
5
  • 5
    Works like a charm with Linux kernel 2.6.31 on an ext4 file system (on Ubuntu 10.04), though only for directories - it raises an IOError "not a directory" if I use it with a file. Commented Apr 30, 2010 at 0:44
  • 1
    GREAT! Same for me, works for directory only and watch files in this directory. But it won't work for modified files in subdirectories, so it looks like you need to walk throught subdirectories and watch all of them. (or is there a better way to do this?)
    – lfagundes
    Commented Nov 8, 2010 at 10:15
  • do I have to keep the directory open for this to work? Commented Dec 5, 2022 at 9:15
  • This does not work in macos.
    – scravy
    Commented May 31, 2023 at 20:55
  • fcntl is deprecated in favor of inotify on linux systems.
    – scravy
    Commented May 31, 2023 at 20:55
21

Check out pyinotify.

inotify replaces dnotify (from an earlier answer) in newer linuxes and allows file-level rather than directory-level monitoring.

2
  • 7
    Not to put a damper on this answer, but after reading this article, I would say that it may not be as glamourous a solution as thought. serpentine.com/blog/2008/01/04/why-you-should-not-use-pyinotify Commented Nov 22, 2014 at 4:53
  • 1
    pyinotify has a lot of disadvantages starting from very unpythonic code base to memory consumption. Better to look for other options..
    – NightOwl19
    Commented Aug 22, 2018 at 9:31
17

For watching a single file with polling, and minimal dependencies, here is a fully fleshed-out example, based on answer from Deestan (above):

import os
import sys 
import time

class Watcher(object):
    running = True
    refresh_delay_secs = 1

    # Constructor
    def __init__(self, watch_file, call_func_on_change=None, *args, **kwargs):
        self._cached_stamp = 0
        self.filename = watch_file
        self.call_func_on_change = call_func_on_change
        self.args = args
        self.kwargs = kwargs

    # Look for changes
    def look(self):
        stamp = os.stat(self.filename).st_mtime
        if stamp != self._cached_stamp:
            self._cached_stamp = stamp
            # File has changed, so do something...
            print('File changed')
            if self.call_func_on_change is not None:
                self.call_func_on_change(*self.args, **self.kwargs)

    # Keep watching in a loop        
    def watch(self):
        while self.running: 
            try: 
                # Look for changes
                time.sleep(self.refresh_delay_secs) 
                self.look() 
            except KeyboardInterrupt: 
                print('\nDone') 
                break 
            except FileNotFoundError:
                # Action on file not found
                pass
            except: 
                print('Unhandled error: %s' % sys.exc_info()[0])

# Call this function each time a change happens
def custom_action(text):
    print(text)

watch_file = 'my_file.txt'

# watcher = Watcher(watch_file)  # simple
watcher = Watcher(watch_file, custom_action, text='yes, changed')  # also call custom action function
watcher.watch()  # start the watch going
6
  • 2
    You could make watch_file and _cached_stamp into lists, and iterate across them in a for loop. Doesn't really scale well to large numbers of files though
    – 4Oh4
    Commented Mar 5, 2018 at 15:56
  • Doesn't this trigger the action every time it's run? _cached_stamp is set to 0 and then compared to os.stat(self.filename).st_mtime. _cached_stamp should be set to os.stat(self.filename).st_mtime in the constructor, no? Commented Aug 16, 2019 at 22:08
  • 1
    call_func_on_change() will be triggered on first run of look(), but then _cached_stamp is updated, so won't be triggered again until the value of os.stat(self.filename).st_mtime. _cached_stamp changes.
    – 4Oh4
    Commented Aug 18, 2019 at 13:20
  • 2
    You could set the value of _cached_stamp in the constructor if you didn't want call_func_on_change() to be called on first run
    – 4Oh4
    Commented Aug 18, 2019 at 13:21
  • I've used your script to call some function on file change. My function doesn't take any arguments unlike yours. I thought that to make it work I need to remove *args, **kwargs It looked that (I put only lines with changes): self.call_func_on_change(self) def custom_action(): watcher = Watcher(watch_file, custom_action()) But this did not work. Action was only called during first iteration: File changed yes, changed File changed File changed File changed It started working when I kept *args and call it: watcher = Watcher(watch_file, custom_action) I struggle to wonder why?
    – zwornik
    Commented Apr 1, 2020 at 14:10
14

Well after a bit of hacking of Tim Golden's script, I have the following which seems to work quite well:

import os

import win32file
import win32con

path_to_watch = "." # look at the current directory
file_to_watch = "test.txt" # look for changes to a file called test.txt

def ProcessNewData( newData ):
    print "Text added: %s"%newData

# Set up the bits we'll need for output
ACTIONS = {
  1 : "Created",
  2 : "Deleted",
  3 : "Updated",
  4 : "Renamed from something",
  5 : "Renamed to something"
}
FILE_LIST_DIRECTORY = 0x0001
hDir = win32file.CreateFile (
  path_to_watch,
  FILE_LIST_DIRECTORY,
  win32con.FILE_SHARE_READ | win32con.FILE_SHARE_WRITE,
  None,
  win32con.OPEN_EXISTING,
  win32con.FILE_FLAG_BACKUP_SEMANTICS,
  None
)

# Open the file we're interested in
a = open(file_to_watch, "r")

# Throw away any exising log data
a.read()

# Wait for new data and call ProcessNewData for each new chunk that's written
while 1:
  # Wait for a change to occur
  results = win32file.ReadDirectoryChangesW (
    hDir,
    1024,
    False,
    win32con.FILE_NOTIFY_CHANGE_LAST_WRITE,
    None,
    None
  )

  # For each change, check to see if it's updating the file we're interested in
  for action, file in results:
    full_filename = os.path.join (path_to_watch, file)
    #print file, ACTIONS.get (action, "Unknown")
    if file == file_to_watch:
        newText = a.read()
        if newText != "":
            ProcessNewData( newText )

It could probably do with a load more error checking, but for simply watching a log file and doing some processing on it before spitting it out to the screen, this works well.

Thanks everyone for your input - great stuff!

9

Check my answer to a similar question. You could try the same loop in Python. This page suggests:

import time

while 1:
    where = file.tell()
    line = file.readline()
    if not line:
        time.sleep(1)
        file.seek(where)
    else:
        print line, # already has newline

Also see the question tail() a file with Python.

2
  • 1
    You can you sys.stdout.write(line). You code doesn't work if the file is truncated. Python has builtin function file().
    – jfs
    Commented Oct 8, 2008 at 13:04
  • I've posted a modified version of your code. You may incorporate it in your answer if it works for you.
    – jfs
    Commented Oct 8, 2008 at 13:15
9

This is another modification of Tim Goldan's script that runs on unix types and adds a simple watcher for file modification by using a dict (file=>time).

usage: whateverName.py path_to_dir_to_watch

#!/usr/bin/env python

import os, sys, time

def files_to_timestamp(path):
    files = [os.path.join(path, f) for f in os.listdir(path)]
    return dict ([(f, os.path.getmtime(f)) for f in files])

if __name__ == "__main__":

    path_to_watch = sys.argv[1]
    print('Watching {}..'.format(path_to_watch))

    before = files_to_timestamp(path_to_watch)

    while 1:
        time.sleep (2)
        after = files_to_timestamp(path_to_watch)

        added = [f for f in after.keys() if not f in before.keys()]
        removed = [f for f in before.keys() if not f in after.keys()]
        modified = []

        for f in before.keys():
            if not f in removed:
                if os.path.getmtime(f) != before.get(f):
                    modified.append(f)

        if added: print('Added: {}'.format(', '.join(added)))
        if removed: print('Removed: {}'.format(', '.join(removed)))
        if modified: print('Modified: {}'.format(', '.join(modified)))

        before = after
2
  • Updated to support python3
    – ronedg
    Commented Nov 4, 2019 at 17:53
  • 1
    Brilliant, thanks! Stuck that in a Thread with an Event to exit, et voilá. I replaced the listdir with the newer scandir which is supposedly faster and the result also has convenient stat() method for getting the mod time. Commented Jun 6, 2021 at 4:19
8

Here is a simplified version of Kender's code that appears to do the same trick and does not import the entire file:

# Check file for new data.

import time

f = open(r'c:\temp\test.txt', 'r')

while True:

    line = f.readline()
    if not line:
        time.sleep(1)
        print 'Nothing New'
    else:
        print 'Call Function: ', line
0
8

Simplest solution for me is using watchdog's tool watchmedo

From https://pypi.python.org/pypi/watchdog I now have a process that looks up the sql files in a directory and executes them if necessary.

watchmedo shell-command \
--patterns="*.sql" \
--recursive \
--command='~/Desktop/load_files_into_mysql_database.sh' \
.
7

Well, since you are using Python, you can just open a file and keep reading lines from it.

f = open('file.log')

If the line read is not empty, you process it.

line = f.readline()
if line:
    // Do what you want with the line

You may be missing that it is ok to keep calling readline at the EOF. It will just keep returning an empty string in this case. And when something is appended to the log file, the reading will continue from where it stopped, as you need.

If you are looking for a solution that uses events, or a particular library, please specify this in your question. Otherwise, I think this solution is just fine.

0
4

As you can see in Tim Golden's article, pointed by Horst Gutmann, WIN32 is relatively complex and watches directories, not a single file.

I'd like to suggest you look into IronPython, which is a .NET python implementation. With IronPython you can use all the .NET functionality - including

System.IO.FileSystemWatcher

Which handles single files with a simple Event interface.

1
  • @Ciasto because then you have to have Iron Python available rather than a basic Python installation.
    – Jon Cage
    Commented Sep 17, 2015 at 13:05
3

This is an example of checking a file for changes. One that may not be the best way of doing it, but it sure is a short way.

Handy tool for restarting application when changes have been made to the source. I made this when playing with pygame so I can see effects take place immediately after file save.

When used in pygame make sure the stuff in the 'while' loop is placed in your game loop aka update or whatever. Otherwise your application will get stuck in an infinite loop and you will not see your game updating.

file_size_stored = os.stat('neuron.py').st_size

  while True:
    try:
      file_size_current = os.stat('neuron.py').st_size
      if file_size_stored != file_size_current:
        restart_program()
    except: 
      pass

In case you wanted the restart code which I found on the web. Here it is. (Not relevant to the question, though it could come in handy)

def restart_program(): #restart application
    python = sys.executable
    os.execl(python, python, * sys.argv)

Have fun making electrons do what you want them to do.

1
  • 1
    Seems like using .st_mtime instead of .st_size would be more reliable and an equally short way of doing this, although the OP has indicated that he didn't want to do it via polling.
    – martineau
    Commented Jun 29, 2015 at 16:40
3

Seems that no one has posted fswatch. It is a cross-platform file system watcher. Just install it, run it and follow the prompts.

I've used it with python and golang programs and it just works.

2
ACTIONS = {
  1 : "Created",
  2 : "Deleted",
  3 : "Updated",
  4 : "Renamed from something",
  5 : "Renamed to something"
}
FILE_LIST_DIRECTORY = 0x0001

class myThread (threading.Thread):
    def __init__(self, threadID, fileName, directory, origin):
        threading.Thread.__init__(self)
        self.threadID = threadID
        self.fileName = fileName
        self.daemon = True
        self.dir = directory
        self.originalFile = origin
    def run(self):
        startMonitor(self.fileName, self.dir, self.originalFile)

def startMonitor(fileMonitoring,dirPath,originalFile):
    hDir = win32file.CreateFile (
        dirPath,
        FILE_LIST_DIRECTORY,
        win32con.FILE_SHARE_READ | win32con.FILE_SHARE_WRITE,
        None,
        win32con.OPEN_EXISTING,
        win32con.FILE_FLAG_BACKUP_SEMANTICS,
        None
    )
    # Wait for new data and call ProcessNewData for each new chunk that's
    # written
    while 1:
        # Wait for a change to occur
        results = win32file.ReadDirectoryChangesW (
            hDir,
            1024,
            False,
            win32con.FILE_NOTIFY_CHANGE_LAST_WRITE,
            None,
            None
        )
        # For each change, check to see if it's updating the file we're
        # interested in
        for action, file_M in results:
            full_filename = os.path.join (dirPath, file_M)
            #print file, ACTIONS.get (action, "Unknown")
            if len(full_filename) == len(fileMonitoring) and action == 3:
                #copy to main file
                ...
2

Since I have it installed globally, my favorite approach is to use nodemon. If your source code is in src, and your entry point is src/app.py, then it's as easy as:

nodemon -w 'src/**' -e py,html --exec python src/app.py

... where -e py,html lets you control what file types to watch for changes.

2

Just to put this out there since no one mentioned it: there's a Python module in the Standard Library named filecmp which has this cmp() function that compares two files.

Just make sure you don't do from filecmp import cmp to not overshadow the built-in cmp() function in Python 2.x. That's okay in Python 3.x, though, since there's no such built-in cmp() function anymore.

Anyway, this is how its use looks like:

import filecmp
filecmp.cmp(path_to_file_1, path_to_file_2, shallow=True)

The argument shallow defaults to True. If the argument's value is True, then only the metadata of the files are compared; however, if the argument's value is False, then the contents of the files are compared.

Maybe this information will be useful to someone.

1

Here's an example geared toward watching input files that write no more than one line per second but usually a lot less. The goal is to append the last line (most recent write) to the specified output file. I've copied this from one of my projects and just deleted all the irrelevant lines. You'll have to fill in or change the missing symbols.

from PyQt5.QtCore import QFileSystemWatcher, QSettings, QThread
from ui_main_window import Ui_MainWindow   # Qt Creator gen'd 

class MainWindow(QMainWindow, Ui_MainWindow):
    def __init__(self, parent=None):
        QMainWindow.__init__(self, parent)
        Ui_MainWindow.__init__(self)
        self._fileWatcher = QFileSystemWatcher()
        self._fileWatcher.fileChanged.connect(self.fileChanged)

    def fileChanged(self, filepath):
        QThread.msleep(300)    # Reqd on some machines, give chance for write to complete
        # ^^ About to test this, may need more sophisticated solution
        with open(filepath) as file:
            lastLine = list(file)[-1]
        destPath = self._filemap[filepath]['dest file']
        with open(destPath, 'a') as out_file:               # a= append
            out_file.writelines([lastLine])

Of course, the encompassing QMainWindow class is not strictly required, ie. you can use QFileSystemWatcher alone.

1

watchfiles (https://github.com/samuelcolvin/watchfiles) is a Python API and CLI that uses the Notify (https://github.com/notify-rs/notify) library written in Rust.

The rust implementation currently (2022-10-09) supports:

  • Linux / Android: inotify
  • macOS: FSEvents or kqueue, see features
  • Windows: ReadDirectoryChangesW
  • FreeBSD / NetBSD / OpenBSD / DragonflyBSD: kqueue
  • All platforms: polling

Binaries available on PyPI (https://pypi.org/project/watchfiles/) and conda-forge (https://github.com/conda-forge/watchfiles-feedstock).

0

You can also use a simple library called repyt, here is an example:

repyt ./app.py
0

related @4Oh4 solution a smooth change for a list of files to watch;

import os
import sys
import time

class Watcher(object):
    running = True
    refresh_delay_secs = 1

    # Constructor
    def __init__(self, watch_files, call_func_on_change=None, *args, **kwargs):
        self._cached_stamp = 0
        self._cached_stamp_files = {}
        self.filenames = watch_files
        self.call_func_on_change = call_func_on_change
        self.args = args
        self.kwargs = kwargs

    # Look for changes
    def look(self):
        for file in self.filenames:
            stamp = os.stat(file).st_mtime
            if not file in self._cached_stamp_files:
                self._cached_stamp_files[file] = 0
            if stamp != self._cached_stamp_files[file]:
                self._cached_stamp_files[file] = stamp
                # File has changed, so do something...
                file_to_read = open(file, 'r')
                value = file_to_read.read()
                print("value from file", value)
                file_to_read.seek(0)
                if self.call_func_on_change is not None:
                    self.call_func_on_change(*self.args, **self.kwargs)

    # Keep watching in a loop
    def watch(self):
        while self.running:
            try:
                # Look for changes
                time.sleep(self.refresh_delay_secs)
                self.look()
            except KeyboardInterrupt:
                print('\nDone')
                break
            except FileNotFoundError:
                # Action on file not found
                pass
            except Exception as e:
                print(e)
                print('Unhandled error: %s' % sys.exc_info()[0])

# Call this function each time a change happens
def custom_action(text):
    print(text)
    # pass

watch_files = ['/Users/mexekanez/my_file.txt', '/Users/mexekanez/my_file1.txt']

# watcher = Watcher(watch_file)  # simple



if __name__ == "__main__":
    watcher = Watcher(watch_files, custom_action, text='yes, changed')  # also call custom action function
    watcher.watch()  # start the watch going
0

The best and simplest solution is to use pygtail: https://pypi.python.org/pypi/pygtail

from pygtail import Pygtail
import sys

while True:
    for line in Pygtail("some.log"):
        sys.stdout.write(line)
0
import inotify.adapters
from datetime import datetime


LOG_FILE='/var/log/mysql/server_audit.log'


def main():
    start_time = datetime.now()
    while True:
        i = inotify.adapters.Inotify()
        i.add_watch(LOG_FILE)
        for event in i.event_gen(yield_nones=False):
            break
        del i

        with open(LOG_FILE, 'r') as f:
            for line in f:
                entry = line.split(',')
                entry_time = datetime.strptime(entry[0],
                                               '%Y%m%d %H:%M:%S')
                if entry_time > start_time:
                    start_time = entry_time
                    print(entry)


if __name__ == '__main__':
    main()
-1

If you're using windows, create this POLL.CMD file

@echo off
:top
xcopy /m /y %1 %2 | find /v "File(s) copied"
timeout /T 1 > nul
goto :top

then you can type "poll dir1 dir2" and it will copy all the files from dir1 to dir2 and check for updates once per second.

The "find" is optional, just to make the console less noisy.

This is not recursive. Maybe you could make it recursive using /e on the xcopy.

-1

The easiest solution would get the two instances of the same file after an interval and Compare them. You Could try something like this

    while True:
        # Capturing the two instances models.py after certain interval of time
        print("Looking for changes in " + app_name.capitalize() + " models.py\nPress 'CTRL + C' to stop the program")
        with open(app_name.capitalize() + '/filename', 'r+') as app_models_file:
            filename_content = app_models_file.read()
        time.sleep(5)
        with open(app_name.capitalize() + '/filename', 'r+') as app_models_file_1:
            filename_content_1 = app_models_file_1.read()
        # Comparing models.py after certain interval of time
        if filename_content == filename_content_1:
            pass
        else:
            print("You made a change in " + app_name.capitalize() + " filename.\n")
            cmd = str(input("Do something with the file?(y/n):"))
            if cmd == 'y':
                # Do Something
            elif cmd == 'n':
                # pass or do something
            else:
                print("Invalid input")
-5

I don't know any Windows specific function. You could try getting the MD5 hash of the file every second/minute/hour (depends on how fast you need it) and compare it to the last hash. When it differs you know the file has been changed and you read out the newest lines.

0
-6

I'd try something like this.

    try:
            f = open(filePath)
    except IOError:
            print "No such file: %s" % filePath
            raw_input("Press Enter to close window")
    try:
            lines = f.readlines()
            while True:
                    line = f.readline()
                    try:
                            if not line:
                                    time.sleep(1)
                            else:
                                    functionThatAnalisesTheLine(line)
                    except Exception, e:
                            # handle the exception somehow (for example, log the trace) and raise the same exception again
                            raw_input("Press Enter to close window")
                            raise e
    finally:
            f.close()

The loop checks if there is a new line(s) since last time file was read - if there is, it's read and passed to the functionThatAnalisesTheLine function. If not, script waits 1 second and retries the process.

2
  • 4
    -1: Opening the file and reading lines isn't a great idea when the files could be 100's of MB big. You'd have to run it for each and every file too which would be bad when you want to watch 1000's of files.
    – Jon Cage
    Commented Aug 4, 2009 at 8:48
  • 2
    Really? Opening the file for changes? Commented Sep 15, 2014 at 21:05

Not the answer you're looking for? Browse other questions tagged or ask your own question.