3

I have directory a and directory b. Directory a has new files and folders copied to it periodically. I would like to monitor folder a for those new files and automatically copy them to folder b. Unfortunately due to some organizational scripts that I have set up previously in directory b preclude me from using rsync for these purposes since the folder structure at the destination would most likely differ too greatly between rsync runs.

Is there any kind of alternative setup that I can use?

2 Answers 2

9

Another way of doing this would be to use inotify:

  1. Install the inotify-tools package

     sudo apt-get install inotify-tools
    
  2. Write a little script that uses inotifywatch to check your folder for changes and moves any new files to the target directory:

     #!/usr/bin/env bash
    
     ## The target and source can contain spaces as 
     ## long as they are quoted. 
     target="/path/to/target dir"
     source="/path to/source/dir";
    
     while true; do 
    
       ## Watch for new files, the grep will return true if a file has
       ## been copied, modified or created.
       inotifywatch -e modify -e create -e moved_to -t 1 "$source" 2>/dev/null |
          grep total && 
    
       ## The -u option to cp causes it to only copy files 
       ## that are newer in $source than in $target. Any files
       ## not present in $target will be copied.
       cp -vu "$source"/* "$target"/
     done
    
  3. Save that script in your $PATH and make it executable, e.g.:

     chmod 744 /usr/bin/watch_dir.sh
    
  4. Have it run every time your machine reboots, create a crontab (crontab -e, as described in @MariusMatutiae's answer) and add this line to it:

     @reboot /usr/bin/watch_dir.sh 
    

Now, every time you reboot, the directory will automatically be watched and new files copied from source to target.

2
  • Instead of cp you may use rsync -rc "$source/" "$target/" --delete to basically clone the folder every time. Commented Aug 29, 2019 at 16:39
  • very useful, was able to nearly drag and drop for my purposes. thanks
    – PaulIsLoud
    Commented Feb 19, 2021 at 14:44
1

You can set up a crontab job to run every N minutes. The job will search for files modified less than N minutes ago, and copy them to their new destination.

For instance if you want to run the file /home/my_name/bin/custom every 10 minutes, you edit your crontab file by means of the command

 crontab -e

and add the following line at the end:

 */10 * * * * /home/my_name/bin/custom

The file custom, made executable by

 chmod 755 custom

could be something like this:

 #!/bin/sh

 cd /directory/to/be/monitored
 find . -type f -mmin -10 -exec sh -c ' file={}; base=${file##*/}; \
 scp {} me@remotemachine:/target/directory/$base ' \;

This command searches recursively the target directory for files modified less than (-mmin -10) ten minutes ago, and s-copies them to their new destination. It puts all files into the same directory /target/directory, irrespective of their origin. You must have set up passwordless login for this to work, of course.

If instead you wish to retain the directory structure (i.e., not pile everything up in the same directory, modify the above as follows:

 find . -type f -mmin -10 -exec sh -c ' file={}; base=${file##*/};  \
 dirpath=${file%/*}; ssh me@remotemachine mkdir -p /target/directory/$dirpath ; \ 
 scp {} me@remotemachine:/target/directory/{} ' \;

There is no error checking here, modify as you see fit.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .