You can set up a crontab job to run every N minutes. The job will search for files modified less than N minutes ago, and copy them to their new destination.
For instance if you want to run the file /home/my_name/bin/custom every 10 minutes, you edit your crontab file by means of the command
crontab -e
and add the following line at the end:
*/10 * * * * /home/my_name/bin/custom
The file custom, made executable by
chmod 755 custom
could be something like this:
#!/bin/sh
cd /directory/to/be/monitored
find . -type f -mmin -10 -exec sh -c ' file={}; base=${file##*/}; \
scp {} me@remotemachine:/target/directory/$base ' \;
This command searches recursively the target directory for files modified less than (-mmin -10) ten minutes ago, and s
-copies them to their new destination. It puts all files into the same directory /target/directory, irrespective of their origin. You must have set up passwordless login for this to work, of course.
If instead you wish to retain the directory structure (i.e., not pile everything up in the same directory, modify the above as follows:
find . -type f -mmin -10 -exec sh -c ' file={}; base=${file##*/}; \
dirpath=${file%/*}; ssh me@remotemachine mkdir -p /target/directory/$dirpath ; \
scp {} me@remotemachine:/target/directory/{} ' \;
There is no error checking here, modify as you see fit.