I want to copy all files from my Documents of the last 24 hours in a file. And after that I want to select every line in the file, which should be the path of the document, and copy it to an other directory. But I have a problem.
This is to copy the path and name of the documents in a file :
find ./Documents -mtime -1 -type f > ./Documents/renc.txt
I get all the Documents created or modified in this file with their path for the last 24 hours.
But then, when I want to give a value ‘b’ to be able to copy the file after that I can’t.
b= `sed -n "3p" ~/Documents/renc.txt` bash: ~/Documents/test.odt: No such file or directory
If it is not clear, what I want is to get ~/Documents/test.odt from the file renc.txt which should get the value ‘b’. Then have an other command line
cp $b ~/Others
After of course, they will be few documents so I will add a loop to get every lines of the file.
Thank you for your help,
If what you’re really asking is, “How can I copy my documents to a different directory every day, for a backup?” then you can use
rsync. This tool will consider every file in the source tree, but will only copy those files are newer in the source tree:
rsync -av Documents/ /path/to/backupDocuments/
If you’re asking, “How can I copy my documents to a remote server every day, for a backup?” then you can use
rsync running over
ssh. This version will copy only the necessary parts of files that are newer in the source tree:
rsync -av Documents/ [email protected]:backupDocuments/
If you really just want to find documents created or modified in the last 24 hours and copy them somewhere (bearing in mind that files older than this may get skipped if you don’t run your script precisely every 24 hours), then this may help. It’ll create a list of files in
/tmp/copied_files.list on the way :
find Documents -depth -mtime -1 fprint /tmp/copied_files.list -print0 | pax -0 -d -rw /path/to/backupDocuments/
All things considered, I’d recommend the
rsync as the better option.