852
Comment:
|
← Revision 3 as of 2020-06-27 11:22:04 ⇥
1099
|
Deletions are marked like this. | Additions are marked like this. |
Line 18: | Line 18: |
* with (export hashfile=hashfile.txt) * get all duplicates {{{ cat $hashfile | sort | uniq -w100 -D }}} * get 0 byte files {{{ cat $hashfile | grep "^0," }}} * get small files 10-999bytes {{{ cat $hashfile | grep "^...," }}} |
Notes related to management of personal photo albums
Stoye photos in directory structure /yyyy/yyyymmdd/<photos>.jpg
- Problem is the following
- keeping multiple backups
- syncing them reliably
- detecting corrupt photos before syncing
- merging photo changes
- detecting duplicates, e.g. dir rename etc.
using hashdeep to detect corruption
- hashdeep creates md5+sha256 checksums and saves them, then can be used later to verify files have not changed.
Create hash -r=recursive , -l relative file paths
hashdeep -rl Photos/{19*,20*,jokes} | tee $(date +"./Photos/hashdeep/%Y%m%d-hashdeep.hash.txt")
Audit hash -a audit -k load known hashes
hashdeep -rl -a -k ./Photos/hashdeep/$1 Photos/{19*,20*,jokes} | tee $(date +"./Photos/hashdeep/%Y%m%d-hashdeep.audit.txt")
- with (export hashfile=hashfile.txt)
get all duplicates
cat $hashfile | sort | uniq -w100 -D
get 0 byte files
cat $hashfile | grep "^0,"
get small files 10-999bytes
cat $hashfile | grep "^...,"