Differences between revisions 2 and 3
Revision 2 as of 2020-06-27 10:06:35
Size: 852
Editor: PieterSmit
Comment:
Revision 3 as of 2020-06-27 11:22:04
Size: 1099
Editor: PieterSmit
Comment:
Deletions are marked like this. Additions are marked like this.
Line 18: Line 18:
 * with (export hashfile=hashfile.txt)
   * get all duplicates {{{
cat $hashfile | sort | uniq -w100 -D
}}}
   * get 0 byte files {{{
cat $hashfile | grep "^0,"
}}}
   * get small files 10-999bytes {{{
cat $hashfile | grep "^...,"
}}}

Notes related to management of personal photo albums

  • Stoye photos in directory structure /yyyy/yyyymmdd/<photos>.jpg

  • Problem is the following
    1. keeping multiple backups
    2. syncing them reliably
      1. detecting corrupt photos before syncing
      2. merging photo changes
      3. detecting duplicates, e.g. dir rename etc.

using hashdeep to detect corruption

  • hashdeep creates md5+sha256 checksums and saves them, then can be used later to verify files have not changed.
  • Create hash -r=recursive , -l relative file paths

    hashdeep -rl  Photos/{19*,20*,jokes} | tee $(date +"./Photos/hashdeep/%Y%m%d-hashdeep.hash.txt")
  • Audit hash -a audit -k load known hashes

    hashdeep -rl -a -k ./Photos/hashdeep/$1 Photos/{19*,20*,jokes} | tee $(date +"./Photos/hashdeep/%Y%m%d-hashdeep.audit.txt")
  • with (export hashfile=hashfile.txt)
    • get all duplicates

      cat $hashfile | sort | uniq -w100 -D 
    • get 0 byte files

      cat $hashfile | grep "^0,"
    • get small files 10-999bytes

      cat $hashfile | grep "^...,"

PhotoManagement (last edited 2020-06-27 11:22:04 by PieterSmit)