What is the best duplicate file finder, that preserves my source of truth? - eviltoast

Having been so meticulous about taking back ups, I’ve perhaps not as been as careful about where I stored them, so I now have a loads of duplicate files in various places. I;ve tried various tools fdupes, czawka etc. , but none seems to do what I want… I need a tool that I can tell which folder (and subfolders) is the source of truth, and to look for anything else, anywhere else that’s a duplicate, and give me an option to move or delete. Seems simple enough, but I have found nothing that allows me to do that… Does anyone know of anything ?

  • speculatrix@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Write a simple script which iterates over the files and generates a hash list, with the hash in the first column.

    find . -type f -exec md5sum {} ; >> /tmp/foo

    Repeat for the backup files.

    Then make a third file by concatenating the two, sort that file, and run “uniq -d”. The output will tell you the duplicated files.

    You can take the output of uniq and de-duplicate.

    • jerwong@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I think you need a \ in front of the ;

      i.e.: find . -type f -exec md5sum {} \; >> /tmp/foo