As I like to make backups and snapshots by simply copying eveything within a folder to some new place I often have a lot of duplicate files which take up precious space.
So I decided to create a small script which takes care of my problems. If you want you can download it at https://gitlab.com/poellmann-daniel/php-duplicate-finder.
It works by using hash functions to determine which files are exactly identical and then prompts you to decide if you want to delete the duplicates. It can also do this automated but be careful as this could possibly delete the ones you'd like to keep.
At the moment it can decide which files should be deleted based on two things: Length of the filename and last modified date.
It also supports multiple hash functions (don't worry if you don't trust md5, you can use sha512 :D) and just calculating how much space can be freed by removing duplicates.
For a full guide on this tool, see the README.md in the git repository https://gitlab.com/poellmann-daniel/php-duplicate-finder.
Tell me what you think about it!