finding duplicates

Have a suggestion for "Everything"? Please post it here.
Post Reply
Posts: 5
Joined: Mon Aug 18, 2014 12:33 pm

finding duplicates

Post by klarah » Mon Aug 18, 2014 6:35 pm

maybe a function like this: ... =10&t=4408

based on md5 checksum would be nice...


Posts: 2851
Joined: Thu Sep 03, 2009 6:48 pm

Re: finding duplicates

Post by therube » Mon Aug 18, 2014 10:54 pm

You can always drag a set of files into a "hasher".
(Might not necessarily be "clean" but its doable.)




Or if the hasher has or can have a content menu, might be better yet.


(Maybe there's a 15 item limit that HashMyFiles can accept?)

Posts: 2851
Joined: Thu Sep 03, 2009 6:48 pm

Re: finding duplicates

Post by therube » Tue Aug 19, 2014 2:43 am


Note what he says:
Explorer Context Menu

HashMyFiles can also be used directly from Windows Explorer. In order to enable this feature, go to the Options menu, and choose the 'Enable Explorer Context Menu' option. After you enable this feature, you can right-click on any file or folder on Windows Explorer, and choose the 'HashMyFiles' item from the menu.
If you run the HashMyFiles option for a folder, it'll display the hashes for all files in the selected folder.
If you run the HashMyFiles option for a single file, it'll display only the hashes for that file.

Notice: Static menu items of Explorer do not support multiple file selection. If you want to get the hash of multiple files from Explorer window, use Copy & Explorer Paste, or drag the files into the HashMyFiles window.
In particular, Notice: Static menu items of Explorer do not support multiple file selection.

Some time ago (& today too) I reported to him that that is not exactly correct, that is does work (with seemingly) < 16 items, though you may see one file duplicated, or not all selected files listed - so you do need to be aware of this.

He mentions copy/paste or dragging, but that is less convenient.

Posts: 12
Joined: Sun Apr 17, 2011 4:00 pm

Re: finding duplicates / MD5

Post by rgbigel » Fri Aug 22, 2014 1:17 pm

First, thanks for providing Version 686.

Regarding this discussion: there are lots of "hashers" around (Beyond Compare, FreeCommander,...)

But what is really needed is a way to get the hashes loaded into the Everything Database and make them visible+sortable as a column!

Based on the identity of MD5 hash, a duplicate function that ignores Names would be very easy to implement (in fact, you can just look at the items in subsequent rows after sorting)

Obviously, updating the hash (MD5) will always be slow (at least compared to reading the MFT). But updating these could be a background task that does not run too often. It would be perfect if it was triggered to run when a file is changed, though.

In a basic functionality, Beyond Compare can do this MD5 hashing, but only on selected directories (in a flattened view, including the subdirs). That's what I have to do for the moment...

Posts: 20
Joined: Wed May 15, 2013 12:45 am

Re: finding duplicates

Post by nothing » Thu Aug 28, 2014 3:04 am

might be what you need:
Doesn't everything does that already
dupe: search for duplicated files\folders names.You just add the name and the size: the file size to limit the duplicates displayed by name and size
dupe:.gif size:>1mb
will show only duplicated gif images(same name and contains but must contains gif)that are bigger than 1mb

dupe: wfn:1.gif size:>100kb
Will display duplicated files(for everything,duplicates mean same name)that the whole name is 1.gif and are bigger than 100kb

Posts: 1
Joined: Thu Apr 06, 2017 9:21 pm

Re: finding duplicates

Post by ljcorsa » Sat Feb 24, 2018 9:15 pm

Bumping an old thread.

I would love to see an MD5sum stored with each file, and displayable through the existing GUI, as a way of locating copied/renamed files, especially across drives and directories.

I could retire my messy Bash/Excel mashup!

Posts: 235
Joined: Thu Oct 27, 2016 7:19 pm

Re: finding duplicates

Post by ovg » Sun Feb 25, 2018 4:26 am

Please, no! Only if optional. There are a bunch of cli/gui utilities for this task.

Posts: 2851
Joined: Thu Sep 03, 2009 6:48 pm

Re: finding duplicates

Post by therube » Tue Feb 27, 2018 1:08 pm

(Separate & untested, Fsum Frontend.)

Some duplicate file finders compute & store hashes.
Some file renamers will compute a hash & rename the file to include it.

Posts: 2682
Joined: Wed May 24, 2017 9:22 pm

Re: finding duplicates

Post by NotNull » Sat Mar 03, 2018 1:05 pm

No need for that on a modern Windows system:

Code: Select all

C:\temp>certutil -hashfile c:\Tools\Everything\Everything.exe MD5

MD5 hash of c:\Tools\Everything\Everything.exe:
CertUtil: -hashfile command completed successfully.
Also available: MD2 MD4 MD5 SHA1 SHA256 SHA384 SHA512

Post Reply