[SOLVED] High CPU usage on empty search

Discussion related to "Everything" 1.5 Alpha.
Post Reply
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

[SOLVED] High CPU usage on empty search

Post by Herkules97 »

NotNull wrote: Sat Feb 07, 2026 10:14 pm
Herkules97 wrote: Sat Feb 07, 2026 4:08 pm 180 million
That must be the "highscore" so far (IIRC, the previous "record" was 65 million files/folders).
Without extra properties and content indexed, Everything uses roughly 100 bytes per file/ folder, which in your case translates to 18 GB RAM usage!!

Just curious: How do regular searches perform?
It takes about 22GB private bytes not counting index journal's RAM usage. This includes the 4 extra properties.
Depending on search I may want to use max-threads:4 as Void said because they can eat up 80% CPU and cause slowdowns elsewhere like my music player halting play.

As processes go to sleep, not because of pagefile necessarily or at all, they do take some time to start up. 22GB might take up to 30 seconds. Idk why but I did a test now just to check and it was maybe 1GB per second or less to go from 1.8GB PB(Private Bytes) to 22GB PB. That's very slow, usually it's like 4GB per second. Maybe this time it was pulling from the pagefile. Most of the pagefile should be in the Tixati I have as that takes 44GB private bytes...I used to use 80GB page file which meant a bit less than half was used only for that Tixati. Now my pagefile is 128GB just so I don't run out of RAM as I've been close in the past with only 80GB. I also have 128GB in RAM sticks so 256GB total these days. Maybe it was a combination of CPU usage and pulling from pagefile maybe. I am certain it has been much faster other times. This felt unusually slow or I don't think about it.
As a disclaimer I am not knower of how RAM and pagefile works, I could be completely wrong. Not trying to confidently state I know how it works but it is what I observe.

IF it has loaded into memory and using max-threads:4 one recent search took 4 seconds. This was just to find all folders of x. IF adding other stuff like size limits it might be faster. With size limit of 10GB the same search took 2.5 seconds or so. It's possible it's slower than usual because I am indexing extra properties on a new instance and desktop lag is higher as I've not restarted foobar2000 for a day and a half(It affects video games but Idk about EBV interface).
Without max-threads:4, which can include the instance eating up most CPU and thus potentially halt the audio from music player and feel like the system is about to crash, it was less than a second. Everything else stayed the same, including that instance indexing properties.
Also if searching from full list to size limit above 10GB it was less than a second. Closing window and then searching again took about 2.5s again. What sort is on I think also affects it. I think size is one of fastest and is the default on all windows I open with it.
Pre-emptively sorting by path-ascending caused it to take 10 seconds instead of 4. All sorts of things affect the speed. No fast sorts are enabled, that could help speed things up? But I have used it for at least a year and have not thought "This is too slow, enable fast sort" so I will keep them disabled.

Loading all of the available files with * and max-threads:4 still causes it to go up to 80% CPU usage, I guess some stages requiring loading of data ignores the rule and max-threads only applies to the actual searching of the data. Took 1min and 48 seconds to read it all, RAM spiked to 29.4GB private bytes and went down to 24.9GB when done when the file list became visible.
Current file count is 179,071,708. Unsurprising as I deleted around 500K at least from somewhere some days ago so it's no longer 180mil.
The file size, not selecting all for byte-level count, but it's 335TB in status bar. Windows-type display, so Idk what it would be without /1024. Maybe 360TB? I don't know the math to get to what 335 is. Maybe I once knew. 335x1.024? 343? That doesn't seem right. 360TB is my guess but it might be more or less.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: Beta

Post by void »

Searching will be much faster if you sort by Name. (memory reads are more linear)
Searching might be a little faster if you enable match diacritics and match case, but its probably not worth the loss in functionality.
Enabling Match Path will hurt search performance.
Loading all of the available files with * and max-threads:4 still causes it to go up to 80% CPU usage, I guess some stages requiring loading of data ignores the rule and max-threads only applies to the actual searching of the data. Took 1min and 48 seconds to read it all, RAM spiked to 29.4GB private bytes and went down to 24.9GB when done when the file list became visible.
This sounds more like Everything loading its database back into memory after it was paged to disk.
Searching for * shouldn't cause any CPU activity, it will match all files so no search is performed.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: Beta

Post by Herkules97 »

void wrote: Sun Feb 08, 2026 9:51 am This sounds more like Everything loading its database back into memory after it was paged to disk.
Searching for * shouldn't cause any CPU activity, it will match all files so no search is performed.
I suppose that's possible, so the full memory of the db is 29.4GB OR 24.9GB? The only time it would then fully load would be when trying to find all files.
DB size was 12.7GB at the time.

Ignore most of the below instead, I don't think it's relevant
Using a search that finds any files with missing size on disk and thus missing other extra properties
  • !size-on-disk:0b !size-on-disk:=>1b !folder: sort:date-accessed-descending !"\System Volume Information" max-threads:4 !"C:\Program Files\WindowsApps\Deleted" !"C:\Windows\servicing\Packages" !"C:\Windows\System32\config\systemprofile\AppData" !"C:\Windows\System32\LogFiles\WMI\RtBackup"

It went up to 24.2GB. It then went down to 23.7GB when it was done and results were visible, but that may have been where it was at from the start.


It shouldn't have had much to load otherwise, I had already used my search that finds any files missing properties and that loads all 22GB.
I also don't know if pagefile is really involved every time when it has gone to sleep because I've had no pagefile at all in previous Windows installs and programs still go to sleep. Larger processes taking longer to awaken. This is seen in the difference between private bytes and working set in process explorer. PB remains typically what it would be when actually using a process while working set is where I can see it go down to 1.8GB when not in use for a while.
I tried reading about it but if I ever found out, I've since forgotten. My guess has been that memory sticks get hot and memory going to sleep helps cool them but that might be entirely incorrect.
I am sure I did find an answer, but as always I forget. Probably from some old Microsoft Learn article I had to use archive.org to read.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: Beta

Post by void »

I suppose that's possible, so the full memory of the db is 29.4GB OR 24.9GB?
The full size of the database in memory can be found under Tools -> Debug -> Statistics -> Total size.

The extra RAM usage is likely a copy of the index for the current results.

If the search is blank, there shouldn't be a copy of the index for the current results.

Are you indexing any file lists under Tools -> Options -> File Lists?
-If so, please try disabling Tools -> Options -> Advanced -> filelist_hide_reconstructed_folders
When filelist_hide_reconstructed_folders is enabled, Everything will do a search to hide reconstructed folders in file lists.

Are you omitting any results? Is Index -> Enable Result Omissions enabled?
-If so, this would also slightly hurt search performance with so many files.

-Could you please send your search OPs when the search is empty:
  • Set the Search to:
    a
  • From the Tools menu, under the Debug submenu, click Start Debug Logging...
  • Clear the search.
  • Wait for all results to show.
  • From the Tools menu, under the Debug submenu, click Stop Debug Logging...
    ---This will open your %TEMP%\Everything Debug Log.txt in notepad.
  • Please look for the following lines:

    Code: Select all

    2026-02-09 09:49:40.072: search '' filter '' sort 5 ascending 0
    2026-02-09 09:49:40.073: FOLDER TERM START 000000000030de20 M 000000000030de20 N 000000000030dcf0
    2026-02-09 09:49:40.073: FILE TERM START 000000000030de20 M 000000000030de20 N 000000000030dcf0
    2026-02-09 09:49:40.073: found 8122614 files with 0 threads in 0.000001 seconds
    2026-02-09 09:49:40.073: found 1970118 folders with 0 threads in 0.000001 seconds
    
  • What is shown for you?


There are ways to lock Everything into physical RAM, let me know if you are interested.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: Beta

Post by Herkules97 »

void wrote: Sun Feb 08, 2026 11:29 pm The full size of the database in memory can be found under Tools -> Debug -> Statistics -> Total size.

The extra RAM usage is likely a copy of the index for the current results.
The total size is 20GB, index size 2GB so 22GB total.

I may have made confusing statements? I don't think I am looking for any solutions? I wouldn't know for what issue. That Monolith is a bit slow? Eh that's whatever, as I said I've never thought it was too slow. If I for some reason want to read the entire file list, I'll just wait the minute or 2 it takes.
All other tasks takes a lot less than that, usually 30 seconds or less.

I only added that in to share some stats on how such a large index behaves, the question was about regular searches.
If I really wanted to speed things up, Idk if fast sort would help or make it slower as I think you said at some point fast sort would probably be worse than not having it on or maybe I am mixing up memories..Anyway buying more RAM could be another. But I don't want to spend the money for both 128GB I already have and then the additional capacity per stick to get to 192GB. Maybe if I have motherboard that supports 384GB or more would it be worth splurging. But AM5 motherboards so far seem to go up to only 256GB if even that as those may have been Threadripper motherboards, I don't remember. Mine officially supports 192GB or if it was 196GB. Ignoring prices for RAM nowadays, but..I just checked and I paid around 400 USD for 128GB. The website no longer sells the exact versions of those sticks but what looks like similar ones go for 2800 USD to get 128GB. Teehee


I could do the things you wrote if you want to know anyway?
The 29.4GB comes from using max-threads:4 * to see the entire list of 179mil files/folders, then when the results are visible it goes down to 24.9GB from the starting 22.7GB or whatever it was.
The extra 2GB might be from it having rendered the list? It goes back to 22GB on a restart, but remains at 24.9GB if not.
There are ways to lock Everything into physical RAM, let me know if you are interested.
You could share anyway if you want?
I doubt I'd have a use for it, but maybe I could do it just for funsies.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: Beta

Post by void »

I may have made confusing statements? I don't think I am looking for any solutions? I wouldn't know for what issue. That Monolith is a bit slow? Eh that's whatever, as I said I've never thought it was too slow. If I for some reason want to read the entire file list, I'll just wait the minute or 2 it takes.
All other tasks takes a lot less than that, usually 30 seconds or less.
It shouldn't be slow for an empty search, so I see it as an issue.


You could share anyway if you want?
To lock Everything into physical RAM: (not recommended)
  • In Everything 1.5, from the Tools menu, click Options.
  • Click the Advanced tab on the left.
  • To the right of Show settings containing, search for:
    working
  • Select: min_working_set_size
  • Set the value to: 2x the normal Everything RAM usage. In your case:
    64424509440
  • Select: max_working_set_size
  • Set the value to: 2x the normal Everything RAM usage. In your case:
    64424509440
  • To the right of Show settings containing, search for:
    virtual
  • Select: virtual_lock
  • Set the value to:
    true
  • Click OK.
  • Exit Everything (File -> Exit)
  • Restart Everything.
This will lock up-to 60GB of RAM for Everything.
The system will be unable to use this RAM for anything else.
Please make sure you have plenty of RAM available if you choose to enable virtual_lock.

virtual_lock
NotNull
Posts: 5948
Joined: Wed May 24, 2017 9:22 pm

Re: Beta

Post by NotNull »

@Hercules97:
Thank you for doing some performance tests!
Now I have some idea how Everything behaves with this many files. Appreciated!

Herkules97 wrote: Sun Feb 08, 2026 2:23 am [...]Tixati I have as that takes 44GB private bytes[...]
That is quite a bit of RAM for (what I assume is ) a background task.
I expect Everything to be a lot more responsive when Tixati isn't running.

You didn't ask for it, but I think there are ways to let Tixati use less RAM. Might improve your overall system performance. Let me know when interested...
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: Beta

Post by Herkules97 »

void wrote: Mon Feb 09, 2026 4:49 am It shouldn't be slow for an empty search, so I see it as an issue.
By empty search you mean "max-threads:4 *" or just * ?
It's not meant to take extra RAM to load and then display all those files? So if I search * and it goes up beyond 22GB it's a bug?

Here is a recording of start to finish with Monolith sleeping and beginning a search for *(skipped the max-threads:4 just to make it "pure"), if you want to see it in action
VOID edit: removed video link

I'll later try out the things you said before and report back I suppose..I want to save Monolith first and for that I need more space which will delay it a bit..
Last edited by void on Tue Feb 10, 2026 10:59 pm, edited 1 time in total.
Reason: VOID edit: removed video link
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: Beta

Post by Herkules97 »

NotNull wrote: Mon Feb 09, 2026 1:55 pm You didn't ask for it, but I think there are ways to let Tixati use less RAM. Might improve your overall system performance. Let me know when interested...
I don't know what you mean by improve system performance, but I'm curious what you mean by using less RAM.
I wouldn't know if it sounds like a good idea without knowing what the idea is ;)
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: Beta

Post by Herkules97 »

void wrote: Sun Feb 08, 2026 11:29 pm Are you indexing any file lists under Tools -> Options -> File Lists?
-If so, please try disabling Tools -> Options -> Advanced -> filelist_hide_reconstructed_folders
When filelist_hide_reconstructed_folders is enabled, Everything will do a search to hide reconstructed folders in file lists.
I am indexing one file list, disabled reconstruct, did nothing
Are you omitting any results? Is Index -> Enable Result Omissions enabled?
-If so, this would also slightly hurt search performance with so many files.
I am not using this feature currently and never have.
-Could you please send your search OPs when the search is empty:
  • Set the Search to:
    a
  • From the Tools menu, under the Debug submenu, click Start Debug Logging...
  • Clear the search.
  • Wait for all results to show.
  • From the Tools menu, under the Debug submenu, click Stop Debug Logging...
    ---This will open your %TEMP%\Everything Debug Log.txt in notepad.
  • Please look for the following lines:

    Code: Select all

    2026-02-09 09:49:40.072: search '' filter '' sort 5 ascending 0
    2026-02-09 09:49:40.073: FOLDER TERM START 000000000030de20 M 000000000030de20 N 000000000030dcf0
    2026-02-09 09:49:40.073: FILE TERM START 000000000030de20 M 000000000030de20 N 000000000030dcf0
    2026-02-09 09:49:40.073: found 8122614 files with 0 threads in 0.000001 seconds
    2026-02-09 09:49:40.073: found 1970118 folders with 0 threads in 0.000001 seconds
    
  • What is shown for you?
I decided to index the device with missing data so it took longer to get to this.
Still a day later and it did all 4.4mil files so that was nice.
Sorted by size as you might see below, Idk what it means myself but sort 2 might be size if it's by how fast each sort is. I will try with name if it's the fastest. This went up to 30.9GB which is funny because I already did this before in the same session and it was 29.9GB
Idk if it adds more RAM for each * search it does, but as usual restarting purges all that.
  • 2026-02-10 16:38:41.237: search '*' filter '' sort 2 ascending 0
    2026-02-10 16:38:41.238: FOLDER TERM START 0000000000bee6c0 M 0000000000bee6c0 N 0000000000bee470
    2026-02-10 16:38:41.238: FILE TERM START 0000000000bee6c0 M 0000000000bee6c0 N 0000000000bee470
    2026-02-10 16:38:41.241: found 130135976 files with 0 threads in 0.000001 seconds
    2026-02-10 16:38:41.241: found 48992540 folders with 0 threads in 0.000000 seconds
    2026-02-10 16:38:41.241: SET SORT 0
    2026-02-10 16:38:41.241: set sort 2 ascending 0 is valid 1 case 0
    2026-02-10 16:38:41.241: fill with name arrays
...
  • 2026-02-10 16:40:03.403: sort complete 00007ff6a8cbee54, valid 1
    2026-02-10 16:40:03.403: finished sort, time taken 82.163983 seconds
    2026-02-10 16:40:03.403: total size 375375852756968, calculated in 0.000000 seconds
    2026-02-10 16:40:03.404: updated selection in 0.000000 seconds
    2026-02-10 16:40:03.404: ready
With sorted by name ascending it was instant and took no extra RAM, though with the previous search the process is taking about 1.4GB more Private Bytes long-term until restart, so I could restart it and do a pure search..But I take it it will be instant and require no more RAM and the current RAM is about 24.2GB restarted.
  • 2026-02-10 16:44:58.260: search '*' filter '' sort 0 ascending 1
    2026-02-10 16:44:58.261: FOLDER TERM START 0000000000bee670 M 0000000000bee670 N 0000000000bee420
    2026-02-10 16:44:58.261: FILE TERM START 0000000000bee670 M 0000000000bee670 N 0000000000bee420
    2026-02-10 16:44:58.261: found 48992553 folders with 0 threads in 0.000000 seconds
    2026-02-10 16:44:58.261: found 130136102 files with 0 threads in 0.000000 seconds
    2026-02-10 16:44:58.261: SET SORT 0
    2026-02-10 16:44:58.261: set sort 0 ascending 1 is valid 1 case 0
    2026-02-10 16:44:58.261: already sorted
    2026-02-10 16:44:58.261: finished sort, time taken 0.000017 seconds
    2026-02-10 16:44:58.261: total size 375387191306949, calculated in 0.000000 seconds
    2026-02-10 16:44:58.261: update index C:
    2026-02-10 16:44:58.261: updated selection in 0.000000 seconds
    2026-02-10 16:44:58.261: USN CREATE Search History-Monolith 1.5.0.1390a.csv.tmp
    2026-02-10 16:44:58.261: USN DATA_EXTEND CREATE Search History-Monolith 1.5.0.1390a.csv.tmp
    2026-02-10 16:44:58.261: ready
I think when you were questioning why it's happening, you had name sort in mind.
However 99% of my searches are by size descending as that's often more useful than name ascending/descending.

Would fast sort for size and maybe path and other basic ones help alleviate the issue of high CPU usage and private bytes increase on anything but name sorting?
I don't think I would do it, but I am curious if it would. Maybe one day I will decide to do it but that would be a major task considering it would have to handle around 170mil files/folders. About 7.7mil are on looks-to-be-dead HDDs so they may never be indexed fully.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: High CPU usage on empty search

Post by void »

Moved from beta


By empty search you mean "max-threads:4 *" or just * ?
They are both the same search.
They shouldn't have any CPU usage.


I am indexing one file list, disabled reconstruct, did nothing
It would have helped a little.


2026-02-10 16:38:41.238: FOLDER TERM START 0000000000bee6c0 M 0000000000bee6c0 N 0000000000bee470
2026-02-10 16:38:41.238: FILE TERM START 0000000000bee6c0 M 0000000000bee6c0 N 0000000000bee470
2026-02-10 16:38:41.241: found 130135976 files with 0 threads in 0.000001 seconds
2026-02-10 16:38:41.241: found 48992540 folders with 0 threads in 0.000000 seconds
The search is instant.
There's no CPU usage.


2026-02-10 16:40:03.403: finished sort, time taken 82.163983 seconds
It's the sort by Size that takes forever.

I highly recommend enabling fast size sort.
(Tools -> Options -> Indexes -> Fast Size Sort)
The instant sort will outweigh the slightly extra RAM usage (8MB per 1 million files)
There's no copy of the results either, so if you were already sorting all the results, you will not see any extra RAM usage.



Another note: Does your mobo support quad channel memory? If you are currently only using dual channel memory, using quad channel memory would double RAM performance for Everything. Can't imagine another 2 sticks of 128 GB of RAM being cheap...
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: High CPU usage on empty search

Post by Herkules97 »

void wrote: Tue Feb 10, 2026 11:05 pm It's the sort by Size that takes forever.

I highly recommend enabling fast size sort.
(Tools -> Options -> Indexes -> Fast Size Sort)
The instant sort will outweigh the slightly extra RAM usage (8MB per 1 million files)
There's no copy of the results either, so if you sort all the results, you will not see any extra RAM usage.
So fast sort would alleviate the CPU usage given that it would be instant?
If so, does it work for all properties that have it enabled? Or are some going to be better than others, like the basics - size, path and such versus extras - size on disk, artist and such.
Another note: Does your mobo support quad channel memory? If you are only using dual channel memory, using quad channel memory would double RAM performance for Everything. Can't imagine another 2 sticks of 128 GB of RAM being cheap...
No, 128GB is 2800 USD(exact sticks are no longer sold on the same website, so Idk what it would be now but similar are 2800) when it used to be 400.
It's "consumer"-level too, so no quad channel probably. It might be supported in the mobo, but that would require me to buy new RAM sticks and at that quad channel.
I doubt it's cheaper than 2800 USD when it's faster and it's still only 128GB. At the base I have to spend 2800 USD to get the same RAM capacity I already have and then whatever more to get higher capacity cards.
Maybe when it's at most 2x as expensive, I will get a platform that goes up to 384GB. I thought doubling from 64GB to 128GB was enough but I found a way. 128GB to 384GB would surely serve any new process I might want to run. I know some process I want to run but they're CPU-bound and take very little RAM. Maybe instead of more RAM, I stick with 128GB real, 128GB pagefile and splurge on a much better CPU instead.
But that might be in 5+ years.

I haven't found that speed is an issue, this "empty search" is just a check. Useable searches include other things that pull down the time it takes to process. These searches are typically 15 seconds or shorter. Maybe much shorter if the process is already awake.

One de-duplication search I did recently and re-used for testing took about 18 seconds from being asleep, no proper counting so for all I know I was counting faster than once a second but probably only down to 15-16 seconds if I had stared at the clock, and then when awake same search took 7 seconds while staring at the clock. This is with max-threads:4 which seems to be around 23% CPU usage whatever that means.

If someone really wants to see "real prices", except for the sticks I bought then you can look here. For reference the page below at the time of buying was 2001 SEK. So 4002 SEK total for 128GB. I say 400 USD but converted it's probably 390 or less.
The exact page I bought from is in Swedish, but is https://www.proshop.se/RAM/Corsair-Veng ... rt/3036147
Dual channel as expected with non-server type hardware.
The 2800 USD one is https://www.proshop.se/RAM/Corsair-Veng ... aa/3170567
The search I used to find similar stuff https://www.proshop.se/?s=+Corsair+Vengeance+DDR5-5200
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: High CPU usage on empty search

Post by void »

So fast sort would alleviate the CPU usage given that it would be instant?
Yes, when the search is empty or * and you sort by Size.
There will be no noticeable CPU usage.


If so, does it work for all properties that have it enabled?
Yes.
As an example, enable fast sort for length and sort by length, there will be no CPU usage when the search is empty or *


Or are some going to be better than others, like the basics - size, path and such versus extras - size on disk, artist and such.
Fast sorts are all the same.
Fast sort is an index of all files and folders.
Doesn't matter if the file or folder has the property indexed or not.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: High CPU usage on empty search

Post by Herkules97 »

void wrote: Wed Feb 11, 2026 10:40 am Yes, when the search is empty or * and you sort by Size.
There will be no noticeable CPU usage.
I see, I would add [Solved] into the title but due to you splitting it only you and maybe the greens(NotNull, therube) can do it.
I'd try but I don't want to change anything if I'm not sure.

Also while I can still edit without a new reply..
It took 45min 46s to add fast sorts for file size, date created, date modified, date accessed, size on disk and allocation size for 179mil files/folders.
The instance went from around 20.7GB to 28.6GB, I think the 22GB I've written recently included the index journal and without it was 20GB which lines up with a 02-08 copy of the stat page.
This surely increases the time it takes to awaken, but it looks like it might be much faster sorting now.
I think I used fast sorts some years ago and found that it didn't do much, I will use Monolith with fast sorts for maybe a month or more to determine if it was worth it.

Adding fast sorts caused heavy CPU usage(Because of stuff like sorting filenames, this is not exclusive to adding fast sorts and is what happens when rebuilding index), total 92% and I guess it didn't want to go further. Monolith itself took up to 66%, other processes added it up to 92% like my 2 Tixatis taking 6-8% each at all times.
max-threads:4 only apply to searches, would it be safe to say disable "CPU affinity" cores in Process Explorer if I want to extend the processing Monolith does but ensure it eats less CPU usage to prevent it feeling like moving around or doing other stuff could cause a system crash from overstimulation.
This might not be how it works. Dying Light The Beast caused a video scheduler error and sub-sequent BSOD when I changed OBS from game capture to display capture and the desktop was lagging hard while the game was on. Idk if lag caused it or TAA. Battlefield 6 with TAA on also caused severe desktop lag, disabling it fixed it IIRC. DLTB forces TAA, so couldn't check non-TAA. Maybe it has something to do with TAA in certain games and OBS. Never tried to push it with say EBV and doing other stuff when it is using 70+% CPU as I don't want to invite system crashes.

At the least it could help against foobar2000 potentially halting music sometimes. It didn't do it this time, but oddly CPU never went beyond 92% so that may have had to do with it. When Monolith can go up to 72% is when I've had interruptions I think.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: High CPU usage on empty search

Post by void »

It took 45min 46s to add fast sorts for file size, date created, date modified, date accessed, size on disk and allocation size for 179mil files/folders.
The instance went from around 20.7GB to 28.6GB
I wouldn't bother with date created or date accessed unless you really need the instant sort for these.
In your case, you are looking at about 2GB for each property with fast sort enabled.


max-threads:4 only apply to searches
Correct.
Please try setting Tools -> Options -> Advanced -> max_threads
Sorting will be limited to this number of threads.
I recommend setting Advanced -> max_threads over setting CPU affinity.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: High CPU usage on empty search

Post by Herkules97 »

void wrote: Thu Feb 12, 2026 12:04 am Correct.
Please try setting Tools -> Options -> Advanced -> max_threads
Sorting will be limited to this number of threads.
I recommend setting Advanced -> max_threads over setting CPU affinity.
Eh screw it, I'll find out and make a separate post or just live with it if it sometimes happens with certain tasks.
If for example it still goes above 50% during rebuilding I can just take the hit. I've done so up to this point and nothing of note has happened.
I am not rebuilding index anytime soon, but this would apply to the whole process?
If every step could take only 26% CPU or so like max-threads:4 does for some stages of searching that would be sweet. Would also mean I no longer have to use max-threads:4 in searches.
I'll try it pre-emptively, but can't confirm it for rebuilding as I don't want to rebuild it for just that.

It's why I ask about doing it via Process Explorer as that should hopefully ensure the entire process is limited in all of its activities.
I just don't know if that route can lead to problems, if programs aren't designed to be limited externally or whatever it may be and when it hits a limit it craps itself instead of just taking it slower.

Hmm I didn't check the options before replying, maybe max threads can apply to all activities..
I will try putting all of them at 4.
content_max_threads=4
index_max_threads=4
max_threads=4
memcpy_max_threads=4
search_max_threads=4
I don't know what search to do to trigger it in normal situation to reach over 50% CPU usage, I put them all to 4 and did a few searches and max was 32% CPU usage. I suppose like fast sort I will find out over time if it works as I'd hope.
Maybe the rebuilding phase where you can't search is the only time none of the max-threads work. I could try with a temporary copy of some instance..But eh.
Last edited by Herkules97 on Thu Feb 12, 2026 5:25 am, edited 1 time in total.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

I am not rebuilding index anytime soon, but this would apply to the whole process?
max_threads applies to the whole process.
Changing max_threads does not trigger a rebuild.

You don't have to include
max-threads:4
in your search if you set Advanced -> max_threads.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: [SOLVED] High CPU usage on empty search

Post by Herkules97 »

void wrote: Thu Feb 12, 2026 5:17 am
I am not rebuilding index anytime soon, but this would apply to the whole process?
max_threads applies to the whole process.
Changing max_threads does not trigger a rebuild.

You don't have to include
max-threads:4
in your search if you set Advanced -> max_threads.
Nice, yeah I was in the middle of editing my post to just discard most of it as one can now see. You were too fast :)
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

A small note: Everything 1.5.0.1405a adds an advanced sort_max_threads setting for more control over sorting performance.

Sorting performance was greatly increased in 1401.
Everything can easily max out all CPUs for a long time when it is sorting.
sort_max_threads can reduce the number of threads used to sort.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: [SOLVED] High CPU usage on empty search

Post by Herkules97 »

void wrote: Thu Feb 26, 2026 8:28 am A small note: Everything 1.5.0.1405a adds an advanced sort_max_threads setting for more control over sorting performance.
I'm just curious..What time is it for you? Because the Everything.exe from the archive was 6 or so hours into the future when I extracted it.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

I built it at about 3pm on 2026-02-26
(4 hours ago from this post)

I am in Australian Central Daylight Time.
So im guessing the time zone information is lost when extracting?
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: [SOLVED] High CPU usage on empty search

Post by Herkules97 »

void wrote: Thu Feb 26, 2026 8:32 am I built it at about 3pm on 2026-02-26
(4 hours ago from this post)

I am in Australian Central Daylight Time.
So im guessing the time zone information is lost when extracting?
Hah, 4 hours ago.
I extracted it 1h30min ago, which I guess means it got a time of 8h30min into the future when you made it.
I have no idea how archives work.
The timestamp on the archive itself downloaded with IDM and its server time setting enabled is 06 which is about 3h30min ago so that lines up.
It's only the .exe inside that has a very odd datetime modified.
It's not on extraction, it's in the archive itself. Extracted just has the same time(s) as in the archive.
I checked 1404a and 1380a and the .exes are also around 9 hours into the future. Never noticed it before :D.


IIRC when I've made archives during DST, they've been stuck. So I think timezones aren't saved in archives.

Ah...I should've checked time pages, according to https://time.is/ACDT it is indeed meant to be 8h30min in the future.
Australian Central Daylight Time is 9 hours and 30 minutes ahead of the time in Sweden when Sweden is on standard time, and 8 hours and 30 minutes ahead of the time in Sweden when Sweden is on daylight saving time.
Though my math might be wrong, I think it's meant to be 9h30min into the future. DST apparently starts in a month.
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

I should probably switch to 7zip which stores timezone info..

..added to my TODO list.
w64bit
Posts: 336
Joined: Wed Jan 09, 2013 9:06 am

Re: [SOLVED] High CPU usage on empty search

Post by w64bit »

void wrote: Thu Feb 26, 2026 8:28 am A small note: Everything 1.5.0.1405a adds an advanced sort_max_threads setting for more control over sorting performance.
In INI file there is max_threads and no sort_max_threads.
It seems to be a renaming issue.
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: [SOLVED] High CPU usage on empty search

Post by Herkules97 »

w64bit wrote: Thu Feb 26, 2026 11:17 am
void wrote: Thu Feb 26, 2026 8:28 am A small note: Everything 1.5.0.1405a adds an advanced sort_max_threads setting for more control over sorting performance.
In INI file there is max_threads and no sort_max_threads.
It seems to be a renaming issue.
Are you sure?
Image
Album version https://imgur.com/a/E4no4ij

Did you update to 1405a? It's not 1404a that has the new addition.
w64bit
Posts: 336
Joined: Wed Jan 09, 2013 9:06 am

Re: [SOLVED] High CPU usage on empty search

Post by w64bit »

I was comparing fixes (1404 with 1405) and I reported based on 1404.
Sorry.
w64bit
Posts: 336
Joined: Wed Jan 09, 2013 9:06 am

Re: [SOLVED] High CPU usage on empty search

Post by w64bit »

And because we have now max threads specific for:
- search
- sort
- content
- memcpy
- index
for what activity is "max_threads"?
Herkules97
Posts: 201
Joined: Tue Oct 08, 2019 6:42 am

Re: [SOLVED] High CPU usage on empty search

Post by Herkules97 »

w64bit wrote: Thu Feb 26, 2026 11:54 am And because we have now max threads specific for:
- search
- sort
- content
- memcpy
- index
for what activity is "max_threads"?
Idk if you expect an answer soon, he might have gone to sleep recently.

Maybe there are those that remain un-separated or it's for all of them. Or it's like bones that a species no longer uses and he has to code it out.
Alternatively it's a lazy setting? If you don't want to customise all individually you can set it?
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

for what activity is "max_threads"?
max_threads applies to all methods (search, sort, content, memcpy and index).
w64bit
Posts: 336
Joined: Wed Jan 09, 2013 9:06 am

Re: [SOLVED] High CPU usage on empty search

Post by w64bit »

max_threads applies to all methods (search, sort, content, memcpy and index).
When the values for them are 0?

If i set
max_threads=4
sort_max_threads=8
this means that sort_max_threads will use 4?
void
Developer
Posts: 19568
Joined: Fri Oct 16, 2009 11:31 pm

Re: [SOLVED] High CPU usage on empty search

Post by void »

When the values for them are 0?
0 = Use unlimited threads


If i set
max_threads=4
sort_max_threads=8
this means that sort_max_threads will use 4?
Correct.
The lowest value is used.
Post Reply