It takes about 22GB private bytes not counting index journal's RAM usage. This includes the 4 extra properties.NotNull wrote: Sat Feb 07, 2026 10:14 pmThat must be the "highscore" so far (IIRC, the previous "record" was 65 million files/folders).
Without extra properties and content indexed, Everything uses roughly 100 bytes per file/ folder, which in your case translates to 18 GB RAM usage!!
Just curious: How do regular searches perform?
Depending on search I may want to use max-threads:4 as Void said because they can eat up 80% CPU and cause slowdowns elsewhere like my music player halting play.
As processes go to sleep, not because of pagefile necessarily or at all, they do take some time to start up. 22GB might take up to 30 seconds. Idk why but I did a test now just to check and it was maybe 1GB per second or less to go from 1.8GB PB(Private Bytes) to 22GB PB. That's very slow, usually it's like 4GB per second. Maybe this time it was pulling from the pagefile. Most of the pagefile should be in the Tixati I have as that takes 44GB private bytes...I used to use 80GB page file which meant a bit less than half was used only for that Tixati. Now my pagefile is 128GB just so I don't run out of RAM as I've been close in the past with only 80GB. I also have 128GB in RAM sticks so 256GB total these days. Maybe it was a combination of CPU usage and pulling from pagefile maybe. I am certain it has been much faster other times. This felt unusually slow or I don't think about it.
As a disclaimer I am not knower of how RAM and pagefile works, I could be completely wrong. Not trying to confidently state I know how it works but it is what I observe.
IF it has loaded into memory and using max-threads:4 one recent search took 4 seconds. This was just to find all folders of x. IF adding other stuff like size limits it might be faster. With size limit of 10GB the same search took 2.5 seconds or so. It's possible it's slower than usual because I am indexing extra properties on a new instance and desktop lag is higher as I've not restarted foobar2000 for a day and a half(It affects video games but Idk about EBV interface).
Without max-threads:4, which can include the instance eating up most CPU and thus potentially halt the audio from music player and feel like the system is about to crash, it was less than a second. Everything else stayed the same, including that instance indexing properties.
Also if searching from full list to size limit above 10GB it was less than a second. Closing window and then searching again took about 2.5s again. What sort is on I think also affects it. I think size is one of fastest and is the default on all windows I open with it.
Pre-emptively sorting by path-ascending caused it to take 10 seconds instead of 4. All sorts of things affect the speed. No fast sorts are enabled, that could help speed things up? But I have used it for at least a year and have not thought "This is too slow, enable fast sort" so I will keep them disabled.
Loading all of the available files with * and max-threads:4 still causes it to go up to 80% CPU usage, I guess some stages requiring loading of data ignores the rule and max-threads only applies to the actual searching of the data. Took 1min and 48 seconds to read it all, RAM spiked to 29.4GB private bytes and went down to 24.9GB when done when the file list became visible.
Current file count is 179,071,708. Unsurprising as I deleted around 500K at least from somewhere some days ago so it's no longer 180mil.
The file size, not selecting all for byte-level count, but it's 335TB in status bar. Windows-type display, so Idk what it would be without /1024. Maybe 360TB? I don't know the math to get to what 335 is. Maybe I once knew. 335x1.024? 343? That doesn't seem right. 360TB is my guess but it might be more or less.
