Experimenting with remote queries using the ETP protocol, I've noticed the size reported on a FOLDER, is extremely large.
In my case a c:\Windows\Panter\ folder contains 18446744073709551615 bytes. Which is approx. 15 Etta Bytes!!
It seems the value I got; 18446744073709551615
is (unsigned __int64)-1). I assume you have used this value to initialise an unknown folder-size.
(just to differentiate a folder-size of 0 byte). But then forgot to fill in the real folder-size. Or something like that, no?
/* Number of seconds between the beginning of the Windows epoch
* (Jan. 1, 1601) and the Unix epoch (Jan. 1, 1970).
*/
#define DELTA_EPOCH_IN_SEC 11644473600
static time_t FILETIME_to_time_t (UINT64 ft)
{
ft /= 10000000; /* from 100 nano-sec periods to sec */
ft -= DELTA_EPOCH_IN_SEC; /* from Windows epoch to Unix epoch */
return (ft);
}
...
UINT64 ft;
if (sscanf(ctx->rx_ptr, "DATE_MODIFIED %I64u", &ft) == 1) {
ctx->mtime = FILETIME_to_time_t (ft);
printf ("mtime: %.24s", ctime(&ctx->mtime));
}
But the printed time I get seems off by 2 hours compared to what your GUI is showing me.
Please advice.
Everything uses SystemTimeToTzSpecificLocalTime to adjust filetimes to the current local time (taking daylight saving time into account) which is why you might be seeing the time difference.