Discussion:
What is a good file and directory management program?
(too old to reply)
g***@aol.com
2023-10-06 15:19:13 UTC
Permalink
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.

Loading Image...
Paul
2023-10-06 15:37:10 UTC
Permalink
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
That's a well-ordered garage. I see no reason to panic :-)

I like your ladder on the ceiling. My father and I had a huge
ladder we used to store that way (basement ceiling). It took the
two of us to carry it and get it into place. It hung from some
heavy wire hangers, so part of the fun, was fitting one end of the ladder
into a U-shaped hanger, then getting the other end into its hanger.
I used to paint windows on the top of that ladder, about 2.5 storeys up.
It's a bastard of a job working up there, and trying to scrape and prep
a window you can barely reach. The air is thinner up there.

Seriously, if you have time for deduping a file tree, you have
too much time on your hands :-) Only "file system vandalism"
(purposeful spewing of files into a tree), needs a dedup/cleanup.
Casual use, the files will be just fine where they are. For
example, I regularly clean up "cache2" area, on softwares where
I haven't changed the cache to be RAM based instead.

So a keyword for your Google search, might be "dedup".

https://www.auslogics.com/en/software/duplicate-file-finder/

*******

This is an older program, written in a university. There are
Linux versions, kdirstat and qdirstat or so, which use the same
display method. This is how you identify "space hogs".

https://www.majorgeeks.com/files/details/sequoiaview.html

(Original site)

https://www.win.tue.nl/sequoiaview/

You can right-click a tile, and select "Explore" to have File Explorer
display the item. For example, a file like C:\hiberfil.sys can be
rather large, and that should give you a big square to practice on.
Naturally, you won't be deleting C:\hiberfil.sys :-)

Loading Image...

That's one of the problems with deduping and cleaning up, is the
danger of overdoing it, and erasing an only copy.

Paul
J. P. Gilliver
2023-10-06 17:34:39 UTC
Permalink
In message <ufp9j6$1jvls$***@dont-email.me> at Fri, 6 Oct 2023 11:37:10,
Paul <***@needed.invalid> writes
[]
Post by Paul
That's a well-ordered garage. I see no reason to panic :-)
[]
Post by Paul
I used to paint windows on the top of that ladder, about 2.5 storeys up.
I was wondering why a ladder would have windows. Then ...
[]
Post by Paul
Seriously, if you have time for deduping a file tree, you have
too much time on your hands :-) Only "file system vandalism"
(purposeful spewing of files into a tree), needs a dedup/cleanup.
Agreed. Though sometimes it seems MS is one of the worst for that.
Post by Paul
Casual use, the files will be just fine where they are. For
example, I regularly clean up "cache2" area, on softwares where
I haven't changed the cache to be RAM based instead.
So a keyword for your Google search, might be "dedup".
https://www.auslogics.com/en/software/duplicate-file-finder/
My, that doesn't half "phone home" a lot! But looks good.
Post by Paul
*******
This is an older program, written in a university. There are
Linux versions, kdirstat and qdirstat or so, which use the same
display method. This is how you identify "space hogs".
https://www.majorgeeks.com/files/details/sequoiaview.html
(Original site)
https://www.win.tue.nl/sequoiaview/
I think WinDirStat is derived from it.
[]
Post by Paul
That's one of the problems with deduping and cleaning up, is the
danger of overdoing it, and erasing an only copy.
Especially when you don't know what they do.
Post by Paul
Paul
John
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

aibohphobia, n., The fear of palindromes.
J. P. Gilliver
2023-10-06 16:33:35 UTC
Permalink
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
Nice (-:! Actually, you _might_ be letting the side down - near bottom
left, I can see something that _might_ still be a car.

I don't think you'll find anything that does _all_ of the above in one
application - in fact I don't know anything that'll _sort_ folders by
size, rather than just _show_ them. For the latter, very little beats
WinDirStat (https://windirstat.net/ - there's a sample screenshot) for
detail in a well-presented manner, but it _is_ a little slow to run
(took just under 2 minutes to do both partitions on here). For a faster
and also rather interesting display of folder sizes, try Steffen
Gerlach's "scanner" http://steffengerlach.de/freeware/ - it's a sort of
hierarchical pie-chart (can also be added to the right-click menu).
Quite old, but nothing wrong with that (may be one of the reasons it
runs reasonably fast).

For finding duplicate files, it depends where you've put them, what sort
they are, and so on. Although it's far from its primary purpose, I find
Everything from voidtools often quite useful in that respect, especially
if you're not sure where the duplicates may be - especially as it can
sort by size or creation date as well as the default of filename. For a
more serious FindDuplicates, David Taylor's - again old - one of that
name: https://www.satsignal.eu/software/disk.html (he also has there a
usage piechart, but IMO its display is less intuitive than Steffen's).

For images specifically, ideally, you want something that knows about
images, and is able to compare assorted files: the best I've come
across, but which unfortunately the company seems to have disappeared,
is Duplicate Image Finder from Runningman software, which can compare
images in different file formats, resolutions, and if I remember
rotations and reflections - you set a percentage equality (can be 100%),
and it shows you them side-by side telling you where they are (and other
things about them such as size and format) and you choose which (or
neither) to delete. I think there are other such utilities about. (If
someone knows a freeware one that does that - allow different formats
and resolutions - please share! [W7 and older only, of course.])

[Probably something similar for audio and video files would be good too,
but harder to create. (Ditto document, e. g. Word, PDF, and so on.)]
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"This situation absolutely requires a really futile and stoopid gesture be done
on somebody's part." "We're just the guys to do it." Eric "Otter" Stratton (Tim
Matheson) and John "Bluto" Blutarsky (John Belushi) - N. L's Animal House
(1978)
Char Jackson
2023-10-07 00:42:40 UTC
Permalink
Post by J. P. Gilliver
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
I don't think you'll find anything that does _all_ of the above in one
application - in fact I don't know anything that'll _sort_ folders by
size, rather than just _show_ them.
Treesize Free, from https://www.jam-software.com/treesize_free
Post by J. P. Gilliver
For the latter, very little beats WinDirStat (https://windirstat.net/
I honestly think that program is little more than eye candy. Other than that, I
found it to be completely useless.
Post by J. P. Gilliver
For finding duplicate files,
I use Duplicate Cleaner Free, from
https://www.digitalvolcano.co.uk/duplicatecleaner.html
J. P. Gilliver
2023-10-07 01:29:30 UTC
Permalink
In message <***@4ax.com> at Fri, 6 Oct
2023 19:42:40, Char Jackson <***@none.invalid> writes
[]
Post by Char Jackson
Treesize Free, from https://www.jam-software.com/treesize_free
Post by J. P. Gilliver
For the latter, very little beats WinDirStat (https://windirstat.net/
I honestly think that program is little more than eye candy. Other than that, I
found it to be completely useless.
Despite your knocking WinDirStat (I include its predecessor that someone
mentioned - had a tribal name, I forget it), I endorse your
recommendation of Treesize. I did think of suggesting it. It has some
advantages over WinDirStat - certainly the detail is more visible. It
just (by its nature) can't show as many lines, certainly not a whole
disc. (Equally, WDS's display can't show as much _detail_.)
Post by Char Jackson
Post by J. P. Gilliver
For finding duplicate files,
I use Duplicate Cleaner Free, from
https://www.digitalvolcano.co.uk/duplicatecleaner.html
Haven't tried that one; the one someone mentioned - the Auslogics one -
I've now tried, and (apart from the amount of phoning home it does,
which AVG has I hope now let me sit on) am quite impressed with.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"In the _car_-park? What are you doing there?" "Parking cars, what else does
one do in a car-park?" (First series, fit the fifth.)
Char Jackson
2023-10-07 05:23:08 UTC
Permalink
Post by J. P. Gilliver
[]
Post by Char Jackson
Treesize Free, from https://www.jam-software.com/treesize_free
Post by J. P. Gilliver
For the latter, very little beats WinDirStat (https://windirstat.net/
I honestly think that program is little more than eye candy. Other than that, I
found it to be completely useless.
Despite your knocking WinDirStat (I include its predecessor that someone
mentioned - had a tribal name, I forget it), I endorse your
recommendation of Treesize. I did think of suggesting it. It has some
advantages over WinDirStat - certainly the detail is more visible. It
just (by its nature) can't show as many lines, certainly not a whole
disc. (Equally, WDS's display can't show as much _detail_.)
I'm not sure what you mean. I use Treesize Free to display a whole disk more
often than not. It's integrated into my right-click context menu, so it can be
started in any folder, but it can also be started in a disk's root folder and
thus show that entire disk.
Post by J. P. Gilliver
Post by Char Jackson
Post by J. P. Gilliver
For finding duplicate files,
I use Duplicate Cleaner Free, from
https://www.digitalvolcano.co.uk/duplicatecleaner.html
Haven't tried that one; the one someone mentioned - the Auslogics one -
I've now tried, and (apart from the amount of phoning home it does,
which AVG has I hope now let me sit on) am quite impressed with.
Duplicate Cleaner (the paid version) claims to be able to find duplicate photos,
including different formats, different aspect ratios, cropped, rotated,
close-enough, etc. It also claims to do the same with videos and music files. I
didn't suggest it for those tasks because those are paid features and you wanted
freeware. I only use it to find duplicate files, especially duplicate files that
have different names, and that part is free.
g***@aol.com
2023-10-07 07:01:58 UTC
Permalink
Post by J. P. Gilliver
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
Nice (-:! Actually, you _might_ be letting the side down - near bottom
left, I can see something that _might_ still be a car.
Golf Cart
I'm surprised nobody found the Harley.
That is an old picture. I have new junk now but (hurricane) Ian helped
me decide to get rid of a lot of it. (4 feet of salt water)
Post by J. P. Gilliver
I don't think you'll find anything that does _all_ of the above in one
application - in fact I don't know anything that'll _sort_ folders by
size, rather than just _show_ them. For the latter, very little beats
WinDirStat (https://windirstat.net/ - there's a sample screenshot) for
detail in a well-presented manner, but it _is_ a little slow to run
(took just under 2 minutes to do both partitions on here). For a faster
and also rather interesting display of folder sizes, try Steffen
Gerlach's "scanner" http://steffengerlach.de/freeware/ - it's a sort of
hierarchical pie-chart (can also be added to the right-click menu).
Quite old, but nothing wrong with that (may be one of the reasons it
runs reasonably fast).
For finding duplicate files, it depends where you've put them, what sort
they are, and so on. Although it's far from its primary purpose, I find
Everything from voidtools often quite useful in that respect, especially
if you're not sure where the duplicates may be - especially as it can
sort by size or creation date as well as the default of filename. For a
more serious FindDuplicates, David Taylor's - again old - one of that
name: https://www.satsignal.eu/software/disk.html (he also has there a
usage piechart, but IMO its display is less intuitive than Steffen's).
Thanks, I will look into those. My problem is I have a disk with close
to 500g on it and I am not sure where the hogs are There are a bunch
of folders.
Post by J. P. Gilliver
For images specifically, ideally, you want something that knows about
images, and is able to compare assorted files: the best I've come
across, but which unfortunately the company seems to have disappeared,
is Duplicate Image Finder from Runningman software, which can compare
images in different file formats, resolutions, and if I remember
rotations and reflections - you set a percentage equality (can be 100%),
and it shows you them side-by side telling you where they are (and other
things about them such as size and format) and you choose which (or
neither) to delete. I think there are other such utilities about. (If
someone knows a freeware one that does that - allow different formats
and resolutions - please share! [W7 and older only, of course.])
[Probably something similar for audio and video files would be good too,
but harder to create. (Ditto document, e. g. Word, PDF, and so on.)]
Pictures are probably where most of the duplicate files are but this
is my legacy drive that started collecting junk in the DOS/W3.1 days
and I keep copying it to a bigger drive in a new system (about 10-12
machines). I know I have copies of copies of stuff out there.
Thanks guys. This will keep me busy for a while.
J. P. Gilliver
2023-10-07 15:54:21 UTC
Permalink
[]
Post by g***@aol.com
Post by J. P. Gilliver
WinDirStat (https://windirstat.net/ - there's a sample screenshot) for
[]
Post by g***@aol.com
Post by J. P. Gilliver
Gerlach's "scanner" http://steffengerlach.de/freeware/ - it's a sort of
[]
Post by g***@aol.com
Post by J. P. Gilliver
more serious FindDuplicates, David Taylor's - again old - one of that
name: https://www.satsignal.eu/software/disk.html (he also has there a
usage piechart, but IMO its display is less intuitive than Steffen's).
Thanks, I will look into those. My problem is I have a disk with close
to 500g on it and I am not sure where the hogs are There are a bunch
of folders.
[]
Oh, for finding where the hogs are, WinDirStat, TreeSize, or Scanner
will all do the job - I think of the three, _just_ for finding where the
hogs are, I'd use scanner (I find its interface lets me climb up and
down - or in and out, if you prefer - quite nicely). For finding whether
the hogs are hogs because of lots of duplicates, I used to say David
Taylor's one (and that's probably still worth a look), but I was quite
impressed with the Auslogics one someone suggested (took just under 2
minutes first time on here; I think it went quicker when there were
fewer duplicates, i. e. when I tried it again after deleting some). If
reducing hoggery is your main wish, David Taylor's one compares the
biggest files first (and I think can be paused or stopped at any point).

You say pictures are probably where most of the duplicates are; you of
course may be right (you know your own system), but assuming you've also
got a fair number of video files, the picture duplicates probably aren't
the major hog. (Though it's satisfying to delete them anyway.)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

If it jams - force it. If it breaks, it needed replacing anyway.
g***@aol.com
2023-10-07 19:49:18 UTC
Permalink
Post by J. P. Gilliver
[]
Post by g***@aol.com
Post by J. P. Gilliver
WinDirStat (https://windirstat.net/ - there's a sample screenshot) for
[]
Post by g***@aol.com
Post by J. P. Gilliver
Gerlach's "scanner" http://steffengerlach.de/freeware/ - it's a sort of
[]
Post by g***@aol.com
Post by J. P. Gilliver
more serious FindDuplicates, David Taylor's - again old - one of that
name: https://www.satsignal.eu/software/disk.html (he also has there a
usage piechart, but IMO its display is less intuitive than Steffen's).
Thanks, I will look into those. My problem is I have a disk with close
to 500g on it and I am not sure where the hogs are There are a bunch
of folders.
[]
Oh, for finding where the hogs are, WinDirStat, TreeSize, or Scanner
will all do the job - I think of the three, _just_ for finding where the
hogs are, I'd use scanner (I find its interface lets me climb up and
down - or in and out, if you prefer - quite nicely). For finding whether
the hogs are hogs because of lots of duplicates, I used to say David
Taylor's one (and that's probably still worth a look), but I was quite
impressed with the Auslogics one someone suggested (took just under 2
minutes first time on here; I think it went quicker when there were
fewer duplicates, i. e. when I tried it again after deleting some). If
reducing hoggery is your main wish, David Taylor's one compares the
biggest files first (and I think can be paused or stopped at any point).
You say pictures are probably where most of the duplicates are; you of
course may be right (you know your own system), but assuming you've also
got a fair number of video files, the picture duplicates probably aren't
the major hog. (Though it's satisfying to delete them anyway.)
Scanner seems to do what I need in analyzing my drives. It turns out
pictures are the biggest hog. I am not sure I am going to address that
right now (30 gig). The problem is I have all the raw, right off the
camera pictures with date file names, then edited copies scattered
around. I like keeping the raw pictures if they are usable since I may
want to go back and recover something I cropped out, not thinking I
cared. Sometimes adding context is a good thing.
Thanks again everyone. You saved me a lot of time weeding through the
chaff online.
Boris
2023-10-08 02:02:24 UTC
Permalink
Post by J. P. Gilliver
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
Nice (-:! Actually, you _might_ be letting the side down - near bottom
left, I can see something that _might_ still be a car.
I don't think you'll find anything that does _all_ of the above in one
application - in fact I don't know anything that'll _sort_ folders by
size, rather than just _show_ them. For the latter, very little beats
WinDirStat (https://windirstat.net/ - there's a sample screenshot) for
detail in a well-presented manner, but it _is_ a little slow to run
(took just under 2 minutes to do both partitions on here). For a faster
and also rather interesting display of folder sizes, try Steffen
Gerlach's "scanner" http://steffengerlach.de/freeware/ - it's a sort of
hierarchical pie-chart (can also be added to the right-click menu).
Quite old, but nothing wrong with that (may be one of the reasons it
runs reasonably fast).
For finding duplicate files, it depends where you've put them, what sort
they are, and so on. Although it's far from its primary purpose, I find
Everything from voidtools often quite useful in that respect, especially
if you're not sure where the duplicates may be - especially as it can
sort by size or creation date as well as the default of filename. For a
more serious FindDuplicates, David Taylor's - again old - one of that
name: https://www.satsignal.eu/software/disk.html (he also has there a
usage piechart, but IMO its display is less intuitive than Steffen's).
For images specifically, ideally, you want something that knows about
images, and is able to compare assorted files: the best I've come
across, but which unfortunately the company seems to have disappeared,
is Duplicate Image Finder from Runningman software, which can compare
images in different file formats, resolutions, and if I remember
rotations and reflections - you set a percentage equality (can be 100%),
and it shows you them side-by side telling you where they are (and other
things about them such as size and format) and you choose which (or
neither) to delete. I think there are other such utilities about. (If
someone knows a freeware one that does that - allow different formats
and resolutions - please share! [W7 and older only, of course.])
I've used this one for years. No adware. Fast. Free. I've used it on
Windows 7, 10, and 11.
For picture files, I like that when you hover over the file name, you can
see a preview.

http://www.malich.org/duplicate_searcher
Post by J. P. Gilliver
[Probably something similar for audio and video files would be good too,
but harder to create. (Ditto document, e. g. Word, PDF, and so on.)]
J. P. Gilliver
2023-10-08 15:13:55 UTC
Permalink
[]
Post by Boris
Post by J. P. Gilliver
For images specifically, ideally, you want something that knows about
images, and is able to compare assorted files: the best I've come
across, but which unfortunately the company seems to have disappeared,
is Duplicate Image Finder from Runningman software, which can compare
images in different file formats, resolutions, and if I remember
rotations and reflections - you set a percentage equality (can be 100%),
and it shows you them side-by side telling you where they are (and other
things about them such as size and format) and you choose which (or
neither) to delete. I think there are other such utilities about. (If
someone knows a freeware one that does that - allow different formats
and resolutions - please share! [W7 and older only, of course.])
I've used this one for years. No adware. Fast. Free. I've used it on
Windows 7, 10, and 11.
For picture files, I like that when you hover over the file name, you can
see a preview.
http://www.malich.org/duplicate_searcher
[]
Thanks for the tip; I like that one for several reasons:
1. It definitely does byte-by-byte comparison. (OK, that may make it
take longer, but it only took 2¾ minutes here - only few seconds after
deleting some.)
2. It can make hard links - so in effect deleting.
3. "Identify audio, video and JPG files that contain identical media
content but have different metadata information." This last one has
found lots for me, mostly in my genealogy area. I use the comment field
in JPG files (I, C in IrfanView; other image utilities that can access
the comment field are available), and I've obviously downloaded the same
image (e. g. a census page) but saved it with different comments. (Or
sometimes, the hardware or software that does the original scan adds a
line in the comment field, and if got from different sources who used
different hardware/software, that line may be different.)

It _doesn't_ do a compare-images with different
formats/resolutions/rotations, but as a DFF, it looks very good.

For those interested/wary: it needs .net, at least 4.7.1. If you run it
and you haven't got, it offers to get, and if you say yes, it takes you
to a (Microsoft) page that (I think) offers the latest version of 4 -
4.8.1 I think it was - and lets you download its installer; when you try
to run that, it tells you (or did me) that that's not compatible with
your system. Fortunately the Microsoft page has "older versions" at the
bottom, including 4.7.2. The DFF page also has links to .net versions 6
and 7 (and says it runs faster with 7).
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

I'm too lazy to have a bigger ego. - James May, RT 2016/1/23-29
Ken Blake
2023-10-07 14:51:07 UTC
Permalink
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
My favorite is Directory Opus. It's not free, but I think it comes
with a free trial.
mick
2023-10-07 19:03:27 UTC
Permalink
Post by Ken Blake
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
My favorite is Directory Opus. It's not free, but I think it comes
with a free trial.
+1
The first program I always install on every computer I have owned.
Just using it day in and day out you subconsciously keep everything on
all drives organised without a second thought. Unlike File Explorer,
which if I do have to use on someone else's computer I really struggle
with because it is usually never set up correctly, e.g, file extensions
turned off. How on earth do people manage files with the same name but
a different extension? As one person once said, "I keep clicking on
them until it looks like the right one"
--
mick
s|b
2023-10-08 09:08:20 UTC
Permalink
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
FreeCommander can do that (and more).

<https://freecommander.com/en/summary/>
--
s|b
j***@astraweb.com
2023-12-30 21:52:54 UTC
Permalink
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
If you seriously cozy up with ZTREE you will find that it does things you never thought of and is
capable of doing most all those things in a batch mode. Being an XTREE user from 30 years ago i have
even cloned a harddrive with XTREE a long time ago (~25 years) and supposedly ZTREE could do the same
thing. When I got Win 7 over a decade ago, one of the first applications I got was ZTREE for 64 bit.
However, i don't know of a way to use it to detect duplicates -- or never looked....

(Whether or not they have fixed this I don't know, but it's major drawback is not using large data
packets on COPY. That probably would not come into play with you cleaning up a drive.)
J. P. Gilliver
2023-12-30 23:07:58 UTC
Permalink
(That was a while ago!)
Post by j***@astraweb.com
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
If you seriously cozy up with ZTREE you will find that it does things
[]
Post by j***@astraweb.com
However, i don't know of a way to use it to detect duplicates -- or never looked....
I'm a fan of Xtree (which still works under 7 32 bit! [Though has a
tendency to rail a core, even when apparently not doing anything]).
Though I don't know if it can find duplicates.

Two duplicate finders I like are:
David Taylor's FindDup: https://www.satsignal.eu/software/disk.html
the AusLogics one:
https://www.auslogics.com/en/software/duplicate-file-finder/

For images only, ideally you want one that compares images of different
sizes, formats, and orientations, and lets you set a match threshold; I
liked Duplicate Image Finder (unfortunate name as it's impossible to
search for!) from Runningman Software, but unfortunately they've
disappeared. I don't know if there are similar ones for sound and video
files.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

The early worm gets the bird.
Daniel65
2023-12-31 12:10:51 UTC
Permalink
Post by J. P. Gilliver
(That was a while ago!)
Post by j***@astraweb.com
Post by g***@aol.com
I am looking for something better than windows explorer for analyzing
directories/folders. Something that will sort folders by size, find
duplicate files and help me clean up a PC that looks like my garage.
https://gfretwell.com/ftp/garage.jpg
If you seriously cozy up with ZTREE you will find that it does things
[]
Post by j***@astraweb.com
However, i don't know of a way to use it to detect duplicates -- or never looked....
I'm a fan of Xtree (which still works under 7 32 bit! [Though has a
tendency to rail a core, even when apparently not doing anything]).
Though I don't know if it can find duplicates.
I loved XTree and XTreeGold .... Was that mid-late-80's?? Loved it!
--
Daniel
J. P. Gilliver
2023-12-31 12:26:54 UTC
Permalink
[]
Post by Daniel65
Post by J. P. Gilliver
I'm a fan of Xtree (which still works under 7 32 bit! [Though has a
tendency to rail a core, even when apparently not doing anything]).
Though I don't know if it can find duplicates.
I loved XTree and XTreeGold .... Was that mid-late-80's?? Loved it!
(It's Gold I have.) Sounds about right. It was such a good precursor -
in DOS days - that, whenever you saw a photo of a scene where there was
a PC screen incidentally in the picture, it was very often the blue of
Xtree rather than the black of a DOS screen!
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

... the pleasure of the mind is an amazing thing. My life has been driven by
the satisfaction of curiosity. - Jeremy Paxman (being interviewed by Anne
Widdecombe), Radio Times, 2-8 July 2011.
g***@aol.com
2023-12-31 15:28:23 UTC
Permalink
On Sun, 31 Dec 2023 12:26:54 +0000, "J. P. Gilliver"
Post by J. P. Gilliver
[]
Post by Daniel65
Post by J. P. Gilliver
I'm a fan of Xtree (which still works under 7 32 bit! [Though has a
tendency to rail a core, even when apparently not doing anything]).
Though I don't know if it can find duplicates.
I loved XTree and XTreeGold .... Was that mid-late-80's?? Loved it!
(It's Gold I have.) Sounds about right. It was such a good precursor -
in DOS days - that, whenever you saw a photo of a scene where there was
a PC screen incidentally in the picture, it was very often the blue of
Xtree rather than the black of a DOS screen!
I will check out Ztree.
BTW my DOS screens were always a different color. I leaned toward
yellow text on blue.
J. P. Gilliver
2023-12-31 19:21:52 UTC
Permalink
Post by g***@aol.com
On Sun, 31 Dec 2023 12:26:54 +0000, "J. P. Gilliver"
[]
Post by g***@aol.com
Post by J. P. Gilliver
(It's Gold I have.) Sounds about right. It was such a good precursor -
in DOS days - that, whenever you saw a photo of a scene where there was
a PC screen incidentally in the picture, it was very often the blue of
Xtree rather than the black of a DOS screen!
I will check out Ztree.
BTW my DOS screens were always a different color. I leaned toward
yellow text on blue.
Xtree had yellow or white (depending on what you were doing) on blue.
(Though I think you could choose all the colours for yourself if you
wanted.) IIRR the DOS version of WordPerfect (which was in most people's
opinion better than the pre-Windows Microsoft equivalent) also used
white on blue. (I can't remember what the pre-windows Microsoft word
processor was even called, or even if they had one.)

I remember someone posting a hack - well, telling you how - to change
the colours of the BSOD; not that it was something one wanted to see
anyway, but at least you could choose the colours! I think that was back
in the '95/8/Me days.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

Can a blue man sing the whites?
Daniel65
2024-01-01 09:41:49 UTC
Permalink
Post by J. P. Gilliver
Post by g***@aol.com
On Sun, 31 Dec 2023 12:26:54 +0000, "J. P. Gilliver"
[]
Post by g***@aol.com
Post by J. P. Gilliver
(It's Gold I have.) Sounds about right. It was such a good
precursor - in DOS days - that, whenever you saw a photo of a
scene where there was a PC screen incidentally in the picture, it
was very often the blue of Xtree rather than the black of a DOS
screen!
I will check out Ztree. BTW my DOS screens were always a different
color. I leaned toward yellow text on blue.
Xtree had yellow or white (depending on what you were doing) on blue.
(Though I think you could choose all the colours for yourself if you
wanted.) IIRR the DOS version of WordPerfect (which was in most
people's opinion better than the pre-Windows Microsoft equivalent)
also used white on blue. (I can't remember what the pre-windows
Microsoft word processor was even called, or even if they had one.)
Wasn't MS Word stand-alone i.e. could be run on any DOS directly?? Or
was that issued on/with MS-DOS
--
Daniel
J. P. Gilliver
2024-01-01 12:16:15 UTC
Permalink
[]
Post by Daniel65
Post by J. P. Gilliver
Xtree had yellow or white (depending on what you were doing) on blue.
(Though I think you could choose all the colours for yourself if you
wanted.) IIRR the DOS version of WordPerfect (which was in most
people's opinion better than the pre-Windows Microsoft equivalent)
also used white on blue. (I can't remember what the pre-windows
Microsoft word processor was even called, or even if they had one.)
Wasn't MS Word stand-alone i.e. could be run on any DOS directly?? Or
was that issued on/with MS-DOS
I'm pretty sure there was no wordprocessor included with DOS (unless you
count EDIT, which was basically a text editor, I think partly intended
for editing batch files, CONFIG.SYS, and the like, rather than actual
text). DOS, after all, came on a handful of floppies! (3, for one
version.)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"If even one person" arguments allow the perfect to become the enemy of the
good, and thus they tend to cause more harm than good.
- Jimmy Akins quoted by Scott Adams, 2015-5-5
g***@aol.com
2024-01-01 16:48:42 UTC
Permalink
Post by J. P. Gilliver
[]
Post by Daniel65
Post by J. P. Gilliver
Xtree had yellow or white (depending on what you were doing) on blue.
(Though I think you could choose all the colours for yourself if you
wanted.) IIRR the DOS version of WordPerfect (which was in most
people's opinion better than the pre-Windows Microsoft equivalent)
also used white on blue. (I can't remember what the pre-windows
Microsoft word processor was even called, or even if they had one.)
Wasn't MS Word stand-alone i.e. could be run on any DOS directly?? Or
was that issued on/with MS-DOS
I'm pretty sure there was no wordprocessor included with DOS (unless you
count EDIT, which was basically a text editor, I think partly intended
for editing batch files, CONFIG.SYS, and the like, rather than actual
text). DOS, after all, came on a handful of floppies! (3, for one
version.)
Edit didn't show up until around DOS 4.0, a miserable release. 5.0
fixed most of those bugs.
Before that the only editor inside DOS was EDLIN and that was
miserable to use if you were actually trying to write something.
OTOH DOS 1.0 shipped with the original 5150 PC had an add on called
"Easywriter" that was fairly user friendly. I may still have a copy
here somewhere. It wasn't really part of DOS although you could add it
to your DOS diskette. (no hard drives in those days)
Daniel65
2024-01-02 10:06:02 UTC
Permalink
***@aol.com wrote on 2/1/24 3:48 am:

<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.

*HUGE* !!
--
Daniel
Frank Slootweg
2024-01-02 15:30:28 UTC
Permalink
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two
partitions! [1]

OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B),
costing $23,500 (in 1969 dollars). That to beat *that*! :-)

2757A etc.
<http://www.hpmuseum.net/display_item.php?hw=548>

[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
Daniel65
2024-01-03 09:15:16 UTC
Permalink
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever
DOS installed on a 10MB Hard drive ..... on which I then ran
"Double Space" (was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all,
or at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two
partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B),
costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc. <http://www.hpmuseum.net/display_item.php?hw=548>
[1] Not quite the reason. Real reason: 32MB maximum partition size
and I didn't want to use software to overcome that limit.
"costing $23,500 (in 1969 dollars)"!!!! WOW!! Frank, can you lend me a
buck or two?? ;-P
--
Daniel
Paul
2024-01-03 14:35:08 UTC
Permalink
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever
DOS installed on a 10MB Hard drive ..... on which I then ran
"Double Space" (was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all,
or at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B), costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc. <http://www.hpmuseum.net/display_item.php?hw=548>
[1] Not quite the reason. Real reason: 32MB maximum partition size
and I didn't want to use software to overcome that limit.
"costing $23,500 (in 1969 dollars)"!!!! WOW!! Frank, can you lend me a buck or two?? ;-P
That's probably at work.

We had a mainframe hard drive, the size of a Maytag washer, connected to
our 128KB PC design. That had 30x the storage of an ST412, which came
along three years later. The interface connection used diff ribbon cables
without outer shields. If I believe Wiki, the storage rates back then
were on the order of 1.2MB/sec. We had no HDTune, so it wasn't like I could
sit around all day measuring it :-) Anything faster than a floppy diskette
was considered "high tech" back then. You did not complain about this stuff.
Progress was progress.

The "hard drive" used 230V power and had a 2 horsepower motor for the spindle.
It used 10-platter disc packs in cake boxes. Load a pack, unscrew the lid, close
the washing machine cover. Push the "Purge" button, and about ten minutes later,
the drive was ready, then you'd push a button to bring it online. I don't
think we booted off that. Might have booted off a... floppy. The most
common component you could count on, was our 8" floppy drive(s).

Some of our peripherals were relatively expensive. We also had a mainframe
tape drive connected.

*******

We worked on 14", 8", and 5.25" FH storage. The 14" storage was the
first to deliver results, and having a large disk provided a place
for the software team to stage their development tree. I tested a few
8" hard drives in the lab, but they weren't total winners, and
at some point, I guess someone saw the advert for the ST412 and
that doomed the 8" lab project. I think maybe three 8" drives escaped
into release machines. And then when the ST412s showed up, they
were "finally personal computers". My desk drawer had 110 of the
8" floppy diskettes stored in it, and the ST412 came at just the right time.

We didn't have limits where I worked. Oh, I kid :-) Our 300MB storage
was one partition. You called into it with an LBA, the controller
(our design) converted to CHS type commands for the drive. The signal
came off the drive as a serial stream, and the controller did SIPO.
we had our own file systems.

You needed a hex editor and a sense of humor to work there.

Like, one of the jobs I was given, I was placed in a foreign lab, with a mix
of big-endian and little-endian computers, and my job was to
put "byte-swaps" in the right places, so they could all read the
network packets properly. Must have taken me a week to get that
shitty mess running. Hex editor for the win. And a sense of humor.
For some reason, one of the machines needed *two* byte swaps. But
you don't argue with computers. If it says it wants it that way, you
just do it, and move on. I don't really know how that works today,
when say, you connect some Apple computer, to a PC. It's a good thing
someone figured that out.

Paul
Frank Slootweg
2024-01-03 16:46:56 UTC
Permalink
Post by Paul
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever
DOS installed on a 10MB Hard drive ..... on which I then ran
"Double Space" (was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all,
or at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B), costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc. <http://www.hpmuseum.net/display_item.php?hw=548>
[1] Not quite the reason. Real reason: 32MB maximum partition size
and I didn't want to use software to overcome that limit.
"costing $23,500 (in 1969 dollars)"!!!! WOW!! Frank, can you lend me a buck or two?? ;-P
That's probably at work.
Yes, we sold/supported the things (and the systems).
Post by Paul
We had a mainframe hard drive, the size of a Maytag washer, connected to
our 128KB PC design. That had 30x the storage of an ST412, which came
along three years later. The interface connection used diff ribbon cables
without outer shields. If I believe Wiki, the storage rates back then
were on the order of 1.2MB/sec. We had no HDTune, so it wasn't like I could
sit around all day measuring it :-) Anything faster than a floppy diskette
was considered "high tech" back then. You did not complain about this stuff.
Progress was progress.
The "hard drive" used 230V power and had a 2 horsepower motor for the
spindle. It used 10-platter disc packs in cake boxes. Load a pack,
unscrew the lid, close the washing machine cover. Push the "Purge"
button, and about ten minutes later, the drive was ready, then you'd
push a button to bring it online.
Probably somithing similar to 'our':

'HP 7920/7925 Disc Drives'
<http://www.hpmuseum.net/display_item.php?hw=272>

and

'HP 13394A/13356A Disc Packs for 7920/7925'
<http://www.hpmuseum.net/display_item.php?hw=417>

[...]
Daniel65
2024-01-03 23:58:05 UTC
Permalink
Post by Frank Slootweg
Post by Paul
Post by Daniel65
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had
whatever DOS installed on a 10MB Hard drive ..... on which I
then ran "Double Space" (was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them
all, or at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes,
*K*B), costing $23,500 (in 1969 dollars). That to beat *that*!
:-)
2757A etc. <http://www.hpmuseum.net/display_item.php?hw=548>
[1] Not quite the reason. Real reason: 32MB maximum partition
size and I didn't want to use software to overcome that limit.
"costing $23,500 (in 1969 dollars)"!!!! WOW!! Frank, can you lend
me a buck or two?? ;-P
That's probably at work.
Yes, we sold/supported the things (and the systems).
Ah!! So no loan, then! Bummer ;-(
--
Daniel
Spalls Hurgenson
2024-01-03 13:21:01 UTC
Permalink
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.

There were a number of alternatives during that era, as increase in
software sizes briefly exceeded the speed at which hard-drives were
growing in size. I personally recall DiskDoubler (although that was
for Mac) and DoubleDisk (no relation), but it seemed like there were
dozens being advertised in the magazines of the time. Given the
(legitimate) concerns about data corruption - even amongst the leading
brands - most of them slipped under the waves almost as quickly.

I've memories of using Stacker. At the time, it didn't have a way to
compress your files in place, so I first had to copy all my files OFF
the hard-drive to make enough room for its virtual disk file. But I
didn't trust the software enough to give it my whole drive, so I split
the difference and left half the drive uncompressed. The uncompressed
drive got my 'important' files, while I used the compressed drive for
games and other nonsense.

The performance hit wasn't severe, but noticable on my PC. Eventually
I replaced the kludge with a second hard-drive and felt much more
secure about everything.
Frank Slootweg
2024-01-03 18:16:15 UTC
Permalink
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.
Thanks for the update/correction! Now you mention it, I remember the
controversy, the Stac court case, etc.. The events are clearly explained
in Wikipedia, especially the transition from MS-DOS 6.2 (DoubleSpace),
to MS-DOS 6.21 (without DoubleSpace) to MS-DOS 6.22 (DriveSpace).

'DriveSpace'
<https://en.wikipedia.org/wiki/DriveSpace>

'Later versions'
<https://en.wikipedia.org/wiki/DriveSpace#Later_versions>

So Daniel was correct after all. Sorry about that, Daniel.
Post by Spalls Hurgenson
There were a number of alternatives during that era, as increase in
software sizes briefly exceeded the speed at which hard-drives were
growing in size. I personally recall DiskDoubler (although that was
for Mac) and DoubleDisk (no relation),
The Wikipedia article say that Microsoft's DoubleSpace was based on
technology it licensed from the DoubleDisk deleoper Vertisoft.
Post by Spalls Hurgenson
but it seemed like there were
dozens being advertised in the magazines of the time. Given the
(legitimate) concerns about data corruption - even amongst the leading
brands - most of them slipped under the waves almost as quickly.
I've memories of using Stacker. At the time, it didn't have a way to
compress your files in place, so I first had to copy all my files OFF
the hard-drive to make enough room for its virtual disk file. But I
didn't trust the software enough to give it my whole drive, so I split
the difference and left half the drive uncompressed. The uncompressed
drive got my 'important' files, while I used the compressed drive for
games and other nonsense.
I did the same, two partitions with important/non-important stuff.
Post by Spalls Hurgenson
The performance hit wasn't severe, but noticable on my PC. Eventually
I replaced the kludge with a second hard-drive and felt much more
secure about everything.
While I never had any data corruption problems, it was just too much
of a hassle and I just stopped using whatever disk compression software
I was using (probably SuperStor or Stacker, because I replaced MS-DOS
6.X with DR DOS (6.0?)).
Daniel65
2024-01-04 00:06:46 UTC
Permalink
Post by Frank Slootweg
On 2 Jan 2024 15:30:28 GMT, Frank Slootweg
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had
whatever DOS installed on a 10MB Hard drive ..... on which I
then ran "Double Space" (was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them
all, or at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.
Thanks for the update/correction! Now you mention it, I remember the
controversy, the Stac court case, etc.. The events are clearly
explained in Wikipedia, especially the transition from MS-DOS 6.2
(DoubleSpace), to MS-DOS 6.21 (without DoubleSpace) to MS-DOS 6.22
(DriveSpace).
'DriveSpace' <https://en.wikipedia.org/wiki/DriveSpace>
'Later versions'
<https://en.wikipedia.org/wiki/DriveSpace#Later_versions>
So Daniel was correct after all. Sorry about that, Daniel.
Hey, I'll take that! ;-) ...... (then I'll run away and hide cause I
thought I was talking DOS 4 or 5 era!! ;-( )
--
Daniel
Spalls Hurgenson
2024-01-04 13:08:21 UTC
Permalink
Post by Frank Slootweg
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.
Thanks for the update/correction! Now you mention it, I remember the
controversy, the Stac court case, etc.. The events are clearly explained
in Wikipedia, especially the transition from MS-DOS 6.2 (DoubleSpace),
to MS-DOS 6.21 (without DoubleSpace) to MS-DOS 6.22 (DriveSpace).
'DriveSpace'
<https://en.wikipedia.org/wiki/DriveSpace>
'Later versions'
<https://en.wikipedia.org/wiki/DriveSpace#Later_versions>
So Daniel was correct after all. Sorry about that, Daniel.
Post by Spalls Hurgenson
There were a number of alternatives during that era, as increase in
software sizes briefly exceeded the speed at which hard-drives were
growing in size. I personally recall DiskDoubler (although that was
for Mac) and DoubleDisk (no relation),
The Wikipedia article say that Microsoft's DoubleSpace was based on
technology it licensed from the DoubleDisk deleoper Vertisoft.
Post by Spalls Hurgenson
but it seemed like there were
dozens being advertised in the magazines of the time. Given the
(legitimate) concerns about data corruption - even amongst the leading
brands - most of them slipped under the waves almost as quickly.
I've memories of using Stacker. At the time, it didn't have a way to
compress your files in place, so I first had to copy all my files OFF
the hard-drive to make enough room for its virtual disk file. But I
didn't trust the software enough to give it my whole drive, so I split
the difference and left half the drive uncompressed. The uncompressed
drive got my 'important' files, while I used the compressed drive for
games and other nonsense.
I did the same, two partitions with important/non-important stuff.
Post by Spalls Hurgenson
The performance hit wasn't severe, but noticable on my PC. Eventually
I replaced the kludge with a second hard-drive and felt much more
secure about everything.
While I never had any data corruption problems, it was just too much
of a hassle and I just stopped using whatever disk compression software
I was using (probably SuperStor or Stacker, because I replaced MS-DOS
6.X with DR DOS (6.0?)).
Like many DOS users, one of my earliest experiences with the OS was
the inevitable, "hey, what if I delete these useless files in c:\?"
experience. Stacker (and its competitors) also put a 'useless' file in
C:\. While, by the time I was experimenting with disk-compression, I
was far too savy to 'just delete some files' like I had years earlier,
it always made me nervous having an entire hard-drive's worth of data
reliant on the existence of that one single file.

Even worse because that file was hosted on a FAT-16 (just FAT, then)
filesystem. "What if that Stacker file got crosslinked with some other
file?" my brain screamed at me.

It just all seemed to risky, too tenuous to depend upon. (In fairness,
later versions of the technology started to take these problems into
consideration, but the earliest revisions were rough). A second
hard-drive, though expensive, just made more sense to me.

It was a 500MB Maxtor, IIRC. That drive stayed with me for a long
time. When I finally discarded it - after long service and an even
longer time in the storage bin - it was with a bit of sadness.
J. P. Gilliver
2024-01-04 16:44:18 UTC
Permalink
In message <***@4ax.com> at Thu, 4 Jan
2024 08:08:21, Spalls Hurgenson <***@gmail.com> writes
[]
Post by Spalls Hurgenson
Like many DOS users, one of my earliest experiences with the OS was
the inevitable, "hey, what if I delete these useless files in c:\?"
experience. Stacker (and its competitors) also put a 'useless' file in
C:\. While, by the time I was experimenting with disk-compression, I
was far too savy to 'just delete some files' like I had years earlier,
it always made me nervous having an entire hard-drive's worth of data
reliant on the existence of that one single file.
[]
Many of us still have a similar situation, but have forgotten about it:
many of our news and, in particular, mail clients store their newsbase
and mailbase in one single file. I've never felt completely at ease with
that, but just hope for the best (and nowadays back up my Turnpike
directory regularly), and on the whole have never been let down that I
remember by Turnpike (though have occasionally had to "rebuild
database"). I have a vague memory of some rogue email corrupting my
system at work - that'd have been Outlook or Outlook Express.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

Ask not for whom the bell tolls; let the machine get it
g***@aol.com
2024-01-05 02:17:25 UTC
Permalink
On Thu, 04 Jan 2024 08:08:21 -0500, Spalls Hurgenson
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.
Thanks for the update/correction! Now you mention it, I remember the
controversy, the Stac court case, etc.. The events are clearly explained
in Wikipedia, especially the transition from MS-DOS 6.2 (DoubleSpace),
to MS-DOS 6.21 (without DoubleSpace) to MS-DOS 6.22 (DriveSpace).
'DriveSpace'
<https://en.wikipedia.org/wiki/DriveSpace>
'Later versions'
<https://en.wikipedia.org/wiki/DriveSpace#Later_versions>
So Daniel was correct after all. Sorry about that, Daniel.
Post by Spalls Hurgenson
There were a number of alternatives during that era, as increase in
software sizes briefly exceeded the speed at which hard-drives were
growing in size. I personally recall DiskDoubler (although that was
for Mac) and DoubleDisk (no relation),
The Wikipedia article say that Microsoft's DoubleSpace was based on
technology it licensed from the DoubleDisk deleoper Vertisoft.
Post by Spalls Hurgenson
but it seemed like there were
dozens being advertised in the magazines of the time. Given the
(legitimate) concerns about data corruption - even amongst the leading
brands - most of them slipped under the waves almost as quickly.
I've memories of using Stacker. At the time, it didn't have a way to
compress your files in place, so I first had to copy all my files OFF
the hard-drive to make enough room for its virtual disk file. But I
didn't trust the software enough to give it my whole drive, so I split
the difference and left half the drive uncompressed. The uncompressed
drive got my 'important' files, while I used the compressed drive for
games and other nonsense.
I did the same, two partitions with important/non-important stuff.
Post by Spalls Hurgenson
The performance hit wasn't severe, but noticable on my PC. Eventually
I replaced the kludge with a second hard-drive and felt much more
secure about everything.
While I never had any data corruption problems, it was just too much
of a hassle and I just stopped using whatever disk compression software
I was using (probably SuperStor or Stacker, because I replaced MS-DOS
6.X with DR DOS (6.0?)).
Like many DOS users, one of my earliest experiences with the OS was
the inevitable, "hey, what if I delete these useless files in c:\?"
experience. Stacker (and its competitors) also put a 'useless' file in
C:\. While, by the time I was experimenting with disk-compression, I
was far too savy to 'just delete some files' like I had years earlier,
it always made me nervous having an entire hard-drive's worth of data
reliant on the existence of that one single file.
Even worse because that file was hosted on a FAT-16 (just FAT, then)
filesystem. "What if that Stacker file got crosslinked with some other
file?" my brain screamed at me.
It just all seemed to risky, too tenuous to depend upon. (In fairness,
later versions of the technology started to take these problems into
consideration, but the earliest revisions were rough). A second
hard-drive, though expensive, just made more sense to me.
It was a 500MB Maxtor, IIRC. That drive stayed with me for a long
time. When I finally discarded it - after long service and an even
longer time in the storage bin - it was with a bit of sadness.
I never trusted compression enough to use it but I was a CE. If
everything worked right every time, I wouldn't have a very interesting
day.
Paul
2024-01-05 12:40:25 UTC
Permalink
Post by g***@aol.com
On Thu, 04 Jan 2024 08:08:21 -0500, Spalls Hurgenson
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Spalls Hurgenson
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
DoubleSpace was also an option. It's what DriveSpace was initially
known as when it released with MS-DOS 6.0. After Microsoft lost a
court case showing that they had infringed upon Stacker's patent,
DoubleSpace was first removed, and then later replaced with
DriveSpace.
Thanks for the update/correction! Now you mention it, I remember the
controversy, the Stac court case, etc.. The events are clearly explained
in Wikipedia, especially the transition from MS-DOS 6.2 (DoubleSpace),
to MS-DOS 6.21 (without DoubleSpace) to MS-DOS 6.22 (DriveSpace).
'DriveSpace'
<https://en.wikipedia.org/wiki/DriveSpace>
'Later versions'
<https://en.wikipedia.org/wiki/DriveSpace#Later_versions>
So Daniel was correct after all. Sorry about that, Daniel.
Post by Spalls Hurgenson
There were a number of alternatives during that era, as increase in
software sizes briefly exceeded the speed at which hard-drives were
growing in size. I personally recall DiskDoubler (although that was
for Mac) and DoubleDisk (no relation),
The Wikipedia article say that Microsoft's DoubleSpace was based on
technology it licensed from the DoubleDisk deleoper Vertisoft.
Post by Spalls Hurgenson
but it seemed like there were
dozens being advertised in the magazines of the time. Given the
(legitimate) concerns about data corruption - even amongst the leading
brands - most of them slipped under the waves almost as quickly.
I've memories of using Stacker. At the time, it didn't have a way to
compress your files in place, so I first had to copy all my files OFF
the hard-drive to make enough room for its virtual disk file. But I
didn't trust the software enough to give it my whole drive, so I split
the difference and left half the drive uncompressed. The uncompressed
drive got my 'important' files, while I used the compressed drive for
games and other nonsense.
I did the same, two partitions with important/non-important stuff.
Post by Spalls Hurgenson
The performance hit wasn't severe, but noticable on my PC. Eventually
I replaced the kludge with a second hard-drive and felt much more
secure about everything.
While I never had any data corruption problems, it was just too much
of a hassle and I just stopped using whatever disk compression software
I was using (probably SuperStor or Stacker, because I replaced MS-DOS
6.X with DR DOS (6.0?)).
Like many DOS users, one of my earliest experiences with the OS was
the inevitable, "hey, what if I delete these useless files in c:\?"
experience. Stacker (and its competitors) also put a 'useless' file in
C:\. While, by the time I was experimenting with disk-compression, I
was far too savy to 'just delete some files' like I had years earlier,
it always made me nervous having an entire hard-drive's worth of data
reliant on the existence of that one single file.
Even worse because that file was hosted on a FAT-16 (just FAT, then)
filesystem. "What if that Stacker file got crosslinked with some other
file?" my brain screamed at me.
It just all seemed to risky, too tenuous to depend upon. (In fairness,
later versions of the technology started to take these problems into
consideration, but the earliest revisions were rough). A second
hard-drive, though expensive, just made more sense to me.
It was a 500MB Maxtor, IIRC. That drive stayed with me for a long
time. When I finally discarded it - after long service and an even
longer time in the storage bin - it was with a bit of sadness.
I never trusted compression enough to use it but I was a CE. If
everything worked right every time, I wouldn't have a very interesting
day.
Compression requires good RAM and good storage.

We haven't had good RAM for all that long.

We need good RAM, because we don't have the brains to have ECC DIMMs.

Here, let me show you my setting.

$ fsutil behavior query disableCompression
DisableCompression = 1 (Compression is DISABLED)

And that applies to user-land folder compression too. it is a
setting without granularity, a machine-wide setting.

wmic ComputerSystem get TotalPhysicalMemory

wmic OS get FreePhysicalMemory

Which will give more puzzles than you have time for :-)

Paul
Mark Lloyd
2024-01-05 19:50:26 UTC
Permalink
[snip]
Post by g***@aol.com
I never trusted compression enough to use it but I was a CE. If
everything worked right every time, I wouldn't have a very interesting
day.
I didn't use it regularly, but I have tried it, I would usually get a
compression ratio of 1.1 to 1.2 rather than the 2 they claimed. Maybe
you could get 2 if you had nothing but text files on that disk. IIRC,
the same thing with MNP (modem) compression.
--
Mark Lloyd
http://notstupid.us/

"Me: Dr Shrödinger, will my cat live?

Doc: Well, yes and no...."
g***@aol.com
2024-01-03 19:11:54 UTC
Permalink
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two
partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B),
costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc.
<http://www.hpmuseum.net/display_item.php?hw=548>
Yup I was an IBM brat so our 1311 started at 2 meg per pack in the
early 60s and that went to 7 meg with the 2311 (same basic box, same
pack but higher bit density and SLT logic). Things started going up
from there.
Post by Frank Slootweg
[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
Pretty soon I had several 9404 "shoeboxes" with 857 "Redwings" in them
that I used like big diskettes.
Frank Slootweg
2024-01-03 19:53:31 UTC
Permalink
Post by g***@aol.com
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two
partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B),
costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc.
<http://www.hpmuseum.net/display_item.php?hw=548>
Yup I was an IBM brat so our 1311 started at 2 meg per pack in the
early 60s and that went to 7 meg with the 2311 (same basic box, same
pack but higher bit density and SLT logic). Things started going up
from there.
Are you sure about that timing (assuming with "meg" you mean a
megabyte, not a megabit)?

As my reference shows, in 1968 the HP 2775A indeed had 1,572,864
(16-bit) words capacity, so ~3MB, but that was *late* 60s.
Post by g***@aol.com
Post by Frank Slootweg
[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
My HP QS/16 (16MHz 386DX, 40MB disk) came with software (a driver?) to
go beyond 32MB, but as said, I didn't use it.
Post by g***@aol.com
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Yes, my later systems (actually left-over *from* my son :-)), had
170MB and 1GB disks.
Post by g***@aol.com
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
Indeed. More than enough!

In the end, in 2006, I copied the files from all three disks (40MB,
170MB and 1GB) to a *single* CD-R. Only 519 of the 650MB were used! :-)
Post by g***@aol.com
Pretty soon I had several 9404 "shoeboxes" with 857 "Redwings" in them
that I used like big diskettes.
g***@aol.com
2024-01-05 02:08:21 UTC
Permalink
Post by Frank Slootweg
Post by g***@aol.com
Post by Frank Slootweg
Post by Daniel65
<Snip>
Post by g***@aol.com
(no hard drives in those days)
O.K., must have been later in the DOS series where I had whatever DOS
installed on a 10MB Hard drive ..... on which I then ran "Double Space"
(was it??) to give myself 20MB.
It was DriveSpace (or SuperStor or Stacker). IIRC, I used them all, or
at least two of them.
Post by Daniel65
*HUGE* !!
Mine was even *HUGER*, 40MB. Way too much, so I divided it into two
partitions! [1]
OTOH, I started with hard drives (drum drives) of 340KB (yes, *K*B),
costing $23,500 (in 1969 dollars). That to beat *that*! :-)
2757A etc.
<http://www.hpmuseum.net/display_item.php?hw=548>
Yup I was an IBM brat so our 1311 started at 2 meg per pack in the
early 60s and that went to 7 meg with the 2311 (same basic box, same
pack but higher bit density and SLT logic). Things started going up
from there.
Are you sure about that timing (assuming with "meg" you mean a
megabyte, not a megabit)?
As my reference shows, in 1968 the HP 2775A indeed had 1,572,864
(16-bit) words capacity, so ~3MB, but that was *late* 60s.
The 1311 was early 60s (1962), typically on a 1401 system and each
pack held 2 million 7 bit characters.
Post by Frank Slootweg
Post by g***@aol.com
Post by Frank Slootweg
[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
My HP QS/16 (16MHz 386DX, 40MB disk) came with software (a driver?) to
go beyond 32MB, but as said, I didn't use it.
I was running an ST 4096 80m on a PC/AT with DOS 3.3 but I broke it up
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
J. P. Gilliver
2024-01-05 03:57:25 UTC
Permalink
In message <***@4ax.com> at Thu, 4 Jan
2024 21:08:21, ***@aol.com writes
[]
Post by g***@aol.com
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
Much the same here, except I just run C: (OS and installed software) and
D: (all data, including downloaded installers for the software). I image
C: (with Macrium), and copy D: (with FreeFileSync currently). I thought
about trying to keep OS and other software separate, but couldn't see
any advantage, and software gets its teeth into the registry, system
files etc., so I just couldn't see any point (and saw horrendous
difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

the plural of 'anecdote' is not 'evidence'. Professor Edzart Ernst, prudential
magazine, AUTUMN 2006, p. 13.
Daniel65
2024-01-05 08:51:35 UTC
Permalink
Post by J. P. Gilliver
common data files (storage for program data) E: everything else.
That makes your backups, from XP on, much simpler Image a
relatively small system drive and simply copy all the rest of the
files on your data drives with a sync program. The only one that
really needs to be an image is the system files because of the
registry and other program sensitive data. I keep that as small as
possible.
Much the same here, except I just run C: (OS and installed software)
and D: (all data, including downloaded installers for the software).
I image C: (with Macrium), and copy D: (with FreeFileSync currently).
I thought about trying to keep OS and other software separate, but
couldn't see any advantage, and software gets its teeth into the
registry, system files etc., so I just couldn't see any point (and
saw horrendous difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
For me ....

C:\ Win7 OS files only (if I can help)
E:\ Other Executables (Word process, etc)
F:\ Data files, Letters, Games, SeaMonkey Profile files, etc.

Of course D:\ is the DVD R/RW drive.
--
Daniel
J. P. Gilliver
2024-01-05 11:56:55 UTC
Permalink
[]
Post by Daniel65
Post by J. P. Gilliver
Much the same here, except I just run C: (OS and installed software)
and D: (all data, including downloaded installers for the software).
I image C: (with Macrium), and copy D: (with FreeFileSync currently).
I thought about trying to keep OS and other software separate, but
couldn't see any advantage, and software gets its teeth into the
registry, system files etc., so I just couldn't see any point (and
saw horrendous difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
For me ....
C:\ Win7 OS files only (if I can help)
E:\ Other Executables (Word process, etc)
F:\ Data files, Letters, Games, SeaMonkey Profile files, etc.
For backup, do you image C: and E:, and sync F:?
Why (and how) do you keep C: and E: separate?
Post by Daniel65
Of course D:\ is the DVD R/RW drive.
Mine appears as E:.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"If even one person" arguments allow the perfect to become the enemy of the
good, and thus they tend to cause more harm than good.
- Jimmy Akins quoted by Scott Adams, 2015-5-5
Daniel65
2024-01-05 12:52:46 UTC
Permalink
Post by J. P. Gilliver
[]
Post by Daniel65
Post by J. P. Gilliver
Much the same here, except I just run C: (OS and installed software)
and D: (all data, including downloaded installers for the software).
I image C: (with Macrium), and copy D: (with FreeFileSync currently).
I thought about trying to keep OS and other software separate, but
couldn't see any advantage, and software gets its teeth into the
registry, system files etc., so I just couldn't see any point (and
saw horrendous difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
For me ....
C:\    Win7 OS files only (if I can help)
E:\    Other Executables (Word process, etc)
F:\    Data files, Letters, Games, SeaMonkey Profile files, etc.
For backup, do you image C: and E:, and sync F:?
Why (and how) do you keep C: and E: separate?
Post by Daniel65
Of course D:\ is the DVD R/RW drive.
Mine appears as E:.
To be honest, it's been that long since I did a back-up, I don't recall.
By-the-by, I think I last backed up using one of my Linux installations.
--
Daniel
g***@aol.com
2024-01-06 05:16:12 UTC
Permalink
On Fri, 5 Jan 2024 19:51:35 +1100, Daniel65
Post by Daniel65
For me ....
C:\ Win7 OS files only (if I can help)
E:\ Other Executables (Word process, etc)
F:\ Data files, Letters, Games, SeaMonkey Profile files, etc.
Of course D:\ is the DVD R/RW drive.
Maybe just a relic of the SCSI days but my optical drive is always
close to the top of the box. D: is my data default.
E: is a FAT partition with DOS and W/3 files on it.
(Boot a thumb drive or diskette into DOS and that is C:)
F: is media stuff.
K: is a mirrored set of drives that gets backup files like my C:
image. and things I sic my sync program on.
g***@aol.com
2024-01-06 05:10:15 UTC
Permalink
Post by J. P. Gilliver
[]
Post by g***@aol.com
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
Much the same here, except I just run C: (OS and installed software) and
D: (all data, including downloaded installers for the software). I image
C: (with Macrium), and copy D: (with FreeFileSync currently). I thought
about trying to keep OS and other software separate, but couldn't see
any advantage, and software gets its teeth into the registry, system
files etc., so I just couldn't see any point (and saw horrendous
difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
Maybe it is just the 32m days but I still like a lot of "drives". It
is a way to keep things separated.
Char Jackson
2024-01-07 04:15:29 UTC
Permalink
Post by g***@aol.com
Post by J. P. Gilliver
[]
Post by g***@aol.com
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
Much the same here, except I just run C: (OS and installed software) and
D: (all data, including downloaded installers for the software). I image
C: (with Macrium), and copy D: (with FreeFileSync currently). I thought
about trying to keep OS and other software separate, but couldn't see
any advantage, and software gets its teeth into the registry, system
files etc., so I just couldn't see any point (and saw horrendous
difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
Maybe it is just the 32m days but I still like a lot of "drives". It
is a way to keep things separated.
I approach storage from a different perspective. Most of my drives are pooled
into a single 40TB volume on my primary PC and a single 60TB volume on a
networked PC. I just picked up a pair of 18TB drives, so I'll be expanding my
little 40TB pool to 76TB (which becomes 70TB after formatting). I use folders
for separation, rather than partitions. Folders are much more flexible.
J. P. Gilliver
2024-01-07 05:04:51 UTC
Permalink
In message <***@4ax.com> at Sat, 6 Jan
2024 22:15:29, Char Jackson <***@none.invalid> writes
[]
Post by Char Jackson
I approach storage from a different perspective. Most of my drives are pooled
into a single 40TB volume on my primary PC and a single 60TB volume on a
networked PC. I just picked up a pair of 18TB drives, so I'll be expanding my
little 40TB pool to 76TB (which becomes 70TB after formatting). I use folders
for separation, rather than partitions. Folders are much more flexible.
I agree - I don't subdivide beyond C: and D:, using folders on D:. (And
ones of my own naming, not the Microsoft names.) I do keep C: as a
separate (only 50G, and that far from full) entity, containing the OS
and software, along with all their necessary files, settings, and
registry, and I image that, so that in the event of HD failure (or, in
theory, ransomware), I can restore from the image (restore [and image]
software kept on a mini-CD) and be back running (with all software set
up as it was) within less than half an hour of installing the new drive;
D: I just copy (with a sync. utility to reduce the time needed).
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

I don't have an agree that our language torture is a quality add
- soldiersailor on Gransnet, 2018-3-8
Daniel65
2024-01-07 10:06:27 UTC
Permalink
Post by J. P. Gilliver
Post by Char Jackson
I approach storage from a different perspective. Most of my drives
are pooled into a single 40TB volume on my primary PC and a single
60TB volume on a networked PC. I just picked up a pair of 18TB
drives, so I'll be expanding my little 40TB pool to 76TB (which
becomes 70TB after formatting). I use folders for separation,
rather than partitions. Folders are much more flexible.
I agree - I don't subdivide beyond C: and D:, using folders on D:.
(And ones of my own naming, not the Microsoft names.) I do keep C: as
a separate (only 50G, and that far from full) entity, containing the
OS and software, along with all their necessary files, settings, and
registry, and I image that, so that in the event of HD failure (or,
in theory, ransomware), I can restore from the image (restore [and
image] software kept on a mini-CD) and be back running (with all
software set up as it was) within less than half an hour of
installing the new drive; D: I just copy (with a sync. utility to
reduce the time needed).
Bloody hell!! Am I underdone?? Or just more economical!! ;-P

One 500GB (yes, GB) spinning rust drive, divided into 9 partitions,
three for Win7 and six for various Linux usages!
--
Daniel
J. P. Gilliver
2024-01-07 11:07:22 UTC
Permalink
Post by Daniel65
Post by J. P. Gilliver
Post by Char Jackson
I approach storage from a different perspective. Most of my drives
are pooled into a single 40TB volume on my primary PC and a single
60TB volume on a networked PC. I just picked up a pair of 18TB
drives, so I'll be expanding my little 40TB pool to 76TB (which
becomes 70TB after formatting). I use folders for separation,
rather than partitions. Folders are much more flexible.
I agree - I don't subdivide beyond C: and D:, using folders on D:.
(And ones of my own naming, not the Microsoft names.) I do keep C: as
a separate (only 50G, and that far from full) entity, containing the
OS and software, along with all their necessary files, settings, and
registry, and I image that, so that in the event of HD failure (or,
in theory, ransomware), I can restore from the image (restore [and
image] software kept on a mini-CD) and be back running (with all
software set up as it was) within less than half an hour of
installing the new drive; D: I just copy (with a sync. utility to
reduce the time needed).
Bloody hell!! Am I underdone?? Or just more economical!! ;-P
One 500GB (yes, GB) spinning rust drive, divided into 9 partitions,
three for Win7 and six for various Linux usages!
I too am on a 500G spinner; just the C: above (19.3 of 50.0G free) and
D: (294 of 425G free; of the 121G used, 73.6 is video of one sort or
another (only 7 films, plus lots of music videos - rest misc., though
15.7 genealogy). So I would have room for a few other OSs if I should
choose to (presumably all with access to the same data). My backup drive
is a "1T", so at present keeps about 3 backups (actually 3 of C and 2 of
D).

But I don't consider those who have much more disc (real or SSD) space
to be profligate - each to his own. For example, my films are from 2.x
to 5.x G, mostly SD; I can see that someone who likes films (or TV
series and similar), especially if they have a big enough TV to justify
HD or 4K, could easily eat up the TBs, like Char above with his 76.
We're all individual. (All together now: "we're all individual!")
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

Q. How much is 2 + 2?
A. Thank you so much for asking your question.
Are you still having this problem? I'll be delighted to help you. Please
restate the problem twice and include your Windows version along with
all error logs.
- Mayayana in alt.windows7.general, 2018-11-1
Ken Blake
2024-01-07 12:56:18 UTC
Permalink
Post by Char Jackson
Post by g***@aol.com
Post by J. P. Gilliver
[]
Post by g***@aol.com
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
Much the same here, except I just run C: (OS and installed software) and
D: (all data, including downloaded installers for the software). I image
C: (with Macrium), and copy D: (with FreeFileSync currently). I thought
about trying to keep OS and other software separate, but couldn't see
any advantage, and software gets its teeth into the registry, system
files etc., so I just couldn't see any point (and saw horrendous
difficulties doing so). My C: is 50G (19.6G free).
Whatever works for you!
Maybe it is just the 32m days but I still like a lot of "drives". It
is a way to keep things separated.
I approach storage from a different perspective. Most of my drives are pooled
into a single 40TB volume on my primary PC and a single 60TB volume on a
networked PC. I just picked up a pair of 18TB drives, so I'll be expanding my
little 40TB pool to 76TB (which becomes 70TB after formatting). I use folders
for separation, rather than partitions. Folders are much more flexible.
Yes, folders are much more flexible, and I do the same (although I
have much less storage than you do).
Frank Slootweg
2024-01-05 12:34:59 UTC
Permalink
***@aol.com wrote:
[...]
Post by g***@aol.com
I was running an ST 4096 80m on a PC/AT with DOS 3.3 but I broke it up
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.

After this setting up of the basic Windows system, I'm normally quite
busy to get all the old stuff working again on the new system, so
getting partitioning software and partitioning the disk is left on the
back burner, until it's too late/bothersome to do it after the fact.

On bright side: While imaging the 'whole' disk (currently 216GB used
of 1TB) is time consuming, so I do a differential only once a month, I
have never needed to restore an image. (I only once used an image to
recover some file(s) or/and 'database'. I forgot the details.)

Only if I have a catastrophic [2] disk failure, I might have to
restore an old image and might/will have to redo some re-installation,
re-configuration, re-updating, etc.. (My data files are backed up
seperately, the most important stuff daily (or twice daily), the rest
weekly.)

This approach has been working for a bit over two decades, so I think
I'm safe! :-) <knocks on wood>

[1] Now SSD.

[2] For a non-catastrophic failure, for example a boot failure, I can
image the disk offline, just in case. I can then try fix the failure and
if I make things worse doing so, I can restore the image I just made and
start over.
g***@aol.com
2024-01-06 05:31:38 UTC
Permalink
Post by Frank Slootweg
[...]
Post by g***@aol.com
I was running an ST 4096 80m on a PC/AT with DOS 3.3 but I broke it up
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions? You should be able to shrink C: and
add another simple volume in that vacated space
Frank Slootweg
2024-01-06 14:45:38 UTC
Permalink
Post by g***@aol.com
Post by Frank Slootweg
[...]
Post by g***@aol.com
I was running an ST 4096 80m on a PC/AT with DOS 3.3 but I broke it up
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions? You should be able to shrink C: and
add another simple volume in that vacated space
Yes, you can do it after the fact, i.e. after the initial Windows
'installation'/initialization, but not before/during that procedure.

After the fact, you can indeed do it with the stock Disk Management
(which does have a 'Shrink Volume...' function) or with third-party
partitioning software, which is probably more flexible and user-friendly.

Maybe one day I'll do it, or maybe not! :-)
J. P. Gilliver
2024-01-06 15:23:47 UTC
Permalink
[]
Post by Frank Slootweg
Post by g***@aol.com
Post by Frank Slootweg
I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions? You should be able to shrink C: and
add another simple volume in that vacated space
Yes, you can do it after the fact, i.e. after the initial Windows
'installation'/initialization, but not before/during that procedure.
After the fact, you can indeed do it with the stock Disk Management
(which does have a 'Shrink Volume...' function) or with third-party
partitioning software, which is probably more flexible and user-friendly.
Maybe one day I'll do it, or maybe not! :-)
Yes, some feature for handling partitions has been built into the OS
since 7. In the 7 version, it was unable to shrink C: below a point
because of "immovable files", which were usually about half way up the
original size of C:, though (a) repeating it after a reboot _sometimes_
allowed further reduction and (b) third-party software (e. g. EaseUS)
did not have this problem. I don't know if this limitation remained in
the in-OS partition utility in future versions.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

I admire him for the constancy of his curiosity, his effortless sense of
authority and his ability to deliver good science without gimmicks.
- Michael Palin on Sir David Attenborough, RT 2016/5/7-13
Paul
2024-01-07 10:28:14 UTC
Permalink
[]
Post by g***@aol.com
  I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions?  You should be able to shrink C: and
add another simple volume in that vacated space
 Yes, you can do it after the fact, i.e. after the initial Windows
'installation'/initialization, but not before/during that procedure.
 After the fact, you can indeed do it with the stock Disk Management
(which does have a 'Shrink Volume...' function) or with third-party
partitioning software, which is probably more flexible and user-friendly.
 Maybe one day I'll do it, or maybe not! :-)
Yes, some feature for handling partitions has been built into the OS since 7.
In the 7 version, it was unable to shrink C: below a point because of
"immovable files", which were usually about half way up the original
size of C:, though (a) repeating it after a reboot _sometimes_ allowed
further reduction and (b) third-party software (e. g. EaseUS) did not
have this problem. I don't know if this limitation remained in the
in-OS partition utility in future versions.
The NTFS file system, places some sort of metadata at the 50% point on the disk.

In Linux, something there, used to place certain structures at the
1/3rd and 2/3rds points on their partitions.

These represent "value statements". The positions selected, have
something to do with file system performance. Someone believed,
long ago, that putting a certain structure at the 50% mark, gave
the best average performance.

You will notice in the SSD era, this no longer matters, and the value
statements mean nothing now.

Thus, the behavior of the utility is incorrect for SSDs, and
remains a value statement for HDD. If your seek time is zero,
then nobody cares where a structure is stored.

Just about any utility worth its salt, does not give a fig newton
for the 50% rule. The metadata gets moved, as required, at the
time a partition manipulation is carried out. I can use GParted
on Linux, and squeeze and squish all I want, and no "rules" are
applied to what I do.

The same goes for defragmenters like Raxco PerfectDisk.

Microsoft did not write the original defragmenter API. A third
party wrote it, and Microsoft bought it, and put it in the OS.
This is the feature that, say, JKDefrag would call, to move
a 64KB block or the like. The defragmenter API is "power-safe",
which means there is a reduced probability of destruction
if the power goes off, while you were using the defragmenter API.
It should not be using any cache path, neither System Read Cache
nor System Write Cache, are allowed to be used (if they did,
defragmentation would "fly like the wind").

The defragmenter API is busy, during a "disk shrink". Open the
Optimization dialog and open Disk Management, at the same time,
so both windows are open. Now, make a request to shrink a partition in DM.
What you should see, in the corner of your eye, is the
Optimize window wakes up and "something is going on". This is the
selective movement of materials. When DM shrinks a partition,
it calls Optimize to do the deed (move the materials out of the way).

But even so, there are a number of things that certain OS utilities
will not request movement. Whereas the defrag API is technically
capable of doing a lot of things. There is a disconnect between
what Microsoft deems moveable, versus what the defrag API can
actually do.

Now, you have some idea, what ingredients fit into the "why won't
my partition shrink further in Disk Management" thing. It's not
a technical issue. It may be a value statement, such as
"put a drink coaster under that drink, you cretin".
We must have doilies and throw cushions, and a hassock you
cannot put your feet on. The hassock sits immobile in the
livingroom, serving no worldly purpose :-)

You see this thing ? Don't put your feet on it, or mom will be pissed.
Well, that's like the metadata at the 50% point. It's to remain
in the livingroom, even if it never demonstrates a function.

Loading Image...

The purpose of a hassock, as it turns out, it's a "land mine". If
an intruder enters the living room, they trip over the hassock
and are "dispatched" by the fall. That's why the hassock must
remain at 50%.

Paul
J. P. Gilliver
2024-01-07 11:36:34 UTC
Permalink
Post by Paul
[]
Post by g***@aol.com
  I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions?  You should be able to shrink C: and
add another simple volume in that vacated space
 Yes, you can do it after the fact, i.e. after the initial Windows
'installation'/initialization, but not before/during that procedure.
 After the fact, you can indeed do it with the stock Disk Management
(which does have a 'Shrink Volume...' function) or with third-party
partitioning software, which is probably more flexible and user-friendly.
 Maybe one day I'll do it, or maybe not! :-)
Yes, some feature for handling partitions has been built into the OS since 7.
In the 7 version, it was unable to shrink C: below a point because of
"immovable files", which were usually about half way up the original
size of C:, though (a) repeating it after a reboot _sometimes_ allowed
further reduction and (b) third-party software (e. g. EaseUS) did not
have this problem. I don't know if this limitation remained in the
in-OS partition utility in future versions.
The NTFS file system, places some sort of metadata at the 50% point on the disk.
In Linux, something there, used to place certain structures at the
1/3rd and 2/3rds points on their partitions.
These represent "value statements". The positions selected, have
something to do with file system performance. Someone believed,
long ago, that putting a certain structure at the 50% mark, gave
the best average performance.
I can see that, following certain perceptions, and with a disc with
files fairly well distributed (not necessarily fragmented), it could be
a perception that, on average, putting something that's accessed a lot
at the half-way point means the head will have less far to move from any
random position than if that something was near the start. I can't think
what the 1/3 and 2/3 justification is - maybe some combination of the
above with discs being faster near the start.

Such assumptions do assume very specific things about the way the data
on the disc is used - or, perhaps, assume very specifically that it is
random, which for any given user probably isn't quite the case.
Post by Paul
You will notice in the SSD era, this no longer matters, and the value
statements mean nothing now.
Indeed.
Post by Paul
Thus, the behavior of the utility is incorrect for SSDs, and
remains a value statement for HDD. If your seek time is zero,
then nobody cares where a structure is stored.
Just about any utility worth its salt, does not give a fig newton
(I've not come across that expression with the "newton" part in it
before.)
Post by Paul
for the 50% rule. The metadata gets moved, as required, at the
time a partition manipulation is carried out. I can use GParted
on Linux, and squeeze and squish all I want, and no "rules" are
applied to what I do.
Same with I think any of the third-party ones available to a 7 user (I
currently have EaseUS, though I think I only used it near the start of
my usage of this drive): within the practical limits, of course, of not
being able to squish below the size of the files currently on the
partition. (I don't know if it insists on a teeny bit extra for C: so
you don't leave yourself with an unusable system; I haven't tried, not
having any reason to.) They usually require a reboot (in practice two,
though the process is automated; I think the in-between case is not
normal Windows, so that they _can_ move the unmovable).
Post by Paul
The same goes for defragmenters like Raxco PerfectDisk.
Microsoft did not write the original defragmenter API. A third
party wrote it, and Microsoft bought it, and put it in the OS.
This is the feature that, say, JKDefrag would call, to move
a 64KB block or the like. The defragmenter API is "power-safe",
which means there is a reduced probability of destruction
if the power goes off, while you were using the defragmenter API.
It should not be using any cache path, neither System Read Cache
nor System Write Cache, are allowed to be used (if they did,
defragmentation would "fly like the wind").
In the same way that it's often far quicker to move (which is of course
really a copy then delete) all to another disc and then back, than to do
a defrag.
Post by Paul
The defragmenter API is busy, during a "disk shrink". Open the
Optimization dialog and open Disk Management, at the same time,
so both windows are open. Now, make a request to shrink a partition in DM.
What you should see, in the corner of your eye, is the
Optimize window wakes up and "something is going on". This is the
selective movement of materials. When DM shrinks a partition,
it calls Optimize to do the deed (move the materials out of the way).
I'm not going to play with my partition sizes anyway (I'm quite content
to take your word for the above), but where/what is the "Optimization
dialog" in Windows 7? Sounds like it might be an interesting thing.
Post by Paul
But even so, there are a number of things that certain OS utilities
will not request movement. Whereas the defrag API is technically
capable of doing a lot of things. There is a disconnect between
what Microsoft deems moveable, versus what the defrag API can
actually do.
Yes, I thought the "unmovable files" were system files or such. I
suppose what is "metadata" and what "system files" is just semantics.
Post by Paul
Now, you have some idea, what ingredients fit into the "why won't
my partition shrink further in Disk Management" thing. It's not
a technical issue. It may be a value statement, such as
"put a drink coaster under that drink, you cretin".
We must have doilies and throw cushions, and a hassock you
cannot put your feet on. The hassock sits immobile in the
livingroom, serving no worldly purpose :-)
You see this thing ? Don't put your feet on it, or mom will be pissed.
Well, that's like the metadata at the 50% point. It's to remain
in the livingroom, even if it never demonstrates a function.
https://i.etsystatic.com/5275497/r/il/8cc62b/3822959334/il_794xN.3822959
334_33s3.jpg
(I have inherited something similar; mine is rectangular - about 2:1 -
whereas yours, hard to tell from the pic., looks as if it might be
square. I don't remember any direction not to sit on it [I don't think
anyone ever put their feet on it so that didn't come up].)
Post by Paul
The purpose of a hassock, as it turns out, it's a "land mine". If
an intruder enters the living room, they trip over the hassock
and are "dispatched" by the fall. That's why the hassock must
remain at 50%.
Paul
Whereas in the computing sense, it's just what you called a "value
statement", not a trap.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

Q. How much is 2 + 2?
A. Thank you so much for asking your question.
Are you still having this problem? I'll be delighted to help you. Please
restate the problem twice and include your Windows version along with
all error logs.
- Mayayana in alt.windows7.general, 2018-11-1
Paul
2024-01-07 17:38:20 UTC
Permalink
Post by g***@aol.com
Post by Frank Slootweg
[...]
Post by g***@aol.com
I was running an ST 4096 80m on a PC/AT with DOS 3.3 but I broke it up
into 3 partitions, an architecture I still run
C: system files.
D: common data files (storage for program data)
E: everything else.
That makes your backups, from XP on, much simpler
Image a relatively small system drive and simply copy all the rest of
the files on your data drives with a sync program.
The only one that really needs to be an image is the system files
because of the registry and other program sensitive data. I keep that
as small as possible.
I should do something similar, but the 'problem' is that when I get a
new system (laptop), the 'installation' (read: initialization) procedure
of the Windows (now Windows 11) on the disk [1] does not give the
choice/chance to partition the disk.
I haven't played with 11 much but doesn't it have "Disk Management"
that lets you juggle partitions? You should be able to shrink C: and
add another simple volume in that vacated space
It does not "juggle".

The functional set of Disk Management, is not intended to displace
business partner Partition Management products. That is why the feature
set is intentionally incomplete.

You can shrink and expand partitions, but this is *not* sufficient
materials for "any possible manipulation". You cannot change the
origin of a partition. That's missing.

The limitation of "shrink to 50%", you won't be able to exist
for every long, before realizing you are in need of better tools.

Open the Optimize (defrag) window.
Open Disk Management, and propose a Shrink to the system.
Watch as the Optimize window "wakes up" and does stuff.

*******

Microsoft does leave "landmines" in their OS.

C:\Windows\System32\MBR2GPT.EXE
C:\Windows\System32\convert.exe

Compare that to some of what Linux gives you (gparted).

Programs like that, these are things that you should
backup first before testing. The MBR2GPT is way way too
ambitious a program, to be releasing that on an unsuspecting
public.

*******

I wake up today and discover this.

https://en.wikipedia.org/wiki/Dalton_%28unit%29

https://en.wikipedia.org/wiki/2019_redefinition_of_the_SI_base_units

The bastards have changed Avogadros Number! That's a number
that is burned into my core memory. 6.02252 x 10^23.
And now, that's the wrong number. "This number is no longer
in service".

At least now I know who to blame for those small cans of Coke. Science.

*******

The thing blocking Shrink, may be this file. After four tries
at this, this is the closest I've got to an answer. When the OS
shuts down, it writes stuff just past this, which helps "hide"
who is responsible. There is also a command to delete the journal,
but the system did not create a new one, when I tested that a long
time ago.

File 240654
\$Extend\$UsnJrnl
$DATA $J (nonresident)
logical sectors 38934488-39003863 (0x25217d8-0x25326d7)
logical sectors 40801328-40801599 (0x26e9430-0x26e953f)
logical sectors 40800048-40801055 (0x26e8f30-0x26e931f)
logical sectors 95654272-95655295 (0x5b39180-0x5b3957f) <=== up high, on my sample partition

There are some Linux operations (gparted), which invalidate
the USN change journal (because the Linux driver does not know
how to update it, and zapping it is the next best thing). It may
actually serve two purposes, in the sense that it could allow
the Shrink on Linux to be smaller than the Shrink on Windows :-)

That's just a guess at a blocker, because there's a lot of "noise"
in the traces I collect.

Paul

Daniel65
2024-01-04 00:17:32 UTC
Permalink
<Snip>
Post by g***@aol.com
Post by Frank Slootweg
[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
WOW!! A Gig!! WOW!! I can remember going to a Museum exhibition in
1993/4 with one of my Brothers-in-law (now deceased) to see a 1GB hard
drive .... which came complete with two ARMED guards (well, that's what
they told us, anyway. :-P )!!
--
Daniel
g***@aol.com
2024-01-05 02:13:44 UTC
Permalink
On Thu, 4 Jan 2024 11:17:32 +1100, Daniel65
Post by Daniel65
<Snip>
Post by g***@aol.com
Post by Frank Slootweg
[1] Not quite the reason. Real reason: 32MB maximum partition size and I
didn't want to use software to overcome that limit.
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
WOW!! A Gig!! WOW!! I can remember going to a Museum exhibition in
1993/4 with one of my Brothers-in-law (now deceased) to see a 1GB hard
drive .... which came complete with two ARMED guards (well, that's what
they told us, anyway. :-P )!!
The 857m Redwings I was running were already getting long in the tooth
by 1994 so I can't imagine a gig had that much value unless it was
full of military secrets or something.
We were shipping 1g Corsairs in 1991.
Mark Lloyd
2024-01-04 19:48:19 UTC
Permalink
[snip]
Post by g***@aol.com
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
The motherboard I bought in 1990 came with DR-DOS 3.41, which supported
FAT16b.
Post by g***@aol.com
Pretty soon I had several 9404 "shoeboxes" with 857 "Redwings" in them
that I used like big diskettes.
--
Mark Lloyd
http://notstupid.us/

"This indictment of Christianity I will write on all walls, wherever
there are walls--I have letters to make even the blind see." [Nietzsche]
g***@aol.com
2024-01-05 02:24:50 UTC
Permalink
Post by Mark Lloyd
[snip]
Post by g***@aol.com
There was an IBM internal use only BIOS for the PC/AT that added large
drive support and a few other things. One of my buddies with a ROM
burner was burning them and handing them out to hardware geeks like
me.
I think it was DOS 4 that bumped up that limit with FAT16B. I never
bit on that one but I ran big drives with no problem on 5.0 and 6.3
Once I got SCSI cards in my PCs it seemed the sky was the limit on
available drives (at least a gig anyway and that seemed like more than
anyone would ever need.)
The motherboard I bought in 1990 came with DR-DOS 3.41, which supported
FAT16b.
In the early 90s(90-96), the whole idea of "buying" a system board, or
any other PC part was foreign to me. I ran the parts room and we had a
part number for shrinkwrapped PC DOS at the current version. ;)
We had the PCTOOLS forum on VM with just about any little snippet of
PC code you could think of, including IBM internal tools that never
made it to the public.
That was when PC hardware hacking was fun for me.
Most of it did end up being business related tho.
At home I was running stuff that was obsolete at work.
g***@aol.com
2024-01-01 16:42:02 UTC
Permalink
On Sun, 31 Dec 2023 19:21:52 +0000, "J. P. Gilliver"
Post by J. P. Gilliver
Post by g***@aol.com
On Sun, 31 Dec 2023 12:26:54 +0000, "J. P. Gilliver"
[]
Post by g***@aol.com
Post by J. P. Gilliver
(It's Gold I have.) Sounds about right. It was such a good precursor -
in DOS days - that, whenever you saw a photo of a scene where there was
a PC screen incidentally in the picture, it was very often the blue of
Xtree rather than the black of a DOS screen!
I will check out Ztree.
BTW my DOS screens were always a different color. I leaned toward
yellow text on blue.
Xtree had yellow or white (depending on what you were doing) on blue.
(Though I think you could choose all the colours for yourself if you
wanted.) IIRR the DOS version of WordPerfect (which was in most people's
opinion better than the pre-Windows Microsoft equivalent) also used
white on blue. (I can't remember what the pre-windows Microsoft word
processor was even called, or even if they had one.)
I remember someone posting a hack - well, telling you how - to change
the colours of the BSOD; not that it was something one wanted to see
anyway, but at least you could choose the colours! I think that was back
in the '95/8/Me days.
Prior to Windows I was using "Wordproof" in IBM internal product with
spell checking, grade level scoring and a number of other features.
Later I switched to "CE3", another IBM Internal programs that can do
text tricks Windows still can't do unless you create an XLS file and
use Excel.
This all runs in DOS.
IBM had so many DOS tools that I resisted Windows or OS2 until I was
forced into it by software that would only run under Windows.
Daniel65
2024-01-02 10:10:28 UTC
Permalink
***@aol.com wrote on 2/1/24 3:42 am:

<Snip>
Post by g***@aol.com
This all runs in DOS.
IBM had so many DOS tools that I resisted Windows or OS2 until I was
forced into it by software that would only run under Windows.
Likewise, I used DOS (up to Ver 6.0) until Win98 came out .... then I
unstalled Win95!!
--
Daniel
Loading...