Discussion:
Macrium Reflect (free) on Win 7
(too old to reply)
Keith
9 years ago
Permalink
I run version 6.1.1225 on a 64 bit desktop; a full backup of 148 GB takes
about 90 mins.
Same version of MR on my Vaio 32 bit laptop a full backup of 78 GB takes
around 4 hours -about 90 mins for 99% then another 150 mins for the last
1%, at around 32 kbs. What could be causing this slowness? Both machines
are connected by network cable and backups are not run concurrently.
Backup device is a Western Digital My Cloud, 3TB and the router a Netgear
D6200.
Paul
9 years ago
Permalink
Post by Keith
I run version 6.1.1225 on a 64 bit desktop; a full backup of 148 GB
takes about 90 mins. Same version of MR on my Vaio 32 bit laptop a full
backup of 78 GB takes around 4 hours -about 90 mins for 99% then another
150 mins for the last 1%, at around 32 kbs. What could be causing this
slowness? Both machines are connected by network cable and backups are
not run concurrently.
Backup device is a Western Digital My Cloud, 3TB and the router a
Netgear D6200.
Is the Vaio plugged in ? I would want to make sure
the device stays in a running state, and doesn't try
to spin-down the hard drive.

When is the last time you ran CHKDSK on the volume
on the Vaio you were backing up ? And if you run
CHKDSK, is that, also, extremely slow ? That
could be evidence of some sort of file system issue.
If a hard drive has to retry read operations, it
can take 15 seconds per sector to resolve them.

You could also use a program like the free version
of HDTune, and check the SMART stats in the Health Tab,
as well as run the bad block scan in the tab next to it.
(Note that, Win10 seems to interfere with SMART, but
you're on Win7 so this should be fully functional.)

http://www.hdtune.com/files/hdtune_255.exe

Macrium uses VSS, and I don't know if the shadow copy
needs storage space on the volume or not. It could be
some issue related to that.

As an experiment, you can:

1) Boot the laptop with the Macrium emergency CD.
2) Do the backup from there. I do backups using
the CD, as well as running them from the hard
drive OS.
3) See if the behavior is consistent. If it is,
then there is more of a chance the disk itself
is the issue. Either a structural problem with
the disk, or an issue with the file system.
If the CD backup runs faster, then maybe the
issue is related to VSS and the shadow copy on
the hard drive OS.

You say in your problem description, the Vaio runs
a 32 bit operating system. While the backup is
running (from the hard drive OS), open Task Manager
and check RAM usage. Sluggish performance can sometimes
happen if the system is out of RAM (a program leaks
memory), and the system is "scavenging" memory to
try to feed the aggressive program.

As a general performance issue, things like Windows
Defender and Search Indexer can interfere with
things a user wants to do. You can turn off Real Time
protection on Windows Defender, as a temporary workaround
for the first problem. Dealing with Search Indexer is
a lot harder - I've had problems where the Search Indexer
rails on one CPU core, cutting down my performance, and
even if you modify the restart options in the Service
definition for the Search Indexer, it still manages to
restart itself. You have to whack it repeatedly with
the Service Stop button to keep it from interfering.
While I don't see this as being key to your problem,
I sometimes resort to this approach in an attempt
to double the performance of what I'm doing on
the computer.

And there is actually another performance tip for
Macrium. It has an actual Preferences panel in it,
and one of the options in there, is to "enable
file system caching". That's the normal Windows
system file cache. My Macrium 6 install defaulted
to that being turned off - turning it back on
made a local disk-to-disk backup run faster.
Went from 60MB/sec to over 100MB/sec, because
the $MFT gets updated less often. But this
isn't the root cause of your 32Kbit/sec behavior.

Paul
Keith
9 years ago
Permalink
Hello Paul,

Sorry for delay in responding - been on vacation.
My responses, enclosed in << >>, are below.
...
<<Ran CHKDSK this am with Auto Fix selected; didnt't seem slow but had 47
re-parses.>>
You could also use a program like the free version
of HDTune, and check the SMART stats in the Health Tab,
as well as run the bad block scan in the tab next to it.
(Note that, Win10 seems to interfere with SMART, but
you're on Win7 so this should be fully functional.)
http://www.hdtune.com/files/hdtune_255.exe
<<
Ran hdtune (free); Benchmark (ran OK, stats below), Health (nothing shown,
see snapshot below) and Error scan (one bad block three blocks from the end).
HD Tune: Hitachi HTS547550A9E384 Benchmark

Transfer Rate Minimum : 13.5 MB/sec
Transfer Rate Maximum : 95.3 MB/sec
Transfer Rate Average : 58.7 MB/sec
Access Time : 22.5 ms
Burst Rate : 114.0 MB/sec
CPU Usage : 57.2%
________________________________________________
HD Tune: Hitachi HTS547550A9E384 Health

ID Current Worst ThresholdData Status

Power On Time : n/a
Health Status : n/a
...
<<
Will try the emergency boot (from USB later.
You say in your problem description, the Vaio runs
a 32 bit operating system. While the backup is
running (from the hard drive OS), open Task Manager
and check RAM usage. Sluggish performance can sometimes
happen if the system is out of RAM (a program leaks
memory), and the system is "scavenging" memory to
try to feed the aggressive program.
<<
RAM used, in latest backup, was 1.9GB (4GB total Ram on this machine).
...
<<
Latest backup, 87 GB, was done with Real Time protection on Windows Defender
turned off and indexing for all partions also turned off and "file system
caching enable", took 3 hours.
Paul
I'm currently running the HDTune Error Scan. Will then run CHKDSK and fix
any errors and, finally, boot from the emergency USB and see how things go.

Keith
Keith
9 years ago
Permalink
Hello Paul,

<<
Here is the Error Scan report:
HD Tune: Hitachi HTS547550A9E384 Error Scan

Scanned data : 476749 MB
Damaged Blocks : 0.0 %
Elapsed Time : 169:50
Keith
Paul
9 years ago
Permalink
Post by Keith
Hello Paul,
<<
HD Tune: Hitachi HTS547550A9E384 Error Scan
Scanned data : 476749 MB
Damaged Blocks : 0.0 %
Elapsed Time : 169:50
Keith
This is as close as I could get. Due to the wavy-gravy
nature of this plot, it's hard to say whether your
drive falls in line with the curve here exactly,
or not. It's in the right ballpark.

Your drive might be three-head, this drive could be
the four-head version. Disks use two heads per platter
(top and bottom). A three-head drive is just a four-head
drive, with one head being ignored during read/write. But
it still flies along, and helps balance forces on either
side of the platter.

Failed to load image: http://dyski.cdrinfo.pl/benchmark/hdtune/hdtune-1516-107204-943aLDRGqpf1O.png

Notice the seek dots are all over the place. I occasionally
have a couple seek dots off the beaten path, so they
don't have to be perfect. But if it's "snowing" off
the main axis, that spells some sort of trouble.
Like, uneven performance in day to day usage.

One possibility for your "slow" portion, is
perhaps the envelope for the partition is
bigger than it should be. The 32Kbit/sec section
could be one timeout after another, while seeking
to places that don't exist. But Macrium would
stop immediately if that were the case.

Partitions have two size parameters. There is
the physical size (in Windows 7, likely rounded
to some number of 1048576 byte "megabytes"). But
inside the physical partition, the virtual information
declares some number of clusters make up the file system.
The two sizes do not have to be equal. If there is
a mishap during a Windows Disk Management partition
resize, there have been cases where the physical
size was 1TB, while the virtual size was 500GB. Which
means half of the partition is completely
inaccessible. The backup would dutifully record
that, without an issue, and reproduce it given
a chance. All the tools are happy if the
situation arises. Only the user is unhappy.

Linux does this sort of thing on purpose. The physical
and virtual are handled as two separate steps in
GParted. Whereas Windows tries not to expose such
details to the user.

If increasing the size of a partition, you increase
Physical first, then increase size of Virtual. If
decreasing the size of a partition, you decrease
the Virtual size first, then adjust the Physical (update
partition table) right after that. Not that any of
this is relevant. I just wanted to point out one
failure mode is for Physical to be quite a bit
larger than Virtual.

The other way around wouldn't work. If Virtual was bigger
than Physical, the partition would corrupt as you
were filling it with data. But what would happen
if backing up ? Would the backup software
try to seek past the end of the partition ?
Dunno.

*******
You've already done the bad block scan. It shows
zero percent bad.

Perhaps this is one of those cases, where you
run the Macrium backup again, then run ProcMon
and collect a trace. Save out the trace, then
examine all the "Readfile" calls. Check the
addresses on the Readfile calls. They could be
relative to the start of the partition. If
some of those addresses are disproportionate
(outside the partition), then that might account
for bad behavior.

I had another idea, which is to resize C: a little
bit. Shrink it down by 10GB. Then run another backup
and time it. The purpose of this, is to give Windows 7
a chance to examine the partition and perhaps put things
right.

But the thing is, Macrium works best if the partition
table stays constant over a set of backups. While
you can resize on a restore now, it's a bit disconcerting
to have the tool complaining it cannot restore the MBR
because the partitions are different sizes. The danger of
modifying the MBR, is the possibility of running into
trouble on a restore (if restoring a 2 year old backup
say).

Did CHKDSK approve of your partition ?

I've studied a Macrium backup from end to end
with Sysinternals Procmon, and the trace was
around 9GB in size (20 minutes worth). You'll need
a 64 bit OS and thus the 64 bit version of ProcMon
will automatically be running, in order to collect
traces that big. Then, convert the trace to another
format, for post-analysis. While you can certainly
scroll through the trace, you may want some other
way to check it out. And we know text editors on
Windows suck, and there aren't a lot of good
choices there (from Microsoft itself). My best
tool now for examining files (not good for this
purpose), is the HxD hex editor. Finally, I can
edit a 30GB file with a hex editor, and it actually
runs at a decent speed. Now, if I could only
find a text editor that works that well.

When the backup is running, the clusters should
be backed up in sequential order, with "gaps"
where nothing is stored. You know the trouble
happens at the end of the backup, so maybe you'll
only have to scroll through the last 100,000 lines
on the screen :-)

The last really good text editor I had was
BBEdit Lite on the Mac (they don't make a PC
version). Which for its time and situation,
was fast. The text editors I've used since then,
are embarrassingly bad.

Paul
Char Jackson
9 years ago
Permalink
Post by Paul
And we know text editors on
Windows suck, and there aren't a lot of good
choices there (from Microsoft itself). My best
tool now for examining files (not good for this
purpose), is the HxD hex editor. Finally, I can
edit a 30GB file with a hex editor, and it actually
runs at a decent speed. Now, if I could only
find a text editor that works that well.
Your requirements may very well be different from mine, but for me TextPad
and Notepad++ are both excellent text editors for the Windows platform. Of
the two, Notepad++ is my current choice. I have it running about 99% of the
time that the computer is running. That's how important it is for me.
--
Char Jackson
Paul
9 years ago
Permalink
...
If I open a 100MB file, does it take five minutes
or five seconds ? :-)

I'm more used to making a meal here, while my text file opens.

Paul
Char Jackson
9 years ago
Permalink
...
My biggest text file is only about 20MB and Notepad++ opened it in less than
a second. So I concatenated some smaller files until I had one that weighed
in at 110MB. Opening that one took approximately two seconds.
--
Char Jackson
Sjouke Burry
9 years ago
Permalink
...
Notepad++:10 sec for 50MByte star database.(xppro sp3 2.6GHZ Celeron)
Continue reading on narkive:
Loading...