[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

raid1 bootable disks



I did'nt check my raid1 last May, at the time of a hot discussion on this list 
about bootable disks with raid1. I was busy to put mpqc at work. Now that it 
works beautifully - and I understand I can invest on it - I begin to be 
concerned with security issues. That so much because yesterday my window 
manager jwm hanged during a four-days mpqc issue, while mpqc was still 
running. It is wrong to start X and windows managers for computation sessions 
but I did the mistake (I just wanted to check the cpu temp, and I never 
learned properly how to switch beteewen consoles with [Alt]+F2 without a 
window manager).

With debian am64, two SATA disks, ext3 filesystem throughout, grub is on its 
own partition /boot.

I took some notes from last May and I wonder where to turn now my attention 
(and action):

Alexander Sieck
To enable booting from all disks in the RAID, you just need
to run 'grub-install /dev/xxy' where xxy is, e.g. hdb or sdb,
depending on your system, after you installed grub within d-i.
Or, as written in the Software-RAID HOWTO: call grub and type:
grub>device (hd0) /dev/xxy
grub>root (hd0,0)       
grub>setup (hd0)

Len Sorensen
I run grub-install /dev/sda and /dev/sdb, and both disks are perfectly
bootable.  I have tested this.
AND
grub-install seems to have issues with seperate boot partitions.  If you
create a symlink inside boot called boot pointing to itself, then
grub-install is fine with it.
cd /boot; ln -s . boot
With that there is no problem with grub on raid1 /boot.

Goswin
You have to trick grub-install. Say /boot is on /dev/md0, a raid1 of
/dev/sda1 and /dev/sdb1. Then you can put /dev/sda1 into devices.map,
instal-grub, repeat for sdb1. You might have to change fstab and/or
mtab as well. Can't remeber if that is truely needed.
But grub-install also has some special code for raids. Why it doesn't
work or isn't good enough should be someones TODO. Maybe we could make
that a NM job
HOWEVER
By setting up each device in
device.map in turn you get grub on each of them.
AND
If a disk fails the system should keep running. Thats the point of
raid after all. So when you do reboot it will usualy be to replace the
disk. So I'm not much concerned if the system is still bootable with a
broken disk still connected. It is likely the bios won't like that at
all no matter what the MBR looks like.

Hope not to have misrepresented the suggestions by cutting parts of the 
e-mails.

Thanks a lot for your attention

francesco pietra





Reply to: