[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

New Hardware + old disks not recognized



In brief: moved all the 3.5" disks from an old system to a new one,
and now I can't boot into buster.  In the initrd environment no disks
appear in /dev; the disks are all connected through an LSI Host Bus
Adapter card (only on the new system).  I can boot into Ubuntu on the
new system, and from there can see and use all the disks.


More details:
My old system was experiencing a lot of problems and so I got a new
system.  The system came with Ubuntu 18.04 installed on an NVME SSD,
and it boots into it fine.  I took all my 3.5" disks from the old
system and put them in the new one.  From within Ubuntu all the disks
are recognized and things seem fine.

I have not been able to boot into my old system (that is, the buster
installation on the old disks) since moving the disks to the new
system.  I boot using grub and initrd after picking the disk to boot
from in the BIOS; the root file system is an encrypted volume in a an
LVM volume group.  /boot is on a separate, unencrypted partition.

My current guess is that the key problem is that none of my disks are
recognized.  Using break=mount I interrupted the initialization and
found /dev contained no entries for sd*, for nvme*, or for a disks/
directory.  All are present on Ubuntu.

My leading suspect for why the hard disks aren't recognized is that
they are all attached through
02:00.0 Serial Attached SCSI controller: LSI Logic / Symbios Logic
SAS2116 PCI-Express Fusion-MPT SAS-2 [Meteor] (rev 02)
whereas before they were vanilla SATA (although at least one used a
SATA expansion card).  I am not using the card for RAID and was told
it didn't even support RAID.  The box says it's an LSI SAS 9201-16i
Host Bus Adapter.
The purpose of the card was just to permit connections to more drives,
but it certainly sounds as if it's a different technology (I had no
SAS and no SCSI before, though it seemed SATA was using some SCSI
drivers anyway).

AFAIK the SSD is not going through the LSI HBA.  But it's irrelevant
to booting into buster.

Note the drive with /boot is attached through the LSI as well, and the
break seems to clearly land me inside its initrd.  So the very start
of the bootstrap process can see the disk.

Ubuntu doesn't indicates it's running any proprietary drivers for
anything but video.  I do see an nvme driver loaded, as well as
scsi_transport_sas *used by mpt3sas) and some raid modules.  None of
the modules have LSI in the name (case-insensitive comparison).

The symptoms of the problem when I don't break into the init scripts
are somewhat variable, but usually I get messages
  Volume Group vgbarley not available
  Can not process volume group
repeated many times.  The system is unresponsive, but if it sits there
for a couple of minutes it says
  encrypted source ... for root does not exist
and drops into busybox.  Once in busybox, the problem noted above is
evident: no disks.

People having similar symptoms, failure to find an LVM group, on the
net seem to have solved it by ensuring that vgchange -ay gets
executed.  But, with no disks, vgchange won't have anything to work
with.

Any ideas?
Ross


Reply to: