Thursday, June 16, 2016

Replacing Drives in an Iomega/LenovoEMC NAS

A few years ago, I came across a situation at work where a drive had failed in an old Iomega ix4-200r NAS. Replacement drives for this unit were no longer available from the manufacturer, so I tried to swap out the bad drive for a new one of the same size which we already had, but the NAS refused to recognize it. In their support forums, an Iomega tech said that it wasn't possible to mix and match drive models, so I would need to replace all 4 drives (something they obviously did not support).

I decided to try a full drive replacement. I had known for a long time that these NAS appliances were basically just Linux boxes, and once I logged into the unit, I discovered it was even using the standard Linux software RAID system. So, I acquired 4 brand new drives which were the same size as the old ones but a different model, and successfully replaced every drive in the unit.

Important Notes:
  • This procedure is intended to be used only when the NAS has suffered a drive failure and will no longer boot.
  • These recovery steps will only work if you have at least one old drive from the NAS available.
  • If more than one drive has failed and the NAS is using RAID 5, then the stored data will be lost (the operating system will still be recovered).
  • The replacement drives do not have to be the same model as the original drive, but all of the replacement drives need to be the same.
  • This procedure was successfully used to restore an Iomega ix4-200r to working condition after the boot drive failed. These steps should also work on any Iomega/LenovoEMC NAS of a similar generation, but this has not been tested.

What You'll Need:
  • Monitor and keyboard
  • Replacement drive(s)
  • Physical access to the NAS
  • Administrator password for the NAS

How to Do It:
  1. Connect a keyboard and monitor to the NAS, because you will need to run commands once it is booted.
  2. Boot the NAS with the old HDD in bay 1 and the other bays empty. Log in as root.

    The default root password is soho if using factory defaults, otherwise it's soho with your admin password appended (ex: sohoabcd9876 if the password is abcd9876).

  3. Once the NAS is booted, insert a new drive into bay 2. After a few seconds, the RAID driver will begin to mirror the first partition from disk 1 onto disk 2.
  4. Monitor the status of the RAID operation using the command mdadm -v -D /dev/md0 every 30 seconds or so.
  5. Once the RAID mirror of disk 2 is complete, insert a new drive into bay 3 and repeat Step 4.
  6. Once the RAID mirror of disk 3 is complete, insert a new drive into bay 4 and repeat Step 4.
  7. Once the RAID mirror of disk 4 is complete, remove disk 1 (the old drive) and insert a new drive into bay 1, then repeat step 4.

    *** DO NOT REBOOT THE NAS! ***

    At this point, all four of the new drives are installed and the OS partition has been mirrored to all of them. However, the MBR of the old drive has not been mirrored to the new drives, so the GRUB bootloader is not installed in the MBR of the new boot drive. To solve this, you have to install GRUB into the MBR of disk 1, and preferably the MBR of each drive so that any other drive could be used for booting should the first drive fail. You can do this by using the GRUB command shell.

  8. Run /boot/ginstall/grub

    In the GRUB command shell, run the following commands in order:

    device (hd0) /dev/sdb
    root (hd0,0)
    setup (hd0)
    device (hd0) /dev/sdc
    root (hd0,0)
    setup (hd0)
    device (hd0) /dev/sdd
    root (hd0,0)
    setup (hd0)
    device (hd0) /dev/sde
    root (hd0,0)
    setup (hd0)

    At this point, the drives are designated /dev/sdb, /dev/sdc, /dev/sdd, and /dev/sde because /dev/sda was the old drive, and that drive was removed. When you reboot, the drives will once again be sda, sdb, etc., but this does not affect the RAID or anything else as far as I can tell.

  9. Reboot the NAS
After a reboot, the NAS should come back up and then function like new. These instructions only restored the OS partition, and you may need to use the web interface to rebuild the data storage volume. In that case, the NAS will be unusable for a few hours while the storage volume is rebuilt.