Quantcast
Channel: Active questions tagged ubuntu - Stack Overflow
Viewing all articles
Browse latest Browse all 5944

why is UEFI installation of ubuntu 24.04 lts behaving different for mdadm during installation [closed]

$
0
0

I am trying to setup mdadm RAID1 for the os drive for ubuntu 24.04 lts and am not sure what is going on with uefi that it is adding mount points to one of the drives even though i want no partition on it so i can create the RAID1 on the 2 drives

please see the screenshot below

enter image description here

So i went ahead and set it up anyways and this how it looks

root@ubuntu:~# lsblkNAME        MAJ:MIN RM  SIZE RO TYPE  MOUNTPOINTSsr0          11:0    1   16M  0 rom   xvda        202:0    0   16G  0 disk  ├─xvda1     202:1    0  763M  0 part  /boot/efi└─xvda2     202:2    0 15.3G  0 part  └─md0       9:0    0 15.2G  0 raid1 └─md0p1 259:0    0 15.2G  0 part  /xvdb        202:16   0   16G  0 disk  ├─xvdb1     202:17   0  763M  0 part  └─xvdb2     202:18   0 15.3G  0 part  └─md0       9:0    0 15.2G  0 raid1 └─md0p1 259:0    0 15.2G  0 part  /xvdc        202:32   0   32G  0 disk  xvde        202:64   0   32G  0 disk  xvdf        202:80   0   32G  0 disk  xvdg        202:96   0   32G  0 disk  root@ubuntu:~# mdadm -D /dev/md0/dev/md0:           Version : 1.2     Creation Time : Fri Sep 13 07:18:25 2024        Raid Level : raid1        Array Size : 15984640 (15.24 GiB 16.37 GB)     Used Dev Size : 15984640 (15.24 GiB 16.37 GB)      Raid Devices : 2     Total Devices : 2       Persistence : Superblock is persistent       Update Time : Fri Sep 13 07:50:20 2024             State : clean     Active Devices : 2   Working Devices : 2    Failed Devices : 0     Spare Devices : 0Consistency Policy : resync              Name : ubuntu-server:0              UUID : f3d46673:304110ea:067c80bd:d0415d2b            Events : 87    Number   Major   Minor   RaidDevice State       0     202        2        0      active sync   /dev/xvda2       1     202       18        1      active sync   /dev/xvdb2root@ubuntu:~# df -hFilesystem      Size  Used Avail Use% Mounted ontmpfs           194M  1.1M  193M   1% /runefivarfs        1.0G  1.5M 1023M   1% /sys/firmware/efi/efivars/dev/md0p1       15G  4.4G  9.8G  31% /tmpfs           970M     0  970M   0% /dev/shmtmpfs           5.0M     0  5.0M   0% /run/lock/dev/xvda1      762M  6.2M  756M   1% /boot/efitmpfs           194M   12K  194M   1% /run/user/1000

As you can see the boot partition is still pointing to the first partition of the first disk /dev/xvda

Is this not problematic? i think i will just avoid UEFI boot firmware altogether

Is this expected behavior and will this cause any issues later on?setting up RAID1 i want both drives to be exactly same and right now efi will not let that happen

for example if the drive with the /boot/efi is the one that fails, i hope this wont be a problem when relying on the protection of RAID1 but it essentially setup for failure from the beginning

please share comments


Viewing all articles
Browse latest Browse all 5944

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>