Fedora WS. I want to continue using an older version of Oobabooga Textgen WebUI, but the 6.6 kernel build of the Nvidia open source module with the x86 UEFI shim breaks compatibility with one of the Python *Blas packages.
Now I need to figure out how to build the kernel module for a distrobox (podman(docker)) container. Is that even possible when I do not have the ability to set my own UEFI boot lock keys in the factory bootloader?
I’m not 100% positive this machine has the right hardware MUX for running the Nvidia GPU independent of the Intel integrated GPU (laptop) but I really wish I could run the 3080TI completely independent of the display. I only use it for AI.
The documentation for everything Nvidia on Fedora is a mess of outdated info and the information Nvidia provides is all wrong and downright malicious in my experience.
Has anyone gone through this recently or can help me fast track?
I am not a 100% sure I understand your setup but it shouldn’t be possible to add a Kernel module in a container. The container uses the Kernel of the host and doesn’t have a Kernel on its own.
Running Textgen on Fedora WS 38 distrobox container. I have an old version of Textgen from a month+ back. When I let the Fedora kernel update and build it broke the old version of Textgen. I tried to get the latest version of Textgen, and it works with the updated kernel 6.6 in a new container, but the changes that the project has made have ruined it for me and I have no interest in continuing to run their version. I want to keep the old version. I rolled back the host kernel to 6.5 and the old Textgen works fine.
I swear I saw something about a runtime loaded version of the Nvidia kernel module, but IIRC that requires the UEFI keys to self sign it.
I tried loading my own keys into the bootloader already but they get reject at the last step and the TPM chip overrides them. I have never tried to boot into the UEFI system with KeyTool. I’m afraid that will be the only way. I was hoping someone here might know the path a little better than my current fog-of-war like state.
(I have a bunch of custom scripts and mods that the Textgen project broke in one of their major changes.)