also at beehaw

  • 2 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle






  • So I’m no expert at running local LLMs, but I did download one (the 7B vicuña model recommended by the LocalLLM subreddit wiki) and try my hand at training a LoRA on some structured data I have.

    Based on my experience, the VRAM available to you is going to be way more of a bottleneck than PCIe speeds.

    I could barely hold a 7B model in 10 GB of VRAM on my 3080, so 8 GB might be impossible or very tight. IMO to get good results with local models you really have large quantities of VRAM and be using 13B or above models.

    Additionally, when you’re training a LoRA the model + training data gets loaded into VRAM. My training dataset wasn’t very large, and even so, I kept running into VRAM constraints with training.

    In the end I concluded that in the current state, running a local LLM is an interesting exercise but only great on enthusiast level hardware with loads of VRAM (4090s etc).








  • That’s helpful; this sounds like a docker issue or qBit issue then. The default qBit location for torrents is /downloads, but you’d need to make sure to point it towards the container volume mapping you’re setting up in docker.

    my relevant qBittorrent compose volume mapping is as follows:

        volumes:
          - /volume1/shared/torrents:/data/torrents
    

    Personally, I don’t separate my torrent downloads by type; I use incoming & completed folders. Here’s how I set up my qBittorrent config:

    Original Value New Value
    Session\DefaultSavePath=/downloads/ Session\DefaultSavePath=/data/torrents/1_completed/
    Session\TempPath=/downloads/incomplete/ Session\TempPath=/data/torrents/2_incoming/
    Downloads\SavePath=/downloads/ Downloads\SavePath=/data/torrents/1_completed/
    Downloads\TempPath=/downloads/incomplete/ Downloads\TempPath=/data/torrents/2_incoming/



  • I really can’t tell - does it work on top of Markdown flat files or not? Based on the mention of an exporter, I’d guess not.

    Part of the reason I moved from TiddlyWiki to Obsidian was to get my writing into plain text files, so anything that doesn’t interface with the OS file system is off the table for me from the get-go. (Part of the reason I care about this is so that I’m not locked into a specific app and can use VS Code to browse and edit as needed, or build a static site from my files, etc.)