• boonhet@lemm.ee
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    9
    ·
    edit-2
    12 hours ago

    AI AI AI AI

    Yawn

    Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.

    • Zip2@feddit.uk
      link
      fedilink
      English
      arrow-up
      98
      arrow-down
      11
      ·
      12 hours ago

      normal person’s server.

      I’m pretty sure I speak for the majority of normal people, but we don’t have servers.

      • Rose@slrpnk.net
        link
        fedilink
        English
        arrow-up
        35
        ·
        10 hours ago

        Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.

        (Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)

        • KnightontheSun@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          9 hours ago

          Hello fellow home labber! I have a home built xpenology box, proxmox server with a dozen vm’s, a hackentosh, and a workstation with 44 cores running linux. Oh, and a usb floppy drive. We are out here.

          I also like long walks in Oblivion.

          • MrPistachios@lemmy.today
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 hours ago

            Man oblivion walks are the best until a crazy woman comes at you trying to steal your soul with a fancy sword

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 hours ago

          It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives.

          Equally disheartening is knowing that both of those have a shelf-life. Old USB flash drives are more durable than the TLC/QLC cells we use today, but 15 years sitting unpowered in a box doesn’t have very good prospects.

      • notabot@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        11 hours ago

        You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?

        Appologies, I’m tired and that made more sense in my head.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      7 hours ago

      You can get a Coral TPU for 40 bucks or so.

      You can get an AMD APU with a NN-inference-optimized tile for under 200.

      Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

      What price point are you trying to hit?

      • boonhet@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 hours ago

        What price point are you trying to hit?

        With regards to AI?. None tbh.

        With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          With regards to AI?. None tbh.

          TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.

        • gravitas_deficiency@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 hours ago

          You’re willing to pay $none to have hardware ML support for local training and inference?

          Well, I’ll just say that you’re gonna get what you pay for.

          • bassomitron@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 hours ago

            No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.