There has been a noticeable shift over the last few months on other operating systems like Android, iOS and Microsoft.

What are your thoughts around how/if integration takes place within Linux?

  • nottheengineer@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    LLMs are big, so you either need a powerful PC to run them or use cloud services. Linux users tend to not be fans of either, so it’ll probably take a while before anything big happens.

    Besides, for the things where an LLM actually makes sense (like a copilot-style code generator), there are already implementations.

    • waspentalive@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I am a Debian user, and I can’t really say I am not a fan of “Big”. I have a laptop as my production machine but I also have as big a file server as I can afford. I would not want an AI that is part of my OS unless it is local. I do use ChatGPT and Stable Diffusion, but only for non-critical functions.