Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 days ago

    LMStudio is pretty much the standard. I think it’s opensource except for the UI. Even if you don’t end up using it long-term, it’s great for getting used to a lot of the models.

    Otherwise there’s OpenWebUI that I would imagine would work as a docker compose, as I think there’s ARM images for OWU and ollama

    • L_Acacia@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      27 minutes ago

      Well they are fully closed source except for the open source project they are a wrapper on. The open source part is llama.cpp

      • ikidd@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 minutes ago

        Fair enough, but it’s damn handy and simple to use. And I don’t know how to do speculative decoding with ollama, which massively speeds up the models for me.