• 1 Post
  • 268 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle




  • I’m still trying out combinations of hardware and models, but even my old Intel 8500T CPU will run around reading speed with a stock version of Meta’s Llama 3.2 3b (maybe the one you tried) with mostly good output—fine for rewriting content, answering questions about uploaded document stores etc.

    There are thousands of models tuned for various purposes, so one of the key questions is your purpose. If you want to use your setup for something specific (e.g., coding SQL) you are going to be able to find a much more efficient model.






  • I run Ollama with Open WebUI at home.

    A) the containers they run in by default can’t access the Internet, but they are provided access if we turn on web search or want to download new models. Ollama and Open WebUI are fairly popular products and I haven’t seen any evidence of nefarious activity so far.

    B) they create a profile on me and my family members that use them, by design. We can add sensitive documents that the models can use.

    C) they are restricted by what we type and the documents we provide.