So great, thanks !
Shivaraj B H has marked this topic as resolved.
Oh crap. open-webui
0.3.4 just landed in nixpkgs-unstable
.
@Shivaraj B H Please run again the flake.lock
generation in example/llm
so you'll get the newest open-webui release.
@Pol Dellaiera we have automatic flake updates every week. It should get updated in 2 days (Sunday).
Even for subdirectories ?!
Yes
Shivaraj B H said:
Pol Dellaiera we have automatic flake updates every week. It should get updated in 2 days (Sunday).
It suffers from this particular UX issue: https://github.com/DeterminateSystems/update-flake-lock/issues/82
Oh DetSys...
Srid said:
I propose this then,
Have two examples,
examples/ollama-ui
: basically our current example but with all of Pol Dellaiera 's suggestions (no custom dataDir; no pre-loaded models)examples/llm
: a new example thatimport
s the previous example (shared service) and sets the two things only --dataDir
andmodels
(here, we can use a good model without size restriction)For video demo (on X and for docs), we can use the latter example.
On 2nd thought, I don't think we should do this. I'll open a PR
What do you have in mind?
https://github.com/juspay/services-flake/pull/228
@Srid Why not have the ollama1
data dir at the top-level in ~/.services-flake
instead of ~/.services-flake/llm
? I ask this because we follow the top-level convention even in the data
dir which is by default in $PWD
Shivaraj B H said:
Srid Why not have the
ollama1
data dir at the top-level in~/.services-flake
instead of~/.services-flake/llm
? I ask this because we follow the top-level convention even in thedata
dir which is by default in$PWD
That makes sense. So ~/.services-flake/llm/data/...
? Yes.
No, I meant at the top-level itself. ~/.services-flake/<service>
, in this case ~/.services-flake/ollama1
(deleted)
Because we consider llm
a self-contained (services-flake) "app", so it makes sense to have its own data directory.
(I'm back on Zulip again)
As I explained, I left here because I don't want to spread to too many communication platforms and because I'm really used to the UI in Zulip.
So I though I had deleted my account... but actually I succeeded to disable it, I don't know how.
Anyway, thanks for re-enabling it.
Pol Dellaiera said:
This is the PHI3 model here. It's small and I like it. It's the default LLM on my open-webui instance.
image.png
This is very impressive.
I don't know what you think guys.
But I'll check your messages tomorrow. Time to go to bed. Cheers.
One idea is to have the flake app take the model name(s) via env vars, overriding what’s in flake config.
I just tried out phi3
, it is quite nice and is only 2.4 G. Should we switch the example/llm to use this instead of the 9 G model?
You can make that call.
Last updated: Oct 12 2024 at 21:45 UTC