I just wanted to get some opinion on two different ways example/llm
can run:
Try out example1: nix run github:juspay/services-flake?dir=example/llm —override-input services-flake github:juspay/services-flake
Try out example2: nix run "github:drupol/services-flake/fix-open-webui-example?dir=example/llm" --refresh --override-input services-flake github:juspay/services-flake
PR for example2: https://github.com/juspay/services-flake/pull/227
cc @Andreas who has tried this out before.
Perhaps we should invite the PR author (drupol
) here.
I'm fine with pre-loading it. It is just an example, but it is an unique example we are also using it for demo (video).
yeah sure, why not invite Drupol!
I've been debugging ollama on docker anyways the last 2 days. Because the dual GPU stack I run has issues of an unclear nature.
I also like dataDir
to remain $HOME/.services-flake/ollama
, because then I can always revisit the "app" by re-runing nix run ...
and play with the model locally. In a way, this example also demonstrates that you can use services-flake not only to run services in development project, but also as an end-user "app" (where there's no concept of flake project directory under $HOME, since nix run ...
doesn't require it).
Hey hi !
I'm @drupol on Github.
Hey @Pol Dellaiera and welcome!
I joined to discuss about this: https://github.com/juspay/services-flake/pull/227#issuecomment-2168096533
@Srid Hey I'm here :)
Welcome :-)
(I posted about the formatter in #nix )
How about finding a compromise?
services-flake is great because it can save data in the same user directory where the flake.nix belong. Maybe we should establish some kind of ./data
directory that is always there (equivalent to the ./state
directory in direnv) and add a comment explaining that user is free to change that directory?
Hiding the ollama LLMs under $home/.service-flake/
is definitely too intrusive for me already.
How are you using example/llm
for your use case? As an actual app seeing everyday use?
No, I use the NixOS service on my side, but I have plenty of colleagues I wish I could show the demo. But Nix is already quite magical for a lot of people and I think it's a good idea to make things clear by not hiding where the files are going to be saved by default.
Pol Dellaiera said:
Nix is already quite magical for a lot of people and I think it's a good idea to make things clear by not hiding where the files are going to be saved by default.
Okay, fair enough. I'm happy with using default dataDir
then.
Cool, thanks ! :) Do you want me to add a comment in the flake for dataDir
?
Instead of removing it, you could just comment out # dataDir = ...
yeah I was about to do that.
Also, I still like pre-loading models, so restore models = [ "llama2-uncensored" ];
as well.
Sad. I would prefer to start with no LLM so users are free to use whatever they want.
And starting with llama2-uncensored
is... meh.
I mainly want the nix run ...
invocation for this example to work 'out of the box' (with little manual poking) as much as possible. I have no preference on the specific model (smaller is generally better)
I mean nowadays we could go llama3 ... or which one would you prefer?
IMHO - I would not force users to download 4+GB of models on their behalv. I would leave it empty.
How about a middle ground? use tinydolphin
as the default pre-loaded model, for the purpose of the user to quickly verify if they are able to prompt and get a response. After which they are free to pull whatever model they want from the Admin panel?
How big is tinydolphin
?
637 MB it seems
which means it will run on almost any GPU that is supported, which is nice
so, tinydolphin
?
I propose this then,
Have two examples,
examples/ollama-ui
: basically our current example but with all of @Pol Dellaiera 's suggestions (no custom dataDir; no pre-loaded models)examples/llm
: a new example that import
s the previous example (shared service) and sets the two things only -- dataDir
and models
(here, we can use a good model without size restriction)For video demo (on X and for docs), we can use the latter example.
The later can also serve as a demonstration of sharing of services.
As you wish guys... this is your project :)
Srid said:
For video demo (on X and for docs), we can use the latter example.
And in future, we can tell people "if you want to see what services-flake can do, run nix run ...
" (pointing to this latter example)
perhaps choose a slightly more specific name for examples/llm
... idk what you guys typically go for, examples/ollama-preloaded
maybe?
Yea, good idea
By the way, how do you manage to not use --impure
and write in the current directory? What's the magic trick ?
Cool, I will merge @Pol Dellaiera ’s PR once the models
is removed to demonstrate only the open-webui
. I will rename the example/llm
to example/open-webui
in a different PR. And then add example/ollama-preloaded
in another one.
Sounds good?
Done.
Pol Dellaiera said:
By the way, how do you manage to not use
--impure
and write in the current directory? What's the magic trick ?
nix run
doesn’t require pure evaluation mode
Last updated: Oct 12 2024 at 21:16 UTC