Stream: services-flake

Topic: Require opinion on `example/llm`


view this post on Zulip Shivaraj B H (Jun 14 2024 at 11:41):

I just wanted to get some opinion on two different ways example/llm can run:

  1. The example will pre-load the models and start Open WebUI. This ensures the user can immediately prompt the model once the WebUI launches. But this has a one-time wait period for pulling the model.
  2. The example will immediately start Open WebUI and the user can load whatever model they want from the “Admin Panel”.

Try out example1: nix run github:juspay/services-flake?dir=example/llm —override-input services-flake github:juspay/services-flake

Try out example2: nix run "github:drupol/services-flake/fix-open-webui-example?dir=example/llm" --refresh --override-input services-flake github:juspay/services-flake

PR for example2: https://github.com/juspay/services-flake/pull/227

view this post on Zulip Srid (Jun 14 2024 at 13:52):

cc @Andreas who has tried this out before.

Perhaps we should invite the PR author (drupol) here.

view this post on Zulip Srid (Jun 14 2024 at 13:53):

I'm fine with pre-loading it. It is just an example, but it is an unique example we are also using it for demo (video).

view this post on Zulip Andreas (Jun 14 2024 at 13:54):

yeah sure, why not invite Drupol!

I've been debugging ollama on docker anyways the last 2 days. Because the dual GPU stack I run has issues of an unclear nature.

view this post on Zulip Srid (Jun 14 2024 at 13:54):

I also like dataDir to remain $HOME/.services-flake/ollama, because then I can always revisit the "app" by re-runing nix run ... and play with the model locally. In a way, this example also demonstrates that you can use services-flake not only to run services in development project, but also as an end-user "app" (where there's no concept of flake project directory under $HOME, since nix run ... doesn't require it).

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 13:56):

Hey hi !

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 13:57):

I'm @drupol on Github.

view this post on Zulip Andreas (Jun 14 2024 at 13:57):

Hey @Pol Dellaiera and welcome!

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 13:57):

I joined to discuss about this: https://github.com/juspay/services-flake/pull/227#issuecomment-2168096533

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 13:58):

@Srid Hey I'm here :)

view this post on Zulip Srid (Jun 14 2024 at 13:59):

Welcome :-)

view this post on Zulip Srid (Jun 14 2024 at 13:59):

(I posted about the formatter in #nix )

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:05):

How about finding a compromise?

services-flake is great because it can save data in the same user directory where the flake.nix belong. Maybe we should establish some kind of ./data directory that is always there (equivalent to the ./state directory in direnv) and add a comment explaining that user is free to change that directory?

Hiding the ollama LLMs under $home/.service-flake/ is definitely too intrusive for me already.

view this post on Zulip Srid (Jun 14 2024 at 14:06):

How are you using example/llm for your use case? As an actual app seeing everyday use?

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:07):

No, I use the NixOS service on my side, but I have plenty of colleagues I wish I could show the demo. But Nix is already quite magical for a lot of people and I think it's a good idea to make things clear by not hiding where the files are going to be saved by default.

view this post on Zulip Srid (Jun 14 2024 at 14:08):

Pol Dellaiera said:

Nix is already quite magical for a lot of people and I think it's a good idea to make things clear by not hiding where the files are going to be saved by default.

Okay, fair enough. I'm happy with using default dataDir then.

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:09):

Cool, thanks ! :) Do you want me to add a comment in the flake for dataDir ?

view this post on Zulip Srid (Jun 14 2024 at 14:10):

Instead of removing it, you could just comment out # dataDir = ...

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:10):

yeah I was about to do that.

view this post on Zulip Srid (Jun 14 2024 at 14:10):

Also, I still like pre-loading models, so restore models = [ "llama2-uncensored" ]; as well.

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:11):

Sad. I would prefer to start with no LLM so users are free to use whatever they want.

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:11):

And starting with llama2-uncensored is... meh.

view this post on Zulip Srid (Jun 14 2024 at 14:12):

I mainly want the nix run ... invocation for this example to work 'out of the box' (with little manual poking) as much as possible. I have no preference on the specific model (smaller is generally better)

view this post on Zulip Andreas (Jun 14 2024 at 14:13):

I mean nowadays we could go llama3 ... or which one would you prefer?

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:13):

IMHO - I would not force users to download 4+GB of models on their behalv. I would leave it empty.

view this post on Zulip Shivaraj B H (Jun 14 2024 at 14:14):

How about a middle ground? use tinydolphin as the default pre-loaded model, for the purpose of the user to quickly verify if they are able to prompt and get a response. After which they are free to pull whatever model they want from the Admin panel?

view this post on Zulip Srid (Jun 14 2024 at 14:15):

How big is tinydolphin?

view this post on Zulip Andreas (Jun 14 2024 at 14:15):

637 MB it seems

view this post on Zulip Andreas (Jun 14 2024 at 14:16):

which means it will run on almost any GPU that is supported, which is nice

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:18):

so, tinydolphin ?

view this post on Zulip Srid (Jun 14 2024 at 14:18):

I propose this then,

Have two examples,

For video demo (on X and for docs), we can use the latter example.

view this post on Zulip Srid (Jun 14 2024 at 14:19):

The later can also serve as a demonstration of sharing of services.

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:20):

As you wish guys... this is your project :)

view this post on Zulip Srid (Jun 14 2024 at 14:20):

Srid said:

For video demo (on X and for docs), we can use the latter example.

And in future, we can tell people "if you want to see what services-flake can do, run nix run ..." (pointing to this latter example)

view this post on Zulip Andreas (Jun 14 2024 at 14:21):

perhaps choose a slightly more specific name for examples/llm... idk what you guys typically go for, examples/ollama-preloaded maybe?

view this post on Zulip Srid (Jun 14 2024 at 14:21):

Yea, good idea

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:21):

By the way, how do you manage to not use --impure and write in the current directory? What's the magic trick ?

view this post on Zulip Shivaraj B H (Jun 14 2024 at 14:26):

Cool, I will merge @Pol Dellaiera ’s PR once the models is removed to demonstrate only the open-webui. I will rename the example/llm to example/open-webui in a different PR. And then add example/ollama-preloaded in another one.

Sounds good?

view this post on Zulip Pol Dellaiera (Jun 14 2024 at 14:27):

Done.

view this post on Zulip Shivaraj B H (Jun 14 2024 at 14:30):

Pol Dellaiera said:

By the way, how do you manage to not use --impure and write in the current directory? What's the magic trick ?

nix run doesn’t require pure evaluation mode


Last updated: Nov 15 2024 at 12:33 UTC