Stream: nixify-llm

Topic: sc - or how to AI editor


view this post on Zulip David Arnold (Feb 20 2024 at 17:31):

https://github.com/efugier/smartcat

I hope this brings the power into my new helix setup.

The key to make this work seamlessly is a good default prompt that tells the model to behave like a CLI tool an not write any unwanted text like markdown formatting or explanations.

view this post on Zulip David Arnold (Feb 20 2024 at 17:36):

@Srid just for my understanding.

The config has:

[openai]  # each supported api has their own config section with api and url
api_key = "<your_api_key>"
default_model = "gpt-4"
url = "https://api.openai.com/v1/chat/completions"

[mistral]
api_key_command = "pass mistral/api_key"  # you can use a command to grab the key
default_model = "mistral-medium"
url = "https://api.mistral.ai/v1/chat/completions"

Does this mean I could run ollama locally and use them as backend?

view this post on Zulip Srid (Feb 20 2024 at 17:39):

ollama provides an API, but I don't know if it is sufficient to replace openai/mistral API etc.

https://github.com/ollama/ollama/blob/main/docs/api.md

view this post on Zulip Srid (Feb 20 2024 at 17:41):

A new helix topic in #nix or #offtopic maybe?

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 17:45):

https://nixos.zulipchat.com/#narrow/stream/420166-offtopic/topic/helix/near/422470027

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 17:46):

@David Arnold I DMed you this as well, but since you opened this up I might as well drop this here:
https://github.com/morph-labs/rift

view this post on Zulip David Arnold (Feb 20 2024 at 17:50):

Tim DeHerrera schrieb:

David Arnold I DMed you this as well, but since you opened this up I might as well drop this here:
https://github.com/morph-labs/rift

This short of seems the other extreme, making the editor lsp interface the upstream: https://github.com/morph-labs/rift/tree/main/rift-engine

Maybe a good approach is to start out with something like smartcat for now and then transition to the "Rift Code Engine" as it matures and is being adopted.

view this post on Zulip David Arnold (Feb 20 2024 at 17:59):

Once you got that working from helix you just have to select some text you want chatgpt to interact with, press the pipe | key and write something like sc -r -c "write tests for that function" and the output of the model will be appended right after your selection in the current buffer. The openai api is a bit slow but apart from that I'm super satisfied with it!

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 18:00):

For now, it hasn't been a major inconvenience to just ask an LLM in browser to generate some code. I haven't even tried a more involved code assistant like co-pilot yet though, so maybe I just don't know what I'm missing :sweat_smile:

view this post on Zulip Shivaraj B H (Feb 20 2024 at 18:06):

I like to use co-pilot for assistance, but it’s annoying when it completes things on its own, that’s a bit distracting for me. I want something like this twitter post I shared earlier: https://x.com/victortaelin/status/1753593250340856179?s=46

i.e auto-complete code when I ask the model to, avoids my round-trip to browser and also if it can learn from my code base as I am working on it, that will be awesome

view this post on Zulip David Arnold (Feb 20 2024 at 18:06):

I think Rift Code Engin on one hand (a proper LSP "extension" for AI support) and linux-y cat style are probably on the two ends of the spectrum.

I have seen some tutorials where you just write a code comment and then the AI does what you ask it for in the context of the current buffer.

That's sort of the immediacy that I think is worth a lot when living your day in a code editor.

But short of using VSCode and having access to the entire plugin system, it increasingly looks like smartcat is a good choice (combined with any OpanAPI compatible backend of ones liking via ollama et al).

I just wonder if I'm building the right mental model towards the "how to use AI in the editor" question?

view this post on Zulip David Arnold (Feb 20 2024 at 18:08):

Shivaraj B H schrieb:

I like to use co-pilot for assistance, but it’s annoying when it completes things on its own, that’s a bit distracting for me.

Thanks for this data point!! :handshake: Exactly what we oldies need to know that haven't gotten their hands dirty yet :smile:

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 18:08):

I think LSP integration would be an improvment mostly because LSP tend to be aware of the entirety of the code base, it was my understanding that current code assistant are only aware of open files

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 18:08):

Maybe it should be a proper LSP api though, instead of a standalone server

view this post on Zulip David Arnold (Feb 20 2024 at 18:09):

because LSP tend to be aware of the entirety of the code base, it was my understanding that current code assistant are only aware of open files

Yeah that's a good point. smartcat litterally only of the selection, so it probably isn't all too powerful.

view this post on Zulip David Arnold (Feb 20 2024 at 18:10):

I wonder what @Shivaraj B H would say of smartcat, would that fit your desired use case better or worse (because it lacks more context on the code-base/buffer)? I'll just listen to your experience and leap frog onto it :smile:

view this post on Zulip David Arnold (Feb 20 2024 at 18:12):

Wait!

-c, --context <CONTEXT>
glob pattern to given the matched files' content as context

That seems promising (and maybe just famously "good enough")?

view this post on Zulip Shivaraj B H (Feb 20 2024 at 18:12):

I will give smartcat a try and let you know my experience

view this post on Zulip David Arnold (Feb 20 2024 at 18:14):

I want too, but struggling still with my key bindings in helix :smile:

view this post on Zulip David Arnold (Feb 20 2024 at 18:18):

Here's another helix specific one: https://github.com/leona/helix-gpt it seems that it could also be combined with any openapi compatible backend.

view this post on Zulip Tim DeHerrera (Feb 20 2024 at 18:20):

Oh nice, I haven't seen that one before, might try to wire it up

view this post on Zulip Shivaraj B H (Mar 23 2024 at 19:55):

I was caught with other things, but I got around to trying out sc yesterday. Here’s what I think:

view this post on Zulip Shivaraj B H (Mar 23 2024 at 20:04):

I have already started on the last point that I mentioned, here’s some progress: https://nixos.zulipchat.com/#narrow/stream/426237-nixify-llm/topic/ollama/near/429110544

view this post on Zulip Shivaraj B H (Mar 23 2024 at 20:45):

Also, I have written a derivation to build smartcat, dumping it here:

{ lib, fetchFromGitHub, rustPlatform }:

rustPlatform.buildRustPackage rec {
  pname = "smartcat";
  version = "0.6.1";

  src = fetchFromGitHub {
    owner = "efugier";
    repo = pname;
    rev = version;
    hash = "sha256-na/Yt5B3nJ0OIeJKVHeoZc+V1OUyimp7PqY7SGARc5s=";
  };

  cargoPatches = [
    ../patches/smartcat/add-Cargo.lock.patch
  ];

  cargoHash = "sha256-ifUHWPBidLXX5f2JfIw9TdyV+pVcRVWT1LmHyLHTVds=";

  meta = with lib; {
    description = ''
      Putting a brain behind `cat`.
      Integrating language models in the Unix commands ecosystem through text streams.
    '';
    homepage = "https://github.com/efugier/smartcat";
    license = licenses.asl20;
    maintainers = [ ];
  };
}

view this post on Zulip Shivaraj B H (Mar 23 2024 at 20:46):

add-Cargo.lock.patch
You will also need this patch file


Last updated: Nov 13 2024 at 11:45 UTC