-
-
Notifications
You must be signed in to change notification settings - Fork 19
Gemini support #30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Gemini support #30
Conversation
src/eca/llm_api.clj
Outdated
@@ -34,6 +35,7 @@ | |||
(string/join "\n" (subvec lines start end))) | |||
content))) | |||
|
|||
;; NTOE: this is not going to scale ;-) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, I think we can easily create a util that receives the provider and do this check automagically, was waiting for more cases to understand usage
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I figured that until this point there was no need for more abstractions for providers but I had to leave that comment after adding all possible configs for Gemini - perhaps a provider protocol would make sense? It would also simplify the dispatch code - half of the args are needed only once (auth, URLs) and could be pre-computed on a record instance just once.
src/eca/llm_api.clj
Outdated
@@ -168,6 +210,7 @@ | |||
provider-fn (case (:api provider-config) | |||
"openai" llm-providers.openai/completion! | |||
"anthropic" llm-providers.anthropic/completion! | |||
"gemini" llm-providers.google/completion! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍🏻
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, I'll change that to Google - it will follow established pattern of using company names rather than products (OpenAI vs ChatGPT or Anthropic vs Claude)
src/eca/llm_api.clj
Outdated
;; NOTE: no :api-url here, because Google provider figures it out | ||
;; because it depends on how we're authenticated |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's new to me, interesting, I wonder if there is any case user want to customize the url, but it's not a problem right now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did some research (well... Perplexity did most of it ;-)), and outside of running the code within GCP there's no real support for providing custom API urls, apart from custom telemetry endpoints. I'd say that we add a TODO
and wait for someone to request it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So far looking good, LMK if you wanna discuss any TODO, BTW we can start supporting basic chat features and leave MCP/reason and other features to next PRs what you think?
I think that makes sense - with all of the plumbing out of the way, chat was the next thing that I was going to tackle |
src/eca/llm_api.clj
Outdated
[:api-key-found "o4-mini"]) | ||
(when-let [ollama-model (first (filter #(string/starts-with? % config/ollama-model-prefix) (keys (:models db))))] | ||
[:ollama-running ollama-model]) | ||
(when (google-any-auth? config) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe this check should come before ollama?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah good call, I just updated the order - external APIs first, then ollama, then the default
Picks up from https://github.com/editor-code-assistant/eca/pull/17/files
Warning
Early days, this PR aims to provide basic functionality. Other features like web search and reasoning support will be implemented as follow up PRs
Authentication
TBD