Skip to content

Commit

Permalink
Merge pull request #22 from jasonhp/main
Browse files Browse the repository at this point in the history
feat: add new llm provider: Novita AI
  • Loading branch information
evilsocket authored Oct 29, 2024
2 parents bf48414 + ad52234 commit cc78cf9
Show file tree
Hide file tree
Showing 4 changed files with 63 additions and 2 deletions.
11 changes: 10 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ While Nerve was inspired by other projects such as Autogen and Rigging, its main

## LLM Support

Nerve features integrations for any model accessible via the [ollama](https://github.com/ollama/ollama), [groq](https://groq.com), [OpenAI](https://openai.com/index/openai-api/), [Fireworks](https://fireworks.ai/) and [Huggingface](https://huggingface.co/blog/tgi-messages-api#using-inference-endpoints-with-openai-client-libraries) APIs.
Nerve features integrations for any model accessible via the [ollama](https://github.com/ollama/ollama), [groq](https://groq.com), [OpenAI](https://openai.com/index/openai-api/), [Fireworks](https://fireworks.ai/), [Huggingface](https://huggingface.co/blog/tgi-messages-api#using-inference-endpoints-with-openai-client-libraries) and [NovitaAI](https://novita.ai/model-api/product/llm-api) APIs.

**The tool will automatically detect if the selected model natively supports function calling. If not, it will provide a compatibility layer that empowers older models to perform function calling anyway.**

Expand Down Expand Up @@ -72,6 +72,15 @@ Refer to [this document](https://huggingface.co/blog/tgi-messages-api#using-infe
HF_API_TOKEN=you-api-key nerve -G "hf://[email protected]" ...
```

For **Novita**:

```sh
NOVITA_API_KEY=you-api-key nerve -G "novita://meta-llama/llama-3.1-70b-instruct" ...
```

You can check your API keys [here](https://novita.ai/settings#key-management), and check all our models [here](https://novita.ai/model-api/product/llm-api).


## Example

Let's take a look at the `examples/ssh_agent` example tasklet (a "tasklet" is a YAML file describing a task and the instructions):
Expand Down
3 changes: 2 additions & 1 deletion nerve-core/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -47,10 +47,11 @@ serde_json = "1.0.120"
clap = { version = "4.5.6", features = ["derive"] }

[features]
default = ["ollama", "groq", "openai", "fireworks", "hf"]
default = ["ollama", "groq", "openai", "fireworks", "hf", "novita"]

ollama = ["dep:ollama-rs"]
groq = ["dep:groq-api-rs", "dep:duration-string"]
openai = ["dep:openai_api_rust"]
fireworks = ["dep:openai_api_rust"]
hf = ["dep:openai_api_rust"]
novita = ["dep:openai_api_rust"]
10 changes: 10 additions & 0 deletions nerve-core/src/agent/generator/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ mod fireworks;
mod groq;
#[cfg(feature = "hf")]
mod huggingface;
#[cfg(feature = "novita")]
mod novita;
#[cfg(feature = "ollama")]
mod ollama;
#[cfg(feature = "openai")]
Expand Down Expand Up @@ -159,6 +161,7 @@ macro_rules! factory_body {
$model_name,
$context_window,
)?)),
#[cfg(feature = "hf")]
"hf" => Ok(Box::new(huggingface::HuggingfaceMessageClient::new(
$url,
$port,
Expand All @@ -172,6 +175,13 @@ macro_rules! factory_body {
$model_name,
$context_window,
)?)),
#[cfg(feature = "novita")]
"novita" => Ok(Box::new(novita::NovitaClient::new(
$url,
$port,
$model_name,
$context_window,
)?)),
_ => Err(anyhow!("generator '{}' not supported yet", $name)),
}
};
Expand Down
41 changes: 41 additions & 0 deletions nerve-core/src/agent/generator/novita.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
use anyhow::Result;
use async_trait::async_trait;

use crate::agent::{state::SharedState, Invocation};

use super::{openai::OpenAIClient, Client, ChatOptions};

pub struct NovitaClient {
client: OpenAIClient,
}

#[async_trait]
impl Client for NovitaClient {
fn new(_: &str, _: u16, model_name: &str, _: u32) -> anyhow::Result<Self>
where
Self: Sized,
{
let client = OpenAIClient::custom(
model_name,
"NOVITA_API_KEY",
"https://api.novita.ai/v3/openai/",
)?;

Ok(Self { client })
}

async fn chat(
&self,
state: SharedState,
options: &ChatOptions,
) -> anyhow::Result<(String, Vec<Invocation>)> {
self.client.chat(state, options).await
}
}

#[async_trait]
impl mini_rag::Embedder for NovitaClient {
async fn embed(&self, text: &str) -> Result<mini_rag::Embeddings> {
self.client.embed(text).await
}
}

0 comments on commit cc78cf9

Please sign in to comment.