Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support multi agent for ts #300

Merged
merged 32 commits into from
Sep 26, 2024
Merged
Show file tree
Hide file tree
Changes from 15 commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
413593b
feat: update question to ask creating multiagent in express
thucpn Sep 18, 2024
622b84b
feat: add express simple multiagent
thucpn Sep 18, 2024
f464b40
fix: import from agent
thucpn Sep 18, 2024
0ebcb9f
Create yellow-jokes-protect.md
marcusschiesser Sep 19, 2024
f43f00a
create workflow with example agents
thucpn Sep 19, 2024
6c05872
remove unused files
thucpn Sep 19, 2024
2c7a538
update doc
thucpn Sep 19, 2024
5daf519
feat: streaming event
thucpn Sep 19, 2024
b875618
fix: streaming final result
thucpn Sep 19, 2024
b030a3d
fix: pipe final streaming result
thucpn Sep 19, 2024
33ce593
feat: funtional calling agent
thucpn Sep 20, 2024
de5ba29
fix: let default max attempt 2
thucpn Sep 20, 2024
aff4f0c
fix lint
thucpn Sep 20, 2024
c4041e2
refactor: move workflow folder to src
thucpn Sep 20, 2024
f659721
refactor: share settings file for ts templates
thucpn Sep 20, 2024
54d74f8
fix: move settings.ts to setting folder
thucpn Sep 20, 2024
d69cd42
refactor: move workflow to components
thucpn Sep 20, 2024
054ee5b
Update templates/components/multiagent/typescript/workflow/index.ts
marcusschiesser Sep 23, 2024
7297edf
create ts multi agent from streaming template
thucpn Sep 23, 2024
3ebc3ec
remove copy express template
thucpn Sep 23, 2024
8cfabc5
enhance streaming and add handle tool call step
thucpn Sep 23, 2024
305296b
update changeset
thucpn Sep 23, 2024
ea3bbcf
refactor: code review
thucpn Sep 25, 2024
325c7ca
fix: coderabbit comment
thucpn Sep 25, 2024
45f7529
enable multiagent ts test
thucpn Sep 25, 2024
234b15e
fix: e2e apptype for nextjs
thucpn Sep 25, 2024
32c3d89
refactor: use context write event instead of append data annotation d…
thucpn Sep 25, 2024
7079b68
fix streaming
marcusschiesser Sep 25, 2024
6ecd5f8
Merge branch 'main' into feat/support-multi-agent-for-ts
marcusschiesser Sep 26, 2024
0679c37
fix: writer is just streaming
marcusschiesser Sep 26, 2024
fa45102
fix: clearly separate streaming events and content and use workflowEv…
marcusschiesser Sep 26, 2024
2fb502e
fix: add correct tool calls for tool messages
marcusschiesser Sep 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/yellow-jokes-protect.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
thucpn marked this conversation as resolved.
Show resolved Hide resolved
"create-llama": patch
---

Add multi agents template for Express
15 changes: 14 additions & 1 deletion helpers/typescript.ts
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,11 @@ export const installTSTemplate = async ({
* Copy the template files to the target directory.
*/
console.log("\nInitializing project with template:", template, "\n");
const type = template === "multiagent" ? "streaming" : template; // use nextjs streaming template for multiagent
let type = "streaming";
if (template === "multiagent" && framework === "express") {
// use nextjs streaming template as frontend for express and fastapi
type = "multiagent";
}
const templatePath = path.join(templatesDir, "types", type, framework);
const copySource = ["**"];

Expand Down Expand Up @@ -145,6 +149,15 @@ export const installTSTemplate = async ({
cwd: path.join(compPath, "engines", "typescript", engine),
});

// copy settings file to engine folder
const settingPath = path.join(
compPath,
"engines",
"typescript",
"settings.ts",
);
await copy("settings.ts", enginePath, { cwd: settingPath });

/**
* Copy the selected UI files to the target directory and reference it.
*/
Expand Down
9 changes: 4 additions & 5 deletions questions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -410,10 +410,7 @@ export const askQuestions = async (
return; // early return - no further questions needed for llamapack projects
}

if (program.template === "multiagent") {
// TODO: multi-agents currently only supports FastAPI
program.framework = preferences.framework = "fastapi";
} else if (program.template === "extractor") {
if (program.template === "extractor") {
// Extractor template only supports FastAPI, empty data sources, and llamacloud
// So we just use example file for extractor template, this allows user to choose vector database later
program.dataSources = [EXAMPLE_FILE];
Expand All @@ -424,7 +421,9 @@ export const askQuestions = async (
program.framework = getPrefOrDefault("framework");
} else {
const choices = [
{ title: "NextJS", value: "nextjs" },
...(program.template === "multiagent"
? []
: [{ title: "NextJS", value: "nextjs" }]), // Not supported nextjs for multiagent for now
{ title: "Express", value: "express" },
{ title: "FastAPI (Python)", value: "fastapi" },
];
Expand Down
103 changes: 103 additions & 0 deletions templates/types/multiagent/express/README-template.md
thucpn marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
This is a [LlamaIndex](https://www.llamaindex.ai/) project using [Express](https://expressjs.com/) bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama).

## Getting Started

First, install the dependencies:

```
npm install
```

Second, generate the embeddings of the documents in the `./data` directory (if this folder exists - otherwise, skip this step):

```
npm run generate
```

Third, run the development server:

```
npm run dev
```

The example provides two different API endpoints:

1. `/api/chat` - a streaming chat endpoint (found in `src/controllers/chat.controller.ts`)
2. `/api/chat/request` - a non-streaming chat endpoint (found in `src/controllers/chat-request.controller.ts`)

You can test the streaming endpoint with the following curl request:

```
curl --location 'localhost:8000/api/chat' \
--header 'Content-Type: application/json' \
--data '{ "messages": [{ "role": "user", "content": "Hello" }] }'
```

And for the non-streaming endpoint run:

```
curl --location 'localhost:8000/api/chat/request' \
--header 'Content-Type: application/json' \
--data '{ "messages": [{ "role": "user", "content": "Hello" }] }'
```

You can start editing the API by modifying `src/controllers/chat.controller.ts` or `src/controllers/chat-request.controller.ts`. The endpoint auto-updates as you save the file.
You can delete the endpoint that you're not using.

## Production

First, build the project:

```
npm run build
```

You can then run the production server:

```
NODE_ENV=production npm run start
```

> Note that the `NODE_ENV` environment variable is set to `production`. This disables CORS for all origins.

## Using Docker

1. Build an image for the Express API:

```
docker build -t <your_backend_image_name> .
```

2. Generate embeddings:

Parse the data and generate the vector embeddings if the `./data` folder exists - otherwise, skip this step:

```
docker run --rm \
-v $(pwd)/.env:/app/.env \ # Use ENV variables and configuration from your file-system
-v $(pwd)/config:/app/config \
-v $(pwd)/data:/app/data \
-v $(pwd)/cache:/app/cache \ # Use your file system to store the vector database
<your_backend_image_name>
npm run generate
```

3. Start the API:

```
docker run \
-v $(pwd)/.env:/app/.env \ # Use ENV variables and configuration from your file-system
-v $(pwd)/config:/app/config \
-v $(pwd)/cache:/app/cache \ # Use your file system to store the vector database
-p 8000:8000 \
<your_backend_image_name>
```

## Learn More

To learn more about LlamaIndex, take a look at the following resources:

- [LlamaIndex Documentation](https://docs.llamaindex.ai) - learn about LlamaIndex (Python features).
- [LlamaIndexTS Documentation](https://ts.llamaindex.ai) - learn about LlamaIndex (Typescript features).

You can check out [the LlamaIndexTS GitHub repository](https://github.com/run-llama/LlamaIndexTS) - your feedback and contributions are welcome!
10 changes: 10 additions & 0 deletions templates/types/multiagent/express/eslintrc.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"extends": ["eslint:recommended", "prettier"],
"rules": {
"max-params": ["error", 4],
"prefer-const": "error"
},
"parserOptions": {
"sourceType": "module"
}
}
5 changes: 5 additions & 0 deletions templates/types/multiagent/express/gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# local env files
.env
node_modules/

output/
46 changes: 46 additions & 0 deletions templates/types/multiagent/express/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
/* eslint-disable turbo/no-undeclared-env-vars */
import cors from "cors";
import "dotenv/config";
import express, { Express, Request, Response } from "express";
import { initObservability } from "./src/observability";
import chatRouter from "./src/routes/chat.route";

const app: Express = express();
const port = parseInt(process.env.PORT || "8000");

const env = process.env["NODE_ENV"];
const isDevelopment = !env || env === "development";
const prodCorsOrigin = process.env["PROD_CORS_ORIGIN"];

initObservability();

app.use(express.json({ limit: "50mb" }));

if (isDevelopment) {
console.warn("Running in development mode - allowing CORS for all origins");
app.use(cors());
} else if (prodCorsOrigin) {
console.log(
`Running in production mode - allowing CORS for domain: ${prodCorsOrigin}`,
);
const corsOptions = {
origin: prodCorsOrigin, // Restrict to production domain
};
app.use(cors(corsOptions));
} else {
console.warn("Production CORS origin not set, defaulting to no CORS.");
}

app.use("/api/files/data", express.static("data"));
app.use("/api/files/output", express.static("output"));
app.use(express.text());

app.get("/", (req: Request, res: Response) => {
res.send("LlamaIndex Express Server");
});

app.use("/api/chat", chatRouter);

app.listen(port, () => {
console.log(`⚡️[server]: Server is running at http://localhost:${port}`);
});
1 change: 1 addition & 0 deletions templates/types/multiagent/express/npmrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
node-linker=hoisted
45 changes: 45 additions & 0 deletions templates/types/multiagent/express/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
{
"name": "llama-index-express-multiagent",
"version": "1.0.0",
"exports": "./index.js",
"types": "./index.d.ts",
"type": "module",
"engines": {
"node": ">=18"
},
"scripts": {
"format": "prettier --ignore-unknown --cache --check .",
"format:write": "prettier --ignore-unknown --write .",
"build": "tsup index.ts --format esm --dts",
"start": "node dist/index.js",
"dev": "concurrently \"tsup index.ts --format esm --dts --watch\" \"nodemon --watch dist/index.js\""
},
"dependencies": {
"ai": "3.3.38",
"cors": "^2.8.5",
"dotenv": "^16.3.1",
"duck-duck-scrape": "^2.2.5",
"express": "^4.18.2",
"llamaindex": "0.6.2",
"pdf2json": "3.0.5",
"ajv": "^8.12.0",
"@e2b/code-interpreter": "^0.0.5",
"got": "^14.4.1",
"@apidevtools/swagger-parser": "^10.1.0",
"formdata-node": "^6.0.3"
},
"devDependencies": {
"@types/cors": "^2.8.16",
"@types/express": "^4.17.21",
"@types/node": "^20.9.5",
"concurrently": "^8.2.2",
"eslint": "^8.54.0",
"eslint-config-prettier": "^8.10.0",
"nodemon": "^3.0.1",
"prettier": "^3.2.5",
"prettier-plugin-organize-imports": "^3.2.4",
"tsx": "^4.7.2",
"tsup": "8.1.0",
"typescript": "^5.3.2"
}
}
3 changes: 3 additions & 0 deletions templates/types/multiagent/express/prettier.config.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
module.exports = {
plugins: ["prettier-plugin-organize-imports"],
};
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
import { Request, Response } from "express";
import { LLamaCloudFileService } from "llamaindex";

export const chatConfig = async (_req: Request, res: Response) => {
let starterQuestions = undefined;
if (
process.env.CONVERSATION_STARTERS &&
process.env.CONVERSATION_STARTERS.trim()
) {
starterQuestions = process.env.CONVERSATION_STARTERS.trim().split("\n");
}
return res.status(200).json({
starterQuestions,
});
};

export const chatLlamaCloudConfig = async (_req: Request, res: Response) => {
if (!process.env.LLAMA_CLOUD_API_KEY) {
return res.status(500).json({
error: "env variable LLAMA_CLOUD_API_KEY is required to use LlamaCloud",
});
}
const config = {
projects: await LLamaCloudFileService.getAllProjectsWithPipelines(),
pipeline: {
pipeline: process.env.LLAMA_CLOUD_INDEX_NAME,
project: process.env.LLAMA_CLOUD_PROJECT_NAME,
},
};
return res.status(200).json(config);
};
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import { Message, StreamData, streamToResponse } from "ai";
import { Request, Response } from "express";
import { ChatMessage } from "llamaindex";
import { createWorkflow } from "../workflow";
import { toDataStream } from "../workflow/stream";
import { createStreamTimeout } from "./llamaindex/streaming/events";

export const chat = async (req: Request, res: Response) => {
const vercelStreamData = new StreamData();
const streamTimeout = createStreamTimeout(vercelStreamData);
try {
const { messages }: { messages: Message[] } = req.body;
const userMessage = messages.pop();
if (!messages || !userMessage || userMessage.role !== "user") {
thucpn marked this conversation as resolved.
Show resolved Hide resolved
return res.status(400).json({
error:
"messages are required in the request body and the last message must be from the user",
});
}

const chatHistory = messages as ChatMessage[];
const agent = await createWorkflow(chatHistory, vercelStreamData);
agent.run(userMessage.content);
const stream = toDataStream(agent.streamEvents(), vercelStreamData);
return streamToResponse(stream, res, {}, vercelStreamData);
} catch (error) {
console.error("[LlamaIndex]", error);
return res.status(500).json({
detail: (error as Error).message,
});
thucpn marked this conversation as resolved.
Show resolved Hide resolved
} finally {
clearTimeout(streamTimeout);
}
};
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export const initObservability = () => {};
12 changes: 12 additions & 0 deletions templates/types/multiagent/express/src/routes/chat.route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
import express, { Router } from "express";
import { chatConfig } from "../controllers/chat-config.controller";
import { chat } from "../controllers/chat.controller";
import { initSettings } from "../controllers/engine/settings";

const llmRouter: Router = express.Router();

initSettings();
llmRouter.route("/").post(chat);
llmRouter.route("/config").get(chatConfig);

export default llmRouter;
Loading
Loading