Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit ce01bb5

Browse files
docs
1 parent 3ef953e commit ce01bb5

File tree

10 files changed

+280
-86
lines changed

10 files changed

+280
-86
lines changed

‎docs/docs/contributing/index.md‎

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ changeset publish
4646

4747
## Add a monorepo package to the main convostack package
4848

49-
All `@convostack/*` packages live in the `packages/` directory and follow the `convostack-package-name` folder naming
49+
All `convostack/*` packages live in the `packages/` directory and follow the `convostack-package-name` folder naming
5050
convention. To add a new package, reference one of the many existing packages.
5151

5252
Just because a package has been defined in the `packages/` directory does NOT mean that it will automatically available
@@ -55,14 +55,14 @@ in the main `convostack` NPM package.
5555
In order to add a package to the main `convostack` package, you must:
5656

5757
1. Add a new folder with the following naming convention to the `packages/convostack/src` folder. For example, for a
58-
package named `@convostack/example-subpackage`, you would create the
58+
package named `convostack/example-subpackage`, you would create the
5959
directory `packages/convostack/src/example-subpackage` and a corresponding `index.ts`
6060
file `packages/convostack/src/example-subpackage/index.ts` that would serve to re-export your original package from
6161
within the parent `convostack` package. The `index.ts` file should only contain one
62-
line: `export * from '@convostack/example-subpackage';`
62+
line: `export * from 'convostack/example-subpackage';`
6363
2. Add your package to the dependencies of `convostack`. Using the example from above, you would
64-
add `"@convostack/example-subpackage": "*"` to `packages/convostack/package.json`'s `dependencies`. Please note that
65-
for all of this to work for end users, you must publicly publish the `@convostack/example-subpackage` package to NPM,
64+
add `"convostack/example-subpackage": "*"` to `packages/convostack/package.json`'s `dependencies`. Please note that
65+
for all of this to work for end users, you must publicly publish the `convostack/example-subpackage` package to NPM,
6666
since `convostack` does not actually bundle these dependencies internally.
6767
3. To ensure that the entrypoints for imports are properly generated, you must also add your package to
6868
the `entrypoints` defined in the `packages/convostack/scripts/create-entrypoints.js` file. Using the example from

‎docs/docs/getting-started/index.md‎

Lines changed: 188 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,188 @@
1+
---
2+
id: "index"
3+
title: "Quickstart"
4+
sidebar_label: "Quickstart"
5+
sidebar_position: 0.1
6+
---
7+
8+
# Quickstart Guide
9+
10+
This tutorial gives you a walkthrough on how to quickly connect AI agents to the ConvoStack chatbot playground.
11+
12+
We will be using **Langchain** for creating the AI agents and **ConvoStack** for connecting these agents to a production-ready chatbot playground.
13+
14+
![ConvoStack Playground](../../static/img/dev-playground.png)
15+
16+
## Installation
17+
18+
```bash
19+
npm install convostack langchain
20+
```
21+
22+
## Example 1: OpenAI Agent
23+
24+
In this example, we are connecting an OpenAI [LLM](https://js.langchain.com/docs/modules/models/llms/) to the chatbot playground.
25+
26+
```typescript
27+
import { playground } from "convostack/playground";
28+
import { OpenAI } from "langchain/llms/openai";
29+
30+
playground({
31+
reply(context: IAgentContext): Promise<IAgentResponse> {
32+
// `humanMessage` is the content of each message the user sends via the chatbot playground.
33+
let humanMessage = context.getHumanMessage().content;
34+
// `agent` is the OpenAI agent we want to use to respond to each `humanMessage`
35+
const agent = new OpenAI();
36+
// `call` is a simple string-in, string-out method for interacting with the OpenAI agent.
37+
const resp = await model.call(humanMessage);
38+
// `resp` is the generated agent's response to the user's `humanMessage`
39+
return {
40+
content: resp,
41+
contentType: "markdown",
42+
};
43+
},
44+
});
45+
```
46+
47+
**See the code above in action:**
48+
49+
![ConvoStack Quickstart Example 1](../../static/img/ex1.png)
50+
51+
## Example 2: LLM Chain
52+
53+
In this example, we are constructing an [LLMChain](https://js.langchain.com/docs/modules/chains/llm_chain) which takes a human message from the chatbot playground, formats it with a [PromptTemplate](https://js.langchain.com/docs/modules/prompts/prompt_templates/), and then passes the formatted response to an OpenAI agent.
54+
55+
The generated response of the agent will be streamed to the user via the chatbot playground.
56+
57+
```typescript
58+
import { playground } from "convostack/playground";
59+
import {
60+
ChatPromptTemplate,
61+
HumanMessagePromptTemplate,
62+
SystemMessagePromptTemplate,
63+
} from "langchain/prompts";
64+
import { LLMChain } from "langchain/chains";
65+
import { ChatOpenAI } from "langchain/chat_models/openai";
66+
67+
playground({
68+
reply(context: IAgentContext): Promise<IAgentResponse> {
69+
// `humanMessage` is the content of each message the user sends via the chatbot playground.
70+
let humanMessage = context.getHumanMessage().content;
71+
// We can now construct an LLMChain from a ChatPromptTemplate and a chat model.
72+
const chat = new ChatOpenAI({ streaming: true, temperature: 0 });
73+
// Pre-prompt the agent to be a language translator
74+
const chatPrompt = ChatPromptTemplate.fromPromptMessages([
75+
SystemMessagePromptTemplate.fromTemplate(
76+
"You are a helpful assistant that translates {input_language} to {output_language}."
77+
),
78+
HumanMessagePromptTemplate.fromTemplate("{text}"),
79+
]);
80+
const chain = new LLMChain({
81+
prompt: chatPrompt,
82+
llm: chat,
83+
});
84+
85+
// `resp` is the response of the OpenAI LLM chain translating `humanMessage` from English to French.
86+
const resp = await chain.call({
87+
input_language: "English",
88+
output_language: "French",
89+
text: humanMessage,
90+
});
91+
92+
return {
93+
content: resp.text,
94+
contentType: "markdown",
95+
};
96+
},
97+
});
98+
```
99+
100+
**See the code above in action:**
101+
102+
![ConvoStack Quickstart Example 2](../../static/img/ex2.png)
103+
104+
## Example 3: LLM Chain With History
105+
106+
In this example, we are connecting an OpenAI [LLM](https://js.langchain.com/docs/modules/models/llms/) that remembers the previous conversational back and forths directly using [Buffer Memory](https://js.langchain.com/docs/modules/memory/examples/buffer_memory) and `ConvoStackLangchainChatMessageHistory`.
107+
108+
The generated response of the agent will be streamed to the user via the chatbot playground.
109+
110+
```typescript
111+
import { playground } from "convostack/playground";
112+
import { ConvoStackLangchainChatMessageHistory } from "convostack/langchain-memory";
113+
import { ChatOpenAI } from "langchain/chat_models/openai";
114+
import {
115+
SystemMessagePromptTemplate,
116+
HumanMessagePromptTemplate,
117+
ChatPromptTemplate,
118+
MessagesPlaceholder,
119+
} from "langchain/prompts";
120+
import { ConversationChain } from "langchain/chains";
121+
import { BufferMemory } from "langchain/memory";
122+
123+
playground({
124+
reply(
125+
context: IAgentContext,
126+
callbacks?: IAgentCallbacks
127+
): Promise<IAgentResponse> {
128+
// `humanMessage` is the content of each message the user sends via the chatbot playground.
129+
let humanMessage = context.getHumanMessage().content;
130+
131+
// Create a new OpenAI agent, with streaming
132+
const chat = new ChatOpenAI({
133+
modelName: "gpt-3.5-turbo",
134+
temperature: 0,
135+
streaming: true,
136+
callbacks: [
137+
{
138+
handleLLMNewToken(token: string) {
139+
// Stream tokens to ConvoStack
140+
callbacks.onMessagePart({
141+
contentChunk: token,
142+
});
143+
},
144+
},
145+
],
146+
});
147+
148+
// Setup your prompts (note the placeholder for {history})
149+
const chatPrompt = ChatPromptTemplate.fromPromptMessages([
150+
SystemMessagePromptTemplate.fromTemplate(
151+
"The following is a friendly conversation between a human and an AI."
152+
),
153+
new MessagesPlaceholder("history"),
154+
HumanMessagePromptTemplate.fromTemplate("{input}"),
155+
]);
156+
157+
// Setup the chain with a BufferMemory that pulls from the ConvoStack conversation history
158+
const chain = new ConversationChain({
159+
memory: new BufferMemory({
160+
// Use the ConvoStackLangchainChatMessageHistory class to prepare a Langchain-compatible version of the history
161+
chatHistory: new ConvoStackLangchainChatMessageHistory({
162+
// Pass the current conversation's message history for loading
163+
history: context.getHistory(),
164+
}),
165+
returnMessages: true,
166+
memoryKey: "history",
167+
}),
168+
prompt: chatPrompt,
169+
llm: chat,
170+
});
171+
172+
// `resp` is the response of the OpenAI LLM chain to `humanMessage`, which was inputted on the ConvoStack playground.
173+
const resp = await chain.call({
174+
input: context.getHumanMessage().content,
175+
});
176+
177+
// Send the final response to ConvoStack
178+
return {
179+
content: resp.response,
180+
contentType: "markdown",
181+
};
182+
},
183+
});
184+
```
185+
186+
**See the code above in action:**
187+
188+
![ConvoStack Quickstart Example 3](../../static/img/ex3.png)

‎docs/docs/getting-started/quickstart-react-express-playground.md‎

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,8 @@
1-
# Quickstart Guide
1+
---
2+
sidebar_position: 0.2
3+
---
24

3-
The ConvoStack Playground monorepo is the fastest way to start exploring ConvoStack. If you like learning by example, then this is the quickstart for you!
5+
# Playground Repository
46

57
To check out the playground without setting it up yourself, click [here](https://playground.convostack.ai) for a live
68
demo!

‎docs/sidebars.js‎

Lines changed: 73 additions & 73 deletions
Original file line numberDiff line numberDiff line change
@@ -13,79 +13,79 @@
1313

1414
/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
1515
const sidebars = {
16-
// By default, Docusaurus generates a sidebar from the docs folder structure
17-
tutorialSidebar: [
18-
"README",
19-
"the-basics",
20-
{
21-
type: "category",
22-
label: "Getting Started",
23-
link: {
24-
type: "doc",
25-
id: "getting-started/quickstart-react-express-playground",
26-
},
27-
items: [{ type: "autogenerated", dirName: "getting-started" }],
28-
},
29-
{
30-
type: "category",
31-
label: "Frontend",
32-
link: {
33-
type: "doc",
34-
id: "frontend/index",
35-
},
36-
items: [{ type: "autogenerated", dirName: "frontend" }],
37-
},
38-
{
39-
type: "category",
40-
label: "Backend",
41-
link: {
42-
type: "doc",
43-
id: "backend/index",
44-
},
45-
items: [{ type: "autogenerated", dirName: "backend" }],
46-
},
47-
{
48-
type: "category",
49-
label: "Production",
50-
link: {
51-
type: "doc",
52-
id: "production/index",
53-
},
54-
items: [{ type: "autogenerated", dirName: "production" }],
55-
},
56-
{
57-
type: "category",
58-
label: "TS/JS API Reference",
59-
link: {
60-
type: "doc",
61-
id: "ts-js-api/index",
62-
},
63-
items: [{ type: "autogenerated", dirName: "ts-js-api" }],
64-
},
65-
{
66-
type: "category",
67-
label: "GraphQL API Examples",
68-
items: [{ type: "autogenerated", dirName: "graphql-api-examples" }],
69-
},
70-
{
71-
type: "category",
72-
label: "GraphQL API Reference",
73-
link: {
74-
type: "doc",
75-
id: "graphql-api/index",
76-
},
77-
items: [{ type: "autogenerated", dirName: "graphql-api" }],
78-
},
79-
{
80-
type: "category",
81-
label: "Contributing",
82-
link: {
83-
type: "doc",
84-
id: "contributing/index",
85-
},
86-
items: [{ type: "autogenerated", dirName: "contributing" }],
87-
},
88-
],
16+
// By default, Docusaurus generates a sidebar from the docs folder structure
17+
tutorialSidebar: [
18+
"README",
19+
"the-basics",
20+
{
21+
type: "category",
22+
label: "Getting Started",
23+
link: {
24+
type: "doc",
25+
id: "getting-started/index",
26+
},
27+
items: [{ type: "autogenerated", dirName: "getting-started" }],
28+
},
29+
{
30+
type: "category",
31+
label: "Frontend",
32+
link: {
33+
type: "doc",
34+
id: "frontend/index",
35+
},
36+
items: [{ type: "autogenerated", dirName: "frontend" }],
37+
},
38+
{
39+
type: "category",
40+
label: "Backend",
41+
link: {
42+
type: "doc",
43+
id: "backend/index",
44+
},
45+
items: [{ type: "autogenerated", dirName: "backend" }],
46+
},
47+
{
48+
type: "category",
49+
label: "Production",
50+
link: {
51+
type: "doc",
52+
id: "production/index",
53+
},
54+
items: [{ type: "autogenerated", dirName: "production" }],
55+
},
56+
{
57+
type: "category",
58+
label: "TS/JS API Reference",
59+
link: {
60+
type: "doc",
61+
id: "ts-js-api/index",
62+
},
63+
items: [{ type: "autogenerated", dirName: "ts-js-api" }],
64+
},
65+
{
66+
type: "category",
67+
label: "GraphQL API Examples",
68+
items: [{ type: "autogenerated", dirName: "graphql-api-examples" }],
69+
},
70+
{
71+
type: "category",
72+
label: "GraphQL API Reference",
73+
link: {
74+
type: "doc",
75+
id: "graphql-api/index",
76+
},
77+
items: [{ type: "autogenerated", dirName: "graphql-api" }],
78+
},
79+
{
80+
type: "category",
81+
label: "Contributing",
82+
link: {
83+
type: "doc",
84+
id: "contributing/index",
85+
},
86+
items: [{ type: "autogenerated", dirName: "contributing" }],
87+
},
88+
],
8989
};
9090

9191
module.exports = sidebars;

‎docs/static/img/dev-playground.png‎

107 KB
Loading[フレーム]

‎docs/static/img/ex1.png‎

141 KB
Loading[フレーム]

‎docs/static/img/ex2.png‎

124 KB
Loading[フレーム]

‎docs/static/img/ex3.png‎

172 KB
Loading[フレーム]

‎examples/be-example-express-sqlite/CHANGELOG.md‎

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -399,7 +399,7 @@
399399
- Update package metadata
400400
- Updated dependencies
401401
- convostack@0.0.4
402-
- @convostack/agent@0.0.4
402+
- convostack/agent@0.0.4
403403

404404
## 0.0.1
405405

@@ -408,4 +408,4 @@
408408
- Update build and packaging strategy
409409
- Updated dependencies
410410
- convostack@0.0.3
411-
- @convostack/agent@0.0.3
411+
- convostack/agent@0.0.3

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /