Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 8b22628

Browse files
updated docs
1 parent ce01bb5 commit 8b22628

File tree

4 files changed

+130
-72
lines changed

4 files changed

+130
-72
lines changed

‎docs/docs/frontend/fe-components.md‎

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,17 @@ sidebar_position: 0.2
66

77
All ConvoStack components are originally styled in TailwindCSS and exported, so they don't interfere with the styling of your own website. This allows seamless integration with existing components on your site. There are four React components that make up our component library:
88

9+
To ensure ConvoStack frontend components import properly, add the following properties to your tsconfig file:
10+
11+
```typescript
12+
{
13+
...
14+
"moduleResolution": "node",
15+
"esModuleInterop": true
16+
...
17+
}
18+
```
19+
920
## ConvoStackWrapper
1021

1122
The `ConvoStackWrapper` component serves as the entry point for integrating our chatbot widget into your site as it provides a shared Redux Toolkit store and React Query Client Provider for all ConvoStack components you choose to add within your application.

‎docs/docs/getting-started/index.md‎

Lines changed: 57 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -13,26 +13,47 @@ We will be using **Langchain** for creating the AI agents and **ConvoStack** for
1313

1414
![ConvoStack Playground](../../static/img/dev-playground.png)
1515

16+
## Clone Quickstart Repo
17+
18+
To find all the examples below and an already-created repository with the necessary dependencies, simply clone the ConvoStack Quickstart repo and follow the ReadMe to immediately get started.
19+
20+
[Clone here!]
21+
22+
If instead you want to complete the walkthrough in your own existing Typescript project, follow the steps below:
23+
1624
## Installation
1725

1826
```bash
19-
npm install convostack langchain
27+
npm install convostack langchain@0.0.67 dotenv
28+
```
29+
30+
After installing the following dependencies, create a `.ts` file. For this example, we will create on called `index.ts`
31+
32+
Because we are using OpenAI for our AI agents below, create a `.env` file and set:
33+
34+
```typescript
35+
OPENAI_API_KEY = YOUR_API_KEY;
2036
```
2137

2238
## Example 1: OpenAI Agent
2339

2440
In this example, we are connecting an OpenAI [LLM](https://js.langchain.com/docs/modules/models/llms/) to the chatbot playground.
2541

2642
```typescript
43+
import * as dotenv from "dotenv";
44+
// Configures the OpenAI API key
45+
dotenv.config();
46+
2747
import { playground } from "convostack/playground";
48+
import { IAgentContext, IAgentResponse } from "convostack/agent";
2849
import { OpenAI } from "langchain/llms/openai";
2950

3051
playground({
31-
reply(context: IAgentContext): Promise<IAgentResponse> {
52+
asyncreply(context: IAgentContext): Promise<IAgentResponse> {
3253
// `humanMessage` is the content of each message the user sends via the chatbot playground.
3354
let humanMessage = context.getHumanMessage().content;
3455
// `agent` is the OpenAI agent we want to use to respond to each `humanMessage`
35-
const agent = new OpenAI();
56+
const agent = new OpenAI({ modelName: "gpt-3.5-turbo" });
3657
// `call` is a simple string-in, string-out method for interacting with the OpenAI agent.
3758
const resp = await model.call(humanMessage);
3859
// `resp` is the generated agent's response to the user's `humanMessage`
@@ -44,7 +65,11 @@ playground({
4465
});
4566
```
4667

47-
**See the code above in action:**
68+
**See the code above in action via the following command:**
69+
70+
```bash
71+
npx ts-node index.ts
72+
```
4873

4974
![ConvoStack Quickstart Example 1](../../static/img/ex1.png)
5075

@@ -55,7 +80,12 @@ In this example, we are constructing an [LLMChain](https://js.langchain.com/docs
5580
The generated response of the agent will be streamed to the user via the chatbot playground.
5681

5782
```typescript
83+
import * as dotenv from "dotenv";
84+
// Configures the OpenAI API key
85+
dotenv.config();
86+
5887
import { playground } from "convostack/playground";
88+
import { IAgentContext, IAgentResponse } from "convostack/agent";
5989
import {
6090
ChatPromptTemplate,
6191
HumanMessagePromptTemplate,
@@ -65,11 +95,15 @@ import { LLMChain } from "langchain/chains";
6595
import { ChatOpenAI } from "langchain/chat_models/openai";
6696

6797
playground({
68-
reply(context: IAgentContext): Promise<IAgentResponse> {
98+
asyncreply(context: IAgentContext): Promise<IAgentResponse> {
6999
// `humanMessage` is the content of each message the user sends via the chatbot playground.
70100
let humanMessage = context.getHumanMessage().content;
71101
// We can now construct an LLMChain from a ChatPromptTemplate and a chat model.
72-
const chat = new ChatOpenAI({ streaming: true, temperature: 0 });
102+
const chat = new ChatOpenAI({
103+
streaming: true,
104+
temperature: 0,
105+
modelName: "gpt-3.5-turbo",
106+
});
73107
// Pre-prompt the agent to be a language translator
74108
const chatPrompt = ChatPromptTemplate.fromPromptMessages([
75109
SystemMessagePromptTemplate.fromTemplate(
@@ -97,7 +131,11 @@ playground({
97131
});
98132
```
99133

100-
**See the code above in action:**
134+
**See the code above in action via the following command:**
135+
136+
```bash
137+
npx ts-node index.ts
138+
```
101139

102140
![ConvoStack Quickstart Example 2](../../static/img/ex2.png)
103141

@@ -108,7 +146,12 @@ In this example, we are connecting an OpenAI [LLM](https://js.langchain.com/docs
108146
The generated response of the agent will be streamed to the user via the chatbot playground.
109147

110148
```typescript
149+
import * as dotenv from "dotenv";
150+
// Configures the OpenAI API key
151+
dotenv.config();
152+
111153
import { playground } from "convostack/playground";
154+
import { IAgentContext, IAgentResponse } from "convostack/agent";
112155
import { ConvoStackLangchainChatMessageHistory } from "convostack/langchain-memory";
113156
import { ChatOpenAI } from "langchain/chat_models/openai";
114157
import {
@@ -121,7 +164,7 @@ import { ConversationChain } from "langchain/chains";
121164
import { BufferMemory } from "langchain/memory";
122165

123166
playground({
124-
reply(
167+
asyncreply(
125168
context: IAgentContext,
126169
callbacks?: IAgentCallbacks
127170
): Promise<IAgentResponse> {
@@ -137,7 +180,7 @@ playground({
137180
{
138181
handleLLMNewToken(token: string) {
139182
// Stream tokens to ConvoStack
140-
callbacks.onMessagePart({
183+
callbacks?.onMessagePart({
141184
contentChunk: token,
142185
});
143186
},
@@ -183,6 +226,10 @@ playground({
183226
});
184227
```
185228

186-
**See the code above in action:**
229+
**See the code above in action via the following command:**
230+
231+
```bash
232+
npx ts-node index.ts
233+
```
187234

188235
![ConvoStack Quickstart Example 3](../../static/img/ex3.png)

‎examples/fe-example-react/tsconfig.tsbuildinfo‎

Lines changed: 1 addition & 1 deletion
Large diffs are not rendered by default.

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /