You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+12-9Lines changed: 12 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,7 @@
3
3
This demo shows how to build a Retrieval-Augmented Generation (RAG) application
4
4
using [MariaDB](https://mariadb.com/), [LocalAI](https://localai.io/), and [Java](https://en.wikipedia.org/wiki/Java_(programming_language)).
5
5
6
-
**Note:** This demo uses an _RC version_ of MariaDB, which includes SQL syntax that
7
-
might change in the next GA (stable) version.
6
+
**Note:** This demo uses an _RC version_ of MariaDB, which includes SQL syntax that might change in the next GA (stable) version.
8
7
9
8
## Prerequisites
10
9
@@ -20,23 +19,27 @@ docker compose up -d
20
19
21
20
This also creates the database schema and loads a [data set with around 1000 Walmart products](https://github.com/luminati-io/Walmart-dataset-samples/blob/main/walmart-products.csv).
22
21
23
-
Calculate the vector embeddings:
22
+
Wait until the AI models have been downloaded successfully:
24
23
25
24
```shell
26
-
./UpdateVectors.java
25
+
docker logs -f local-ai
27
26
```
28
27
29
-
Be patient. This might take some time depending on your hardware.
28
+
Wait until you see the message _LocalAI API is listening!_ in the log.
30
29
31
-
## Run the demo
30
+
## Calculate the vector embeddings
32
31
33
-
Before you run the demo, double-check that the models have been downloaded successfully:
32
+
To calculate the vector embeddings for all the products in the database, run:
34
33
35
34
```shell
36
-
docker logs -f local-ai
35
+
./ComputeVectors.java
37
36
```
38
37
39
-
Start the demo:
38
+
Be patient. This might take some time depending on your hardware.
0 commit comments