Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

'OpenAI' object has no attribute 'embed_documents' #255

Closed Unanswered
VishwasK asked this question in Q&A
Discussion options

Getting this error when running sample notebook in colab. Do we need some fix for RAG node
--- Executing Fetch Node ---

AttributeError Traceback (most recent call last)
in <cell line: 2>()
1 # execute the graph
----> 2 result = graph.execute({
3 "user_prompt": "List me the example with their description",
4 "url": "https://github.com/run-llama/llama_index/tree/main/docs/docs/examples"
5 })

3 frames
/usr/local/lib/python3.10/dist-packages/langchain_community/vectorstores/faiss.py in from_texts(cls, texts, embedding, metadatas, ids, **kwargs)
928 faiss = FAISS.from_texts(texts, embeddings)
929 """
--> 930 embeddings = embedding.embed_documents(texts)
931 return cls.__from(
932 texts,

AttributeError: 'OpenAI' object has no attribute 'embed_documents'

You must be logged in to vote

Replies: 2 comments 1 reply

Comment options

Give the code

You must be logged in to vote
1 reply
Comment options

Thanks @VinciGit00 . I tried to run the example codebook https://colab.research.google.com/drive/1sEZBonBMGP44CtO6GQTwAlL0BGJXjtfd?usp=sharing in google colab. I found error in Build custom Graph section. First Error in
rag_node = RAGNode(
input="user_prompt & (parsed_doc | doc)",
output=["relevant_chunks"],
node_config={"llm": llm_model},
)
in this section I had to replace llm with llm_model for that section to run but then the error is thrown in next section .

execute the graph

result = graph.execute({
"user_prompt": "List me the projects with their description",
"url": "https://perinim.github.io/projects/"
})

get the answer from the result

result = result.get("answer", "No answer found.")
I tried to add llm_embedder_model by defining new variable and pointing to text-embedding-3-small by modifying as below
node_config={"llm_model": llm_model, "embedder_model": llm_model_embedder,}

Even after these modifications it did not work

Comment options

ok please update to the new version

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /