-
Notifications
You must be signed in to change notification settings - Fork 2.4k
-
部署的是1.9.1版本,在使用本地向量模型向量化处理的时候,出现报错
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2025年01月20日 17:55:44 向量化段落:c8eb6ba6-d70c-11ef-be73-82e3c21abf47出现错误Expecting value: line 1 column 1 (char 0)Traceback (most recent call last):
File "/opt/py3/lib/python3.11/site-packages/requests/models.py", line 974, in json
return complexjson.loads(self.text, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/maxkb/app/apps/common/event/listener_manage.py", line 138, in embedding_by_paragraph
VectorStore.get_embedding_vector().batch_save(data_list, embedding_model, is_the_task_interrupted)
File "/opt/maxkb/app/apps/embedding/vector/base_vector.py", line 102, in batch_save
self._batch_save(child_array, embedding, is_the_task_interrupted)
File "/opt/maxkb/app/apps/embedding/vector/pg_vector.py", line 62, in _batch_save
embeddings = embedding.embed_documents(texts)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/maxkb/app/apps/setting/models_provider/impl/local_model_provider/model/embedding.py", line 44, in embed_documents
result = res.json()
^^^^^^^^^^
File "/opt/py3/lib/python3.11/site-packages/requests/models.py", line 978, in json
raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
这个报错,说明后台没有拿到Json格式的文本,不知道是不是哪里配置有问题
Beta Was this translation helpful? Give feedback.