qq_44162297 2024-05-01 20:56 采纳率: 50%
浏览 902

Ollama调用Embedding models实现RAG应用代码报错

运行ollama的Embedding models报错,但不知道原因

import ollama
import chromadb

documents = [
  "Llamas are members of the camelid family meaning they're pretty closely related to vicuñas and camels",
  "Llamas were first domesticated and used as pack animals 4,000 to 5,000 years ago in the Peruvian highlands",
  "Llamas can grow as much as 6 feet tall though the average llama between 5 feet 6 inches and 5 feet 9 inches tall",
  "Llamas weigh between 280 and 450 pounds and can carry 25 to 30 percent of their body weight",
  "Llamas are vegetarians and have very efficient digestive systems",
  "Llamas live to be about 20 years old, though some only live for 15 years and others live to be 30 years old",
]

client = chromadb.Client()
collection = client.create_collection(name="docs")

# store each document in a vector embedding database
for i, d in enumerate(documents):
  response = ollama.embeddings(model="mxbai-embed-large", prompt=d)
  embedding = response["embedding"]
  collection.add(
    ids=[str(i)],
    embeddings=[embedding],
    documents=[d]
  )

  # an example prompt
  prompt = "What animals are llamas related to?"

  # generate an embedding for the prompt and retrieve the most relevant doc
  response = ollama.embeddings(
    prompt=prompt,
    model="mxbai-embed-large"
  )
  results = collection.query(
    query_embeddings=[response["embedding"]],
    n_results=1
  )
  data = results['documents'][0][0]

  # generate a response combining the prompt and data we retrieved in step 2
  output = ollama.generate(
    model="llama2",
    prompt=f"Using this data: {data}. Respond to this prompt: {prompt}"
  )

  print(output['response'])


python3.11 /Users/admin/project/py/rag/ollama/ola1.py 
Traceback (most recent call last):
  File "/Users/admin/project/py/rag/ollama/ola1.py", line 18, in <module>
    response = ollama.embeddings(model="mxbai-embed-large", prompt=d)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/admin/anaconda3/lib/python3.11/site-packages/ollama/_client.py", line 198, in embeddings
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/admin/anaconda3/lib/python3.11/site-packages/ollama/_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError

Process finished with exit code 1

  • 写回答

7条回答 默认 最新

  • 稔°K 2024-08-20 09:54
    关注

    把梯子撤了就好了

    评论

报告相同问题?

问题事件

  • 修改了问题 5月1日
  • 创建了问题 5月1日