(fix) Ollama use new streaming format

This commit is contained in:
ishaan-jaff 2023-10-11 16:59:59 -07:00
parent 7b3ee8d129
commit fabad3dc42

View file

@ -1040,7 +1040,7 @@ def completion(
else:
response_string = ""
for chunk in generator:
response_string+=chunk['choices'][0]['delta']['content']
response_string+=chunk['content']
## RESPONSE OBJECT
model_response["choices"][0]["message"]["content"] = response_string