-
Notifications
You must be signed in to change notification settings - Fork 91
Open
Description
I have tried various solutions to try so setup BLLM with free model like Ollamas models Llama3.2 or Llava. but BLLM does not seem to accept these in any way.
Setting up local models gives the error ... does not support tools/functions. I tried both mentioned options; starting a local model or using Docker. Same error.
When going into the code, the _get_model() function in helpers/llm_models.py condition on the available models. But all of these return the error:
client._types.ResponseError: model 'xxxxx' not found (status code: 404).
There is no help to find in the documentation (which also seems very poorly maintained).
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels