Help: Calling models directly #1927
-
|
For my use case, I want to use tensorzero primarily as a model router w/ metrics. I also prefer to use OpenAI's client. Atm, the default configuration of tensorzero works great: However, when I attempt to add any custom provider/model to my tensorzero config: And make a request w/ And in w/ another configuration I hit: I MUST be doing something wrong right? Any help would be appreciated! And on a side note - why is it that there are basically no docs on how to call models directly? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
|
On a side note - is there a way to list available models? I feel like that's a critical endpoint that any AI gateway needs since new models/LoRAs are added weekly. |
Beta Was this translation helpful? Give feedback.
-
|
Hi @vinchg - for custom models, you don't need the provider type prefix, i.e. you should call |
Beta Was this translation helpful? Give feedback.
Hi @vinchg - for custom models, you don't need the provider type prefix, i.e. you should call
tensorzero::model_name::gpt_4o_mini_custom. If you add a provider prefix (e.g.openai::), we simply take the suffix and send to that provider.