Use of tensorzero in N8N #5361
MacherelR
started this conversation in
Feature Requests
Replies: 1 comment 6 replies
-
|
Glad to hear! I haven't tried n8n + TensorZero, but it looks like they support OpenAI-compatible models, so it should work. You should set the base URL to your TensorZero base URL (e.g. |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have successfully deployed tensorzero, and interfaced it with ollama for models inference. It works amazingly and I for example set it up with
sgpt(s.o https://github.com/tbckr/sgpt).Now someone in my organisation asked me how to use it inside n8n, and I couldn't find another way than http requests directly, which is not a great UX. Has anyone ever tried it and had success in integrating tensorzero in n8n ?
Beta Was this translation helpful? Give feedback.
All reactions