Skip to content

Conversation

@lucataco
Copy link
Contributor

This PR adds support for the Text Generation task type for Replicate models.

Example:

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +44 to +48
if (Array.isArray(value)) {
for (const item of value) {
const text = extractTextFromReplicateResponse(item);
if (typeof text === "string" && text.length > 0) {
return text;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Picking first array element returns user prompt instead of model output

The new extractTextFromReplicateResponse walks arrays and returns the first non‑empty string it finds. Replicate’s LLM responses are often arrays of chat messages where the first element is the user (or system) prompt and the assistant’s generated text appears later in the array. In such cases textGeneration will now echo the prompt instead of the model’s reply. Consider iterating from the end of the array or preferring elements with role === "assistant" so the final generated text is returned.

Useful? React with 👍 / 👎.

@SBrandeis
Copy link
Contributor

SBrandeis commented Oct 20, 2025

Hi @lucataco !
Does Replicate support OpenAI's chat completion API or Responses API?

Those are the APIs used to power the chat widget on model pages and our chat API at router.huggingface.co for conversational models (such as GPT-5).

If you support such APIs, adding support for them in the JS client should be very easy, see this for example:
https://github.com/huggingface/huggingface.js/blob/e80852fbf05cb61d018586dd6516eb13c4e51067/packages/inference/src/providers/groq.ts#L32-L41

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants