Skip to content

[Bug]: Ollama integration with Wave Term not Reading terminal #2478

@will-wrigh

Description

@will-wrigh

Current Behavior

WAVE AI does not link with Ollama, Wave AI widget does, but doesn't read terminal.

Image Image

//this is when I disconnect the internet, to ensure model is local, which, through the wave chatbot apparently, it is not. The ai widget does work this way however.

Have a config file for {
"ai@ollama-llama": {
"ai:*": true,
"ai:apitoken": "ollama",
"ai:baseurl": "http://localhost:11434/v1",
"ai:model": "qwen3-coder:30b",
"ai:name": "qwen3-coder:30b",
"display:name": "Ollama - Qwen3",
"display:order": 3
},
"autoupdate:channel": "latest"
}

Expected Behavior

Expected Ollama to read terminal- I don't know if the ai widget was designed just as a chat bot but I need local llm (BYOLLM) integration with the terminal. I've seen on the site this working before
https://legacydocs.waveterm.dev/features/waveAI
I know this is a legacy doc.

Image

if this feature no longer exists, I'd like to know if and how I can download an older variant that works. Please, and thank you!

Steps To Reproduce

Ollama download, with qwenstarcoder:30b. Config file listed. Opened ai widget and changed from gpt5mini to Ollama and confirmed this works.

Other steps I have tried to point the chatbot wave ai ui have broken the program ( I would list what I've done, but I admittedly don't know what I've done- I used ai and threw its suggestions at the wall until it broke the app, so I reinstalled it fresh. upon request I can do more research into steps taken but because those steps have been undone, I believe they are not relevant- they were bad suggestions from the ai and could be wild goose chase in nature.

Wave Version

Client Version 0.12.1 (202510210632) Update Channel: latest

Platform

macOS

OS Version/Distribution

Sonoma

Architecture

x64

Anything else?

These details are fun highlights but should not be relevant to problem: running a Mac Pro 2013 with open core, 128gb of ram, launching the wave browser through terminal apple shortcut using open gl because d700s can't run latest metal in Sonoma.

Questionnaire

  • I'm interested in fixing this myself but don't know where to start
  • I would like to fix and I have a solution
  • I don't have time to fix this right now, but maybe later

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions