Skip to content

Fix api_token not applied when using configure_default_client#176

Open
shr00mie wants to merge 1 commit intolmstudio-ai:mainfrom
shr00mie:fix/configure-default-client-api-token-global
Open

Fix api_token not applied when using configure_default_client#176
shr00mie wants to merge 1 commit intolmstudio-ai:mainfrom
shr00mie:fix/configure-default-client-api-token-global

Conversation

@shr00mie
Copy link

@shr00mie shr00mie commented Feb 3, 2026

Pull Request: Fix api_token not applied when using configure_default_client

Branch: fix/configure-default-client-api-token-global
Base: main
Target: lmstudio-ai/lmstudio-python


Summary

When calling configure_default_client(api_host, api_token=...) and then using the convenience API (e.g. list_downloaded_models()), the LM Studio server returns "An LM Studio API token is required to make requests to this server, but none was provided." The token is never sent because the module-level _default_api_token is never updated: only _default_api_host is declared global in configure_default_client, so the assignment _default_api_token = api_token creates a local variable and the global remains None. This pull request fixes that by declaring both globals.


Problem

Expected behavior (per Authentication docs)

  • Call lms.configure_default_client(api_host, api_token="sk-lm-...") (or set LM_API_TOKEN).
  • Subsequent calls such as lms.list_downloaded_models() use the default client and send the token so token-protected LM Studio servers accept the connection.

Actual behavior

  • After configure_default_client(api_host, api_token="sk-lm-..."), the first use of the convenience API (e.g. list_downloaded_models()get_default_client()) creates Client(_default_api_host, _default_api_token).
  • The module-level _default_api_token was never set, because in configure_default_client only _default_api_host is declared global. The line _default_api_token = api_token therefore assigns to a local variable; the global _default_api_token stays None.
  • The client is therefore created with api_token=None, and the websocket connects without token. The server responds with authentication failure.

Root cause

In src/lmstudio/sync_api.py, configure_default_client:

def configure_default_client(api_host: str, api_token: str | None = None) -> None:
    """Set the server API host for the default global client (without creating the client)."""
    global _default_api_host   # <-- only _default_api_host is global
    if _default_client is not None:
        ...
    _default_api_host = api_host
    _default_api_token = api_token   # <-- this assigns to a LOCAL variable

So _default_api_token is never written to the module global. Later, get_default_client() does:

_default_client = Client(_default_api_host, _default_api_token)  # _default_api_token is still None

Solution

Declare _default_api_token as global in configure_default_client so the assignment updates the module-level variable:

global _default_api_host, _default_api_token

One-line change in sync_api.py (around line 1712).


Testing

  • Before fix: With a token-protected LM Studio server, configure_default_client(host, api_token=token) followed by list_downloaded_models() raises LMStudioServerError: Authentication failed: ... none was provided.
  • After fix: The same sequence succeeds; the SDK sends the token and the server accepts the request.

Repro (with LM Studio running with API token enabled):

import lmstudio as lms
lms.configure_default_client("localhost:1234", api_token="sk-lm-xxxxxxxx:yyyyyyyyyyyyyyyyyyyy")
models = lms.list_downloaded_models("llm")  # Before fix: auth error. After fix: returns list.

Checklist (per CONTRIBUTING.md)

  • Single-line change; no behavior change except fixing the bug.
  • Aligns with documented Authentication API (api_token and LM_API_TOKEN).
  • Branch is based on the main source
  • Follows existing code style (no formatting changes).
  • All tests pass (uv run pytest tests/ -m "not lmstudio" — 177 passed; full suite requires LM Studio with API token).
  • Static checks pass (uv run ruff check src/ tests/ examples/plugins and uv run mypy --strict src/ tests/).

When calling configure_default_client(api_host, api_token=...) and then
using the convenience API (e.g. list_downloaded_models()), the LM Studio
server could return 'An LM Studio API token is required to make requests
to this server, but none was provided.' The token was never sent because
the module-level _default_api_token was never updated.

Root cause: In configure_default_client only _default_api_host was
declared global. The assignment _default_api_token = api_token therefore
created a local variable; the global _default_api_token remained None.
Later, get_default_client() created Client(_default_api_host,
_default_api_token) with the token still None.

Fix: Declare both _default_api_host and _default_api_token as global in
configure_default_client so the assignment updates the module-level
variable. One-line change; no other behavior change.
@github-actions
Copy link

github-actions bot commented Feb 3, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@shr00mie
Copy link
Author

shr00mie commented Feb 3, 2026

I have read the CLA Document and I hereby sign the CLA

@github-actions github-actions bot added the CLA signed Indicates that all contributors have signed label Feb 3, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA signed Indicates that all contributors have signed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant