Fix api_token not applied when using configure_default_client#176
Open
shr00mie wants to merge 1 commit intolmstudio-ai:mainfrom
Open
Fix api_token not applied when using configure_default_client#176shr00mie wants to merge 1 commit intolmstudio-ai:mainfrom
shr00mie wants to merge 1 commit intolmstudio-ai:mainfrom
Conversation
When calling configure_default_client(api_host, api_token=...) and then using the convenience API (e.g. list_downloaded_models()), the LM Studio server could return 'An LM Studio API token is required to make requests to this server, but none was provided.' The token was never sent because the module-level _default_api_token was never updated. Root cause: In configure_default_client only _default_api_host was declared global. The assignment _default_api_token = api_token therefore created a local variable; the global _default_api_token remained None. Later, get_default_client() created Client(_default_api_host, _default_api_token) with the token still None. Fix: Declare both _default_api_host and _default_api_token as global in configure_default_client so the assignment updates the module-level variable. One-line change; no other behavior change.
|
All contributors have signed the CLA ✍️ ✅ |
Author
|
I have read the CLA Document and I hereby sign the CLA |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Pull Request: Fix
api_tokennot applied when usingconfigure_default_clientBranch:
fix/configure-default-client-api-token-globalBase:
mainTarget: lmstudio-ai/lmstudio-python
Summary
When calling
configure_default_client(api_host, api_token=...)and then using the convenience API (e.g.list_downloaded_models()), the LM Studio server returns "An LM Studio API token is required to make requests to this server, but none was provided." The token is never sent because the module-level_default_api_tokenis never updated: only_default_api_hostis declaredglobalinconfigure_default_client, so the assignment_default_api_token = api_tokencreates a local variable and the global remainsNone. This pull request fixes that by declaring both globals.Problem
Expected behavior (per Authentication docs)
lms.configure_default_client(api_host, api_token="sk-lm-...")(or setLM_API_TOKEN).lms.list_downloaded_models()use the default client and send the token so token-protected LM Studio servers accept the connection.Actual behavior
configure_default_client(api_host, api_token="sk-lm-..."), the first use of the convenience API (e.g.list_downloaded_models()→get_default_client()) createsClient(_default_api_host, _default_api_token)._default_api_tokenwas never set, because inconfigure_default_clientonly_default_api_hostis declaredglobal. The line_default_api_token = api_tokentherefore assigns to a local variable; the global_default_api_tokenstaysNone.api_token=None, and the websocket connects without token. The server responds with authentication failure.Root cause
In
src/lmstudio/sync_api.py,configure_default_client:So
_default_api_tokenis never written to the module global. Later,get_default_client()does:Solution
Declare
_default_api_tokenas global inconfigure_default_clientso the assignment updates the module-level variable:One-line change in
sync_api.py(around line 1712).Testing
configure_default_client(host, api_token=token)followed bylist_downloaded_models()raisesLMStudioServerError: Authentication failed: ... none was provided.Repro (with LM Studio running with API token enabled):
Checklist (per CONTRIBUTING.md)
uv run pytest tests/ -m "not lmstudio"— 177 passed; full suite requires LM Studio with API token).uv run ruff check src/ tests/ examples/pluginsanduv run mypy --strict src/ tests/).