UN-2954: Enable line confidence extraction in LLMWhisperer V2#204
Conversation
Summary by CodeRabbit
WalkthroughBumps Python requirement to >=3.12,<3.13, updates llmwhisperer-client to >=2.5.0, increments SDK version to v0.79.0, and adds WhispererConfig.INCLUDE_LINE_CONFIDENCE with mapping from Changes
Sequence Diagram(s)sequenceDiagram
participant Caller as Caller
participant Helper as get_whisperer_params()
participant Config as WhispererConfig
Note over Caller,Helper `#DDEBF7`: Build Whisperer parameters
Caller->>Helper: provide extra_params (contains enable_highlight)
Helper-->>Config: set ADD_LINE_NOS (existing)
Helper-->>Config: set INCLUDE_LINE_CONFIDENCE = extra_params.enable_highlight
Note right of Config `#E6F4EA`: New config key added
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Pre-merge checks and finishing touches✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro Cache: Disabled due to Reviews > Disable Cache setting Knowledge base: Disabled due to 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (1)
pyproject.toml (1)
72-72: Consider a more flexible version constraint.The exact version pin (==2.5.0) prevents automatic security patches and bug fixes. Consider using a compatible release specifier like
~=2.5.0or a version range like>=2.5.0,<3.0.0to allow patch updates while maintaining compatibility.However, if exact pinning is intentional for SDK stability and reproducibility, this is acceptable.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Cache: Disabled due to Reviews > Disable Cache setting
Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting
📒 Files selected for processing (4)
pyproject.toml(1 hunks)src/unstract/sdk/__init__.py(1 hunks)src/unstract/sdk/adapters/x2text/llm_whisperer_v2/src/constants.py(1 hunks)src/unstract/sdk/adapters/x2text/llm_whisperer_v2/src/helper.py(1 hunks)
🔇 Additional comments (3)
src/unstract/sdk/__init__.py (1)
1-1: LGTM!The version bump to v0.78.2 is appropriate for this feature addition.
src/unstract/sdk/adapters/x2text/llm_whisperer_v2/src/helper.py (1)
203-204: LGTM!The addition of
INCLUDE_LINE_CONFIDENCEparameter is correctly mapped fromextra_params.enable_highlight, mirroring the pattern used forADD_LINE_NOS. This ensures that line confidence data is included when highlighting is enabled, as described in the PR objectives.src/unstract/sdk/adapters/x2text/llm_whisperer_v2/src/constants.py (1)
75-75: LGTM!The new constant
INCLUDE_LINE_CONFIDENCEfollows the established naming convention and is appropriately placed alongside related configuration keys.
- Add include_line_confidence parameter to LLMWhisperer V2 whisper requests - Enable confidence extraction when highlighting is enabled - Bump llmwhisperer-client to version 2.5.0 - Update requires-python to >=3.12 (required by llmwhisperer-client 2.5.0) - Bump SDK version to v0.78.2 - Update uv.lock with resolved dependencies
d2f8ad7 to
44ec893
Compare
Co-authored-by: Chandrasekharan M <117059509+chandrasekharan-zipstack@users.noreply.github.com> Signed-off-by: Deepak K <89829542+Deepak-Kesavan@users.noreply.github.com>
What
Enable line-level confidence score extraction in LLMWhisperer V2 adapter for the SDK.
Why
To support displaying average confidence scores of highlighted lines in Prompt Studio, we need to request and receive line confidence data from LLMWhisperer API.
How
INCLUDE_LINE_CONFIDENCEconstant toWhispererConfigclassget_whisperer_params()to includeinclude_line_confidenceparameter when highlighting is enabledinclude_line_confidencetoTruewhenenable_highlightisTruellmwhisperer-clientdependency to version2.5.0to support the new parameterv0.79.0Relevant Docs
Related Issues or PRs
Dependencies Versions / Env Variables
llmwhisperer-client==2.5.0(bumped from>=2.2.1)Notes on Testing
[page, x, y, width, confidence]Screenshots
N/A - Backend changes only
Checklist
I have read and understood the Contribution Guidelines.