Skip to content

Fix low_level_api_chat_cpp example to match current API#1086

Merged
abetlen merged 3 commits intoabetlen:mainfrom
aniljava:main
Jan 15, 2024
Merged

Fix low_level_api_chat_cpp example to match current API#1086
abetlen merged 3 commits intoabetlen:mainfrom
aniljava:main

Conversation

@aniljava
Copy link
Contributor

No description provided.

@aniljava aniljava marked this pull request as draft January 14, 2024 07:08
@aniljava aniljava marked this pull request as ready for review January 14, 2024 07:10
@abetlen
Copy link
Owner

abetlen commented Jan 15, 2024

@aniljava thank you for fixing the example! Really appreciate it.

@abetlen abetlen merged commit 1eaace8 into abetlen:main Jan 15, 2024
@cbigger
Copy link

cbigger commented Mar 22, 2024

I'm not sure if this is entirely updated. When running the Chat.py example from the low level api example I get the following error when it gets to generation:

AttributeError: module 'llama_cpp' has no attribute 'llama_eval'

llama_eval is supposedly deprecated when i trace back to llama_cpp:

logits_all (bool): the llama_eval() call computes all logits, not just the last one (DEPRECATED - set llama_batch.logits instead)

The examples I saw referenced ggml.bin models as well - not sure if that is intentional or not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants