Skip to content

Tags: niceqwer55555/llama-cpp-python

Tags

v0.3.16-cu128-AVXVNNI-win-20251118

Toggle v0.3.16-cu128-AVXVNNI-win-20251118's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
更新 build-wheels-cu128-win.yml

v0.3.16-cu128-AVX-win-20251118

Toggle v0.3.16-cu128-AVX-win-20251118's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
更新 build-wheels-cu128-win.yml

v0.3.16-cu128-AVX512-win-20251119

Toggle v0.3.16-cu128-AVX512-win-20251119's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
更新 build-wheels-cu128-win.yml

v0.3.16-cu128-Basic-win-20251117

Toggle v0.3.16-cu128-Basic-win-20251117's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Update build-wheels-cu128-win.yml

v0.3.16-cu128-AVX2-win-20251116

Toggle v0.3.16-cu128-AVX2-win-20251116's commit message
Update llama.cpp 20251115 and Move the ggml-related code to _ggml.py.

v0.3.16-cu128-AVX2-win-20251112

Toggle v0.3.16-cu128-AVX2-win-20251112's commit message
Use httplib to download model from an URL when the libcurl is disabled

Note: LLAMA_HTTPLIB is OFF, cannot build llama-server. Hint: to skip building server, set -DLLAMA_BUILD_SERVER=OFF

v0.3.16-cu128-AVX2-linux-20251112

Toggle v0.3.16-cu128-AVX2-linux-20251112's commit message
Use httplib to download model from an URL when the libcurl is disabled

Note: LLAMA_HTTPLIB is OFF, cannot build llama-server. Hint: to skip building server, set -DLLAMA_BUILD_SERVER=OFF

v0.3.16-cu126-Basic-win-20251112

Toggle v0.3.16-cu126-Basic-win-20251112's commit message
Use httplib to download model from an URL when the libcurl is disabled

Note: LLAMA_HTTPLIB is OFF, cannot build llama-server. Hint: to skip building server, set -DLLAMA_BUILD_SERVER=OFF

v0.3.16-cu126-AVX2-win-20251112

Toggle v0.3.16-cu126-AVX2-win-20251112's commit message
Use httplib to download model from an URL when the libcurl is disabled

Note: LLAMA_HTTPLIB is OFF, cannot build llama-server. Hint: to skip building server, set -DLLAMA_BUILD_SERVER=OFF

v0.3.16-cu126-AVX2-linux-20251112

Toggle v0.3.16-cu126-AVX2-linux-20251112's commit message
Use httplib to download model from an URL when the libcurl is disabled

Note: LLAMA_HTTPLIB is OFF, cannot build llama-server. Hint: to skip building server, set -DLLAMA_BUILD_SERVER=OFF