Skip to content
This repository was archived by the owner on Feb 11, 2026. It is now read-only.

Conversation

@fabiendupont
Copy link
Contributor

While building instructlab 0.23.0a0 from source on aarch64 with CUDA, I noticed compilation errors. Empirically testing with llama_cpp_python 0.3.6, which is the current latest version, the issues disappeared.

While building instructlab 0.23.0a0 from source on aarch64 with CUDA, I
noticed compilation errors. Empirically testing with llama_cpp_python
0.3.6, the issues disappeared.

Signed-off-by: Fabien Dupont <fdupont@redhat.com>
@mergify mergify bot added documentation Improvements or additions to documentation dependencies Relates to dependencies ci-failure PR has at least one CI failure labels Jan 23, 2025
@cdoern
Copy link
Contributor

cdoern commented Jan 23, 2025

This might work, 3.5 has a bunch of issues causing us to only bump to 3.2, but 3.6 seems to have possibly solved them all?

@nathan-weinberg
Copy link
Member

There is already a PR for this: #2368

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

ci-failure PR has at least one CI failure dependencies Relates to dependencies documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants