[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1108368: llama.cpp: CVE-2025-52566



Source: llama.cpp
Version: 5713+dfsg-1
Severity: important
Tags: security upstream
X-Debbugs-Cc: carnil@debian.org, Debian Security Team <team@security.debian.org>

Hi,

The following vulnerability was published for llama.cpp.

CVE-2025-52566[0]:
| llama.cpp is an inference of several LLM models in C/C++. Prior to
| version b5721, there is a signed vs. unsigned integer overflow in
| llama.cpp's tokenizer implementation (llama_vocab::tokenize)
| (src/llama-vocab.cpp:3036) resulting in unintended behavior in
| tokens copying size comparison. Allowing heap-overflowing llama.cpp
| inferencing engine with carefully manipulated text input during
| tokenization process. This issue has been patched in version b5721.


If you fix the vulnerability please also make sure to include the
CVE (Common Vulnerabilities & Exposures) id in your changelog entry.

For further information see:

[0] https://security-tracker.debian.org/tracker/CVE-2025-52566
    https://www.cve.org/CVERecord?id=CVE-2025-52566
[1] https://github.com/ggml-org/llama.cpp/security/advisories/GHSA-7rxv-5jhh-j6xx
[2] https://github.com/ggml-org/llama.cpp/commit/dd6e6d0b6a4bbe3ebfc931d1eb14db2f2b1d70af

Please adjust the affected versions in the BTS as needed.

Regards,
Salvatore


Reply to: