VYPR
Medium severity5.3NVD Advisory· Published Aug 12, 2024· Updated Apr 27, 2026

CVE-2024-42477

CVE-2024-42477

Description

llama.cpp provides LLM inference in C/C++. The unsafe type member in the rpc_tensor structure can cause global-buffer-overflow. This vulnerability may lead to memory data leakage. The vulnerability is fixed in b3561.

Affected products

1
  • cpe:2.3:a:ggml:llama.cpp:*:*:*:*:*:*:*:*
    Range: <b3561

Patches

2

Vulnerability mechanics

Generated by null/stub on May 9, 2026. Inputs: CWE entries + fix-commit diffs from this CVE's patches. Citations validated against bundle.

References

2

News mentions

0

No linked articles in our index yet.