VYPR
High severity8.6NVD Advisory· Published Apr 1, 2026· Updated Apr 15, 2026

CVE-2026-34445

CVE-2026-34445

Description

Open Neural Network Exchange (ONNX) is an open standard for machine learning interoperability. Prior to version 1.21.0, the ExternalDataInfo class in ONNX was using Python’s setattr() function to load metadata (like file paths or data lengths) directly from an ONNX model file. It didn’t check if the "keys" in the file were valid. Due to this, an attacker could craft a malicious model that overwrites internal object properties. This issue has been patched in version 1.21.0.

Affected packages

Versions sourced from the GitHub Security Advisory.

PackageAffected versionsPatched versions
onnxPyPI
< 1.21.01.21.0

Affected products

1

Patches

1
e30c6935d67c

Merged Fix object state corruption and DoS via ExternalDataInfo attribute injection (#7751)

https://github.com/onnx/onnxTi-Tai WangMar 18, 2026via ghsa
4 files changed · +406 17
  • docs/Security.md+73 0 modified
    @@ -93,3 +93,76 @@ Test coverage is in:
       - `TestSaveExternalDataAbsolutePathValidation` — absolute path rejection.
     
     Symlink and hardlink tests are skipped on Windows (`os.name == "nt"`).
    +
    +---
    +
    +## External Data Attribute Validation
    +
    +This section describes the security model for validating external data attributes in `ExternalDataInfo`. It covers defenses against attribute injection (CWE-915) and resource exhaustion (CWE-400) via crafted `external_data` entries in `TensorProto`.
    +
    +**Advisory:** [GHSA-538c-55jv-c5g9](https://github.com/onnx/onnx/security/advisories/GHSA-538c-55jv-c5g9)
    +
    +### Threat Model
    +
    +An attacker provides a malicious ONNX model with crafted `external_data` entries in `TensorProto`. The `external_data` field is a repeated `StringStringEntryProto` — a key-value store that accepts arbitrary strings for both key and value.
    +
    +The attack is triggered during `onnx.load()` with no explicit checker invocation required. `ExternalDataInfo.__init__` processes these key-value pairs to populate object attributes.
    +
    +Attack vectors:
    +
    +- **Arbitrary attribute injection**: Setting unknown keys (e.g. `evil_attr`) causes `setattr()` to create arbitrary attributes on the `ExternalDataInfo` object. While no current consumer iterates over attributes, injected attributes create latent risk for future code.
    +- **Dunder attribute injection**: Setting keys like `__class__` or `__dict__` corrupts the Python object's internal state, enabling type confusion attacks.
    +- **Negative offset/length**: Negative values for `offset` cause `file.seek()` to raise `OSError`. Negative `length` causes `file.read(-1)` to read the entire file to EOF, bypassing intended size limits.
    +- **Resource exhaustion (DoS)**: Setting `length` to a multi-petabyte value causes unbounded memory allocation when reading external data, even if the actual data file is small.
    +
    +Four Python consumers of `ExternalDataInfo` exist: `load_external_data_for_tensor`, `set_external_data` / `write_external_data_tensors`, `ModelContainer._load_large_initializers`, and `ReferenceEvaluator` (in `onnx/reference/reference_evaluator.py`). (The C++ checker validates paths but does not use the Python `ExternalDataInfo` class.)
    +
    +## Defense Layers
    +
    +We use a 3-layer defense-in-depth approach. Each layer addresses a different class of attack and operates at a different point in the processing pipeline.
    +
    +### Layer 1: Attribute Whitelist (CWE-915 Mitigation)
    +
    +`ExternalDataInfo.__init__` only accepts keys in `_ALLOWED_EXTERNAL_DATA_KEYS`: `location`, `offset`, `length`, `checksum`, `basepath`. Unknown keys are warned via `warnings.warn()` and ignored — this prevents arbitrary attribute injection.
    +
    +This also blocks dunder attribute injection (e.g. `__class__`, `__dict__`) that could cause object type confusion.
    +
    +**Rationale**: While we cannot prevent someone from constructing malicious protobuf directly, rejecting unknown keys at the Python object level is defense-in-depth that limits the attack surface. The whitelist is a `frozenset` to prevent runtime mutation.
    +
    +### Layer 2: Bounds Validation at Parse Time (CWE-400 Mitigation)
    +
    +`offset` and `length` must be non-negative integers. Non-numeric strings raise `ValueError`. This catches obviously invalid values early, before any file I/O occurs.
    +
    +**Rationale**: Negative `offset` causes `file.seek(-1)` to raise `OSError`; negative `length` causes `file.read(-1)` to read the entire file, bypassing intended size limits. Validating at parse time provides a clear error message at the point closest to the malicious input.
    +
    +### Layer 3: File-Size Validation at Consumption Time (CWE-400 Mitigation, Defense-in-Depth)
    +
    +In `load_external_data_for_tensor()` and `ModelContainer._load_large_initializers`, before reading: `offset <= file_size` and `offset + length <= file_size` are verified. A 1KB data file cannot cause a multi-petabyte memory allocation.
    +
    +**Rationale**: This is the critical safety net. It prevents memory exhaustion regardless of how the model was constructed — even via direct protobuf APIs that bypass Python-level parsing entirely. Validation happens at the point of actual file I/O, the last opportunity before harm occurs.
    +
    +## Why Layered Defense
    +
    +- **Layer 1 (whitelist)** catches the broadest class of attacks at parse time. It blocks attribute injection, dunder corruption, and any future unknown-key attack vector.
    +- **Layer 2 (bounds validation)** catches obviously invalid numeric values at parse time, providing clear error messages.
    +- **Layer 3 (file-size validation)** is the critical safety net that prevents actual harm at the I/O boundary. This layer cannot be bypassed even if an attacker crafts a model using protobuf APIs directly, because validation happens at the point of actual file read.
    +
    +## Protected Entry Points
    +
    +| Entry Point | File | Layers |
    +|---|---|---|
    +| `ExternalDataInfo.__init__` | `onnx/external_data_helper.py` | 1, 2 |
    +| `load_external_data_for_tensor` | `onnx/external_data_helper.py` | 1, 2, 3 |
    +| `set_external_data` | `onnx/external_data_helper.py` | 1 (whitelist by overwrite) |
    +| `ModelContainer._load_large_initializers` | `onnx/model_container.py` | 1, 2, 3 |
    +
    +## Testing
    +
    +Test coverage is in `onnx/test/test_external_data.py`:
    +
    +- `TestExternalDataInfoSecurity`:
    +  - **CWE-915 (attribute injection):** `test_unknown_key_rejected`, `test_dunder_key_rejected`, `test_multiple_unknown_keys_all_rejected`, `test_allowed_keys_constant_is_frozen`
    +  - **CWE-400 (bounds/DoS):** `test_negative_offset_rejected`, `test_negative_length_rejected`, `test_non_numeric_offset_raises`, `test_non_numeric_length_raises`
    +  - **Regression guards:** `test_valid_external_data_accepted`, `test_zero_offset_and_length_accepted`
    +- `TestLoadExternalDataFileSizeValidation`:
    +  - **File-size validation:** `test_offset_exceeds_file_size_raises`, `test_length_exceeds_available_data_raises`, `test_valid_offset_and_length_load_correctly`
    
  • onnx/external_data_helper.py+96 9 modified
    @@ -8,8 +8,9 @@
     import re
     import sys
     import uuid
    +import warnings
     from itertools import chain
    -from typing import TYPE_CHECKING
    +from typing import IO, TYPE_CHECKING
     
     import onnx.checker as onnx_checker
     import onnx.onnx_cpp2py_export.checker as c_checker
    @@ -24,6 +25,27 @@
     if TYPE_CHECKING:
         from collections.abc import Callable, Iterable
     
    +# Security: 3-layer defense against malicious external_data entries (GHSA-538c-55jv-c5g9)
    +#
    +# Layer 1 (here) — Attribute whitelist: Only spec-defined keys are accepted.
    +#   Unknown keys are warned and ignored, preventing arbitrary attribute injection (CWE-915).
    +#
    +# Layer 2 (ExternalDataInfo.__init__) — Bounds validation: offset and length must be
    +#   non-negative integers. Catches invalid values at parse time (CWE-400).
    +#
    +# Layer 3 (load_external_data_for_tensor) — File-size validation: offset and length are
    +#   checked against actual file size before reading. This is the critical safety net that
    +#   prevents memory exhaustion regardless of how the model was constructed (CWE-400).
    +#
    +# 'basepath' is included because set_external_data() and model_container
    +# write it to protobuf entries; it must survive save/load round-trips.
    +_ALLOWED_EXTERNAL_DATA_KEYS = frozenset(
    +    {"location", "offset", "length", "checksum", "basepath"}
    +)
    +_SORTED_ALLOWED_KEYS = sorted(_ALLOWED_EXTERNAL_DATA_KEYS)
    +_MAX_UNKNOWN_KEYS_IN_WARNING = 10
    +_MAX_KEY_DISPLAY_LENGTH = 100
    +
     
     class ExternalDataInfo:
         def __init__(self, tensor: TensorProto) -> None:
    @@ -33,14 +55,83 @@ def __init__(self, tensor: TensorProto) -> None:
             self.checksum = None
             self.basepath = ""
     
    +        unknown_keys: set[str] = set()
    +        unknown_key_count = 0
             for entry in tensor.external_data:
    -            setattr(self, entry.key, entry.value)
    +            # Layer 1: reject unknown keys (CWE-915 defense-in-depth)
    +            if entry.key in _ALLOWED_EXTERNAL_DATA_KEYS:
    +                setattr(self, entry.key, entry.value)
    +            else:
    +                unknown_key_count += 1
    +                if len(unknown_keys) < _MAX_UNKNOWN_KEYS_IN_WARNING:
    +                    truncated = entry.key[:_MAX_KEY_DISPLAY_LENGTH]
    +                    if len(entry.key) > _MAX_KEY_DISPLAY_LENGTH:
    +                        truncated += "..."
    +                    unknown_keys.add(truncated)
    +
    +        if unknown_keys:
    +            shown = sorted(unknown_keys)
    +            extra = unknown_key_count - len(shown)
    +            key_list = repr(shown)
    +            if extra > 0:
    +                key_list += f" and {extra} more"
    +            warnings.warn(
    +                f"Ignoring unknown external data key(s) {key_list} "
    +                f"for tensor {tensor.name!r}. "
    +                f"Allowed keys: {_SORTED_ALLOWED_KEYS}",
    +                stacklevel=2,
    +            )
     
             if self.offset is not None:
                 self.offset = int(self.offset)
    +            if self.offset < 0:
    +                raise ValueError(
    +                    f"External data offset must be non-negative, got {self.offset} "
    +                    f"for tensor {tensor.name!r}"
    +                )
     
             if self.length is not None:
                 self.length = int(self.length)
    +            if self.length < 0:
    +                raise ValueError(
    +                    f"External data length must be non-negative, got {self.length} "
    +                    f"for tensor {tensor.name!r}"
    +                )
    +
    +
    +def _validate_external_data_file_bounds(
    +    data_file: IO[bytes],
    +    info: ExternalDataInfo,
    +    tensor_name: str,
    +) -> bytes:
    +    """Validate offset/length against actual file size and read data.
    +
    +    Layer 3 defense-in-depth (CWE-400): prevents memory exhaustion even if the
    +    model was crafted via direct protobuf APIs that bypass Python parsing.
    +
    +    Returns the raw bytes read from the file.
    +    """
    +    file_size = os.fstat(data_file.fileno()).st_size
    +
    +    if info.offset is not None:
    +        if info.offset > file_size:
    +            raise ValueError(
    +                f"External data offset ({info.offset}) exceeds file size "
    +                f"({file_size}) for tensor {tensor_name!r}"
    +            )
    +        data_file.seek(info.offset)
    +
    +    if info.length is not None:
    +        read_start = info.offset if info.offset is not None else 0
    +        available = file_size - read_start
    +        if info.length > available:
    +            raise ValueError(
    +                f"External data length ({info.length}) exceeds available data "
    +                f"({available} bytes from offset {read_start}) "
    +                f"for tensor {tensor_name!r}"
    +            )
    +        return data_file.read(info.length)
    +    return data_file.read()
     
     
     def _validate_external_data_path(
    @@ -110,13 +201,9 @@ def load_external_data_for_tensor(tensor: TensorProto, base_dir: str) -> None:
             open_flags |= os.O_NOFOLLOW
         fd = os.open(external_data_file_path, open_flags)
         with os.fdopen(fd, "rb") as data_file:
    -        if info.offset is not None:
    -            data_file.seek(info.offset)
    -
    -        if info.length is not None:
    -            tensor.raw_data = data_file.read(info.length)
    -        else:
    -            tensor.raw_data = data_file.read()
    +        tensor.raw_data = _validate_external_data_file_bounds(
    +            data_file, info, tensor.name
    +        )
     
     
     def load_external_data_for_model(model: ModelProto, base_dir: str) -> None:
    
  • onnx/model_container.py+2 7 modified
    @@ -304,13 +304,8 @@ def _load_large_initializers(self, file_path):
                     open_flags |= os.O_NOFOLLOW
                 fd = os.open(external_data_file_path, open_flags)
                 with os.fdopen(fd, "rb") as data_file:
    -                if info.offset is not None:
    -                    data_file.seek(info.offset)
    -
    -                raw_data = (
    -                    data_file.read(info.length)
    -                    if info.length is not None
    -                    else data_file.read()
    +                raw_data = ext_data._validate_external_data_file_bounds(
    +                    data_file, info, tensor.name
                     )
     
                     dtype = onnx.helper.tensor_dtype_to_np_dtype(tensor.data_type)
    
  • onnx/test/test_external_data.py+235 1 modified
    @@ -10,6 +10,7 @@
     import tempfile
     import unittest
     import uuid
    +import warnings
     from typing import TYPE_CHECKING, Any
     
     import numpy as np
    @@ -26,6 +27,8 @@
         shape_inference,
     )
     from onnx.external_data_helper import (
    +    _ALLOWED_EXTERNAL_DATA_KEYS,
    +    ExternalDataInfo,
         convert_model_from_external_data,
         convert_model_to_external_data,
         load_external_data_for_model,
    @@ -1040,7 +1043,7 @@ def test_load_rejects_parent_directory_symlink(self) -> None:
             shutil.rmtree(subdir_path)
             os.symlink(sensitive_dir, subdir_path)
     
    -        # Loading must fail because realpath resolves outside model_dir
    +        # Loading must fail because realpath resolves outside model_dir.
             loaded_model = onnx.load(model_path, load_external_data=False)
             with self.assertRaises(checker.ValidationError):
                 load_external_data_for_model(loaded_model, model_dir)
    @@ -1087,5 +1090,236 @@ def test_save_rejects_absolute_path(self) -> None:
                 save_external_data(tensor, self.temp_dir)
     
     
    +class TestExternalDataInfoSecurity(unittest.TestCase):
    +    """Tests for ExternalDataInfo hardening against attribute injection and bounds.
    +
    +    Covers all attack vectors from the security advisory: unknown key injection,
    +    dunder attribute injection, negative offset/length bypass, and validates
    +    that legitimate keys still work correctly.
    +    """
    +
    +    @staticmethod
    +    def _make_tensor_with_external_data(
    +        entries: dict[str, str],
    +        tensor_name: str = "test_tensor",
    +    ) -> TensorProto:
    +        """Create a TensorProto with given external_data key-value entries."""
    +        tensor = TensorProto()
    +        tensor.name = tensor_name
    +        tensor.data_type = TensorProto.FLOAT
    +        tensor.dims.extend([4])
    +        tensor.data_location = TensorProto.EXTERNAL
    +        for key, value in entries.items():
    +            entry = tensor.external_data.add()
    +            entry.key = key
    +            entry.value = value
    +        return tensor
    +
    +    def test_valid_external_data_accepted(self) -> None:
    +        """All valid external_data keys must be accepted and correctly parsed."""
    +        tensor = self._make_tensor_with_external_data(
    +            {
    +                "location": "weights.bin",
    +                "offset": "16",
    +                "length": "1024",
    +                "checksum": "sha256:abc123",
    +            }
    +        )
    +        info = ExternalDataInfo(tensor)
    +        self.assertEqual(info.location, "weights.bin")
    +        self.assertEqual(info.offset, 16)
    +        self.assertIsInstance(info.offset, int)
    +        self.assertEqual(info.length, 1024)
    +        self.assertIsInstance(info.length, int)
    +        self.assertEqual(info.checksum, "sha256:abc123")
    +
    +    def test_unknown_key_rejected(self) -> None:
    +        """Unknown external_data keys must not be set as object attributes (CWE-915)."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "malicious_attr": "evil_value"}
    +        )
    +        with warnings.catch_warnings(record=True) as caught:
    +            warnings.simplefilter("always")
    +            info = ExternalDataInfo(tensor)
    +        # Unknown attribute must NOT be set on the object
    +        self.assertFalse(
    +            hasattr(info, "malicious_attr"),
    +            "Unknown key 'malicious_attr' should not become an attribute",
    +        )
    +        # Valid key must still work
    +        self.assertEqual(info.location, "weights.bin")
    +        # A warning must have been emitted for the unknown key
    +        self.assertTrue(
    +            any("malicious_attr" in str(w.message) for w in caught),
    +            "Expected warning about unknown key 'malicious_attr'",
    +        )
    +
    +    def test_dunder_key_rejected(self) -> None:
    +        """Dunder keys like '__class__' must not be injected via external_data (CWE-915).
    +
    +        Without the whitelist, setattr(self, '__class__', ...) would corrupt
    +        the object type, enabling type confusion attacks.
    +        """
    +        tensor = self._make_tensor_with_external_data({"location": "weights.bin"})
    +        # Add __class__ key via protobuf add() to mimic direct protobuf injection
    +        dunder_entry = tensor.external_data.add()
    +        dunder_entry.key = "__class__"
    +        dunder_entry.value = "builtins.dict"
    +
    +        original_class = ExternalDataInfo
    +        with warnings.catch_warnings(record=True) as caught:
    +            warnings.simplefilter("always")
    +            info = ExternalDataInfo(tensor)
    +        # Object type must not have been corrupted
    +        self.assertIsInstance(info, original_class)
    +        self.assertEqual(type(info).__name__, "ExternalDataInfo")
    +        self.assertEqual(info.location, "weights.bin")
    +        # A warning must have been emitted for the dunder key
    +        self.assertTrue(
    +            any("__class__" in str(w.message) for w in caught),
    +            "Expected warning about dunder key '__class__'",
    +        )
    +
    +    def test_negative_offset_rejected(self) -> None:
    +        """Negative offset must raise ValueError to prevent seek(-1) attacks."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "offset": "-1"}
    +        )
    +        with self.assertRaises(ValueError) as ctx:
    +            ExternalDataInfo(tensor)
    +        self.assertIn("non-negative", str(ctx.exception).lower())
    +
    +    def test_negative_length_rejected(self) -> None:
    +        """Negative length must raise ValueError to prevent underflow attacks."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "length": "-100"}
    +        )
    +        with self.assertRaises(ValueError) as ctx:
    +            ExternalDataInfo(tensor)
    +        self.assertIn("non-negative", str(ctx.exception).lower())
    +
    +    def test_zero_offset_and_length_accepted(self) -> None:
    +        """Zero values for offset/length should be accepted (edge case for bounds check)."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "offset": "0", "length": "0"}
    +        )
    +        # Should not raise — zero is a valid non-negative value
    +        info = ExternalDataInfo(tensor)
    +        self.assertEqual(info.location, "weights.bin")
    +        self.assertEqual(info.offset, 0)
    +        self.assertEqual(info.length, 0)
    +
    +    def test_multiple_unknown_keys_all_rejected(self) -> None:
    +        """Multiple unknown keys in a single tensor must all be rejected."""
    +        tensor = self._make_tensor_with_external_data(
    +            {
    +                "location": "weights.bin",
    +                "evil_one": "a",
    +                "evil_two": "b",
    +                "__dict__": "c",
    +            }
    +        )
    +        with warnings.catch_warnings(record=True) as caught:
    +            warnings.simplefilter("always")
    +            info = ExternalDataInfo(tensor)
    +        self.assertFalse(hasattr(info, "evil_one"))
    +        self.assertFalse(hasattr(info, "evil_two"))
    +        self.assertEqual(info.location, "weights.bin")
    +        unknown_key_warnings = [
    +            str(w.message)
    +            for w in caught
    +            if "unknown external data key" in str(w.message).lower()
    +        ]
    +        self.assertEqual(
    +            len(unknown_key_warnings),
    +            1,
    +            "Expected 1 aggregated warning for unknown keys",
    +        )
    +        # All unknown keys should be mentioned in the single warning
    +        self.assertIn("evil_one", unknown_key_warnings[0])
    +        self.assertIn("evil_two", unknown_key_warnings[0])
    +        self.assertIn("__dict__", unknown_key_warnings[0])
    +
    +    def test_allowed_keys_constant_is_frozen(self) -> None:
    +        """The whitelist must be a frozenset to prevent runtime mutation."""
    +        self.assertIsInstance(_ALLOWED_EXTERNAL_DATA_KEYS, frozenset)
    +        self.assertEqual(
    +            _ALLOWED_EXTERNAL_DATA_KEYS,
    +            frozenset({"location", "offset", "length", "checksum", "basepath"}),
    +        )
    +
    +    def test_non_numeric_offset_raises(self) -> None:
    +        """Non-numeric offset string must raise ValueError from int() conversion."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "offset": "abc"}
    +        )
    +        with self.assertRaises(ValueError):
    +            ExternalDataInfo(tensor)
    +
    +    def test_non_numeric_length_raises(self) -> None:
    +        """Non-numeric length string must raise ValueError from int() conversion."""
    +        tensor = self._make_tensor_with_external_data(
    +            {"location": "weights.bin", "length": "not_a_number"}
    +        )
    +        with self.assertRaises(ValueError):
    +            ExternalDataInfo(tensor)
    +
    +
    +class TestLoadExternalDataFileSizeValidation(TestLoadExternalDataBase):
    +    """Tests for defense-in-depth file-size validation in load_external_data_for_tensor."""
    +
    +    def test_offset_exceeds_file_size_raises(self) -> None:
    +        """Offset beyond file size must raise ValueError."""
    +        array = np.ones((4,), dtype=np.float32)
    +        tensor = from_array(array, name="weight")
    +        set_external_data(tensor, location="data.bin")
    +
    +        data_path = os.path.join(self.temp_dir, "data.bin")
    +        with open(data_path, "wb") as f:
    +            f.write(tensor.raw_data)
    +
    +        file_size = os.path.getsize(data_path)
    +        # Set offset beyond file size
    +        set_external_data(tensor, location="data.bin", offset=file_size + 100)
    +        tensor.ClearField("raw_data")
    +
    +        with self.assertRaisesRegex(ValueError, "offset.*exceeds file size"):
    +            load_external_data_for_tensor(tensor, self.temp_dir)
    +
    +    def test_length_exceeds_available_data_raises(self) -> None:
    +        """Length that overflows available data must raise ValueError."""
    +        array = np.ones((4,), dtype=np.float32)
    +        tensor = from_array(array, name="weight")
    +        set_external_data(tensor, location="data.bin")
    +
    +        data_path = os.path.join(self.temp_dir, "data.bin")
    +        with open(data_path, "wb") as f:
    +            f.write(tensor.raw_data)
    +
    +        file_size = os.path.getsize(data_path)
    +        # Set length much larger than file
    +        set_external_data(tensor, location="data.bin", length=file_size * 1000)
    +        tensor.ClearField("raw_data")
    +
    +        with self.assertRaisesRegex(ValueError, "length.*exceeds available data"):
    +            load_external_data_for_tensor(tensor, self.temp_dir)
    +
    +    def test_valid_offset_and_length_load_correctly(self) -> None:
    +        """Valid offset+length within file size should load correctly."""
    +        array = np.array([1.0, 2.0, 3.0, 4.0], dtype=np.float32)
    +        tensor = from_array(array, name="weight")
    +        raw = tensor.raw_data
    +
    +        data_path = os.path.join(self.temp_dir, "data.bin")
    +        with open(data_path, "wb") as f:
    +            f.write(raw)
    +
    +        set_external_data(tensor, location="data.bin", offset=0, length=len(raw))
    +        tensor.ClearField("raw_data")
    +
    +        load_external_data_for_tensor(tensor, self.temp_dir)
    +        self.assertEqual(tensor.raw_data, raw)
    +
    +
     if __name__ == "__main__":
         unittest.main()
    

Vulnerability mechanics

Generated by null/stub on May 9, 2026. Inputs: CWE entries + fix-commit diffs from this CVE's patches. Citations validated against bundle.

References

5

News mentions

0

No linked articles in our index yet.