vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.6.5 and prior to 0.8.5, having vLLM integration with mooncake, are vulnerable to remote code execution due to using pickle based serialization over unsecured ZeroMQ sockets. The vulnerable sockets were set to listen on all network interfaces, increasing the likelihood that an attacker is able to reach the vulnerable ZeroMQ sockets to carry out an attack. vLLM instances that do not make use of the mooncake integration are not vulnerable. This issue has been patched in version 0.8.5.
CVE ID: CVE-2025-32444
CVSS Base Severity: CRITICAL
CVSS Base Score: 10.0
CVSS Vector: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H
Vendor: vllm-project
Product: vllm
EPSS Score: 0.5% (probability of being exploited)
EPSS Percentile: 64.75% (scored less or equal to compared to others)
EPSS Date: 2025-05-29 (when was this score calculated)
SSVC Exploitation: none
SSVC Technical Impact: total
SSVC Automatable: true