A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.
CVE ID: CVE-2024-12704
CVSS Base Severity: HIGH
CVSS Base Score: 7.5
CVSS Vector: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
Vendor: run-llama
Product: run-llama/llama_index
EPSS Score: 0.06% (probability of being exploited)
EPSS Percentile: 17.22% (scored less or equal to compared to others)
EPSS Date: 2025-04-18 (when was this score calculated)