Snyk has a proof-of-concept or detailed explanation of how to exploit this vulnerability.
The probability is the direct output of the EPSS model, and conveys an overall sense of the threat of exploitation in the wild. The percentile measures the EPSS probability relative to all known EPSS scores. Note: This data is updated daily, relying on the latest available EPSS model version. Check out the EPSS documentation for more details.
In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.
Test your applicationsUpgrade llama-index-core
to version 0.12.6 or higher.
llama-index-core is an Interface between LLMs and your data
Affected versions of this package are vulnerable to Improper Handling of Exceptional Conditions via the stream_complete
method of the LangChainLLM
class. An attacker can disrupt service availability by providing an input of type integer instead of a string and causing an error in the stream
function where the thread terminates due to the error, but the process continues to run.
import openai
from llama_index.llms.langchain import LangChainLLM
import os
from langchain_openai import ChatOpenAI
os.environ["OPENAI_API_KEY"] = "<YOUR_API_KEY>"
openai.api_key = os.environ["OPENAI_API_KEY"]
llm = LangChainLLM(llm = ChatOpenAI(model="gpt-3.5-turbo"))
messages = 123
# Generate response
response_gen = llm.stream_complete(messages)
for delta in response_gen:
print(delta.delta, end="")
print("\n")