In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.
Test your applicationsUpgrade vllm
to version 0.7.2 or higher.
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs
Affected versions of this package are vulnerable to Use of Weak Hash due to the use of a predictable constant value in the Python 3.12 built-in hash
function. An attacker can interfere with subsequent responses and cause unintended behavior by exploiting predictable hash collisions to populate the cache with prompts known to collide with another prompt in use.