vllm@0.9.0 vulnerabilities

A high-throughput and memory-efficient inference and serving engine for LLMs

Direct Vulnerabilities

Known vulnerabilities in the vllm package. This does not include vulnerabilities belonging to this package’s dependencies.

How to fix?

Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.

Fix for free
VulnerabilityVulnerable Version
  • H
Deserialization of Untrusted Data

vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs

Affected versions of this package are vulnerable to Deserialization of Untrusted Data via the SUB ZeroMQ socket, where the deserialization is performed using the unsafe pickle library. An attacker on the same cluster can execute arbitrary code on the remote machine by sending maliciously crafted deserialized payloads.

Note The V0 engine is off by default since v0.8.0, and the V1 engine is not affected. Due to the V0 engine's deprecated status and the invasive nature of a fix, the developers recommend ensuring a secure network environment if the V0 engine with multi-host tensor parallelism is still in use.

How to fix Deserialization of Untrusted Data?

There is no fixed version for vllm.

[0.5.2,)
  • H
Deserialization of Untrusted Data

vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs

Affected versions of this package are vulnerable to Deserialization of Untrusted Data in the MessageQueue.dequeue() API function. An attacker can execute arbitrary code by sending a malicious payload to the message queue.

How to fix Deserialization of Untrusted Data?

There is no fixed version for vllm.

[0,)