0.11.0
2 years ago
9 days ago
Known vulnerabilities in the vllm package. This does not include vulnerabilities belonging to this package’s dependencies.
Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.
Fix for freeVulnerability | Vulnerable Version |
---|---|
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs Affected versions of this package are vulnerable to Server-side Request Forgery (SSRF) via the Note: This vulnerability is particularly critical in containerized environments like ##Workaround To address this vulnerability, it is essential to restrict the URLs that the MediaConnector can access. The principle of least privilege should be applied. It is recommend to implement a configurable allowlist or denylist for domains and IP addresses.
A check should be added at the beginning of the How to fix Server-side Request Forgery (SSRF)? Upgrade | [0.5.0,0.11.0) |
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs Affected versions of this package are vulnerable to Allocation of Resources Without Limits or Throttling through the How to fix Allocation of Resources Without Limits or Throttling? Upgrade | [,0.11.0) |
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs Affected versions of this package are vulnerable to Covert Timing Channel via the How to fix Covert Timing Channel? Upgrade | [,0.11.0) |
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs Affected versions of this package are vulnerable to Deserialization of Untrusted Data via the Note The V0 engine is off by default since v0.8.0, and the V1 engine is not affected. Due to the V0 engine's deprecated status and the invasive nature of a fix, the developers recommend ensuring a secure network environment if the V0 engine with multi-host tensor parallelism is still in use. How to fix Deserialization of Untrusted Data? There is no fixed version for | [0.5.2,) |
vllm is an A high-throughput and memory-efficient inference and serving engine for LLMs Affected versions of this package are vulnerable to Deserialization of Untrusted Data in the How to fix Deserialization of Untrusted Data? There is no fixed version for | [0,) |