org.apache.spark:spark-network-common_2.12@2.4.1 vulnerabilities

Direct Vulnerabilities

Known vulnerabilities in the org.apache.spark:spark-network-common_2.12 package. This does not include vulnerabilities belonging to this package’s dependencies.

Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.
Fix for free
Vulnerability Vulnerable Version
  • M
Information Exposure

org.apache.spark:spark-network-common_2.12 is an open-source distributed general-purpose cluster-computing framework.

Affected versions of this package are vulnerable to Information Exposure via a bespoke mutual authentication protocol that allows for full encryption key recovery. This would allow a malicious actor who has access to the machine to decrypt captured network traffic offline.

How to fix Information Exposure?

Upgrade org.apache.spark:spark-network-common_2.12 to version 3.1.3 or higher.

[,3.1.3)
  • C
Remote Code Execution (RCE)

org.apache.spark:spark-network-common_2.12 is an open-source distributed general-purpose cluster-computing framework.

Affected versions of this package are vulnerable to Remote Code Execution (RCE). A standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).

How to fix Remote Code Execution (RCE)?

Upgrade org.apache.spark:spark-network-common_2.12 to version 3.0.0, 2.4.6 or higher.

[3.0.0-preview,3.0.0) [,2.4.6)