Remote Code Execution (RCE) Affecting org.apache.spark:spark-network-common_2.12 Open this link in a new tab package, versions [3.0.0-preview, 3.0.0) [,2.4.6)
Do your applications use this vulnerable package?
In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.Test your applications
24 Jun 2020
24 Jun 2020
How to fix?
org.apache.spark:spark-network-common_2.12 to version 3.0.0, 2.4.6 or higher.
org.apache.spark:spark-network-common_2.12 is an open-source distributed general-purpose cluster-computing framework.
Affected versions of this package are vulnerable to Remote Code Execution (RCE). A standalone resource manager's master may be configured to require authentication (
spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).