org.apache.spark:spark-core_2.11@2.2.2 vulnerabilities

Direct Vulnerabilities

Known vulnerabilities in the org.apache.spark:spark-core_2.11 package. This does not include vulnerabilities belonging to this package’s dependencies.

Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.
Fix for free
Vulnerability Vulnerable Version
  • C
Arbitrary Code Execution

org.apache.spark:spark-core_2.11 is a cluster computing system for Big Data.

Affected versions of this package are vulnerable to Arbitrary Code Execution. The standalone resource manager accepts code to execute on a 'master' host, that then runs that code on 'worker' hosts. The master itself does not, by design, execute user code. A specially-crafted request to the master can, however, cause the master to execute code too.

Note that this does not affect standalone clusters with authentication enabled. While the master host typically has less outbound access to other resources than a worker, the execution of code on the master is nevertheless unexpected.

How to fix Arbitrary Code Execution?

There is no fixed version for org.apache.spark:spark-core_2.11.

[0,)
  • M
Information Exposure

org.apache.spark:spark-core_2.11 is a cluster computing system for Big Data.

Affected versions of this package are vulnerable to Information Exposure. In certain situations Spark would write user data to local disk unencrypted, even if spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs.

How to fix Information Exposure?

Upgrade org.apache.spark:spark-core_2.11 to version 2.3.3 or higher.

(,2.3.3)
  • H
Information Exposure

org.apache.spark:spark-core_2.11 is a cluster computing system for Big Data.

Affected versions of this package are vulnerable to Information Exposure. A specially-crafted request to the zinc server could cause it to reveal information in files readable to the developer account running the build.

Note This vulnerability only affects developers building Spark from source code, and does not affect Spark end users.

How to fix Information Exposure?

Upgrade org.apache.spark:spark-core_2.11 to version 2.4.0 or higher.

[1.3.0,2.4.0)
  • M
Insecure Defaults

org.apache.spark:spark-core is a fast and general cluster computing system for Big Data.

Affected versions of this package are vulnerable to Insecure Defaults. Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property spark.authenticate.secret establishes a shared secret for authenticating requests to submit jobs via spark-submit. The REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating, but not launch executors, using the REST API.

How to fix Insecure Defaults?

There is no fix version for org.apache.spark:spark-core. To mitigate the problem, standalone masters, should disable the REST API by setting spark.master.rest.enabled to false if it is unused, and/or ensure that all network access to the REST API is restricted to hosts that are trusted to submit jobs. Mesos users can stop the MesosClusterDispatcher, though that will prevent them from running jobs in cluster mode. Alternatively, they can ensure access to the MesosRestSubmissionServer is restricted to trusted hosts.

[1.3.0,)