Insecure Defaults Affecting org.apache.spark:spark-core_2.11 package, versions [1.3.0,]



    Attack Complexity High

    Threat Intelligence

    EPSS 97.43% (100th percentile)
Expand this section
4.2 medium

Do your applications use this vulnerable package?

In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.

Test your applications
  • published 15 Aug 2018
  • disclosed 13 Aug 2018
  • credit Imran Rashid

How to fix?

There is no fix version for org.apache.spark:spark-core. To mitigate the problem, standalone masters, should disable the REST API by setting to false if it is unused, and/or ensure that all network access to the REST API is restricted to hosts that are trusted to submit jobs. Mesos users can stop the MesosClusterDispatcher, though that will prevent them from running jobs in cluster mode. Alternatively, they can ensure access to the MesosRestSubmissionServer is restricted to trusted hosts.


org.apache.spark:spark-core is a fast and general cluster computing system for Big Data.

Affected versions of this package are vulnerable to Insecure Defaults. Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property spark.authenticate.secret establishes a shared secret for authenticating requests to submit jobs via spark-submit. The REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating, but not launch executors, using the REST API.