Insecure Defaults Affecting org.apache.spark:spark-core_2.10 Open this link in a new tab package, versions [1.3.0,]


0.0
medium
  • Attack Complexity

    High

Do your applications use this vulnerable package?

In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.

Test your applications
  • snyk-id

    SNYK-JAVA-ORGAPACHESPARK-1298182

  • published

    15 Aug 2018

  • disclosed

    13 Aug 2018

  • credit

    Imran Rashid

Overview

org.apache.spark:spark-core is a fast and general cluster computing system for Big Data.

Affected versions of this package are vulnerable to Insecure Defaults. Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property spark.authenticate.secret establishes a shared secret for authenticating requests to submit jobs via spark-submit. The REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating, but not launch executors, using the REST API.

Remediation

There is no fix version for org.apache.spark:spark-core. To mitigate the problem, standalone masters, should disable the REST API by setting spark.master.rest.enabled to false if it is unused, and/or ensure that all network access to the REST API is restricted to hosts that are trusted to submit jobs. Mesos users can stop the MesosClusterDispatcher, though that will prevent them from running jobs in cluster mode. Alternatively, they can ensure access to the MesosRestSubmissionServer is restricted to trusted hosts.