Information Exposure Affecting org.apache.spark:spark-core_2.10 Open this link in a new tab package, versions [0,]

  • Attack Complexity


  • Privileges Required


  • Confidentiality


Do your applications use this vulnerable package?

In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.

Test your applications
  • snyk-id


  • published

    8 Aug 2019

  • disclosed

    7 Aug 2019

  • credit

    Thomas Graves

How to fix?

There is no fixed version for org.apache.spark:spark-core_2.10.


org.apache.spark:spark-core_2.10 is a cluster computing system for Big Data.

Affected versions of this package are vulnerable to Information Exposure. In certain situations Spark would write user data to local disk unencrypted, even if This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs.