Do your applications use this vulnerable package?
In a few clicks we can analyze your entire application and see what components are vulnerable in your application, and suggest you quick fixes.Test your applications
8 Aug 2019
7 Aug 2019
How to fix?
org.apache.spark:spark-core_2.11 to version 2.3.3 or higher.
org.apache.spark:spark-core_2.11 is a cluster computing system for Big Data.
Affected versions of this package are vulnerable to Information Exposure. In certain situations Spark would write user data to local disk unencrypted, even if
spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by
spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs.