3.4.1
13 years ago
2 months ago
Known vulnerabilities in the org.apache.hadoop:hadoop-common package. This does not include vulnerabilities belonging to this package’s dependencies.
Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.
Fix for freeVulnerability | Vulnerable Version |
---|---|
org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. Affected versions of this package are vulnerable to Creation of Temporary File in Directory with Insecure Permissions in the Note: This vulnerability can only be exploited on unix-like systems, where the system temporary directory is shared between all local users. How to fix Creation of Temporary File in Directory with Insecure Permissions? Upgrade | [,3.4.0) |
org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. Affected versions of this package are vulnerable to Arbitrary Code Execution via the Note: In vulnerable 3.3.x versions How to fix Arbitrary Code Execution? Upgrade | [2.0.0,2.10.2)[3.0.0-alpha,3.2.4)[3.3.0,3.3.3) |
org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. Affected versions of this package are vulnerable to Arbitrary File Write via Archive Extraction (Zip Slip) in How to fix Arbitrary File Write via Archive Extraction (Zip Slip)? Upgrade | [,2.10.2)[3.0.0-alpha1,3.2.3)[3.3.0,3.3.3-RC0) |
org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. Affected versions of this package are vulnerable to Information Exposure. When Kerberos authentication is enabled and SPNEGO through HTTP is not enabled, any users can access some servlets without authentication. How to fix Information Exposure? Upgrade | [3.0.0-alpha2,3.0.1)[2.8.0,2.10.0) |
org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. Affected versions of this package are vulnerable to Information Exposure. A cluster user may be able to expose private files owned by the user running the MapReduce job history server process. A malicious user can construct a configuration file containing XML directives that reference sensitive files on the MapReduce job history server host. How to fix Information Exposure? Upgrade | [,2.8.3)[3.0.0-alpha,3.0.0) |