org.apache.hadoop:hadoop-common@2.0.1-alpha vulnerabilities

Direct Vulnerabilities

Known vulnerabilities in the org.apache.hadoop:hadoop-common package. This does not include vulnerabilities belonging to this package’s dependencies.

Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.
Fix for free
Vulnerability Vulnerable Version
  • M
Creation of Temporary File in Directory with Insecure Permissions

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of this package are vulnerable to Creation of Temporary File in Directory with Insecure Permissions in the RunJar.run() method. If sensitive data is stored in this file, a local attacker can view it by accessing the temporary directory where this data is stored.

Note:

This vulnerability can only be exploited on unix-like systems, where the system temporary directory is shared between all local users.

How to fix Creation of Temporary File in Directory with Insecure Permissions?

Upgrade org.apache.hadoop:hadoop-common to version 3.4.0 or higher.

[,3.4.0)
  • C
Arbitrary Code Execution

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of this package are vulnerable to Arbitrary Code Execution via the FileUtil.unTar() API due to improper escape of the input file name before it passed to the shell.

Note:

In vulnerable 3.3.x versions FileUtil.unTar() is used through InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user.

How to fix Arbitrary Code Execution?

Upgrade org.apache.hadoop:hadoop-common to version 2.10.2, 3.2.4, 3.3.3 or higher.

[2.0.0,2.10.2) [3.0.0-alpha,3.2.4) [3.3.0,3.3.3)
  • C
Arbitrary File Write via Archive Extraction (Zip Slip)

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of this package are vulnerable to Arbitrary File Write via Archive Extraction (Zip Slip) in FileUtil where unpackEntries during TAR extraction follows symbolic links which allows writing outside the expected base directory on Windows. This is because getCanonicalPath doesn't resolve symbolic links on Windows.

How to fix Arbitrary File Write via Archive Extraction (Zip Slip)?

Upgrade org.apache.hadoop:hadoop-common to version 2.10.2, 3.2.3, 3.3.3-RC0 or higher.

[,2.10.2) [3.0.0-alpha1,3.2.3) [3.3.0,3.3.3-RC0)
  • M
Information Exposure

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of this package are vulnerable to Information Exposure. A cluster user may be able to expose private files owned by the user running the MapReduce job history server process. A malicious user can construct a configuration file containing XML directives that reference sensitive files on the MapReduce job history server host.

How to fix Information Exposure?

Upgrade org.apache.hadoop:hadoop-common to version 2.8.3, 3.0.0 or higher.

[,2.8.3) [3.0.0-alpha,3.0.0)
  • C
Cryptographic Weakness

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of this package are vulnerable to Cryptographic Weakness due to an insufficient secret key length. The package generates token passwords using a 20-bit secret when Kerberos security features are enabled, which makes it easier for context-dependent attackers to crack secret keys via a brute-force attack.

How to fix Cryptographic Weakness?

Upgrade org.apache.hadoop:hadoop-common to version 0.23.4, 2.0.2-alpha or higher.

[,0.23.4) [2.0.0-alpha,2.0.2-alpha)
  • C
Information Exposure

org.apache.hadoop:hadoop-common is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.

Affected versions of the package are vulnerable to Information Exposure.

If you use the CredentialProvider feature to encrypt passwords used in NodeManager configs, it may be possible for any Container launched by that NodeManager to gain access to the encryption password. The other passwords themselves are not directly exposed.

How to fix Information Exposure?

Upgrade org.apache.hadoop:hadoop-common to versions 2.6.5, 2.7.3 or higher

[,2.6.5) [2.7.0,2.7.3)