org.apache.kafka:kafka-clients@3.3.1

  • latest version

    4.2.0

  • latest non vulnerable version

  • first published

    11 years ago

  • latest version published

    2 months ago

  • licenses detected

  • package registry

  • Direct Vulnerabilities

    Known vulnerabilities in the org.apache.kafka:kafka-clients package. This does not include vulnerabilities belonging to this package’s dependencies.

    Fix vulnerabilities automatically

    Snyk's AI Trust Platform automatically finds the best upgrade path and integrates with your development workflows. Secure your code at zero cost.

    Fix for free
    VulnerabilityVulnerable Version
    • M
    Race Condition

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Race Condition in Sender.sendProducerData, involving ByteBuffer reuse. An attacker can cause messages to be delivered to unintended topics by forcing the premature deallocation of a buffer via delivery.timeout.ms expiration, before the original network request completes, leaving it in the buffer pool. Consumers who have access to the destination topic but not the intended source topic may receive the deallocated buffer, without an error being emitted. The corrupted messages may themselves cause errors or deserialization failures.

    The project maintainers note: "This bug has existed for more than a decade (since Kafka 0.x it seems), but never manifested because prior to 2.8.0 the pooled ByteBuffer (which contained record data aka your publishes) was copied into a freshly allocated ByteBuffer before any potential reuse and that fresh ByteBuffer was what got written over the network to the broker. With a change included in 2.8.0, the pooled ByteBuffer remains as-is inside of a MemoryRecords instance and this pooled ByteBuffer (which in some cases can be reused and overwritten with other data) is written over the network."

    How to fix Race Condition?

    Upgrade org.apache.kafka:kafka-clients to version 3.9.2, 4.0.2, 4.1.2 or higher.

    [2.8.0,3.9.2)[4.0.0,4.0.2)[4.1.0,4.1.2)
    • H
    Deserialization of Untrusted Data

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Deserialization of Untrusted Data due to improper handling of configuration data in the sasl.jaas.config property. An attacker can achieve arbitrary code execution by injecting a malicious configuration that causes the server to connect to an attacker-controlled LDAP server and deserialize untrusted data, leading to execution of deserialization gadget chains.

    Note:

    This is only exploitable if the attacker has access to alterConfig for a cluster resource or Kafka Connect worker and can create or modify connectors with arbitrary Kafka client SASL JAAS configuration.

    How to fix Deserialization of Untrusted Data?

    Upgrade org.apache.kafka:kafka-clients to version 3.9.1 or higher.

    [2.3.0,3.9.1)
    • H
    Deserialization of Untrusted Data

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Deserialization of Untrusted Data via the JndiLoginModule process in the SASL authentication mechanism. An attacker can execute arbitrary code or cause a denial of service by supplying a malicious JNDI URI in the broker's configuration.

    Note:

    This is only exploitable if the attacker can connect to the Kafka cluster and has the AlterConfigs permission on the cluster resource.

    How to fix Deserialization of Untrusted Data?

    Upgrade org.apache.kafka:kafka-clients to version 3.9.1 or higher.

    [2.0.0,3.9.1)
    • H
    Server-side Request Forgery (SSRF)

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Server-side Request Forgery (SSRF) due to the improper handling of sasl.oauthbearer.token.endpoint.url and sasl.oauthbearer.jwks.endpoint.url configurations. An attacker can read arbitrary contents of the disk and environment variables or make requests to an unintended location by manipulating these configurations.

    Note: This is only exploitable if configurations can be specified by an untrusted party.

    How to fix Server-side Request Forgery (SSRF)?

    Upgrade org.apache.kafka:kafka-clients to version 3.9.1 or higher.

    [3.1.0,3.9.1)
    • H
    Incorrect Implementation of Authentication Algorithm

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Incorrect Implementation of Authentication Algorithm in the form of nonce verification that fails to comply with RFC 5802 in the SCRAM implementation. If TLS is not in use for SCRAM exchanges - which is an insecure configuration in its own right - an attacker can intercept and replay the authentication messages.

    Configurations with SASL_PLAINTEXT set for listeners are vulnerable.

    How to fix Incorrect Implementation of Authentication Algorithm?

    Upgrade org.apache.kafka:kafka-clients to version 3.7.2, 3.8.1 or higher.

    [0.10.2.0,3.7.2)[3.8.0,3.8.1)
    • M
    Files or Directories Accessible to External Parties

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Files or Directories Accessible to External Parties through ConfigProviders interface, including the FileConfigProvider, DirectoryConfigProvider, and EnvVarConfigProvider class objects, which allows attackers to read arbitrary contents of the disk and environment variables.

    Note:

    1. Users should upgrade to the fixed version, and it is recommended that the JVM system property be set to org.apache.kafka.automatic.config.providers=none.

    2. Users of Kafka Connect with one of the above ConfigProvider implementations specified in their worker config are also recommended to add appropriate allowlist.pattern and allowed.paths to restrict their operation to appropriate bounds.

    How to fix Files or Directories Accessible to External Parties?

    Upgrade org.apache.kafka:kafka-clients to version 3.8.0 or higher.

    [2.3.0,3.8.0)
    • M
    Deserialization of Untrusted Data

    org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.

    Affected versions of this package are vulnerable to Deserialization of Untrusted Data when there are gadgets in the classpath. The server will connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server.

    Note: Exploitation requires access to a Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol.

    How to fix Deserialization of Untrusted Data?

    Upgrade org.apache.kafka:kafka-clients to version 3.4.0 or higher.

    [2.3.0,3.4.0)