Find, fix and prevent vulnerabilities in your code.
high severity
- Vulnerable module: org.xerial.snappy:snappy-java
- Introduced through: org.apache.kafka:kafka-clients@2.8.1
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.kafka:kafka-clients@2.8.1 › org.xerial.snappy:snappy-java@1.1.8.1Remediation: Upgrade to org.apache.kafka:kafka-clients@3.5.2.
Overview
Affected versions of this package are vulnerable to Allocation of Resources Without Limits or Throttling due to a missing upper bound check on chunk length in the SnappyInputStream
function. An attacker can decompress data with an excessively large chunk size.
Remediation
Upgrade org.xerial.snappy:snappy-java
to version 1.1.10.4 or higher.
References
high severity
- Vulnerable module: org.xerial.snappy:snappy-java
- Introduced through: org.apache.kafka:kafka-clients@2.8.1
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.kafka:kafka-clients@2.8.1 › org.xerial.snappy:snappy-java@1.1.8.1Remediation: Upgrade to org.apache.kafka:kafka-clients@3.5.1.
Overview
Affected versions of this package are vulnerable to Denial of Service (DoS) via the hasNextChunk
function due to improper validation of the chunkSize
variable value.
Exploiting this vulnerability is possible by passing a negative number (such as 0xFFFFFFFF
, which is -1), which will cause the code to raise a java.lang.NegativeArraySizeException
exception.
A worse case would happen when passing a huge positive value (such as 0x7FFFFFFF
), raising the fatal java.lang.OutOfMemoryError
error.
PoC
package org.example;
import org.xerial.snappy.SnappyInputStream;
import java.io.*;
public class Main {
public static void main(String[] args) throws IOException {
byte[] data = {-126, 'S', 'N', 'A', 'P', 'P', 'Y', 0, 0, 0, 0, 0, 0, 0, 0, 0,(byte) 0x7f, (byte) 0xff, (byte) 0xff, (byte) 0xff};
SnappyInputStream in = new SnappyInputStream(new ByteArrayInputStream(data));
byte[] out = new byte[50];
try {
in.read(out);
}
catch (Exception ignored) {
}
}
}
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.
Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.
One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.
When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.
Two common types of DoS vulnerabilities:
High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.
Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm
ws
package
Remediation
Upgrade org.xerial.snappy:snappy-java
to version 1.1.10.1 or higher.
References
medium severity
- Vulnerable module: org.xerial.snappy:snappy-java
- Introduced through: org.apache.kafka:kafka-clients@2.8.1
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.kafka:kafka-clients@2.8.1 › org.xerial.snappy:snappy-java@1.1.8.1Remediation: Upgrade to org.apache.kafka:kafka-clients@3.5.1.
Overview
Affected versions of this package are vulnerable to Integer Overflow or Wraparound via the shuffle(int[] input)
function due to improper validation of the multiplications done on the input length.
Exploiting this vulnerability is possible by passing negative, zero, float, very small, or very long values to the shuffle
functions, which later on are multiplicated by four.
A successful exploration results in “java.lang.ArrayIndexOutOfBoundsException" or “java.lang.NegativeArraySizeException” exceptions which can crash the program.
PoC
package org.example;
import org.xerial.snappy.BitShuffle;
import java.io.*;
public class Main {
public static void main(String[] args) throws IOException {
int[] original = new int[0x40000000];
byte[] shuffled = BitShuffle.shuffle(original);
System.out.println(shuffled[0]);
}
}
The program will crash, showing the following error (or similar):
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 0 out of bounds for length 0
at org.example.Main.main(Main.java:12)
Process finished with exit code 1
Remediation
Upgrade org.xerial.snappy:snappy-java
to version 1.1.10.1 or higher.
References
medium severity
- Vulnerable module: org.xerial.snappy:snappy-java
- Introduced through: org.apache.kafka:kafka-clients@2.8.1
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.kafka:kafka-clients@2.8.1 › org.xerial.snappy:snappy-java@1.1.8.1Remediation: Upgrade to org.apache.kafka:kafka-clients@3.5.1.
Overview
Affected versions of this package are vulnerable to Integer Overflow or Wraparound via the function compress(char[] input)
in Snappy.java
due to improper validation of the array length.
Exploiting this vulnerability is possible when the “buf” array compiled by the maxCompressedLength
function is successfully allocated but its size might be too small to use for the compression, causing a fatal Access Violation error.
Note: The issue most likely won’t occur when using a byte array since creating a byte array of size 0x80000000 (or any other negative value) is impossible in the first place.
PoC
package org.example;
import org.xerial.snappy.Snappy;
import java.io.*;
public class Main {
public static void main(String[] args) throws IOException {
char[] uncompressed = new char[0x40000000];
byte[] compressed = Snappy.compress(uncompressed);
}
}
Remediation
Upgrade org.xerial.snappy:snappy-java
to version 1.1.10.1 or higher.
References
medium severity
- Vulnerable module: org.apache.kafka:kafka-clients
- Introduced through: org.apache.kafka:kafka-clients@2.8.1
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.kafka:kafka-clients@2.8.1Remediation: Upgrade to org.apache.kafka:kafka-clients@3.4.0.
Overview
org.apache.kafka:kafka-clients is a streaming platform that can publish and subscribe to streams of records, store streams of records in a fault-tolerant durable way, and process streams of records as they occur.
Affected versions of this package are vulnerable to Deserialization of Untrusted Data when there are gadgets in the classpath
. The server will connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server.
Note: Exploitation requires access to a Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol.
Mitigation
Kafka Connect users are advised to validate connector configurations and only allow trusted JNDI configurations.
Users should examine connector dependencies for vulnerable versions and either upgrade their connectors, upgrading that specific dependency, or removing the connectors as options for remediation.
Kafka Connect users can also implement their own connector client config override policy, which can be used to control which Kafka client properties can be overridden directly in a connector config and which cannot.
Details
Serialization is a process of converting an object into a sequence of bytes which can be persisted to a disk or database or can be sent through streams. The reverse process of creating object from sequence of bytes is called deserialization. Serialization is commonly used for communication (sharing objects between multiple hosts) and persistence (store the object state in a file or a database). It is an integral part of popular protocols like Remote Method Invocation (RMI), Java Management Extension (JMX), Java Messaging System (JMS), Action Message Format (AMF), Java Server Faces (JSF) ViewState, etc.
Deserialization of untrusted data (CWE-502) is when the application deserializes untrusted data without sufficiently verifying that the resulting data will be valid, thus allowing the attacker to control the state or the flow of the execution.
Remediation
Upgrade org.apache.kafka:kafka-clients
to version 3.4.0 or higher.
References
low severity
- Vulnerable module: junit:junit
- Introduced through: org.apache.logging.log4j:log4j-api@2.16.0
Detailed paths
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.logging.log4j:log4j-api@2.16.0 › org.junit.jupiter:junit-jupiter-migrationsupport@5.7.2 › junit:junit@4.13Remediation: Upgrade to org.apache.logging.log4j:log4j-api@2.16.0.
-
Introduced through: finn-no/retriable-kafka-consumer@finn-no/retriable-kafka-consumer#78e7e6eb40edf5c35e0eed7ad1c5e32b86b532b9 › org.apache.logging.log4j:log4j-api@2.16.0 › org.junit.vintage:junit-vintage-engine@5.7.2 › junit:junit@4.13Remediation: Upgrade to org.apache.logging.log4j:log4j-api@2.16.0.
Overview
junit:junit is an unit testing framework for Java
Affected versions of this package are vulnerable to Information Exposure. The JUnit4 test rule TemporaryFolder
contains a local information disclosure vulnerability. On Unix like systems, the system's temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system.
Note: This vulnerability does not allow other users to overwrite the contents of these directories or files. This only affects Unix like systems.
Remediation
Upgrade junit:junit
to version 4.13.1 or higher.