Filtered by vendor Apache
Subscriptions
Filtered by product Commons Compress
Subscriptions
Total
11 CVE
CVE | Vendors | Products | Updated | CVSS v3.1 |
---|---|---|---|---|
CVE-2024-26308 | 2 Apache, Redhat | 8 Commons Compress, Camel Quarkus, Jboss Data Grid and 5 more | 2024-11-21 | 5.5 Medium |
Allocation of Resources Without Limits or Throttling vulnerability in Apache Commons Compress.This issue affects Apache Commons Compress: from 1.21 before 1.26. Users are recommended to upgrade to version 1.26, which fixes the issue. | ||||
CVE-2024-25710 | 2 Apache, Redhat | 9 Commons Compress, Amq Streams, Camel Quarkus and 6 more | 2024-11-21 | 8.1 High |
Loop with Unreachable Exit Condition ('Infinite Loop') vulnerability in Apache Commons Compress.This issue affects Apache Commons Compress: from 1.3 through 1.25.0. Users are recommended to upgrade to version 1.26.0 which fixes the issue. | ||||
CVE-2023-42503 | 1 Apache | 1 Commons Compress | 2024-11-21 | 5.5 Medium |
Improper Input Validation, Uncontrolled Resource Consumption vulnerability in Apache Commons Compress in TAR parsing.This issue affects Apache Commons Compress: from 1.22 before 1.24.0. Users are recommended to upgrade to version 1.24.0, which fixes the issue. A third party can create a malformed TAR file by manipulating file modification times headers, which when parsed with Apache Commons Compress, will cause a denial of service issue via CPU consumption. In version 1.22 of Apache Commons Compress, support was added for file modification times with higher precision (issue # COMPRESS-612 [1]). The format for the PAX extended headers carrying this data consists of two numbers separated by a period [2], indicating seconds and subsecond precision (for example “1647221103.5998539”). The impacted fields are “atime”, “ctime”, “mtime” and “LIBARCHIVE.creationtime”. No input validation is performed prior to the parsing of header values. Parsing of these numbers uses the BigDecimal [3] class from the JDK which has a publicly known algorithmic complexity issue when doing operations on large numbers, causing denial of service (see issue # JDK-6560193 [4]). A third party can manipulate file time headers in a TAR file by placing a number with a very long fraction (300,000 digits) or a number with exponent notation (such as “9e9999999”) within a file modification time header, and the parsing of files with these headers will take hours instead of seconds, leading to a denial of service via exhaustion of CPU resources. This issue is similar to CVE-2012-2098 [5]. [1]: https://issues.apache.org/jira/browse/COMPRESS-612 [2]: https://pubs.opengroup.org/onlinepubs/9699919799/utilities/pax.html#tag_20_92_13_05 [3]: https://docs.oracle.com/javase/8/docs/api/java/math/BigDecimal.html [4]: https://bugs.openjdk.org/browse/JDK-6560193 [5]: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-2098 Only applications using CompressorStreamFactory class (with auto-detection of file types), TarArchiveInputStream and TarFile classes to parse TAR files are impacted. Since this code was introduced in v1.22, only that version and later versions are impacted. | ||||
CVE-2021-36090 | 4 Apache, Netapp, Oracle and 1 more | 36 Commons Compress, Active Iq Unified Manager, Oncommand Insight and 33 more | 2024-11-21 | 7.5 High |
When reading a specially crafted ZIP archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' zip package. | ||||
CVE-2021-35517 | 4 Apache, Netapp, Oracle and 1 more | 29 Commons Compress, Active Iq Unified Manager, Oncommand Insight and 26 more | 2024-11-21 | 7.5 High |
When reading a specially crafted TAR archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' tar package. | ||||
CVE-2021-35516 | 4 Apache, Netapp, Oracle and 1 more | 26 Commons Compress, Active Iq Unified Manager, Oncommand Insight and 23 more | 2024-11-21 | 7.5 High |
When reading a specially crafted 7Z archive, Compress can be made to allocate large amounts of memory that finally leads to an out of memory error even for very small inputs. This could be used to mount a denial of service attack against services that use Compress' sevenz package. | ||||
CVE-2021-35515 | 4 Apache, Netapp, Oracle and 1 more | 28 Commons Compress, Active Iq Unified Manager, Oncommand Insight and 25 more | 2024-11-21 | 7.5 High |
When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress' sevenz package. | ||||
CVE-2019-12402 | 4 Apache, Fedoraproject, Oracle and 1 more | 20 Commons Compress, Fedora, Banking Payments and 17 more | 2024-11-21 | 7.5 High |
The file name encoding algorithm used internally in Apache Commons Compress 1.15 to 1.18 can get into an infinite loop when faced with specially crafted inputs. This can lead to a denial of service attack if an attacker can choose the file names inside of an archive created by Compress. | ||||
CVE-2018-1324 | 2 Apache, Oracle | 3 Commons Compress, Mysql Cluster, Weblogic Server | 2024-11-21 | 5.5 Medium |
A specially crafted ZIP archive can be used to cause an infinite loop inside of Apache Commons Compress' extra field parser used by the ZipFile and ZipArchiveInputStream classes in versions 1.11 to 1.15. This can be used to mount a denial of service attack against services that use Compress' zip package. | ||||
CVE-2018-11771 | 3 Apache, Oracle, Redhat | 3 Commons Compress, Weblogic Server, Jboss Fuse | 2024-11-21 | 5.5 Medium |
When reading a specially crafted ZIP archive, the read method of Apache Commons Compress 1.7 to 1.17's ZipArchiveInputStream can fail to return the correct EOF indication after the end of the stream has been reached. When combined with a java.io.InputStreamReader this can lead to an infinite stream, which can be used to mount a denial of service attack against services that use Compress' zip package. | ||||
CVE-2012-2098 | 1 Apache | 1 Commons Compress | 2024-11-21 | N/A |
Algorithmic complexity vulnerability in the sorting algorithms in bzip2 compressing stream (BZip2CompressorOutputStream) in Apache Commons Compress before 1.4.1 allows remote attackers to cause a denial of service (CPU consumption) via a file with many repeating inputs. |
Page 1 of 1.