able@0.4.3

Vulnerabilities

30 via 101 paths

Dependencies

252

Source

npm

Find, fix and prevent vulnerabilities in your code.

Severity
  • 13
  • 15
  • 2
Status
  • 30
  • 0
  • 0

high severity

Denial of Service (DoS)

  • Vulnerable module: ammo
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 ammo@1.0.1
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 ammo@1.0.1

Overview

ammo is a HTTP Range processing utilities. Note This package is deprecated and is now maintained as @hapi/ammo.

Affected versions of this package are vulnerable to Denial of Service (DoS). The Range HTTP header parser has a vulnerability which will cause the function to throw a system error if the header is set to an invalid value. Because hapi is not expecting the function to ever throw, the error is thrown all the way up the stack. If no unhandled exception handler is available, the application will exist, allowing an attacker to shut down services.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

There is no fixed version for ammo.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: content
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 content@1.0.2
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 content@1.0.2
    Remediation: Upgrade to hapi@11.0.4.

Overview

content is a HTTP Content-* headers parsing

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks. An attacker may pass a specially crafted Content-Type or Content-Disposition header, causing the server to hang. This can cause an impact of about 10 seconds matching time for data 180 characters long.

Disclosure Timeline

  • Feb 5th, 2018 - Initial Disclosure to package owner
  • Feb 5th, 2018 - Initial Response from package owner
  • Feb 28th, 2018 - Fix issued
  • Mar 5th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade content to versions 3.0.7, 4.0.4 or higher

References

high severity

Prototype Pollution

  • Vulnerable module: convict
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1
    Remediation: Upgrade to convict@6.0.1.

Overview

convict is a package that expands on the standard pattern of configuring node.js applications in a way that is more robust and accessible to collaborators, who may have less interest in digging through imperative code in order to inspect or modify settings. By introducing a configuration schema, convict gives project collaborators more context on each setting and enables validation and early failures for when configuration goes wrong.

Affected versions of this package are vulnerable to Prototype Pollution via the convict() function as parentKey is not validated.

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade convict to version 6.0.1 or higher.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: hapi
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0

Overview

hapi is a HTTP Server framework.

Affected versions of this package are vulnerable to Denial of Service (DoS). The CORS request handler has a vulnerability which will cause the function to throw a system error if the header contains some invalid values. If no unhandled exception handler is available, the application will exist, allowing an attacker to shut down services.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

There is no fixed version for hapi.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: hapi
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0
    Remediation: Upgrade to hapi@11.1.3.

Overview

Sending a purposefully crafted invalid date in the If-Modified-Since or Last-Modified header will cause the Hapi server to err but keep the socket open (the socket will time out after 2 minutes by default). This allows an attacker to quickly exhaust the sockets on the server, making it unavailable (a Denial of Service attack).

The vulnerability is caused by the combination of two bugs. First, the underlying V8 engine throws an exception when processing the specially crafted date, instead of stating the date is invalid as it should. Second, the Hapi server does not handle the exception well, leading to the socket remaining open.

Upgrading Hapi will address the second issue and thus fix the vulnerability.

References

high severity

Prototype Pollution

  • Vulnerable module: merge
  • Introduced through: mozlog@2.0.0

Detailed paths

  • Introduced through: able@0.4.3 mozlog@2.0.0 merge@1.2.1
    Remediation: Upgrade to mozlog@3.0.2.

Overview

merge is a library that allows you to merge multiple objects into one, optionally creating a new cloned object. Similar to the jQuery.extend but more flexible. Works in Node.js and the browser.

Affected versions of this package are vulnerable to Prototype Pollution. The 'merge' function already checks for 'proto' keys in an object to prevent prototype pollution, but does not check for 'constructor' or 'prototype' keys.

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade merge to version 2.1.0 or higher.

References

high severity

Prototype Pollution

  • Vulnerable module: merge
  • Introduced through: mozlog@2.0.0

Detailed paths

  • Introduced through: able@0.4.3 mozlog@2.0.0 merge@1.2.1
    Remediation: Upgrade to mozlog@3.0.2.

Overview

merge is a library that allows you to merge multiple objects into one, optionally creating a new cloned object. Similar to the jQuery.extend but more flexible. Works in Node.js and the browser.

Affected versions of this package are vulnerable to Prototype Pollution via _recursiveMerge .

PoC:

const merge = require('merge');

const payload2 = JSON.parse('{"x": {"__proto__":{"polluted":"yes"}}}');

let obj1 = {x: {y:1}};

console.log("Before : " + obj1.polluted);
merge.recursive(obj1, payload2);
console.log("After : " + obj1.polluted);
console.log("After : " + {}.polluted);

Output:

Before : undefined
After : yes
After : yes

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade merge to version 2.1.1 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: minimatch
  • Introduced through: glob@5.0.5 and browserify@9.0.8

Detailed paths

  • Introduced through: able@0.4.3 glob@5.0.5 minimatch@2.0.10
    Remediation: Upgrade to glob@5.0.15.
  • Introduced through: able@0.4.3 browserify@9.0.8 glob@4.5.3 minimatch@2.0.10
    Remediation: Upgrade to browserify@12.0.0.

Overview

minimatch is a minimal matching utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via complicated and illegal regexes.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade minimatch to version 3.0.2 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: minimatch
  • Introduced through: glob@5.0.5 and browserify@9.0.8

Detailed paths

  • Introduced through: able@0.4.3 glob@5.0.5 minimatch@2.0.10
    Remediation: Upgrade to glob@5.0.15.
  • Introduced through: able@0.4.3 browserify@9.0.8 glob@4.5.3 minimatch@2.0.10
    Remediation: Upgrade to browserify@12.0.0.

Overview

minimatch is a minimal matching utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade minimatch to version 3.0.2 or higher.

References

high severity

Prototype Override Protection Bypass

  • Vulnerable module: qs
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 qs@2.4.2
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 qs@4.0.0
    Remediation: Upgrade to hapi@11.0.4.

Overview

qs is a querystring parser that supports nesting and arrays, with a depth limit.

Affected versions of this package are vulnerable to Prototype Override Protection Bypass. By default qs protects against attacks that attempt to overwrite an object's existing prototype properties, such as toString(), hasOwnProperty(),etc.

From qs documentation:

By default parameters that would overwrite properties on the object prototype are ignored, if you wish to keep the data from those fields either use plainObjects as mentioned above, or set allowPrototypes to true which will allow user input to overwrite those properties. WARNING It is generally a bad idea to enable this option as it can cause problems when attempting to use the properties that have been overwritten. Always be careful with this option.

Overwriting these properties can impact application logic, potentially allowing attackers to work around security controls, modify data, make the application unstable and more.

In versions of the package affected by this vulnerability, it is possible to circumvent this protection and overwrite prototype properties and functions by prefixing the name of the parameter with [ or ]. e.g. qs.parse("]=toString") will return {toString = true}, as a result, calling toString() on the object will throw an exception.

Example:

qs.parse('toString=foo', { allowPrototypes: false })
// {}

qs.parse("]=toString", { allowPrototypes: false })
// {toString = true} <== prototype overwritten

For more information, you can check out our blog.

Disclosure Timeline

  • February 13th, 2017 - Reported the issue to package owner.
  • February 13th, 2017 - Issue acknowledged by package owner.
  • February 16th, 2017 - Partial fix released in versions 6.0.3, 6.1.1, 6.2.2, 6.3.1.
  • March 6th, 2017 - Final fix released in versions 6.4.0,6.3.2, 6.2.3, 6.1.2 and 6.0.4

    Remediation

    Upgrade qs to version 6.0.4, 6.1.2, 6.2.3, 6.3.2 or higher.

    References

  • GitHub Commit
  • GitHub Issue

high severity

Command Injection

  • Vulnerable module: shell-quote
  • Introduced through: browserify@9.0.8

Detailed paths

  • Introduced through: able@0.4.3 browserify@9.0.8 shell-quote@0.0.1
    Remediation: Upgrade to browserify@12.0.0.

Overview

shell-quote is a package used to quote and parse shell commands.

Affected versions of this package are vulnerable to Command Injection. The quote function does not properly escape the following special characters <, >, ;, {, } , and as a result can be used by an attacker to inject malicious shell commands or leak sensitive information.

Proof of Concept

Consider the following poc.js application

var quote = require('shell-quote').quote;
var exec = require('child_process').exec;

var userInput = process.argv[2];

var safeCommand = quote(['echo', userInput]);

exec(safeCommand, function (err, stdout, stderr) {
  console.log(stdout);
});

Running the following command will not only print the character a as expected, but will also run the another command, i.e touch malicious.sh

$ node poc.js 'a;{touch,malicious.sh}'

Remediation

Upgrade shell-quote to version 1.6.1 or higher.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: subtext
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1

Overview

subtext is a HTTP payload parsing library. Deprecated. Note: This package is deprecated and is now maintained as @hapi/subtext

Affected versions of this package are vulnerable to Denial of Service (DoS). The package fails to enforce the maxBytes configuration for payloads with chunked encoding that are written to the file system. This allows attackers to send requests with arbitrary payload sizes, which may exhaust system resources.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

Remediation

There is no fixed version for subtext.

References

high severity

Prototype Pollution

  • Vulnerable module: subtext
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1

Overview

subtext is a HTTP payload parsing library. Deprecated. Note: This package is deprecated and is now maintained as @hapi/subtext

Affected versions of this package are vulnerable to Prototype Pollution. A multipart payload can be constructed in a way that one of the parts’ content can be set as the entire payload object’s prototype. If this prototype contains data, it may bypass other validation rules which enforce access and privacy. If this prototype evaluates to null, it can cause unhandled exceptions when the request payload is accessed.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

There is no fixed version for subtext.

References

medium severity

Improper input validation

  • Vulnerable module: call
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 call@2.0.2
    Remediation: Upgrade to hapi@11.0.4.

Overview

call is the primary HTTP router of the hapi framework.

The vulnerability arise from undefined values inside a path (last segment being an exception) making their way into components that do not care for values being undefined (eg. the database layer).

For example, the request URI /delete/company// may incorrectly match a route looking for /delete/company/{company}/. By itself, the bad match is not a vulnerability. However, depending on the remaining logic in the application, such a bad match may result in skipping a protection mechanisms. In the above example, if the route translates to a DB delete command, it might delete all the companies from the db.

Remediation

Upgrade to version 3.0.2 or higher.

References

https://github.com/hapijs/hapi/issues/3228 https://github.com/hapijs/call/commit/9570eee5358b4383715cc6a13cb95971678efd30

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: content
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 content@1.0.2
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 content@1.0.2
    Remediation: Upgrade to hapi@11.0.4.

Overview

content is HTTP Content-* headers parsing.

Affected versions of this package are vulnerable to Regular expression Denial of Service (ReDoS) attacks. An attacker may pass a specially crafted Content-Type or Content-Disposition header, causing the server to hang.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade content to version 3.0.6 or higher.

References

medium severity

Potentially loose security restrictions

  • Vulnerable module: hapi
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0
    Remediation: Upgrade to hapi@11.1.4.

Overview

Security restrictions (e.g. origin) get overridden by less restrictive defaults (i.e. all origins) in cases when server level, connection level or route level CORS configurations are combined.

References

medium severity

Prototype Pollution

  • Vulnerable module: hoek
  • Introduced through: boom@2.6.1, hapi@8.4.0 and others

Detailed paths

  • Introduced through: able@0.4.3 boom@2.6.1 hoek@2.16.3
    Remediation: Upgrade to boom@3.1.3.
  • Introduced through: able@0.4.3 hapi@8.4.0 hoek@2.16.3
    Remediation: Upgrade to hapi@13.4.0.
  • Introduced through: able@0.4.3 hapi-fxa-oauth@2.0.1 boom@2.6.1 hoek@2.16.3
    Remediation: Upgrade to hapi-fxa-oauth@3.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 accept@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 ammo@1.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 call@2.0.2 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@13.1.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 catbox@4.3.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 catbox-memory@1.1.2 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 heavy@3.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 iron@2.1.3 hoek@2.16.3
    Remediation: Upgrade to hapi@13.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 kilt@1.1.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 mimos@2.0.2 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 shot@1.7.0 hoek@2.16.3
    Remediation: Upgrade to hapi@12.0.1.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 hoek@2.16.3
    Remediation: Upgrade to hapi@13.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 hoek@2.16.3
    Remediation: Upgrade to hapi@12.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 vision@2.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 accept@1.1.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 ammo@1.0.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 call@2.0.2 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 catbox@4.3.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 cryptiles@2.0.5 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 heavy@3.0.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 iron@2.1.3 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 vision@2.0.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 ammo@1.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 catbox@4.3.0 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 heavy@3.0.1 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@13.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 vision@2.0.1 joi@6.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 wreck@6.3.0 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 wreck@6.3.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 iron@2.1.3 hoek@2.16.3
    Remediation: Upgrade to hapi@13.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 content@1.0.2 hoek@2.16.3
    Remediation: Open PR to patch hoek@2.16.3.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 ammo@1.0.1 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 iron@2.1.3 cryptiles@2.0.5 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 cryptiles@2.0.5 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 wreck@6.3.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 wreck@6.3.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 iron@2.1.3 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 content@1.0.2 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 catbox@4.3.0 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 h2o2@4.0.2 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 heavy@3.0.1 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 inert@2.1.6 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 vision@2.0.1 joi@6.10.1 topo@1.1.0 hoek@2.16.3
    Remediation: Upgrade to hapi@9.0.0.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 content@1.0.2 hoek@2.16.3
    Remediation: Open PR to patch hoek@2.16.3.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 b64@2.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 nigel@1.0.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 statehood@2.1.1 iron@2.1.3 cryptiles@2.0.5 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 content@1.0.2 boom@2.10.1 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.
  • Introduced through: able@0.4.3 hapi@8.4.0 subtext@1.1.1 pez@1.0.0 nigel@1.0.1 vise@1.0.0 hoek@2.16.3
    Remediation: Upgrade to hapi@11.0.4.

Overview

hoek is an Utility methods for the hapi ecosystem.

Affected versions of this package are vulnerable to Prototype Pollution. The utilities function allow modification of the Object prototype. If an attacker can control part of the structure passed to this function, they could add or modify an existing property.

PoC by Olivier Arteau (HoLyVieR)

var Hoek = require('hoek');
var malicious_payload = '{"__proto__":{"oops":"It works !"}}';

var a = {};
console.log("Before : " + a.oops);
Hoek.merge({}, JSON.parse(malicious_payload));
console.log("After : " + a.oops);

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade hoek to version 4.2.1, 5.0.3 or higher.

References

medium severity

Prototype Pollution

  • Vulnerable module: minimist
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 optimist@0.6.1 minimist@0.0.10

Overview

minimist is a parse argument options module.

Affected versions of this package are vulnerable to Prototype Pollution. The library could be tricked into adding or modifying properties of Object.prototype using a constructor or __proto__ payload.

PoC by Snyk

require('minimist')('--__proto__.injected0 value0'.split(' '));
console.log(({}).injected0 === 'value0'); // true

require('minimist')('--constructor.prototype.injected1 value1'.split(' '));
console.log(({}).injected1 === 'value1'); // true

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade minimist to version 0.2.1, 1.2.3 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: moment
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 moment@2.8.4
    Remediation: Upgrade to convict@1.1.1.

Overview

moment is a lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates.

An attacker can provide a long value to the duration function, which nearly matches the pattern being matched. This will cause the regular expression matching to take a long time, all the while occupying the event loop and preventing it from processing other requests and making the server unavailable (a Denial of Service attack).

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade moment to version 2.11.2 or greater.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: moment
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 moment@2.8.4
    Remediation: Upgrade to convict@2.0.0.

Overview

moment is a lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates.

Affected versions of the package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks for any locale that has separate format and standalone options and format input can be controlled by the user.

An attacker can provide a specially crafted input to the format function, which nearly matches the pattern being matched. This will cause the regular expression matching to take a long time, all the while occupying the event loop and preventing it from processing other requests and making the server unavailable (a Denial of Service attack).

Disclosure Timeline

  • October 19th, 2016 - Reported the issue to package owner.
  • October 19th, 2016 - Issue acknowledged by package owner.
  • October 24th, 2016 - Issue fixed and version 2.15.2 released.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: uglify-js
  • Introduced through: uglify-js@2.4.24

Detailed paths

  • Introduced through: able@0.4.3 uglify-js@2.4.24
    Remediation: Upgrade to uglify-js@2.6.0.

Overview

The parse() function in the uglify-js package prior to version 2.6.0 is vulnerable to regular expression denial of service (ReDoS) attacks when long inputs of certain patterns are processed.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade to version 2.6.0 or greater. If a direct dependency update is not possible, use snyk wizard to patch this vulnerability.

References

medium severity

Arbitrary Code Injection

  • Vulnerable module: underscore
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 cjson@0.3.0 jsonlint@1.6.0 nomnom@1.8.1 underscore@1.6.0

Overview

underscore is a JavaScript's functional programming helper library.

Affected versions of this package are vulnerable to Arbitrary Code Injection via the template function, particularly when the variable option is taken from _.templateSettings as it is not sanitized.

PoC

const _ = require('underscore');
_.templateSettings.variable = "a = this.process.mainModule.require('child_process').execSync('touch HELLO')";
const t = _.template("")();

Remediation

Upgrade underscore to version 1.13.0-2, 1.12.1 or higher.

References

medium severity

Buffer Overflow

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@2.0.0.

Overview

validator is a library of string validators and sanitizers.

Affected versions of this package are vulnerable to Buffer Overflow. It used a regular expression (/^(?:[A-Z0-9+\/]{4})*(?:[A-Z0-9+\/]{2}==|[A-Z0-9+\/]{3}=|[A-Z0-9+\/]{4})$/i) in order to validate Base64 strings.

Remediation

Upgrade validator to version 5.0.0 or higher.

References

medium severity

Cross-site Scripting (XSS)

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@0.8.1.

Overview

validator is a library of string validators and sanitizers.

Affected versions of this package are vulnerable to Cross-site Scripting (XSS) in IE9 due to unescaped backticks.

Details

A cross-site scripting attack occurs when the attacker tricks a legitimate web-based application or site to accept a request as originating from a trusted source.

This is done by escaping the context of the web application; the web application then delivers that data to its users along with other trusted dynamic content, without validating it. The browser unknowingly executes malicious script on the client side (through client-side languages; usually JavaScript or HTML) in order to perform actions that are otherwise typically blocked by the browser’s Same Origin Policy.

ֿInjecting malicious code is the most prevalent manner by which XSS is exploited; for this reason, escaping characters in order to prevent this manipulation is the top method for securing code against this vulnerability.

Escaping means that the application is coded to mark key characters, and particularly key characters included in user input, to prevent those characters from being interpreted in a dangerous context. For example, in HTML, < can be coded as &lt; and > can be coded as &gt; in order to be interpreted and displayed as themselves in text, while within the code itself, they are used for HTML tags. If malicious content is injected into an application that escapes special characters and that malicious content uses < and > as HTML tags, those characters are nonetheless not interpreted as HTML tags by the browser if they’ve been correctly escaped in the application code and in this way the attempted attack is diverted.

The most prominent use of XSS is to steal cookies (source: OWASP HttpOnly) and hijack user sessions, but XSS exploits have been used to expose sensitive information, enable access to privileged services and functionality and deliver malware.

Types of attacks

There are a few methods by which XSS can be manipulated:

Type Origin Description
Stored Server The malicious code is inserted in the application (usually as a link) by the attacker. The code is activated every time a user clicks the link.
Reflected Server The attacker delivers a malicious link externally from the vulnerable web site application to a user. When clicked, malicious code is sent to the vulnerable web site, which reflects the attack back to the user’s browser.
DOM-based Client The attacker forces the user’s browser to render a malicious page. The data in the page itself delivers the cross-site scripting data.
Mutated The attacker injects code that appears safe, but is then rewritten and modified by the browser, while parsing the markup. An example is rebalancing unclosed quotation marks or even adding quotation marks to unquoted parameters.

Affected environments

The following environments are susceptible to an XSS attack:

  • Web servers
  • Application servers
  • Web application environments

How to prevent

This section describes the top best practices designed to specifically protect your code:

  • Sanitize data input in an HTTP request before reflecting it back, ensuring all data is validated, filtered or escaped before echoing anything back to the user, such as the values of query parameters during searches.
  • Convert special characters such as ?, &, /, <, > and spaces to their respective HTML or URL encoded equivalents.
  • Give users the option to disable client-side scripts.
  • Redirect invalid requests.
  • Detect simultaneous logins, including those from two separate IP addresses, and invalidate those sessions.
  • Use and enforce a Content Security Policy (source: Wikipedia) to disable any features that might be manipulated for an XSS attack.
  • Read the documentation for any of the libraries referenced in your code to understand which elements allow for embedded HTML.

Remediation

Upgrade validator to version 3.35.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@6.0.0.

Overview

validator is an A library of string validators and sanitizers.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the isSlug function

PoC

var validator = require("validator")
function build_attack(n) {
    var ret = "111"
    for (var i = 0; i < n; i++) {
        ret += "a"
    }

    return ret+"_";
}
for(var i = 1; i <= 50000; i++) {
    if (i % 10000 == 0) {
        var time = Date.now();
        var attack_str = build_attack(i)
       validator.isSlug(attack_str)
        var time_cost = Date.now() - time;
        console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
   }
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade validator to version 13.6.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@6.0.0.

Overview

validator is an A library of string validators and sanitizers.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the rtrim function.

PoC

var validator = require("validator")
function build_attack(n) {
    var ret = ""
    for (var i = 0; i < n; i++) {
        ret += " "
    }

    return ret+"◎";
}
for(var i = 1; i <= 50000; i++) {
    if (i % 10000 == 0) {
        var time = Date.now();
        var attack_str = build_attack(i)
       validator.rtrim(attack_str)
        var time_cost = Date.now() - time;
        console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
   }

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade validator to version 13.6.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@6.0.0.

Overview

validator is an A library of string validators and sanitizers.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the isHSL function.

PoC

var validator = require("validator")
function build_attack(n) {
    var ret = "hsla(0"
    for (var i = 0; i < n; i++) {
        ret += " "
    }

    return ret+"◎";
}
for(var i = 1; i <= 50000; i++) {
    if (i % 1000 == 0) {
        var time = Date.now();
        var attack_str = build_attack(i)
       validator.isHSL(attack_str)
        var time_cost = Date.now() - time;
        console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
   }
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade validator to version 13.6.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: validator
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 validator@3.26.0
    Remediation: Upgrade to convict@6.0.0.

Overview

validator is an A library of string validators and sanitizers.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the isEmail function.

PoC

var validator = require("validator")
function build_attack(n) {
    var ret = ""
    for (var i = 0; i < n; i++) {
        ret += "<"
    }

    return ret+"";
}
for(var i = 1; i <= 50000; i++) {
    if (i % 10000 == 0) {
        var time = Date.now();
        var attack_str = build_attack(i)
        validator.isEmail(attack_str,{ allow_display_name: true })
        var time_cost = Date.now() - time;
        console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
   }
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade validator to version 13.6.0 or higher.

References

low severity

CORS Bypass

  • Vulnerable module: hapi
  • Introduced through: hapi@8.4.0

Detailed paths

  • Introduced through: able@0.4.3 hapi@8.4.0
    Remediation: Upgrade to hapi@11.0.0.

Overview

Hapi v11.0.0 and below have an incorrect implementation of the CORS protocol, and allow for configurations that, at best, return inconsistent headers and, at worst, cross-origin activities that are expected to be forbidden.

Details

If the connection has CORS enabled but one route has it off, and the route is not GET, the OPTIONS prefetch request will return the default CORS headers and then the actual request will go through and return no CORS headers. This defeats the purpose of turning CORS on the route.

Remediation

Upgrade to a version 11.0.0 or greater.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: moment
  • Introduced through: convict@0.6.1

Detailed paths

  • Introduced through: able@0.4.3 convict@0.6.1 moment@2.8.4
    Remediation: Upgrade to convict@4.0.2.

Overview

moment is a lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). It used a regular expression (/[0-9]*['a-z\u00A0-\u05FF\u0700-\uD7FF\uF900-\uFDCF\uFDF0-\uFFEF]+|[\u0600-\u06FF\/]+(\s*?[\u0600-\u06FF]+){1,2}/i) in order to parse dates specified as strings. This can cause a very low impact of about 2 seconds matching time for data 50k characters long.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade moment to version 2.19.3 or higher.

References