sao@0.16.4

Vulnerabilities

10 via 15 paths

Dependencies

390

Source

npm

Find, fix and prevent vulnerabilities in your code.

Severity
  • 2
  • 7
  • 1
Status
  • 10
  • 0
  • 0

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: ansi-regex
  • Introduced through: kopy@5.1.1

Detailed paths

  • Introduced through: sao@0.16.4 kopy@5.1.1 inquirer@2.0.0 string-width@2.1.1 strip-ansi@4.0.0 ansi-regex@3.0.0
    Remediation: Upgrade to sao@1.0.0.

Overview

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) due to the sub-patterns [[\\]()#;?]* and (?:;[-a-zA-Z\\d\\/#&.:=?%@~_]*)*.

PoC

import ansiRegex from 'ansi-regex';

for(var i = 1; i <= 50000; i++) {
    var time = Date.now();
    var attack_str = "\u001B["+";".repeat(i*10000);
    ansiRegex().test(attack_str)
    var time_cost = Date.now() - time;
    console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade ansi-regex to version 6.0.1, 5.0.1 or higher.

References

high severity

Command Injection

  • Vulnerable module: lodash.template
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 gulp-util@3.0.8 lodash.template@3.6.2

Overview

lodash.template is a The Lodash method _.template exported as a Node.js module.

Affected versions of this package are vulnerable to Command Injection via template.

PoC

var _ = require('lodash');

_.template('', { variable: '){console.log(process.env)}; with(obj' })()

Remediation

There is no fixed version for lodash.template.

References

medium severity

Arbitrary File Write via Archive Extraction (Zip Slip)

  • Vulnerable module: decompress
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 decompress@3.0.0

Overview

decompress is a package that can be used for extracting archives.

Affected versions of this package are vulnerable to Arbitrary File Write via Archive Extraction (Zip Slip). It is possible to bypass the security measures provided by decompress and conduct ZIP path traversal through symlinks.

PoC

const decompress = require('decompress');

decompress('slip.tar.gz', 'dist').then(files => {
    console.log('done!');
});

Details

It is exploited using a specially crafted zip archive, that holds path traversal filenames. When exploited, a filename in a malicious archive is concatenated to the target extraction directory, which results in the final path ending up outside of the target folder. For instance, a zip may hold a file with a "../../file.exe" location and thus break out of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicous file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:


+2018-04-15 22:04:29 ..... 19 19 good.txt

+2018-04-15 22:04:42 ..... 20 20 ../../../../../../root/.ssh/authorized_keys

Remediation

Upgrade decompress to version 4.2.1 or higher.

References

medium severity

Arbitrary File Write via Archive Extraction (Zip Slip)

  • Vulnerable module: decompress-tar
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 decompress@3.0.0 decompress-tar@3.1.0

Overview

decompress-tar is a tar plugin for decompress.

Affected versions of this package are vulnerable to Arbitrary File Write via Archive Extraction (Zip Slip). It is possible to bypass the security measures provided by decompress and conduct ZIP path traversal through symlinks.

PoC

const decompress = require('decompress');

decompress('slip.tar.gz', 'dist').then(files => {
    console.log('done!');
});

Details

It is exploited using a specially crafted zip archive, that holds path traversal filenames. When exploited, a filename in a malicious archive is concatenated to the target extraction directory, which results in the final path ending up outside of the target folder. For instance, a zip may hold a file with a "../../file.exe" location and thus break out of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicous file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:


+2018-04-15 22:04:29 ..... 19 19 good.txt

+2018-04-15 22:04:42 ..... 20 20 ../../../../../../root/.ssh/authorized_keys

Remediation

There is no fixed version for decompress-tar.

References

medium severity

Prototype Pollution

  • Vulnerable module: dot-prop
  • Introduced through: conf@0.11.2 and update-notifier@1.0.3

Detailed paths

  • Introduced through: sao@0.16.4 conf@0.11.2 dot-prop@3.0.0
    Remediation: Upgrade to sao@0.16.11.
  • Introduced through: sao@0.16.4 update-notifier@1.0.3 configstore@2.1.0 dot-prop@3.0.0
    Remediation: Upgrade to sao@0.16.11.

Overview

dot-prop is a package to get, set, or delete a property from a nested object using a dot path.

Affected versions of this package are vulnerable to Prototype Pollution. It is possible for a user to modify the prototype of a base object.

PoC by aaron_costello

var dotProp = require("dot-prop")
const object = {};
console.log("Before " + object.b); //Undefined
dotProp.set(object, '__proto__.b', true);
console.log("After " + {}.b); //true

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade dot-prop to version 4.2.1, 5.1.1 or higher.

References

medium severity

Denial of Service

  • Vulnerable module: node-fetch
  • Introduced through: node-fetch@1.7.3

Detailed paths

  • Introduced through: sao@0.16.4 node-fetch@1.7.3
    Remediation: Upgrade to sao@0.21.0.

Overview

node-fetch is an A light-weight module that brings window.fetch to node.js

Affected versions of this package are vulnerable to Denial of Service. Node Fetch did not honor the size option after following a redirect, which means that when a content size was over the limit, a FetchError would never get thrown and the process would end without failure.

Remediation

Upgrade node-fetch to version 2.6.1, 3.0.0-beta.9 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: glob-parent
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 vinyl-fs@2.4.4 glob-stream@5.3.5 glob-parent@3.1.0
  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 decompress@3.0.0 vinyl-fs@2.4.4 glob-stream@5.3.5 glob-parent@3.1.0
  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 vinyl-fs@2.4.4 glob-stream@5.3.5 micromatch@2.3.11 parse-glob@3.0.4 glob-base@0.3.0 glob-parent@2.0.0
  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 decompress@3.0.0 vinyl-fs@2.4.4 glob-stream@5.3.5 micromatch@2.3.11 parse-glob@3.0.4 glob-base@0.3.0 glob-parent@2.0.0

Overview

glob-parent is a package that helps extracting the non-magic parent path from a glob string.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The enclosure regex used to check for strings ending in enclosure containing path separator.

PoC by Yeting Li

var globParent = require("glob-parent")
function build_attack(n) {
var ret = "{"
for (var i = 0; i < n; i++) {
ret += "/"
}

return ret;
}

globParent(build_attack(5000));

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade glob-parent to version 5.1.2 or higher.

References

medium severity

Uninitialized Memory Exposure

  • Vulnerable module: tunnel-agent
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 caw@1.2.0 tunnel-agent@0.4.3
    Remediation: Upgrade to sao@0.16.11.

Overview

tunnel-agent is HTTP proxy tunneling agent. Affected versions of the package are vulnerable to Uninitialized Memory Exposure.

A possible memory disclosure vulnerability exists when a value of type number is used to set the proxy.auth option of a request request and results in a possible uninitialized memory exposures in the request body.

This is a result of unobstructed use of the Buffer constructor, whose insecure default constructor increases the odds of memory leakage.

Details

Constructing a Buffer class with integer N creates a Buffer of length N with raw (not "zero-ed") memory.

In the following example, the first call would allocate 100 bytes of memory, while the second example will allocate the memory needed for the string "100":

// uninitialized Buffer of length 100
x = new Buffer(100);
// initialized Buffer with value of '100'
x = new Buffer('100');

tunnel-agent's request construction uses the default Buffer constructor as-is, making it easy to append uninitialized memory to an existing list. If the value of the buffer list is exposed to users, it may expose raw server side memory, potentially holding secrets, private data and code. This is a similar vulnerability to the infamous Heartbleed flaw in OpenSSL.

Proof of concept by ChALkeR

require('request')({
  method: 'GET',
  uri: 'http://www.example.com',
  tunnel: true,
  proxy:{
      protocol: 'http:',
      host:"127.0.0.1",
      port:8080,
      auth:80
  }
});

You can read more about the insecure Buffer behavior on our blog.

Similar vulnerabilities were discovered in request, mongoose, ws and sequelize.

Remediation

Upgrade tunnel-agent to version 0.6.0 or higher. Note This is vulnerable only for Node <=4

References

medium severity

Arbitrary Code Injection

  • Vulnerable module: ejs
  • Introduced through: kopy@5.1.1

Detailed paths

  • Introduced through: sao@0.16.4 kopy@5.1.1 jstransformer-ejs@0.0.3 ejs@2.7.4

Overview

ejs is a popular JavaScript templating engine.

Affected versions of this package are vulnerable to Arbitrary Code Injection via the render and renderFile. If external input is flowing into the options parameter, an attacker is able run arbitrary code. This include the filename, compileDebug, and client option.

POC

let ejs = require('ejs')
ejs.render('./views/test.ejs',{
    filename:'/etc/passwd\nfinally { this.global.process.mainModule.require(\'child_process\').execSync(\'touch EJS_HACKED\') }',
    compileDebug: true,
    message: 'test',
    client: true
})

Remediation

Upgrade ejs to version 3.1.6 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: braces
  • Introduced through: download-git-repo@0.2.2

Detailed paths

  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 vinyl-fs@2.4.4 glob-stream@5.3.5 micromatch@2.3.11 braces@1.8.5
  • Introduced through: sao@0.16.4 download-git-repo@0.2.2 download@4.4.3 gulp-decompress@1.2.0 decompress@3.0.0 vinyl-fs@2.4.4 glob-stream@5.3.5 micromatch@2.3.11 braces@1.8.5

Overview

braces is a Bash-like brace expansion, implemented in JavaScript.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). It used a regular expression (^\{(,+(?:(\{,+\})*),*|,*(?:(\{,+\})*),+)\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long.

Disclosure Timeline

  • Feb 15th, 2018 - Initial Disclosure to package owner
  • Feb 16th, 2018 - Initial Response from package owner
  • Feb 18th, 2018 - Fix issued
  • Feb 19th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade braces to version 2.3.1 or higher.

References