freeze@0.2.2-beta

Vulnerabilities

66 via 216 paths

Dependencies

579

Source

npm

Find, fix and prevent vulnerabilities in your code.

Severity
  • 2
  • 28
  • 27
  • 9
Status
  • 66
  • 0
  • 0

critical severity

Arbitrary Code Execution

  • Vulnerable module: front-matter
  • Introduced through: front-matter@0.2.3

Detailed paths

  • Introduced through: freeze@0.2.2-beta front-matter@0.2.3
    Remediation: Upgrade to front-matter@4.0.1.

Overview

front-matter is a package that extracts meta data (front-matter) from documents.

Affected versions of this package are vulnerable to Arbitrary Code Execution due to the default usage of the function yaml.load() of the package js-yaml instead of its secure replacement , yaml.safeLoad().

Remediation

Upgrade front-matter to version 4.0.1 or higher.

References

critical severity

Arbitrary Code Injection

  • Vulnerable module: growl
  • Introduced through: freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 growl@1.8.1

Overview

growl is a package adding Growl support for Nodejs.

Affected versions of this package are vulnerable to Arbitrary Code Injection due to unsafe use of the eval() function. Node.js provides the eval() function by default, and is used to translate strings into Javascript code. An attacker can craft a malicious payload to inject arbitrary commands.

Remediation

Upgrade growl to version 1.10.0 or higher.

References

high severity

Uninitialized Memory Exposure

  • Vulnerable module: base64-url
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 uid-safe@2.0.0 base64-url@1.2.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 uid-safe@2.0.0 base64-url@1.2.1

Overview

base64-url Base64 encode, decode, escape and unescape for URL applications.

Affected versions of this package are vulnerable to Uninitialized Memory Exposure. An attacker may extract sensitive data from uninitialized memory or may cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).

Details

The Buffer class on Node.js is a mutable array of binary data, and can be initialized with a string, array or number.

const buf1 = new Buffer([1,2,3]);
// creates a buffer containing [01, 02, 03]
const buf2 = new Buffer('test');
// creates a buffer containing ASCII bytes [74, 65, 73, 74]
const buf3 = new Buffer(10);
// creates a buffer of length 10

The first two variants simply create a binary representation of the value it received. The last one, however, pre-allocates a buffer of the specified size, making it a useful buffer, especially when reading data from a stream. When using the number constructor of Buffer, it will allocate the memory, but will not fill it with zeros. Instead, the allocated buffer will hold whatever was in memory at the time. If the buffer is not zeroed by using buf.fill(0), it may leak sensitive information like keys, source code, and system info.

Remediation

Upgrade base64-url to version 2.0.0 or higher. Note This is vulnerable only for Node <=4

References

high severity

Code Injection

  • Vulnerable module: dustjs-linkedin
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 dustjs-linkedin@1.2.6

Overview

dustjs-linkedin is a Javascript templating engine designed to run asynchronously on both the server and the browser.

Affected versions of this package are vulnerable to Code Injection. Dust.js uses Javascript's eval() function to evaluate the "if" statement conditions. The input to the function is sanitized by escaping all potentially dangerous characters.

However, if the variable passed in is an array, no escaping is applied, exposing an easy path to code injection. The risk of exploit is especially high given the fact express, koa and many other Node.js servers allow users to force a query parameter to be an array using the param[]=value notation.

Remediation

Upgrade dustjs-linkedin to version 2.6.0 or higher.

References

high severity

Prototype Pollution

  • Vulnerable module: dustjs-linkedin
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 dustjs-linkedin@1.2.6

Overview

dustjs-linkedin is a Javascript templating engine designed to run asynchronously on both the server and the browser.

Affected versions of this package are vulnerable to Prototype Pollution. It is possible to pollute the blocks Array attribute of the object context within the compileBlocks function. This vulnerability can be leveraged for code execution since this property is added to the compiled function which is then execute by the vm module.

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade dustjs-linkedin to version 3.0.0 or higher.

References

high severity

Arbitrary File Write

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Write. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created.

This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both \ and / characters as path separators. However, \ is a valid filename character on posix systems.

By first creating a directory, and then replacing that directory with a symlink, it is possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location. This can lead to extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite.

Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at FOO, followed by a symbolic link named foo, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but not from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the FOO directory would then be placed in the target of the symbolic link, thinking that the directory had already been created.

Remediation

Upgrade tar to version 6.1.7, 5.0.8, 4.4.16 or higher.

References

high severity

Arbitrary File Write

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Write. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created.

This logic is insufficient when extracting tar files that contain two directories and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive can include directories with two forms of the path that resolve to the same file system entity, followed by a symbolic link with a name in the first form, lastly followed by a file using the second form. This leads to bypassing node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and extracting arbitrary files into that location.

Remediation

Upgrade tar to version 6.1.9, 5.0.10, 4.4.18 or higher.

References

high severity

Arbitrary File Write

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Write. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain .. path portions, and resolving the sanitized paths against the extraction target directory.

This logic is insufficient on Windows systems when extracting tar files that contain a path that is not an absolute path, but specify a drive letter different from the extraction target, such as C:some\path. If the drive letter does not match the extraction target, for example D:\extraction\dir, then the result of path.resolve(extractionDirectory, entryPath) resolves against the current working directory on the C: drive, rather than the extraction target directory.

Additionally, a .. portion of the path can occur immediately after the drive letter, such as C:../foo, and is not properly sanitized by the logic that checks for .. within the normalized and split portions of the path.

Note: This only affects users of node-tar on Windows systems.

Remediation

Upgrade tar to version 6.1.9, 5.0.10, 4.4.18 or higher.

References

high severity

Arbitrary File Overwrite

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Overwrite. This is due to insufficient symlink protection. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created.

This logic is insufficient when extracting tar files that contain both a directory and a symlink with the same name as the directory. This order of operations results in the directory being created and added to the node-tar directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where node-tar checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it is possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location.

Remediation

Upgrade tar to version 3.2.3, 4.4.15, 5.0.7, 6.1.2 or higher.

References

high severity

Arbitrary File Overwrite

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Overwrite. This is due to insufficient absolute path sanitization.

node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservePaths flag is not set to true. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example, the path /home/user/.bashrc would turn into home/user/.bashrc.

This logic is insufficient when file paths contain repeated path roots such as ////home/user/.bashrc. node-tar only strips a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. ///home/user/.bashrc) still resolves to an absolute path.

Remediation

Upgrade tar to version 3.2.2, 4.4.14, 5.0.6, 6.1.1 or higher.

References

high severity

Arbitrary Code Execution

  • Vulnerable module: ejs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 ejs@0.8.8

Overview

ejs is a popular JavaScript templating engine. Affected versions of the package are vulnerable to Remote Code Execution by letting the attacker under certain conditions control the source folder from which the engine renders include files. You can read more about this vulnerability on the Snyk blog.

There's also a Cross-site Scripting & Denial of Service vulnerabilities caused by the same behaviour.

Details

ejs provides a few different options for you to render a template, two being very similar: ejs.render() and ejs.renderFile(). The only difference being that render expects a string to be used for the template and renderFile expects a path to a template file.

Both functions can be invoked in two ways. The first is calling them with template, data, and options:

ejs.render(str, data, options);

ejs.renderFile(filename, data, options, callback)

The second way would be by calling only the template and data, while ejs lets the options be passed as part of the data:

ejs.render(str, dataAndOptions);

ejs.renderFile(filename, dataAndOptions, callback)

If used with a variable list supplied by the user (e.g. by reading it from the URI with qs or equivalent), an attacker can control ejs options. This includes the root option, which allows changing the project root for includes with an absolute path.

ejs.renderFile('my-template', {root:'/bad/root/'}, callback);

By passing along the root directive in the line above, any includes would now be pulled from /bad/root instead of the path intended. This allows the attacker to take control of the root directory for included scripts and divert it to a library under his control, thus leading to remote code execution.

The fix introduced in version 2.5.3 blacklisted root options from options passed via the data object.

Disclosure Timeline

  • November 27th, 2016 - Reported the issue to package owner.
  • November 27th, 2016 - Issue acknowledged by package owner.
  • November 28th, 2016 - Issue fixed and version 2.5.3 released.

Remediation

The vulnerability can be resolved by either using the GitHub integration to generate a pull-request from your dashboard or by running snyk wizard from the command-line interface. Otherwise, Upgrade ejs to version 2.5.3 or higher.

References

high severity

Arbitrary File Overwrite

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript.

Affected versions of this package are vulnerable to Arbitrary File Overwrite. It fails to prevent existing globally-installed binaries to be overwritten by other package installations. For example, if a package was installed globally and created a serve binary, any subsequent installs of packages that also create a serve binary would overwrite the first binary. This only affects files in /usr/local/bin.

For npm, this behaviour is still allowed in local installations and also through install scripts. This vulnerability bypasses a user using the --ignore-scripts install option.

Details

A Directory Traversal attack (also known as path traversal) aims to access files and directories that are stored outside the intended folder. By manipulating files with "dot-dot-slash (../)" sequences and its variations, or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration, and other critical system files.

Directory Traversal vulnerabilities can be generally divided into two types:

  • Information Disclosure: Allows the attacker to gain information about the folder structure or read the contents of sensitive files on the system.

st is a module for serving static files on web pages, and contains a vulnerability of this type. In our example, we will serve files from the public route.

If an attacker requests the following URL from our server, it will in turn leak the sensitive private key of the root user.

curl http://localhost:8080/public/%2e%2e/%2e%2e/%2e%2e/%2e%2e/%2e%2e/root/.ssh/id_rsa

Note %2e is the URL encoded version of . (dot).

  • Writing arbitrary files: Allows the attacker to create or replace existing files. This type of vulnerability is also known as Zip-Slip.

One way to achieve this is by using a malicious zip archive that holds path traversal filenames. When each filename in the zip archive gets concatenated to the target extraction folder, without validation, the final path ends up outside of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicious file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:

2018-04-15 22:04:29 .....           19           19  good.txt
2018-04-15 22:04:42 .....           20           20  ../../../../../../root/.ssh/authorized_keys

Remediation

Upgrade npm to version 6.13.4 or higher.

References

high severity

Arbitrary File Write

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript.

Affected versions of this package are vulnerable to Arbitrary File Write. It fails to prevent access to folders outside of the intended node_modules folder through the bin field.

For npm, a properly constructed entry in the package.json bin field would allow a package publisher to modify and/or gain access to arbitrary files on a user’s system when the package is installed. This behaviour is possible through install scripts. This vulnerability bypasses a user using the --ignore-scripts install option.

Details

A Directory Traversal attack (also known as path traversal) aims to access files and directories that are stored outside the intended folder. By manipulating files with "dot-dot-slash (../)" sequences and its variations, or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration, and other critical system files.

Directory Traversal vulnerabilities can be generally divided into two types:

  • Information Disclosure: Allows the attacker to gain information about the folder structure or read the contents of sensitive files on the system.

st is a module for serving static files on web pages, and contains a vulnerability of this type. In our example, we will serve files from the public route.

If an attacker requests the following URL from our server, it will in turn leak the sensitive private key of the root user.

curl http://localhost:8080/public/%2e%2e/%2e%2e/%2e%2e/%2e%2e/%2e%2e/root/.ssh/id_rsa

Note %2e is the URL encoded version of . (dot).

  • Writing arbitrary files: Allows the attacker to create or replace existing files. This type of vulnerability is also known as Zip-Slip.

One way to achieve this is by using a malicious zip archive that holds path traversal filenames. When each filename in the zip archive gets concatenated to the target extraction folder, without validation, the final path ends up outside of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicious file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:

2018-04-15 22:04:29 .....           19           19  good.txt
2018-04-15 22:04:42 .....           20           20  ../../../../../../root/.ssh/authorized_keys

Remediation

Upgrade npm to version 6.13.3 or higher.

References

high severity

Arbitrary File Overwrite

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Arbitrary File Overwrite. Extracting tarballs containing a hard-link to a file that already exists in the system, and a file that matches the hard-link may overwrite system's files with the contents of the extracted file.

Remediation

Upgrade tar to version 2.2.2, 4.4.2 or higher.

References

high severity

Arbitrary Code Injection

  • Vulnerable module: xmlhttprequest
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 xmlhttprequest@1.4.2
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 xmlhttprequest@1.4.2

Overview

xmlhttprequest is a wrapper for the built-in http client to emulate the browser XMLHttpRequest object.

Affected versions of this package are vulnerable to Arbitrary Code Injection. Provided requests are sent synchronously (async=False on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run.

POC

const { XMLHttpRequest } = require("xmlhttprequest")

const xhr = new XMLHttpRequest()
xhr.open("POST", "http://localhost.invalid/", false /* use synchronize request */)
xhr.send("\\');require(\"fs\").writeFileSync(\"/tmp/aaaaa.txt\", \"poc-20210306\");req.end();//")

Remediation

Upgrade xmlhttprequest to version 1.7.0 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: fresh
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 fresh@0.3.0
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 send@0.13.0 fresh@0.3.0
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-favicon@2.3.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 send@0.13.0 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-favicon@2.3.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 fresh@0.3.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 fresh@0.3.0

Overview

fresh is HTTP response freshness testing.

Affected versions of this package are vulnerable to Regular expression Denial of Service (ReDoS) attacks. A Regular Expression (/ *, */) was used for parsing HTTP headers and take about 2 seconds matching time for 50k characters.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade fresh to version 0.5.2 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: minimatch
  • Introduced through: browser-refresh@0.1.0-beta, freeze-theme-default@0.1.0-beta and others

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta minimatch@0.2.14
    Remediation: Upgrade to browser-refresh@1.7.2.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta minimatch@0.2.14
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 glob@3.2.3 minimatch@0.2.14
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 minimatch@0.3.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine-raptor@0.1.3-beta raptor-templates@0.2.20-beta glob@3.2.11 minimatch@0.3.0
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 glob@3.2.11 minimatch@0.3.0
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 glob@4.0.6 minimatch@1.0.0
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 init-package-json@0.1.2 glob@4.5.3 minimatch@2.0.10
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 read-package-json@1.2.7 glob@4.5.3 minimatch@2.0.10
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 minimatch@0.4.0

Overview

minimatch is a minimal matching utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via complicated and illegal regexes.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade minimatch to version 3.0.2 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: minimatch
  • Introduced through: browser-refresh@0.1.0-beta, freeze-theme-default@0.1.0-beta and others

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta minimatch@0.2.14
    Remediation: Upgrade to browser-refresh@1.7.2.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta minimatch@0.2.14
    Remediation: Open PR to patch minimatch@0.2.14.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 glob@3.2.3 minimatch@0.2.14
    Remediation: Open PR to patch minimatch@0.2.14.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 minimatch@0.3.0
    Remediation: Open PR to patch minimatch@0.3.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine-raptor@0.1.3-beta raptor-templates@0.2.20-beta glob@3.2.11 minimatch@0.3.0
    Remediation: Open PR to patch minimatch@0.3.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 glob@3.2.11 minimatch@0.3.0
    Remediation: Open PR to patch minimatch@0.3.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 glob@4.0.6 minimatch@1.0.0
    Remediation: Open PR to patch minimatch@1.0.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 init-package-json@0.1.2 glob@4.5.3 minimatch@2.0.10
    Remediation: Open PR to patch minimatch@2.0.10.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 read-package-json@1.2.7 glob@4.5.3 minimatch@2.0.10
    Remediation: Open PR to patch minimatch@2.0.10.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 minimatch@0.4.0
    Remediation: Open PR to patch minimatch@0.4.0.

Overview

minimatch is a minimal matching utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade minimatch to version 3.0.2 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: mocha
  • Introduced through: freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5

Overview

mocha is a javascript test framework for node.js & the browser.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). If the stack trace in utils.js begins with a large error message (>= 20k characters), and full-trace is not undisabled, utils.stackTraceFilter() will take exponential time to run.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade mocha to version 6.0.0 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: negotiator
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 compression@1.5.2 accepts@1.2.13 negotiator@0.5.3
    Remediation: Open PR to patch negotiator@0.5.3.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 accepts@1.2.13 negotiator@0.5.3
    Remediation: Open PR to patch negotiator@0.5.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 compression@1.5.2 accepts@1.2.13 negotiator@0.5.3
    Remediation: Open PR to patch negotiator@0.5.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 accepts@1.2.13 negotiator@0.5.3
    Remediation: Open PR to patch negotiator@0.5.3.

Overview

negotiator is an HTTP content negotiator for Node.js.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) when parsing Accept-Language http header.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade negotiator to version 0.6.1 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: npm-user-validate
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 npm-user-validate@0.1.5

Overview

npm-user-validate is an User validations for npm

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The regex that validates user emails took exponentially longer to process long input strings beginning with @ characters.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade npm-user-validate to version 1.0.1 or higher.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: qs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 qs@0.6.6
    Remediation: Open PR to patch qs@0.6.6.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 qs@0.6.6
    Remediation: Open PR to patch qs@0.6.6.

Overview

qs is a querystring parser that supports nesting and arrays, with a depth limit.

Affected versions of this package are vulnerable to Denial of Service (DoS). During parsing, the qs module may create a sparse area (an array where no elements are filled), and grow that array to the necessary size based on the indices used on it. An attacker can specify a high index value in a query string, thus making the server allocate a respectively big array. Truly large values can cause the server to run out of memory and cause it to crash - thus enabling a Denial-of-Service attack.

Remediation

Upgrade qs to version 1.0.0 or higher.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

References

high severity

Prototype Override Protection Bypass

  • Vulnerable module: qs
  • Introduced through: browser-refresh@0.1.0-beta, freeze-theme-default@0.1.0-beta and others

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 qs@4.0.0
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 qs@4.0.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 qs@4.0.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 qs@4.0.0
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 qs@1.0.2
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 qs@0.6.6
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 qs@0.6.6

Overview

qs is a querystring parser that supports nesting and arrays, with a depth limit.

Affected versions of this package are vulnerable to Prototype Override Protection Bypass. By default qs protects against attacks that attempt to overwrite an object's existing prototype properties, such as toString(), hasOwnProperty(),etc.

From qs documentation:

By default parameters that would overwrite properties on the object prototype are ignored, if you wish to keep the data from those fields either use plainObjects as mentioned above, or set allowPrototypes to true which will allow user input to overwrite those properties. WARNING It is generally a bad idea to enable this option as it can cause problems when attempting to use the properties that have been overwritten. Always be careful with this option.

Overwriting these properties can impact application logic, potentially allowing attackers to work around security controls, modify data, make the application unstable and more.

In versions of the package affected by this vulnerability, it is possible to circumvent this protection and overwrite prototype properties and functions by prefixing the name of the parameter with [ or ]. e.g. qs.parse("]=toString") will return {toString = true}, as a result, calling toString() on the object will throw an exception.

Example:

qs.parse('toString=foo', { allowPrototypes: false })
// {}

qs.parse("]=toString", { allowPrototypes: false })
// {toString = true} <== prototype overwritten

For more information, you can check out our blog.

Disclosure Timeline

  • February 13th, 2017 - Reported the issue to package owner.
  • February 13th, 2017 - Issue acknowledged by package owner.
  • February 16th, 2017 - Partial fix released in versions 6.0.3, 6.1.1, 6.2.2, 6.3.1.
  • March 6th, 2017 - Final fix released in versions 6.4.0,6.3.2, 6.2.3, 6.1.2 and 6.0.4

    Remediation

    Upgrade qs to version 6.0.4, 6.1.2, 6.2.3, 6.3.2 or higher.

    References

  • GitHub Commit
  • GitHub Issue

high severity

Symlink File Overwrite

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
    Remediation: Open PR to patch tar@0.1.20.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20
    Remediation: Open PR to patch tar@0.1.20.

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Symlink File Overwrite. It does not properly normalize symbolic links pointing to targets outside the extraction root. As a result, packages may hold symbolic links to parent and sibling directories and overwrite those files when the package is extracted.

Remediation

Upgrade tar to version 2.0.0 or higher.

References

high severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: tough-cookie
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 tough-cookie@0.9.15

Overview

tough-cookie is a RFC6265 Cookies and CookieJar module for Node.js.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). An attacker can provide a cookie, which nearly matches the pattern being matched. This will cause the regular expression matching to take a long time, all the while occupying the event loop and preventing it from processing other requests and making the server unavailable (a Denial of Service attack).

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade tough-cookie to version 2.3.0 or higher.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: ws
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32

Overview

ws is a WebSocket client and server implementation.

Affected versions of this package did not limit the size of an incoming payload before it was processed by default. As a result, a very large payload (over 256MB in size) could lead to a failed allocation and crash the node process - enabling a Denial of Service attack.

While 256MB may seem excessive, note that the attack is likely to be sent from another server, not an end-user computer, using data-center connection speeds. In those speeds, a payload of this size can be transmitted in seconds.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

Remediation

Update to version 1.1.1 or greater, which sets a default maxPayload of 100MB. If you cannot upgrade, apply a Snyk patch, or provide ws with options setting the maxPayload to an appropriate size that is smaller than 256MB.

References

high severity

Denial of Service (DoS)

  • Vulnerable module: ws
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32

Overview

ws is a simple to use websocket client, server and console for node.js.

Affected versions of this package are vulnerable to Denial of Service (DoS) attacks. A specially crafted value of the Sec-WebSocket-Extensions header that used Object.prototype property names as extension or parameter names could be used to make a ws server crash.

PoC:

const WebSocket = require('ws');
const net = require('net');

const wss = new WebSocket.Server({ port: 3000 }, function () {
  const payload = 'constructor';  // or ',;constructor'

  const request = [
    'GET / HTTP/1.1',
    'Connection: Upgrade',
    'Sec-WebSocket-Key: test',
    'Sec-WebSocket-Version: 8',
    `Sec-WebSocket-Extensions: ${payload}`,
    'Upgrade: websocket',
    '\r\n'
  ].join('\r\n');

  const socket = net.connect(3000, function () {
    socket.resume();
    socket.write(request);
  });
});

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

Remediation

Upgrade ws to version 1.1.5, 3.3.1 or higher.

References

high severity

Uninitialized Memory Exposure

  • Vulnerable module: npmconf
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 npmconf@1.0.6

Overview

npmconf is a package to reintegrate directly into npm.

Affected versions of this package are vulnerable to Uninitialized Memory Exposure. It allocates and writes to disk uninitialized memory content when a typed number is passed as input.

Note npmconf is deprecated and should not be used. Note This is vulnerable only for Node <=4

Details

The Buffer class on Node.js is a mutable array of binary data, and can be initialized with a string, array or number.

const buf1 = new Buffer([1,2,3]);
// creates a buffer containing [01, 02, 03]
const buf2 = new Buffer('test');
// creates a buffer containing ASCII bytes [74, 65, 73, 74]
const buf3 = new Buffer(10);
// creates a buffer of length 10

The first two variants simply create a binary representation of the value it received. The last one, however, pre-allocates a buffer of the specified size, making it a useful buffer, especially when reading data from a stream. When using the number constructor of Buffer, it will allocate the memory, but will not fill it with zeros. Instead, the allocated buffer will hold whatever was in memory at the time. If the buffer is not zeroed by using buf.fill(0), it may leak sensitive information like keys, source code, and system info.

Remediation

Upgrade npmconf to version 2.1.3 or higher.

References

high severity

Arbitrary File Overwrite

  • Vulnerable module: fstream
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 fstream@0.1.31
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20 fstream@0.1.31
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 fstream@0.1.31
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20 fstream@0.1.31

Overview

fstream is a package that supports advanced FS Streaming for Node.

Affected versions of this package are vulnerable to Arbitrary File Overwrite. Extracting tarballs containing a hardlink to a file that already exists in the system and a file that matches the hardlink will overwrite the system's file with the contents of the extracted file.

Remediation

Upgrade fstream to version 1.0.12 or higher.

References

high severity

Prototype Pollution

  • Vulnerable module: ini
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 ini@1.2.1

Overview

ini is an An ini encoder/decoder for node

Affected versions of this package are vulnerable to Prototype Pollution. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.

PoC by Eugene Lim

payload.ini

[__proto__]
polluted = "polluted"

poc.js:

var fs = require('fs')
var ini = require('ini')

var parsed = ini.parse(fs.readFileSync('./payload.ini', 'utf-8'))
console.log(parsed)
console.log(parsed.__proto__)
console.log(polluted)
> node poc.js
{}
{ polluted: 'polluted' }
{ polluted: 'polluted' }
polluted

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade ini to version 1.3.6 or higher.

References

medium severity

Arbitrary Code Injection

  • Vulnerable module: morgan
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 morgan@1.6.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 morgan@1.6.1

Overview

morgan is a HTTP request logger middleware for node.js.

Affected versions of this package are vulnerable to Arbitrary Code Injection. An attacker could use the format parameter to inject arbitrary commands.

Remediation

Upgrade morgan to version 1.9.1 or higher.

References

medium severity

npm Token Leak

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

This vulnerability could cause the unintentional leakage of bearer tokens. A design flaw in npm's registry allows an attacker to set up an HTTP server that could collect authentication information, and then use this authentication information to impersonate the users whose tokens they collected. The attacker could do anything the compromised users could do, including publishing new versions of packages.

Details

The primary npm registry has, since late 2014, used HTTP bearer tokens to authenticate requests from the npm command-line interface. Due to a design flaw in the CLI, these bearer tokens were sent with every request made by logged-in users, regardless of the destination of the request. (The bearers only should have been included for requests made against a registry or registries used for the current install.)

This flaw allows an attacker to set up an HTTP server that could collect authentication information. They could then use this information to impersonate the users whose tokens they collected. This impersonation would allow them to do anything the compromised users could do, including publishing new versions of packages.

With the fixes npm have released, the CLI will only send bearer tokens with requests made against a registry. npm’s CLI team believe that the fix won’t break any existing registry setups. However, it’s possible the change will be breaking in some cases, due to the large number of registry software suites used.

Remediation

  1. Upgrade npm to ">= 3.8.3 || >= 2.15.1"
  2. Invalidate your current npm bearer tokens

References

medium severity

Timing Attack

  • Vulnerable module: http-signature
  • Introduced through: rapido@1.1.12 and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 http-signature@0.10.1
    Remediation: Open PR to patch http-signature@0.10.1.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 http-signature@0.10.1
    Remediation: Open PR to patch http-signature@0.10.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 http-signature@0.10.1
    Remediation: Open PR to patch http-signature@0.10.1.

Overview

http-signature is a reference implementation of Joyent's HTTP Signature scheme.

Affected versions of the package are vulnerable to Timing Attacks due to time-variable comparison of signatures.

The library implemented a character to character comparison, similar to the built-in string comparison mechanism, ===, and not a time constant string comparison. As a result, the comparison will fail faster when the first characters in the signature are incorrect. An attacker can use this difference to perform a timing attack, essentially allowing them to guess the signature one character at a time.

You can read more about timing attacks in Node.js on the Snyk blog.

Remediation

Upgrade http-signature to version 1.0.0 or higher.

References

medium severity

Denial of Service (DoS)

  • Vulnerable module: qs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 qs@0.6.6
    Remediation: Open PR to patch qs@0.6.6.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 qs@0.6.6
    Remediation: Open PR to patch qs@0.6.6.

Overview

qs is a querystring parser that supports nesting and arrays, with a depth limit.

Affected versions of this package are vulnerable to Denial of Service (DoS). When parsing a string representing a deeply nested object, qs will block the event loop for long periods of time. Such a delay may hold up the server's resources, keeping it from processing other requests in the meantime, thus enabling a Denial-of-Service attack.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade qs to version 1.0.0 or higher.

References

medium severity

Remote Memory Exposure

  • Vulnerable module: ws
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
    Remediation: Open PR to patch ws@0.4.32.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
    Remediation: Open PR to patch ws@0.4.32.

Overview

ws is a simple to use websocket client, server and console for node.js. Affected versions of the package are vulnerable to Uninitialized Memory Exposure.

A client side memory disclosure vulnerability exists in ping functionality of the ws service. When a client sends a ping request and provides an integer value as ping data, it will result in leaking an uninitialized memory buffer.

This is a result of unobstructed use of the Buffer constructor, whose insecure default constructor increases the odds of memory leakage.

ws's ping function uses the default Buffer constructor as-is, making it easy to append uninitialized memory to an existing list. If the value of the buffer list is exposed to users, it may expose raw memory, potentially holding secrets, private data and code.

Proof of Concept:

var ws = require('ws')

var server = new ws.Server({ port: 9000 })
var client = new ws('ws://localhost:9000')

client.on('open', function () {
  console.log('open')
  client.ping(50) // this makes the client allocate an uninitialized buffer of 50 bytes and send it to the server

  client.on('pong', function (data) {
    console.log('got pong')
    console.log(data)
  })
})

Details

The Buffer class on Node.js is a mutable array of binary data, and can be initialized with a string, array or number.

const buf1 = new Buffer([1,2,3]);
// creates a buffer containing [01, 02, 03]
const buf2 = new Buffer('test');
// creates a buffer containing ASCII bytes [74, 65, 73, 74]
const buf3 = new Buffer(10);
// creates a buffer of length 10

The first two variants simply create a binary representation of the value it received. The last one, however, pre-allocates a buffer of the specified size, making it a useful buffer, especially when reading data from a stream. When using the number constructor of Buffer, it will allocate the memory, but will not fill it with zeros. Instead, the allocated buffer will hold whatever was in memory at the time. If the buffer is not zeroed by using buf.fill(0), it may leak sensitive information like keys, source code, and system info.

Similar vulnerabilities were discovered in request, mongoose, ws and sequelize.

References

medium severity

Prototype Pollution

  • Vulnerable module: hoek
  • Introduced through: rapido@1.1.12 and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 hawk@1.0.0 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 hawk@1.0.0 boom@0.4.2 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 hawk@1.0.0 sntp@0.2.4 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 hawk@1.0.0 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 hawk@1.0.0 cryptiles@0.2.2 boom@0.4.2 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 hawk@1.0.0 boom@0.4.2 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 hawk@1.0.0 sntp@0.2.4 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 hawk@1.0.0 cryptiles@0.2.2 boom@0.4.2 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 hawk@1.1.1 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 hawk@1.1.1 boom@0.4.2 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 hawk@1.1.1 sntp@0.2.4 hoek@0.9.1
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 hawk@1.1.1 cryptiles@0.2.2 boom@0.4.2 hoek@0.9.1

Overview

hoek is an Utility methods for the hapi ecosystem.

Affected versions of this package are vulnerable to Prototype Pollution. The utilities function allow modification of the Object prototype. If an attacker can control part of the structure passed to this function, they could add or modify an existing property.

PoC by Olivier Arteau (HoLyVieR)

var Hoek = require('hoek');
var malicious_payload = '{"__proto__":{"oops":"It works !"}}';

var a = {};
console.log("Before : " + a.oops);
Hoek.merge({}, JSON.parse(malicious_payload));
console.log("After : " + a.oops);

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade hoek to version 4.2.1, 5.0.3 or higher.

References

medium severity

Cross-site Scripting (XSS)

  • Vulnerable module: ejs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 ejs@0.8.8

Overview

ejs is a popular JavaScript templating engine. Affected versions of the package are vulnerable to Cross-site Scripting by letting the attacker under certain conditions control and override the filename option causing it to render the value as is, without escaping it. You can read more about this vulnerability on the Snyk blog.

There's also a Remote Code Execution & Denial of Service vulnerabilities caused by the same behaviour.

Details

ejs provides a few different options for you to render a template, two being very similar: ejs.render() and ejs.renderFile(). The only difference being that render expects a string to be used for the template and renderFile expects a path to a template file.

Both functions can be invoked in two ways. The first is calling them with template, data, and options:

ejs.render(str, data, options);

ejs.renderFile(filename, data, options, callback)

The second way would be by calling only the template and data, while ejs lets the options be passed as part of the data:

ejs.render(str, dataAndOptions);

ejs.renderFile(filename, dataAndOptions, callback)

If used with a variable list supplied by the user (e.g. by reading it from the URI with qs or equivalent), an attacker can control ejs options. This includes the filename option, which will be rendered as is when an error occurs during rendering.

ejs.renderFile('my-template', {filename:'<script>alert(1)</script>'}, callback);

The fix introduced in version 2.5.3 blacklisted root options from options passed via the data object.

Disclosure Timeline

  • November 28th, 2016 - Reported the issue to package owner.
  • November 28th, 2016 - Issue acknowledged by package owner.
  • December 06th, 2016 - Issue fixed and version 2.5.5 released.

Remediation

The vulnerability can be resolved by either using the GitHub integration to generate a pull-request from your dashboard or by running snyk wizard from the command-line interface. Otherwise, Upgrade ejs to version 2.5.5 or higher.

References

medium severity

Denial of Service (DoS)

  • Vulnerable module: ejs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 ejs@0.8.8

Overview

ejs is a popular JavaScript templating engine. Affected versions of the package are vulnerable to Denial of Service by letting the attacker under certain conditions control and override the localNames option causing it to crash. You can read more about this vulnerability on the Snyk blog.

There's also a Remote Code Execution & Cross-site Scripting vulnerabilities caused by the same behaviour.

Details

ejs provides a few different options for you to render a template, two being very similar: ejs.render() and ejs.renderFile(). The only difference being that render expects a string to be used for the template and renderFile expects a path to a template file.

Both functions can be invoked in two ways. The first is calling them with template, data, and options:

ejs.render(str, data, options);

ejs.renderFile(filename, data, options, callback)

The second way would be by calling only the template and data, while ejs lets the options be passed as part of the data:

ejs.render(str, dataAndOptions);

ejs.renderFile(filename, dataAndOptions, callback)

If used with a variable list supplied by the user (e.g. by reading it from the URI with qs or equivalent), an attacker can control ejs options. This includes the localNames option, which will cause the renderer to crash.

ejs.renderFile('my-template', {localNames:'try'}, callback);

The fix introduced in version 2.5.3 blacklisted root options from options passed via the data object.

Disclosure Timeline

  • November 28th, 2016 - Reported the issue to package owner.
  • November 28th, 2016 - Issue acknowledged by package owner.
  • December 06th, 2016 - Issue fixed and version 2.5.5 released.

Remediation

The vulnerability can be resolved by either using the GitHub integration to generate a pull-request from your dashboard or by running snyk wizard from the command-line interface. Otherwise, Upgrade ejs to version 2.5.5 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: marked
  • Introduced through: marked@0.3.19

Detailed paths

  • Introduced through: freeze@0.2.2-beta marked@0.3.19
    Remediation: Upgrade to marked@1.1.1.

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The em regex within src/rules.js file have multiple unused capture groups which could lead to a denial of service attack if user input is reachable.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 1.1.1 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: tough-cookie
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 tough-cookie@0.9.15

Overview

tough-cookie is RFC6265 Cookies and Cookie Jar for node.js.

Affected versions of this package are vulnerable to Regular expression Denial of Service (ReDoS) attacks. An attacker may pass a specially crafted cookie, causing the server to hang.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade to version 2.3.3 or newer.

References

medium severity

Prototype Pollution

  • Vulnerable module: highlight.js
  • Introduced through: highlight.js@8.9.1

Detailed paths

  • Introduced through: freeze@0.2.2-beta highlight.js@8.9.1
    Remediation: Upgrade to highlight.js@9.18.2.

Overview

highlight.js is a syntax highlighter written in JavaScript. It works in the browser as well as on the server. It works with pretty much any markup, doesn’t depend on any framework, and has automatic language detection.

Affected versions of this package are vulnerable to Prototype Pollution. A malicious HTML code block can be crafted that will result in prototype pollution of the base object's prototype during highlighting. If you allow users to insert custom HTML code blocks into your page/app via parsing Markdown code blocks (or similar) and do not filter the language names the user can provide you may be vulnerable.

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade highlight.js to version 9.18.2, 10.1.2 or higher.

References

medium severity

Prototype Pollution

  • Vulnerable module: minimist
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 mkdirp@0.5.1 minimist@0.0.8
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 mkdirp@0.5.1 minimist@0.0.8
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 mkdirp@0.5.0 minimist@0.0.8

Overview

minimist is a parse argument options module.

Affected versions of this package are vulnerable to Prototype Pollution. The library could be tricked into adding or modifying properties of Object.prototype using a constructor or __proto__ payload.

PoC by Snyk

require('minimist')('--__proto__.injected0 value0'.split(' '));
console.log(({}).injected0 === 'value0'); // true

require('minimist')('--constructor.prototype.injected1 value1'.split(' '));
console.log(({}).injected1 === 'value1'); // true

Details

Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_, constructor and prototype. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.

There are two main ways in which the pollution of prototypes occurs:

  • Unsafe Object recursive merge
  • Property definition by path

Unsafe Object recursive merge

The logic of a vulnerable recursive merge function follows the following high-level model:

merge (target, source)

  foreach property of source

    if property exists and is an object on both the target and the source

      merge(target[property], source[property])

    else

      target[property] = source[property]

When the source object contains a property named _proto_ defined with Object.defineProperty() , the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object and the source of Object as defined by the attacker. Properties are then copied on the Object prototype.

Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source).

lodash and Hoek are examples of libraries susceptible to recursive merge attacks.

Property definition by path

There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)

If the attacker can control the value of “path”, they can set this value to _proto_.myValue. myValue is then assigned to the prototype of the class of the object.

Types of attacks

There are a few methods by which Prototype Pollution can be manipulated:

Type Origin Short description
Denial of service (DoS) Client This is the most likely attack.
DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf).
The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object. In this case, the code fails and is likely to cause a denial of service.
For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail.
Remote Code Execution Client Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation.
For example: eval(someobject.someattr). In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code.
Property Injection Client The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens.
For example: if a codebase checks privileges for someuser.isAdmin, then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true, they can then achieve admin privileges.

Affected environments

The following environments are susceptible to a Prototype Pollution attack:

  • Application server
  • Web server

How to prevent

  1. Freeze the prototype— use Object.freeze (Object.prototype).
  2. Require schema validation of JSON input.
  3. Avoid using unsafe recursive merge functions.
  4. Consider using objects without prototypes (for example, Object.create(null)), breaking the prototype chain and preventing pollution.
  5. As a best practice use Map instead of Object.

For more information on this vulnerability type:

Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018

Remediation

Upgrade minimist to version 0.2.1, 1.2.3 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: marked
  • Introduced through: marked@0.3.19

Detailed paths

  • Introduced through: freeze@0.2.2-beta marked@0.3.19
    Remediation: Upgrade to marked@0.6.2.

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The inline.text regex may take quadratic time to scan for potential email addresses starting at every point.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 0.6.2 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: marked
  • Introduced through: marked@0.3.19

Detailed paths

  • Introduced through: freeze@0.2.2-beta marked@0.3.19
    Remediation: Upgrade to marked@0.4.0.

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). A Denial of Service condition could be triggered through exploitation of the heading regex.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 0.4.0 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: ms
  • Introduced through: freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 debug@2.0.0 ms@0.6.2
    Remediation: Open PR to patch ms@0.6.2.

Overview

ms is a tiny milisecond conversion utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attack when converting a time period string (i.e. "2 days", "1h") into a milliseconds integer. A malicious user could pass extremely long strings to ms(), causing the server to take a long time to process, subsequently blocking the event loop for that extended period.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade ms to version 0.7.1 or higher.

References

medium severity

Access Restriction Bypass

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript.

Affected versions of this package are vulnerable to Access Restriction Bypass. It might allow local users to bypass intended filesystem access restrictions due to ownerships of /etc and /usr directories are being changed unexpectedly, related to a "correctMkdir" issue.

Remediation

Upgrade npm to version 5.7.1 or higher.

References

medium severity

Insertion of Sensitive Information into Log File

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript.

Affected versions of this package are vulnerable to Insertion of Sensitive Information into Log File. The CLI supports URLs like <protocol>://[<user>[:<password>]@]<hostname>[:<port>][:][/]<path>. The password value is not redacted and is printed to stdout and also to any generated log files.

Remediation

Upgrade npm to version 6.14.6 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: semver
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 init-package-json@0.1.2 semver@3.0.1
    Remediation: Open PR to patch semver@3.0.1.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 read-installed@2.0.7 semver@3.0.1
    Remediation: Open PR to patch semver@3.0.1.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 semver@2.2.1
    Remediation: Open PR to patch semver@2.2.1.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 semver@2.3.2
    Remediation: Open PR to patch semver@2.3.2.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 npm-registry-client@2.0.7 semver@2.3.2
    Remediation: Open PR to patch semver@2.3.2.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 npmconf@1.0.6 semver@2.3.2
    Remediation: Open PR to patch semver@2.3.2.

Overview

semver is a semantic version parser used by npm.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The semver module uses regular expressions when parsing a version string. For a carefully crafted input, the time it takes to process these regular expressions is not linear to the length of the input. Since the semver module did not enforce a limit on the version string length, an attacker could provide a long string that would take up a large amount of resources, potentially taking a server down. This issue therefore enables a potential Denial of Service attack.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade semver to version 4.3.2 or higher.

References

medium severity

Insecure Defaults

  • Vulnerable module: socket.io
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19

Overview

socket.io is a node.js realtime framework server.

Affected versions of this package are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default.

Remediation

Upgrade socket.io to version 2.4.0 or higher.

References

medium severity
new

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: uglify-js
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 uglify-js@1.2.5
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 uglify-js@1.2.5
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-minify-js@0.1.0-beta uglify-js@1.3.5

Overview

uglify-js is a JavaScript parser, minifier, compressor and beautifier toolkit.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via the string_template and the decode_template functions.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade uglify-js to version 3.14.3 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: uglify-js
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 uglify-js@1.2.5
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 uglify-js@1.2.5
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-minify-js@0.1.0-beta uglify-js@1.3.5

Overview

The parse() function in the uglify-js package prior to version 2.6.0 is vulnerable to regular expression denial of service (ReDoS) attacks when long inputs of certain patterns are processed.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade to version 2.6.0 or greater. If a direct dependency update is not possible, use snyk wizard to patch this vulnerability.

References

medium severity

Insecure Randomness

  • Vulnerable module: ws
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
    Remediation: Open PR to patch ws@0.4.32.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
    Remediation: Open PR to patch ws@0.4.32.

Overview

ws is a simple to use websocket client, server and console for node.js.

Affected versions of the package use the cryptographically insecure Math.random() which can produce predictable values and should not be used in security-sensitive context.

Details

Computers are deterministic machines, and as such are unable to produce true randomness. Pseudo-Random Number Generators (PRNGs) approximate randomness algorithmically, starting with a seed from which subsequent values are calculated.

There are two types of PRNGs: statistical and cryptographic. Statistical PRNGs provide useful statistical properties, but their output is highly predictable and forms an easy to reproduce numeric stream that is unsuitable for use in cases where security depends on generated values being unpredictable. Cryptographic PRNGs address this problem by generating output that is more difficult to predict. For a value to be cryptographically secure, it must be impossible or highly improbable for an attacker to distinguish between it and a truly random value. In general, if a PRNG algorithm is not advertised as being cryptographically secure, then it is probably a statistical PRNG and should not be used in security-sensitive contexts.

You can read more about node's insecure Math.random() in Mike Malone's post.

Remediation

Upgrade ws to version 1.1.2 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: ws
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta socket.io@0.9.19 socket.io-client@0.9.16 ws@0.4.32

Overview

ws is a simple to use websocket client, server and console for node.js.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). A specially crafted value of the Sec-Websocket-Protocol header can be used to significantly slow down a ws server.

##PoC

for (const length of [1000, 2000, 4000, 8000, 16000, 32000]) {
  const value = 'b' + ' '.repeat(length) + 'x';
  const start = process.hrtime.bigint();

  value.trim().split(/ *, */);

  const end = process.hrtime.bigint();

  console.log('length = %d, time = %f ns', length, end - start);
}

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade ws to version 7.4.6, 6.2.2, 5.2.3 or higher.

References

medium severity

Remote Memory Exposure

  • Vulnerable module: request
  • Introduced through: freeze-theme-default@0.1.0-beta and rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0
    Remediation: Open PR to patch request@2.40.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0
    Remediation: Open PR to patch request@2.30.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0
    Remediation: Open PR to patch request@2.34.0.

Overview

request is a simplified http request client.

Affected versions of this package are vulnerable to Remote Memory Exposure. A potential remote memory exposure vulnerability exists in request. If a request uses a multipart attachment and the body type option is number with value X, then X bytes of uninitialized memory will be sent in the body of the request.

Note that while the impact of this vulnerability is high (memory exposure), exploiting it is likely difficult, as the attacker needs to somehow control the body type of the request. One potential exploit scenario is when a request is composed based on JSON input, including the body type, allowing a malicious JSON to trigger the memory leak.

Details

Constructing a Buffer class with integer N creates a Buffer of length N with non zero-ed out memory. Example:

var x = new Buffer(100); // uninitialized Buffer of length 100
// vs
var x = new Buffer('100'); // initialized Buffer with value of '100'

Initializing a multipart body in such manner will cause uninitialized memory to be sent in the body of the request.

Proof of concept

var http = require('http')
var request = require('request')

http.createServer(function (req, res) {
  var data = ''
  req.setEncoding('utf8')
  req.on('data', function (chunk) {
    console.log('data')
    data += chunk
  })
  req.on('end', function () {
    // this will print uninitialized memory from the client
    console.log('Client sent:\n', data)
  })
  res.end()
}).listen(8000)

request({
  method: 'POST',
  uri: 'http://localhost:8000',
  multipart: [{ body: 1000 }]
},
function (err, res, body) {
  if (err) return console.error('upload failed:', err)
  console.log('sent')
})

Remediation

Upgrade request to version 2.68.0 or higher.

References

medium severity

Uninitialized Memory Exposure

  • Vulnerable module: tunnel-agent
  • Introduced through: freeze-theme-default@0.1.0-beta and rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 tunnel-agent@0.4.3
    Remediation: Open PR to patch tunnel-agent@0.4.3.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 tunnel-agent@0.3.0
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 tunnel-agent@0.3.0

Overview

tunnel-agent is HTTP proxy tunneling agent. Affected versions of the package are vulnerable to Uninitialized Memory Exposure.

A possible memory disclosure vulnerability exists when a value of type number is used to set the proxy.auth option of a request request and results in a possible uninitialized memory exposures in the request body.

This is a result of unobstructed use of the Buffer constructor, whose insecure default constructor increases the odds of memory leakage.

Details

Constructing a Buffer class with integer N creates a Buffer of length N with raw (not "zero-ed") memory.

In the following example, the first call would allocate 100 bytes of memory, while the second example will allocate the memory needed for the string "100":

// uninitialized Buffer of length 100
x = new Buffer(100);
// initialized Buffer with value of '100'
x = new Buffer('100');

tunnel-agent's request construction uses the default Buffer constructor as-is, making it easy to append uninitialized memory to an existing list. If the value of the buffer list is exposed to users, it may expose raw server side memory, potentially holding secrets, private data and code. This is a similar vulnerability to the infamous Heartbleed flaw in OpenSSL.

Proof of concept by ChALkeR

require('request')({
  method: 'GET',
  uri: 'http://www.example.com',
  tunnel: true,
  proxy:{
      protocol: 'http:',
      host:"127.0.0.1",
      port:8080,
      auth:80
  }
});

You can read more about the insecure Buffer behavior on our blog.

Similar vulnerabilities were discovered in request, mongoose, ws and sequelize.

Remediation

Upgrade tunnel-agent to version 0.6.0 or higher. Note This is vulnerable only for Node <=4

References

medium severity

Time of Check Time of Use (TOCTOU)

  • Vulnerable module: chownr
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 chownr@0.0.2
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 npm-registry-client@2.0.7 chownr@0.0.2

Overview

chownr is a package that takes the same arguments as fs.chown()

Affected versions of this package are vulnerable to Time of Check Time of Use (TOCTOU). Affected versions of this package are vulnerable toTime of Check Time of Use (TOCTOU) attacks.

It does not dereference symbolic links and changes the owner of the link, which can trick it into descending into unintended trees if a non-symlink is replaced by a symlink at a critical moment:

      fs.lstat(pathChild, function(er, stats) {
        if (er)
          return cb(er)
        if (!stats.isSymbolicLink())
          chownr(pathChild, uid, gid, then)

Remediation

Upgrade chownr to version 1.1.0 or higher.

References

medium severity

Arbitrary Code Injection

  • Vulnerable module: ejs
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 ejs@0.8.8

Overview

ejs is a popular JavaScript templating engine.

Affected versions of this package are vulnerable to Arbitrary Code Injection via the render and renderFile. If external input is flowing into the options parameter, an attacker is able run arbitrary code. This include the filename, compileDebug, and client option.

POC

let ejs = require('ejs')
ejs.render('./views/test.ejs',{
    filename:'/etc/passwd\nfinally { this.global.process.mainModule.require(\'child_process\').execSync(\'touch EJS_HACKED\') }',
    compileDebug: true,
    message: 'test',
    client: true
})

Remediation

Upgrade ejs to version 3.1.6 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: clean-css
  • Introduced through: freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 clean-css@2.2.23

Overview

clean-css is a fast and efficient CSS optimizer for Node.js platform and any modern browser.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). attacks. This can cause an impact of about 10 seconds matching time for data 70k characters long.

Disclosure Timeline

  • Feb 15th, 2018 - Initial Disclosure to package owner
  • Feb 20th, 2018 - Initial Response from package owner
  • Mar 6th, 2018 - Fix issued
  • Mar 7th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade clean-css to version 4.1.11 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: debug
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 debug@2.2.0
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 send@0.13.0 debug@2.2.0
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 compression@1.5.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 finalhandler@0.4.0 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 morgan@1.6.1 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 send@0.13.0 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 compression@1.5.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 finalhandler@0.4.0 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 morgan@1.6.1 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 debug@2.2.0
    Remediation: Open PR to patch debug@2.2.0.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta view-engine@0.1.11 mocha@1.21.5 debug@2.0.0
    Remediation: Open PR to patch debug@2.0.0.

Overview

debug is a JavaScript debugging utility modelled after Node.js core's debugging technique..

debug uses printf-style formatting. Affected versions of this package are vulnerable to Regular expression Denial of Service (ReDoS) attacks via the the %o formatter (Pretty-print an Object all on a single line). It used a regular expression (/\s*\n\s*/g) in order to strip whitespaces and replace newlines with spaces, in order to join the data into a single line. This can cause a very low impact of about 2 seconds matching time for data 50k characters long.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade debug to version 2.6.9, 3.1.0 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: hawk
  • Introduced through: freeze-theme-default@0.1.0-beta and rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 hawk@1.1.1
    Remediation: Open PR to patch hawk@1.1.1.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 hawk@1.0.0
    Remediation: Open PR to patch hawk@1.0.0.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 hawk@1.0.0
    Remediation: Open PR to patch hawk@1.0.0.

Overview

hawk is an HTTP authentication scheme using a message authentication code (MAC) algorithm to provide partial HTTP request cryptographic verification.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

You can read more about Regular Expression Denial of Service (ReDoS) on our blog.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: mime
  • Introduced through: browser-refresh@0.1.0-beta, freeze-theme-default@0.1.0-beta and others

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 send@0.13.0 mime@1.3.4
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 send@0.13.0 mime@1.3.4
    Remediation: Open PR to patch mime@1.3.4.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 mime@1.3.4
    Remediation: Open PR to patch mime@1.3.4.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 mime@1.3.4
    Remediation: Open PR to patch mime@1.3.4.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 request@2.34.0 form-data@0.1.4 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 request@2.30.0 form-data@0.1.4 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta raptor-optimizer@0.2.41-beta raptor-optimizer-less@0.1.5-beta less@1.7.5 request@2.40.0 form-data@0.1.4 mime@1.2.11
    Remediation: Open PR to patch mime@1.2.11.

Overview

mime is a comprehensive, compact MIME type module.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). It uses regex the following regex /.*[\.\/\\]/ in its lookup, which can cause a slowdown of 2 seconds for 50k characters.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade mime to version 1.4.1, 2.0.3 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: ms
  • Introduced through: browser-refresh@0.1.0-beta and freeze-theme-default@0.1.0-beta

Detailed paths

  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 debug@2.2.0 ms@0.7.1
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 send@0.13.0 ms@0.7.1
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 send@0.13.0 debug@2.2.0 ms@0.7.1
    Remediation: Upgrade to browser-refresh@1.7.3.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 send@0.13.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 compression@1.5.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 finalhandler@0.4.0 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 morgan@1.6.1 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 send@0.13.0 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 body-parser@1.13.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 compression@1.5.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 connect-timeout@1.6.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 express-session@1.11.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 finalhandler@0.4.0 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 morgan@1.6.1 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-index@1.7.3 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-static@1.10.3 send@0.13.2 debug@2.2.0 ms@0.7.1
    Remediation: Open PR to patch ms@0.7.1.
  • Introduced through: freeze@0.2.2-beta browser-refresh@0.1.0-beta express@3.21.2 connect@2.30.2 serve-favicon@2.3.2 ms@0.7.2
    Remediation: Open PR to patch ms@0.7.2.
  • Introduced through: freeze@0.2.2-beta freeze-theme-default@0.1.0-beta browser-refresh@0.1.3-beta express@3.21.2 connect@2.30.2 serve-favicon@2.3.2 ms@0.7.2
    Remediation: Open PR to patch ms@0.7.2.

Overview

ms is a tiny millisecond conversion utility.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) due to an incomplete fix for previously reported vulnerability npm:ms:20151024. The fix limited the length of accepted input string to 10,000 characters, and turned to be insufficient making it possible to block the event loop for 0.3 seconds (on a typical laptop) with a specially crafted string passed to ms() function.

Proof of concept

ms = require('ms');
ms('1'.repeat(9998) + 'Q') // Takes about ~0.3s

Note: Snyk's patch for this vulnerability limits input length to 100 characters. This new limit was deemed to be a breaking change by the author. Based on user feedback, we believe the risk of breakage is very low, while the value to your security is much greater, and therefore opted to still capture this change in a patch for earlier versions as well. Whenever patching security issues, we always suggest to run tests on your code to validate that nothing has been broken.

For more information on Regular Expression Denial of Service (ReDoS) attacks, go to our blog.

Disclosure Timeline

  • Feb 9th, 2017 - Reported the issue to package owner.
  • Feb 11th, 2017 - Issue acknowledged by package owner.
  • April 12th, 2017 - Fix PR opened by Snyk Security Team.
  • May 15th, 2017 - Vulnerability published.
  • May 16th, 2017 - Issue fixed and version 2.0.0 released.
  • May 21th, 2017 - Patches released for versions >=0.7.1, <=1.0.0.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade ms to version 2.0.0 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: tar
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 tar@0.1.20
  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000 node-gyp@0.13.1 tar@0.1.20

Overview

tar is a full-featured Tar for Node.js.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). When stripping the trailing slash from files arguments, the f.replace(/\/+$/, '') performance of this function can exponentially degrade when f contains many / characters resulting in ReDoS.

This vulnerability is not likely to be exploitable as it requires that the untrusted input is being passed into the tar.extract() or tar.list() array of entries to parse/extract, which would be unusual.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade tar to version 6.1.4, 5.0.8, 4.4.16 or higher.

References

low severity

Symlink attack due to predictable tmp folder names

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript. Affected versions of the package are vulnerable to Symlink attack due to predictable tmp folder names, which were named /tmp/npm-$PID. An attacker waiting for a process named npm- to load could then go to the folder and arbitrarily change the files in the tmp folder.

Remediation

Upgrade npm to version 1.3.3 or higher.

References

low severity

Unauthorized File Access

  • Vulnerable module: npm
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 npm@1.2.8000

Overview

npm is a package manager for JavaScript.

Affected versions of this package are vulnerable to Unauthorized File Access. It is possible for packages to create symlinks to files outside of thenode_modules folder through the bin field upon installation.

For npm, a properly constructed entry in the package.json bin field would allow a package publisher to create a symlink pointing to arbitrary files on a user’s system when the package is installed. This behaviour is possible through install scripts. This vulnerability bypasses a user using the --ignore-scripts install option.

Remediation

Upgrade npm to version 6.13.3 or higher.

References

low severity

Uninitialized Memory Exposure

  • Vulnerable module: utile
  • Introduced through: rapido@1.1.12

Detailed paths

  • Introduced through: freeze@0.2.2-beta rapido@1.1.12 prompt@0.2.14 utile@0.2.1

Overview

utile is a drop-in replacement for util with some additional advantageous functions.

Affected versions of this package are vulnerable to Uninitialized Memory Exposure. A malicious user could extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed.

Note Uninitialized Memory Exposure impacts only Node.js 6.x or lower, Denial of Service impacts any Node.js version.

Details

The Buffer class on Node.js is a mutable array of binary data, and can be initialized with a string, array or number.

const buf1 = new Buffer([1,2,3]);
// creates a buffer containing [01, 02, 03]
const buf2 = new Buffer('test');
// creates a buffer containing ASCII bytes [74, 65, 73, 74]
const buf3 = new Buffer(10);
// creates a buffer of length 10

The first two variants simply create a binary representation of the value it received. The last one, however, pre-allocates a buffer of the specified size, making it a useful buffer, especially when reading data from a stream. When using the number constructor of Buffer, it will allocate the memory, but will not fill it with zeros. Instead, the allocated buffer will hold whatever was in memory at the time. If the buffer is not zeroed by using buf.fill(0), it may leak sensitive information like keys, source code, and system info.

Remediation

There is no fix version for utile.

References