Vulnerabilities |
36 via 176 paths |
---|---|
Dependencies |
622 |
Source |
npm |
Find, fix and prevent vulnerabilities in your code.
critical severity
- Vulnerable module: growl
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › growl@1.9.2Remediation: Upgrade to js-data-repo-tools@1.0.0.
Overview
growl is a package adding Growl support for Nodejs.
Affected versions of this package are vulnerable to Arbitrary Code Injection due to unsafe use of the eval()
function. Node.js provides the eval()
function by default, and is used to translate strings into Javascript code. An attacker can craft a malicious payload to inject arbitrary commands.
Remediation
Upgrade growl
to version 1.10.0 or higher.
References
critical severity
- Vulnerable module: sanitize-html
- Introduced through: ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3 › sanitize-html@1.27.5
Overview
sanitize-html is a library that allows you to clean up user-submitted HTML, preserving whitelisted elements and whitelisted attributes on a per-element basis
Affected versions of this package are vulnerable to Arbitrary Code Execution. Tag transformations which turn an attribute value into a text node using transformTags
could be vulnerable to code execution.
Remediation
Upgrade sanitize-html
to version 2.0.0-beta or higher.
References
high severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Arbitrary File Write. node-tar
aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created.
This logic was insufficient when extracting tar
files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both \
and /
characters as path separators. However, \
is a valid filename character on posix systems.
By first creating a directory, and then replacing that directory with a symlink, it is possible to bypass node-tar
symlink checks on directories, essentially allowing an untrusted tar
file to symlink into an arbitrary location. This can lead to extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite.
Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar
archive contained a directory at FOO
, followed by a symbolic link named foo
, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but not from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the FOO
directory would then be placed in the target of the symbolic link, thinking that the directory had already been created.
Remediation
Upgrade tar
to version 6.1.7, 5.0.8, 4.4.16 or higher.
References
high severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Arbitrary File Write. node-tar
aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created.
This logic is insufficient when extracting tar
files that contain two directories and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts.
A specially crafted tar
archive can include directories with two forms of the path that resolve to the same file system entity, followed by a symbolic link with a name in the first form, lastly followed by a file using the second form. This leads to bypassing node-tar
symlink checks on directories, essentially allowing an untrusted tar
file to symlink into an arbitrary location and extracting arbitrary files into that location.
Remediation
Upgrade tar
to version 6.1.9, 5.0.10, 4.4.18 or higher.
References
high severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Arbitrary File Write. node-tar
aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain ..
path portions, and resolving the sanitized paths against the extraction target directory.
This logic is insufficient on Windows systems when extracting tar
files that contain a path that is not an absolute path, but specify a drive letter different from the extraction target, such as C:some\path
. If the drive letter does not match the extraction target, for example D:\extraction\dir
, then the result of path.resolve(extractionDirectory, entryPath)
resolves against the current working directory on the C:
drive, rather than the extraction target directory.
Additionally, a ..
portion of the path can occur immediately after the drive letter, such as C:../foo
, and is not properly sanitized by the logic that checks for ..
within the normalized and split portions of the path.
Note: This only affects users of node-tar
on Windows systems.
Remediation
Upgrade tar
to version 6.1.9, 5.0.10, 4.4.18 or higher.
References
high severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Arbitrary File Overwrite. This is due to insufficient symlink protection.
node-tar
aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat
calls to determine whether a given path is a directory, paths are cached when directories are created.
This logic is insufficient when extracting tar files that contain both a directory and a symlink with the same name as the directory. This order of operations results in the directory being created and added to the node-tar
directory cache. When a directory is present in the directory cache, subsequent calls to mkdir
for that directory are skipped.
However, this is also where node-tar
checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it is possible to bypass node-tar
symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location.
Remediation
Upgrade tar
to version 3.2.3, 4.4.15, 5.0.7, 6.1.2 or higher.
References
high severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Arbitrary File Overwrite. This is due to insufficient absolute path sanitization.
node-tar
aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservePaths
flag is not set to true
. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example, the path /home/user/.bashrc
would turn into home/user/.bashrc
.
This logic is insufficient when file paths contain repeated path roots such as ////home/user/.bashrc
. node-tar
only strips a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. ///home/user/.bashrc
) still resolves to an absolute path.
Remediation
Upgrade tar
to version 3.2.2, 4.4.14, 5.0.6, 6.1.1 or higher.
References
high severity
- Vulnerable module: ajv
- Introduced through: standard@6.0.8
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › table@3.8.3 › ajv@4.11.8Remediation: Upgrade to standard@11.0.0.
Overview
ajv is an Another JSON Schema Validator
Affected versions of this package are vulnerable to Prototype Pollution. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.)
Details
Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_
, constructor
and prototype
. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype
are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.
There are two main ways in which the pollution of prototypes occurs:
- Unsafe
Object
recursive merge - Property definition by path
Unsafe Object recursive merge
The logic of a vulnerable recursive merge function follows the following high-level model:
merge (target, source)
foreach property of source
if property exists and is an object on both the target and the source
merge(target[property], source[property])
else
target[property] = source[property]
When the source object contains a property named _proto_
defined with Object.defineProperty()
, the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object
and the source of Object
as defined by the attacker. Properties are then copied on the Object
prototype.
Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source)
.
lodash
and Hoek
are examples of libraries susceptible to recursive merge attacks.
Property definition by path
There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)
If the attacker can control the value of “path”, they can set this value to _proto_.myValue
. myValue
is then assigned to the prototype of the class of the object.
Types of attacks
There are a few methods by which Prototype Pollution can be manipulated:
Type | Origin | Short description |
---|---|---|
Denial of service (DoS) | Client | This is the most likely attack. DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf ). The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object . In this case, the code fails and is likely to cause a denial of service. For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail. |
Remote Code Execution | Client | Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation. For example: eval(someobject.someattr) . In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code. |
Property Injection | Client | The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens. For example: if a codebase checks privileges for someuser.isAdmin , then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true , they can then achieve admin privileges. |
Affected environments
The following environments are susceptible to a Prototype Pollution attack:
- Application server
- Web server
How to prevent
- Freeze the prototype— use
Object.freeze (Object.prototype)
. - Require schema validation of JSON input.
- Avoid using unsafe recursive merge functions.
- Consider using objects without prototypes (for example,
Object.create(null)
), breaking the prototype chain and preventing pollution. - As a best practice use
Map
instead ofObject
.
For more information on this vulnerability type:
Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018
Remediation
Upgrade ajv
to version 6.12.3 or higher.
References
high severity
- Vulnerable module: npm
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm is a package manager for JavaScript.
Affected versions of this package are vulnerable to Arbitrary File Overwrite. It fails to prevent existing globally-installed binaries to be overwritten by other package installations. For example, if a package was installed globally and created a serve
binary, any subsequent installs of packages that also create a serve
binary would overwrite the first binary. This only affects files in /usr/local/bin
.
For npm
, this behaviour is still allowed in local installations and also through install scripts. This vulnerability bypasses a user using the --ignore-scripts
install option.
Details
A Directory Traversal attack (also known as path traversal) aims to access files and directories that are stored outside the intended folder. By manipulating files with "dot-dot-slash (../)" sequences and its variations, or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration, and other critical system files.
Directory Traversal vulnerabilities can be generally divided into two types:
- Information Disclosure: Allows the attacker to gain information about the folder structure or read the contents of sensitive files on the system.
st
is a module for serving static files on web pages, and contains a vulnerability of this type. In our example, we will serve files from the public
route.
If an attacker requests the following URL from our server, it will in turn leak the sensitive private key of the root user.
curl http://localhost:8080/public/%2e%2e/%2e%2e/%2e%2e/%2e%2e/%2e%2e/root/.ssh/id_rsa
Note %2e
is the URL encoded version of .
(dot).
- Writing arbitrary files: Allows the attacker to create or replace existing files. This type of vulnerability is also known as
Zip-Slip
.
One way to achieve this is by using a malicious zip
archive that holds path traversal filenames. When each filename in the zip archive gets concatenated to the target extraction folder, without validation, the final path ends up outside of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.
The following is an example of a zip
archive with one benign file and one malicious file. Extracting the malicious file will result in traversing out of the target folder, ending up in /root/.ssh/
overwriting the authorized_keys
file:
2018-04-15 22:04:29 ..... 19 19 good.txt
2018-04-15 22:04:42 ..... 20 20 ../../../../../../root/.ssh/authorized_keys
Remediation
Upgrade npm
to version 6.13.4 or higher.
References
high severity
- Vulnerable module: npm
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm is a package manager for JavaScript.
Affected versions of this package are vulnerable to Arbitrary File Write. It fails to prevent access to folders outside of the intended node_modules
folder through the bin field.
For npm
, a properly constructed entry in the package.json
bin field would allow a package publisher to modify and/or gain access to arbitrary files on a user’s system when the package is installed. This behaviour is possible through install scripts. This vulnerability bypasses a user using the --ignore-scripts install
option.
Details
A Directory Traversal attack (also known as path traversal) aims to access files and directories that are stored outside the intended folder. By manipulating files with "dot-dot-slash (../)" sequences and its variations, or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration, and other critical system files.
Directory Traversal vulnerabilities can be generally divided into two types:
- Information Disclosure: Allows the attacker to gain information about the folder structure or read the contents of sensitive files on the system.
st
is a module for serving static files on web pages, and contains a vulnerability of this type. In our example, we will serve files from the public
route.
If an attacker requests the following URL from our server, it will in turn leak the sensitive private key of the root user.
curl http://localhost:8080/public/%2e%2e/%2e%2e/%2e%2e/%2e%2e/%2e%2e/root/.ssh/id_rsa
Note %2e
is the URL encoded version of .
(dot).
- Writing arbitrary files: Allows the attacker to create or replace existing files. This type of vulnerability is also known as
Zip-Slip
.
One way to achieve this is by using a malicious zip
archive that holds path traversal filenames. When each filename in the zip archive gets concatenated to the target extraction folder, without validation, the final path ends up outside of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.
The following is an example of a zip
archive with one benign file and one malicious file. Extracting the malicious file will result in traversing out of the target folder, ending up in /root/.ssh/
overwriting the authorized_keys
file:
2018-04-15 22:04:29 ..... 19 19 good.txt
2018-04-15 22:04:42 ..... 20 20 ../../../../../../root/.ssh/authorized_keys
Remediation
Upgrade npm
to version 6.13.3 or higher.
References
high severity
- Vulnerable module: bl
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › bl@1.1.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › bl@1.1.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
bl is a library that allows you to collect buffers and access with a standard readable buffer interface.
Affected versions of this package are vulnerable to Remote Memory Exposure. If user input ends up in consume()
argument and can become negative, BufferList state can be corrupted, tricking it into exposing uninitialized memory via regular .slice()
calls.
PoC by chalker
const { BufferList } = require('bl')
const secret = require('crypto').randomBytes(256)
for (let i = 0; i < 1e6; i++) {
const clone = Buffer.from(secret)
const bl = new BufferList()
bl.append(Buffer.from('a'))
bl.consume(-1024)
const buf = bl.slice(1)
if (buf.indexOf(clone) !== -1) {
console.error(`Match (at ${i})`, buf)
}
}
Remediation
Upgrade bl
to version 2.2.1, 3.0.1, 4.0.3, 1.2.3 or higher.
References
high severity
- Vulnerable module: ansi-regex
- Introduced through: npm-check-updates@2.15.0, standard@6.0.8 and others
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › inquirer@0.12.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › node-alias@1.0.4 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › node-alias@1.0.4 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › columnify@1.5.4 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › inquirer@0.12.0 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to standard@13.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-eslint@6.1.2 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › inquirer@0.12.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1Remediation: Upgrade to standard@11.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › table@3.8.3 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-eslint@6.1.2 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › inquirer@0.12.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to standard@11.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › table@3.8.3 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › columnify@1.5.4 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › inquirer@0.12.0 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to standard@13.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › npmlog@4.0.2 › gauge@2.7.4 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-block-scoping@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › har-validator@2.0.6 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-block-scoping@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › har-validator@2.0.6 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › npmlog@4.0.2 › gauge@2.7.4 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › npmlog@3.1.2 › gauge@2.6.0 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › npm-registry-client@7.2.1 › npmlog@3.1.2 › gauge@2.6.0 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › npmlog@4.0.2 › gauge@2.7.4 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-helpers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-block-scoping@6.26.0 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-computed-properties@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-systemjs@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-function-name@6.24.1 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-object-super@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-helper-call-delegate@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › har-validator@2.0.6 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-helpers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-block-scoping@6.26.0 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-computed-properties@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-systemjs@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-function-name@6.24.1 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-object-super@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-parameters@6.24.1 › babel-helper-call-delegate@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › har-validator@2.0.6 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › npmlog@3.1.2 › gauge@2.6.0 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › npm-registry-client@7.2.1 › npmlog@3.1.2 › gauge@2.6.0 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › npmlog@4.0.2 › gauge@2.7.4 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › npmlog@3.1.2 › gauge@2.6.0 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › npm-registry-client@7.2.1 › npmlog@3.1.2 › gauge@2.6.0 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-function-name@6.24.1 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-object-super@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-define-map@6.26.0 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-function-name@6.24.1 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-object-super@6.24.1 › babel-helper-replace-supers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-define-map@6.26.0 › babel-helper-function-name@6.24.1 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › npmlog@3.1.2 › gauge@2.6.0 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › npm-registry-client@7.2.1 › npmlog@3.1.2 › gauge@2.6.0 › string-width@1.0.2 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-helpers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-define-map@6.26.0 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › has-ansi@2.0.0 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-core@6.26.3 › babel-register@6.26.0 › babel-core@6.26.3 › babel-helpers@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-classes@6.24.1 › babel-helper-define-map@6.26.0 › babel-helper-function-name@6.24.1 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › babel-preset-es2015@6.24.1 › babel-plugin-transform-es2015-modules-umd@6.24.1 › babel-plugin-transform-es2015-modules-amd@6.24.1 › babel-plugin-transform-es2015-modules-commonjs@6.26.2 › babel-template@6.26.0 › babel-traverse@6.26.0 › babel-code-frame@6.26.0 › chalk@1.1.3 › strip-ansi@3.0.1 › ansi-regex@2.1.1
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › update-notifier@2.5.0 › boxen@1.3.0 › string-width@2.1.1 › strip-ansi@4.0.0 › ansi-regex@3.0.1Remediation: Upgrade to npm-check-updates@3.1.10.
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › table@3.8.3 › string-width@2.1.1 › strip-ansi@4.0.0 › ansi-regex@3.0.1Remediation: Upgrade to standard@13.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › update-notifier@2.5.0 › boxen@1.3.0 › ansi-align@2.0.0 › string-width@2.1.1 › strip-ansi@4.0.0 › ansi-regex@3.0.1Remediation: Upgrade to npm-check-updates@3.1.10.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › update-notifier@2.5.0 › boxen@1.3.0 › widest-line@2.0.1 › string-width@2.1.1 › strip-ansi@4.0.0 › ansi-regex@3.0.1Remediation: Upgrade to npm-check-updates@4.0.2.
Overview
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) due to the sub-patterns [[\\]()#;?]*
and (?:;[-a-zA-Z\\d\\/#&.:=?%@~_]*)*
.
PoC
import ansiRegex from 'ansi-regex';
for(var i = 1; i <= 50000; i++) {
var time = Date.now();
var attack_str = "\u001B["+";".repeat(i*10000);
ansiRegex().test(attack_str)
var time_cost = Date.now() - time;
console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
}
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade ansi-regex
to version 4.1.1, 5.0.1, 6.0.1 or higher.
References
high severity
- Vulnerable module: minimatch
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › glob@3.2.11 › minimatch@0.3.0Remediation: Upgrade to js-data-repo-tools@0.5.6.
Overview
minimatch is a minimal matching utility.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via complicated and illegal regexes.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade minimatch
to version 3.0.2 or higher.
References
high severity
- Vulnerable module: minimatch
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › glob@3.2.11 › minimatch@0.3.0Remediation: Upgrade to js-data-repo-tools@0.5.6.
Overview
minimatch is a minimal matching utility.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade minimatch
to version 3.0.2 or higher.
References
high severity
- Vulnerable module: mocha
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3Remediation: Upgrade to js-data-repo-tools@1.0.0.
Overview
mocha is a javascript test framework for node.js & the browser.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). If the stack trace in utils.js
begins with a large error message (>= 20k characters), and full-trace
is not undisabled, utils.stackTraceFilter()
will take exponential time to run.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade mocha
to version 6.0.0 or higher.
References
high severity
- Vulnerable module: npm-user-validate
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › npm-user-validate@0.1.5Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › npm-user-validate@0.1.5Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm-user-validate is an User validations for npm
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The regex that validates user emails took exponentially longer to process long input strings beginning with @
characters.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade npm-user-validate
to version 1.0.1 or higher.
References
high severity
new
- Vulnerable module: hawk
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3
Overview
hawk is a library for the HTTP Hawk Authentication Scheme.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) in header parsing where each added character in the attacker's input increases the computation time exponentially.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
A fix was pushed into the master
branch but not yet published.
References
high severity
- Vulnerable module: shelljs
- Introduced through: standard@6.0.8
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0 › shelljs@0.5.3
Overview
shelljs is a wrapper for the Unix shell commands for Node.js.
Affected versions of this package are vulnerable to Improper Privilege Management. When ShellJS
is used to create shell scripts which may be running as root
, users with low-level privileges on the system can leak sensitive information such as passwords (depending on implementation) from the standard output of the privileged process OR shutdown privileged ShellJS
processes via the exec
function when triggering EACCESS errors.
Note: Thi only impacts the synchronous version of shell.exec()
.
Remediation
Upgrade shelljs
to version 0.8.5 or higher.
References
medium severity
- Vulnerable module: codecov
- Introduced through: codecov@1.0.1
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › codecov@1.0.1Remediation: Upgrade to js-data-repo-tools@0.5.2.
Overview
codecov is a npm package for uploading reports to Codecov.
Affected versions of this package are vulnerable to Command Injection. The value provided as part of the gcov-args
argument is executed by the exec
function within lib/codecov.js.
PoC by JHU System Security Lab
var root = require("codecov");
var args = {
"options": {
'gcov-args': "& touch PWNED &"
}
}
root.handleInput.upload(args, function(){}, function(){});
Remediation
Upgrade codecov
to version 3.6.2 or higher.
References
medium severity
- Vulnerable module: codecov
- Introduced through: codecov@1.0.1
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › codecov@1.0.1Remediation: Upgrade to js-data-repo-tools@0.5.2.
Overview
codecov is a npm package for uploading reports to Codecov.
Affected versions of this package are vulnerable to Command Injection. The value provided as part of the gcov-root
argument is executed by the exec
function within lib/codecov.js. This vulnerability exists due to an incomplete fix of CVE-2020-7596.
PoC by JHU System Security Lab
var root = require("codecov");
var args = {
"options": {
'gcov-root': "& touch exploit &",
'gcov-exec': ' ',
'gcov-args': ' '
}
}
root.handleInput.upload(args, function(){}, function(){});
Remediation
Upgrade codecov
to version 3.6.5 or higher.
References
medium severity
- Vulnerable module: codecov
- Introduced through: codecov@1.0.1
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › codecov@1.0.1Remediation: Upgrade to js-data-repo-tools@0.5.2.
Overview
codecov is a npm package for uploading reports to Codecov.
Affected versions of this package are vulnerable to Command Injection via the upload
method.
Note: This vulnerability exists due to an incomplete fix of CVE-2020-7597.
Remediation
Upgrade codecov
to version 3.7.1 or higher.
References
medium severity
- Vulnerable module: sanitize-html
- Introduced through: ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3 › sanitize-html@1.27.5
Overview
sanitize-html is a library that allows you to clean up user-submitted HTML, preserving whitelisted elements and whitelisted attributes on a per-element basis
Affected versions of this package are vulnerable to Access Restriction Bypass. Internationalized domain name (IDN) is not properly handled. This allows attackers to bypass hostname whitelist validation set by the allowedIframeHostnames
option.
Remediation
Upgrade sanitize-html
to version 2.3.1 or higher.
References
medium severity
- Vulnerable module: sanitize-html
- Introduced through: ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › ink-docstrap@git+https://github.com/js-data/docstrap.git#1af960b2835cb171a1e204b3dc8b74d8a0fcf9f3 › sanitize-html@1.27.5
Overview
sanitize-html is a library that allows you to clean up user-submitted HTML, preserving whitelisted elements and whitelisted attributes on a per-element basis
Affected versions of this package are vulnerable to Validation Bypass. There is no proper validation of the hostnames set by the allowedIframeHostnames
option when the allowIframeRelativeUrls
is set to true
. This allows attackers to bypass the hostname whitelist for the iframe element.
Details
A cross-site scripting attack occurs when the attacker tricks a legitimate web-based application or site to accept a request as originating from a trusted source.
This is done by escaping the context of the web application; the web application then delivers that data to its users along with other trusted dynamic content, without validating it. The browser unknowingly executes malicious script on the client side (through client-side languages; usually JavaScript or HTML) in order to perform actions that are otherwise typically blocked by the browser’s Same Origin Policy.
Injecting malicious code is the most prevalent manner by which XSS is exploited; for this reason, escaping characters in order to prevent this manipulation is the top method for securing code against this vulnerability.
Escaping means that the application is coded to mark key characters, and particularly key characters included in user input, to prevent those characters from being interpreted in a dangerous context. For example, in HTML, <
can be coded as <
; and >
can be coded as >
; in order to be interpreted and displayed as themselves in text, while within the code itself, they are used for HTML tags. If malicious content is injected into an application that escapes special characters and that malicious content uses <
and >
as HTML tags, those characters are nonetheless not interpreted as HTML tags by the browser if they’ve been correctly escaped in the application code and in this way the attempted attack is diverted.
The most prominent use of XSS is to steal cookies (source: OWASP HttpOnly) and hijack user sessions, but XSS exploits have been used to expose sensitive information, enable access to privileged services and functionality and deliver malware.
Types of attacks
There are a few methods by which XSS can be manipulated:
Type | Origin | Description |
---|---|---|
Stored | Server | The malicious code is inserted in the application (usually as a link) by the attacker. The code is activated every time a user clicks the link. |
Reflected | Server | The attacker delivers a malicious link externally from the vulnerable web site application to a user. When clicked, malicious code is sent to the vulnerable web site, which reflects the attack back to the user’s browser. |
DOM-based | Client | The attacker forces the user’s browser to render a malicious page. The data in the page itself delivers the cross-site scripting data. |
Mutated | The attacker injects code that appears safe, but is then rewritten and modified by the browser, while parsing the markup. An example is rebalancing unclosed quotation marks or even adding quotation marks to unquoted parameters. |
Affected environments
The following environments are susceptible to an XSS attack:
- Web servers
- Application servers
- Web application environments
How to prevent
This section describes the top best practices designed to specifically protect your code:
- Sanitize data input in an HTTP request before reflecting it back, ensuring all data is validated, filtered or escaped before echoing anything back to the user, such as the values of query parameters during searches.
- Convert special characters such as
?
,&
,/
,<
,>
and spaces to their respective HTML or URL encoded equivalents. - Give users the option to disable client-side scripts.
- Redirect invalid requests.
- Detect simultaneous logins, including those from two separate IP addresses, and invalidate those sessions.
- Use and enforce a Content Security Policy (source: Wikipedia) to disable any features that might be manipulated for an XSS attack.
- Read the documentation for any of the libraries referenced in your code to understand which elements allow for embedded HTML.
Remediation
Upgrade sanitize-html
to version 2.3.2 or higher.
References
medium severity
- Vulnerable module: hoek
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › boom@2.10.1 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › sntp@1.0.9 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › cryptiles@2.0.5 › boom@2.10.1 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › boom@2.10.1 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › sntp@1.0.9 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › hawk@3.1.3 › cryptiles@2.0.5 › boom@2.10.1 › hoek@2.16.3Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
hoek is an Utility methods for the hapi ecosystem.
Affected versions of this package are vulnerable to Prototype Pollution. The utilities function allow modification of the Object
prototype. If an attacker can control part of the structure passed to this function, they could add or modify an existing property.
PoC by Olivier Arteau (HoLyVieR)
var Hoek = require('hoek');
var malicious_payload = '{"__proto__":{"oops":"It works !"}}';
var a = {};
console.log("Before : " + a.oops);
Hoek.merge({}, JSON.parse(malicious_payload));
console.log("After : " + a.oops);
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade hoek
to version 4.2.1, 5.0.3 or higher.
References
medium severity
- Vulnerable module: minimist
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › mkdirp@0.5.1 › minimist@0.0.8Remediation: Upgrade to js-data-repo-tools@1.0.0.
Overview
minimist is a parse argument options module.
Affected versions of this package are vulnerable to Prototype Pollution. The library could be tricked into adding or modifying properties of Object.prototype
using a constructor
or __proto__
payload.
PoC by Snyk
require('minimist')('--__proto__.injected0 value0'.split(' '));
console.log(({}).injected0 === 'value0'); // true
require('minimist')('--constructor.prototype.injected1 value1'.split(' '));
console.log(({}).injected1 === 'value1'); // true
Details
Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_
, constructor
and prototype
. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype
are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.
There are two main ways in which the pollution of prototypes occurs:
- Unsafe
Object
recursive merge - Property definition by path
Unsafe Object recursive merge
The logic of a vulnerable recursive merge function follows the following high-level model:
merge (target, source)
foreach property of source
if property exists and is an object on both the target and the source
merge(target[property], source[property])
else
target[property] = source[property]
When the source object contains a property named _proto_
defined with Object.defineProperty()
, the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object
and the source of Object
as defined by the attacker. Properties are then copied on the Object
prototype.
Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source)
.
lodash
and Hoek
are examples of libraries susceptible to recursive merge attacks.
Property definition by path
There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)
If the attacker can control the value of “path”, they can set this value to _proto_.myValue
. myValue
is then assigned to the prototype of the class of the object.
Types of attacks
There are a few methods by which Prototype Pollution can be manipulated:
Type | Origin | Short description |
---|---|---|
Denial of service (DoS) | Client | This is the most likely attack. DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf ). The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object . In this case, the code fails and is likely to cause a denial of service. For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail. |
Remote Code Execution | Client | Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation. For example: eval(someobject.someattr) . In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code. |
Property Injection | Client | The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens. For example: if a codebase checks privileges for someuser.isAdmin , then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true , they can then achieve admin privileges. |
Affected environments
The following environments are susceptible to a Prototype Pollution attack:
- Application server
- Web server
How to prevent
- Freeze the prototype— use
Object.freeze (Object.prototype)
. - Require schema validation of JSON input.
- Avoid using unsafe recursive merge functions.
- Consider using objects without prototypes (for example,
Object.create(null)
), breaking the prototype chain and preventing pollution. - As a best practice use
Map
instead ofObject
.
For more information on this vulnerability type:
Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018
Remediation
Upgrade minimist
to version 0.2.1, 1.2.3 or higher.
References
medium severity
- Vulnerable module: hosted-git-info
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › hosted-git-info@2.1.5Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › hosted-git-info@2.1.5Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
hosted-git-info is a Provides metadata and conversions from repository urls for Github, Bitbucket and Gitlab
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) via regular expression shortcutMatch
in the fromUrl
function in index.js. The affected regular expression exhibits polynomial worst-case time complexity.
PoC by Yeting Li
var hostedGitInfo = require("hosted-git-info")
function build_attack(n) {
var ret = "a:"
for (var i = 0; i < n; i++) {
ret += "a"
}
return ret + "!";
}
for(var i = 1; i <= 5000000; i++) {
if (i % 1000 == 0) {
var time = Date.now();
var attack_str = build_attack(i)
var parsedInfo = hostedGitInfo.fromUrl(attack_str)
var time_cost = Date.now() - time;
console.log("attack_str.length: " + attack_str.length + ": " + time_cost+" ms")
}
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade hosted-git-info
to version 3.0.8, 2.8.9 or higher.
References
medium severity
- Vulnerable module: npm
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm is a package manager for JavaScript.
Affected versions of this package are vulnerable to Access Restriction Bypass. It might allow local users to bypass intended filesystem access restrictions due to ownerships of /etc
and /usr
directories are being changed unexpectedly, related to a "correctMkdir" issue.
Remediation
Upgrade npm
to version 5.7.1 or higher.
References
medium severity
- Vulnerable module: npm
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm is a package manager for JavaScript.
Affected versions of this package are vulnerable to Insertion of Sensitive Information into Log File. The CLI supports URLs like <protocol>://[<user>[:<password>]@]<hostname>[:<port>][:][/]<path>
. The password value is not redacted and is printed to stdout and also to any generated log files.
Remediation
Upgrade npm
to version 6.14.6 or higher.
References
medium severity
- Vulnerable module: tunnel-agent
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › request@2.75.0 › tunnel-agent@0.4.3Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › request@2.75.0 › tunnel-agent@0.4.3Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tunnel-agent
is HTTP proxy tunneling agent. Affected versions of the package are vulnerable to Uninitialized Memory Exposure.
A possible memory disclosure vulnerability exists when a value of type number
is used to set the proxy.auth option of a request request
and results in a possible uninitialized memory exposures in the request body.
This is a result of unobstructed use of the Buffer
constructor, whose insecure default constructor increases the odds of memory leakage.
Details
Constructing a Buffer
class with integer N
creates a Buffer
of length N
with raw (not "zero-ed") memory.
In the following example, the first call would allocate 100 bytes of memory, while the second example will allocate the memory needed for the string "100":
// uninitialized Buffer of length 100
x = new Buffer(100);
// initialized Buffer with value of '100'
x = new Buffer('100');
tunnel-agent
's request
construction uses the default Buffer
constructor as-is, making it easy to append uninitialized memory to an existing list. If the value of the buffer list is exposed to users, it may expose raw server side memory, potentially holding secrets, private data and code. This is a similar vulnerability to the infamous Heartbleed
flaw in OpenSSL.
Proof of concept by ChALkeR
require('request')({
method: 'GET',
uri: 'http://www.example.com',
tunnel: true,
proxy:{
protocol: 'http:',
host:"127.0.0.1",
port:8080,
auth:80
}
});
You can read more about the insecure Buffer
behavior on our blog.
Similar vulnerabilities were discovered in request, mongoose, ws and sequelize.
Remediation
Upgrade tunnel-agent
to version 0.6.0 or higher.
Note This is vulnerable only for Node <=4
References
medium severity
- Vulnerable module: chownr
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › chownr@1.0.1Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › chownr@1.0.1Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
chownr is a package that takes the same arguments as fs.chown()
Affected versions of this package are vulnerable to Time of Check Time of Use (TOCTOU). Affected versions of this package are vulnerable toTime of Check Time of Use (TOCTOU) attacks.
It does not dereference symbolic links and changes the owner of the link, which can trick it into descending into unintended trees if a non-symlink is replaced by a symlink at a critical moment:
fs.lstat(pathChild, function(er, stats) {
if (er)
return cb(er)
if (!stats.isSymbolicLink())
chownr(pathChild, uid, gid, then)
Remediation
Upgrade chownr
to version 1.1.0 or higher.
References
low severity
- Vulnerable module: debug
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › debug@2.2.0Remediation: Upgrade to js-data-repo-tools@1.0.0.
Overview
debug
is a JavaScript debugging utility modelled after Node.js core's debugging technique..
debug
uses printf-style formatting. Affected versions of this package are vulnerable to Regular expression Denial of Service (ReDoS) attacks via the the %o
formatter (Pretty-print an Object all on a single line). It used a regular expression (/\s*\n\s*/g
) in order to strip whitespaces and replace newlines with spaces, in order to join the data into a single line. This can cause a very low impact of about 2 seconds matching time for data 50k characters long.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade debug
to version 2.6.9, 3.1.0 or higher.
References
low severity
- Vulnerable module: eslint
- Introduced through: standard@6.0.8
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › standard@6.0.8 › eslint@2.2.0Remediation: Upgrade to standard@11.0.0.
Overview
eslint is a pluggable linting utility for JavaScript and JSX
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). This can cause an impact of about 10 seconds matching time for data 100k characters long.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade eslint
to version 4.18.2 or higher.
References
low severity
- Vulnerable module: minimist
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › mkdirp@0.5.1 › minimist@0.0.8Remediation: Upgrade to js-data-repo-tools@1.0.0.
Overview
minimist is a parse argument options module.
Affected versions of this package are vulnerable to Prototype Pollution due to a missing handler to Function.prototype
.
Note: this is a bypass to CVE-2020-7598
PoC by Snyk
require('minimist')('--_.constructor.constructor.prototype.foo bar'.split(' '));
console.log((function(){}).foo); // bar
Details
Prototype Pollution is a vulnerability affecting JavaScript. Prototype Pollution refers to the ability to inject properties into existing JavaScript language construct prototypes, such as objects. JavaScript allows all Object attributes to be altered, including their magical attributes such as _proto_
, constructor
and prototype
. An attacker manipulates these attributes to overwrite, or pollute, a JavaScript application object prototype of the base object by injecting other values. Properties on the Object.prototype
are then inherited by all the JavaScript objects through the prototype chain. When that happens, this leads to either denial of service by triggering JavaScript exceptions, or it tampers with the application source code to force the code path that the attacker injects, thereby leading to remote code execution.
There are two main ways in which the pollution of prototypes occurs:
- Unsafe
Object
recursive merge - Property definition by path
Unsafe Object recursive merge
The logic of a vulnerable recursive merge function follows the following high-level model:
merge (target, source)
foreach property of source
if property exists and is an object on both the target and the source
merge(target[property], source[property])
else
target[property] = source[property]
When the source object contains a property named _proto_
defined with Object.defineProperty()
, the condition that checks if the property exists and is an object on both the target and the source passes and the merge recurses with the target, being the prototype of Object
and the source of Object
as defined by the attacker. Properties are then copied on the Object
prototype.
Clone operations are a special sub-class of unsafe recursive merges, which occur when a recursive merge is conducted on an empty object: merge({},source)
.
lodash
and Hoek
are examples of libraries susceptible to recursive merge attacks.
Property definition by path
There are a few JavaScript libraries that use an API to define property values on an object based on a given path. The function that is generally affected contains this signature: theFunction(object, path, value)
If the attacker can control the value of “path”, they can set this value to _proto_.myValue
. myValue
is then assigned to the prototype of the class of the object.
Types of attacks
There are a few methods by which Prototype Pollution can be manipulated:
Type | Origin | Short description |
---|---|---|
Denial of service (DoS) | Client | This is the most likely attack. DoS occurs when Object holds generic functions that are implicitly called for various operations (for example, toString and valueOf ). The attacker pollutes Object.prototype.someattr and alters its state to an unexpected value such as Int or Object . In this case, the code fails and is likely to cause a denial of service. For example: if an attacker pollutes Object.prototype.toString by defining it as an integer, if the codebase at any point was reliant on someobject.toString() it would fail. |
Remote Code Execution | Client | Remote code execution is generally only possible in cases where the codebase evaluates a specific attribute of an object, and then executes that evaluation. For example: eval(someobject.someattr) . In this case, if the attacker pollutes Object.prototype.someattr they are likely to be able to leverage this in order to execute code. |
Property Injection | Client | The attacker pollutes properties that the codebase relies on for their informative value, including security properties such as cookies or tokens. For example: if a codebase checks privileges for someuser.isAdmin , then when the attacker pollutes Object.prototype.isAdmin and sets it to equal true , they can then achieve admin privileges. |
Affected environments
The following environments are susceptible to a Prototype Pollution attack:
- Application server
- Web server
How to prevent
- Freeze the prototype— use
Object.freeze (Object.prototype)
. - Require schema validation of JSON input.
- Avoid using unsafe recursive merge functions.
- Consider using objects without prototypes (for example,
Object.create(null)
), breaking the prototype chain and preventing pollution. - As a best practice use
Map
instead ofObject
.
For more information on this vulnerability type:
Arteau, Oliver. “JavaScript prototype pollution attack in NodeJS application.” GitHub, 26 May 2018
Remediation
Upgrade minimist
to version 1.2.6 or higher.
References
low severity
- Vulnerable module: ms
- Introduced through: mocha@2.5.3
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › mocha@2.5.3 › debug@2.2.0 › ms@0.7.1Remediation: Upgrade to js-data-repo-tools@0.5.6.
Overview
ms
is a tiny millisecond conversion utility.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) due to an incomplete fix for previously reported vulnerability npm:ms:20151024. The fix limited the length of accepted input string to 10,000 characters, and turned to be insufficient making it possible to block the event loop for 0.3 seconds (on a typical laptop) with a specially crafted string passed to ms()
function.
Proof of concept
ms = require('ms');
ms('1'.repeat(9998) + 'Q') // Takes about ~0.3s
Note: Snyk's patch for this vulnerability limits input length to 100 characters. This new limit was deemed to be a breaking change by the author. Based on user feedback, we believe the risk of breakage is very low, while the value to your security is much greater, and therefore opted to still capture this change in a patch for earlier versions as well. Whenever patching security issues, we always suggest to run tests on your code to validate that nothing has been broken.
For more information on Regular Expression Denial of Service (ReDoS)
attacks, go to our blog.
Disclosure Timeline
- Feb 9th, 2017 - Reported the issue to package owner.
- Feb 11th, 2017 - Issue acknowledged by package owner.
- April 12th, 2017 - Fix PR opened by Snyk Security Team.
- May 15th, 2017 - Vulnerability published.
- May 16th, 2017 - Issue fixed and version
2.0.0
released. - May 21th, 2017 - Patches released for versions
>=0.7.1, <=1.0.0
.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade ms
to version 2.0.0 or higher.
References
low severity
- Vulnerable module: tar
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10 › node-gyp@3.4.0 › tar@2.2.2Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
tar is a full-featured Tar for Node.js.
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). When stripping the trailing slash from files
arguments, the f.replace(/\/+$/, '')
performance of this function can exponentially degrade when f
contains many /
characters resulting in ReDoS.
This vulnerability is not likely to be exploitable as it requires that the untrusted input is being passed into the tar.extract()
or tar.list()
array of entries to parse/extract, which would be unusual.
Details
Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.
The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.
Let’s take the following regular expression as an example:
regex = /A(B|C+)+D/
This regular expression accomplishes the following:
A
The string must start with the letter 'A'(B|C+)+
The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the+
matches one or more times). The+
at the end of this section states that we can look for one or more matches of this section.D
Finally, we ensure this section of the string ends with a 'D'
The expression would match inputs such as ABBD
, ABCCCCD
, ABCBCCCD
and ACCCCCD
It most cases, it doesn't take very long for a regex engine to find a match:
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total
$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total
The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.
Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.
Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:
- CCC
- CC+C
- C+CC
- C+C+C.
The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.
From there, the number of steps the engine must use to validate a string just continues to grow.
String | Number of C's | Number of steps |
---|---|---|
ACCCX | 3 | 38 |
ACCCCX | 4 | 71 |
ACCCCCX | 5 | 136 |
ACCCCCCCCCCCCCCX | 14 | 65,553 |
By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.
Remediation
Upgrade tar
to version 6.1.4, 5.0.8, 4.4.16 or higher.
References
low severity
- Vulnerable module: npm
- Introduced through: npm-check-updates@2.15.0
Detailed paths
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
-
Introduced through: js-data-repo-tools@0.5.0 › npm-check-updates@2.15.0 › npmi@2.0.1 › npm@3.10.10Remediation: Upgrade to npm-check-updates@3.0.0.
Overview
npm is a package manager for JavaScript.
Affected versions of this package are vulnerable to Unauthorized File Access. It is possible for packages to create symlinks to files outside of thenode_modules
folder through the bin
field upon installation.
For npm
, a properly constructed entry in the package.json
bin field would allow a package publisher to create a symlink pointing to arbitrary files on a user’s system when the package is installed. This behaviour is possible through install scripts. This vulnerability bypasses a user using the --ignore-scripts
install option.
Remediation
Upgrade npm
to version 6.13.3 or higher.