@oclif/semantic-release@3.1.5

Vulnerabilities

4 via 37 paths

Dependencies

654

Source

npm

Package

@oclif/semantic-release

Find, fix and prevent vulnerabilities in your code.

Severity
  • 4
Status
  • 4
  • 0
  • 0

medium severity

Denial of Service (DoS)

  • Vulnerable module: mem
  • Introduced through: @semantic-release/npm@7.0.5 and semantic-release@17.1.1

Detailed paths

  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 os-locale@2.1.0 mem@1.1.0

Overview

mem is an optimization used to speed up consecutive function calls by caching the result of calls with identical input.

Affected versions of this package are vulnerable to Denial of Service (DoS). Old results were deleted from the cache and could cause a memory leak.

details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its intended and legitimate users.

Unlike other vulnerabilities, DoS attacks usually do not aim at breaching security. Rather, they are focused on making websites and services unavailable to genuine users resulting in downtime.

One popular Denial of Service vulnerability is DDoS (a Distributed Denial of Service), an attack that attempts to clog network pipes to the system by generating a large volume of traffic from many machines.

When it comes to open source libraries, DoS vulnerabilities allow attackers to trigger such a crash or crippling of the service by using a flaw either in the application code or from the use of open source libraries.

Two common types of DoS vulnerabilities:

  • High CPU/Memory Consumption- An attacker sending crafted requests that could cause the system to take a disproportionate amount of time to process. For example, commons-fileupload:commons-fileupload.

  • Crash - An attacker sending crafted requests that could cause the system to crash. For Example, npm ws package

Remediation

Upgrade mem to version 4.0.0 or higher.

References

medium severity

Prototype Pollution

  • Vulnerable module: dot-prop
  • Introduced through: @semantic-release/npm@7.0.5 and semantic-release@17.1.1

Detailed paths

  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libnpx@10.2.3 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libnpx@10.2.3 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 update-notifier@2.5.0 configstore@3.1.2 dot-prop@4.2.0

Overview

dot-prop is a package to get, set, or delete a property from a nested object using a dot path.

Affected versions of this package are vulnerable to Prototype Pollution. It is possible for a user to modify the prototype of a base object.

PoC by aaron_costello

var dotProp = require("dot-prop")
const object = {};
console.log("Before " + object.b); //Undefined
dotProp.set(object, '__proto__.b', true);
console.log("After " + {}.b); //true

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade dot-prop to version 5.1.1 or higher.

References

medium severity

Prototype Pollution

  • Vulnerable module: lodash
  • Introduced through: @semantic-release/changelog@5.0.1, @semantic-release/exec@5.0.0 and others

Detailed paths

  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/changelog@5.0.1 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/exec@5.0.0 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/git@9.0.0 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/github@7.0.7 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/github@7.0.7 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/commit-analyzer@8.0.1 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/release-notes-generator@9.0.1 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/commit-analyzer@8.0.1 conventional-commits-parser@3.1.0 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/release-notes-generator@9.0.1 conventional-commits-parser@3.1.0 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/release-notes-generator@9.0.1 conventional-changelog-writer@4.0.17 lodash@4.17.15
    Remediation: Open PR to patch lodash@4.17.15.

Overview

lodash is a modern JavaScript utility library delivering modularity, performance, & extras.

Affected versions of this package are vulnerable to Prototype Pollution. The function zipObjectDeep can be tricked into adding or modifying properties of the Object prototype. These properties will be present on all objects.

PoC

const _ = require('lodash');
_.zipObjectDeep(['__proto__.z'],[123])
console.log(z) // 123

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

There is no fixed version for lodash.

References

medium severity

Prototype Pollution

  • Vulnerable module: yargs-parser
  • Introduced through: @semantic-release/npm@7.0.5 and semantic-release@17.1.1

Detailed paths

  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libcipm@4.0.8 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libnpm@3.0.1 lock-verify@2.2.0 @iarna/cli@1.2.0 yargs@8.0.2 yargs-parser@7.0.0
  • Introduced through: @oclif/semantic-release@3.1.5 @semantic-release/npm@7.0.5 npm@6.14.5 libnpx@10.2.3 yargs@11.1.1 yargs-parser@9.0.2
  • Introduced through: @oclif/semantic-release@3.1.5 semantic-release@17.1.1 @semantic-release/npm@7.0.5 npm@6.14.5 libnpx@10.2.3 yargs@11.1.1 yargs-parser@9.0.2

Overview

yargs-parser is a mighty option parser used by yargs.

Affected versions of this package are vulnerable to Prototype Pollution. The library could be tricked into adding or modifying properties of Object.prototype using a __proto__ payload.

Our research team checked several attack vectors to verify this vulnerability:

  1. It could be used for privilege escalation.
  2. The library could be used to parse user input received from different sources:
    • terminal emulators
    • system calls from other code bases
    • CLI RPC servers

PoC by Snyk

const parser = require("yargs-parser");
console.log(parser('--foo.__proto__.bar baz'));
console.log(({}).bar);

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade yargs-parser to version 5.0.0-security.0, 13.1.2, 15.0.1, 18.1.1 or higher.

References