Last tested: 01 Aug, 2018

webpack vulnerabilities

Packs CommonJs/AMD modules for the browser. Allows to split your codebase into multiple bundles, which can be loaded on demand. Support loaders to preprocess files, i.e. json, jsx, es7, css, less, ... and your custom stuff.

View on npm

webpack (latest)

Published 27 Jul, 2018

Known vulnerabilities1
Vulnerable paths2
Dependencies419

Time of Check Time of Use (TOCTOU)

medium severity

Detailed paths

  • Introduced through: webpack@4.16.3 > uglifyjs-webpack-plugin@1.2.7 > cacache@10.0.4 > chownr@1.0.1
  • Introduced through: webpack@4.16.3 > watchpack@1.6.0 > chokidar@2.0.4 > fsevents@1.2.4 > node-pre-gyp@0.10.3 > tar@4.4.4 > chownr@1.0.1

Overview

Affected versions of chownr are vulnerable to Time of Check Time of Use (TOCTOU). It does not dereference symbolic links and changes the owner of the link.

Remediation

There is no fix version for chownr.

References

Vulnerable versions of webpack

Fixed in 2.1.0-beta.0

Regular Expression Denial of Service (ReDoS)

low severity

Detailed paths

  • Introduced through: webpack@2.0.7-beta > watchpack@0.2.9 > chokidar@1.7.0 > anymatch@1.3.2 > micromatch@2.3.11 > braces@1.8.5

Overview

braces is a Bash-like brace expansion, implemented in JavaScript.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks. It used a regular expression (^\{(,+(?:(\{,+\})*),*|,*(?:(\{,+\})*),+)\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long.

Disclosure Timeline

  • Feb 15th, 2018 - Initial Disclosure to package owner
  • Feb 16th, 2018 - Initial Response from package owner
  • Feb 18th, 2018 - Fix issued
  • Feb 19th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade braces to version 2.3.1 or higher.

References

Fixed in 2.0.1-beta

Regular Expression Denial of Service (DoS)

medium severity

Detailed paths

  • Introduced through: webpack@2.0.0-beta > uglify-js@2.5.0

Overview

The parse() function in the uglify-js package prior to version 2.6.0 is vulnerable to regular expression denial of service (ReDoS) attacks when long inputs of certain patterns are processed.

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Upgrade to version 2.6.0 or greater. If a direct dependency update is not possible, use snyk wizard to patch this vulnerability.

References

Fixed in 1.5.0

Regular Expression Denial of Service (DoS)

high severity

Detailed paths

  • Introduced through: npm@1.4.15 > glob@4.0.6 > minimatch@1.0.0
  • Introduced through: npm@1.4.15 > minimatch@0.3.0
  • Introduced through: npm@1.4.15 > init-package-json@0.1.2 > glob@4.5.3 > minimatch@2.0.10
  • Introduced through: npm@1.4.15 > read-package-json@1.2.7 > glob@4.5.3 > minimatch@2.0.10
  • Introduced through: npm@1.4.15 > node-gyp@0.13.1 > minimatch@0.4.0
  • Introduced through: npm@1.4.15 > node-gyp@0.13.1 > glob@3.2.11 > minimatch@0.3.0
  • Introduced through: webpack@1.4.15 > watchpack@0.1.3 > chokidar@0.11.1 > readdirp@1.1.0 > minimatch@0.2.14

Overview

minimatch is a minimalistic matching library used for converting glob expressions into JavaScript RegExp objects. Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Many Regular Expression implementations may reach edge cases that causes them to work very slowly (exponentially related to input size), allowing an attacker to exploit this and can cause the program to enter these extreme situations by using a specially crafted input and cause the service to excessively consume CPU, resulting in a Denial of Service.

An attacker can provide a long value to the minimatch function, which nearly matches the pattern being matched. This will cause the regular expression matching to take a long time, all the while occupying the event loop and preventing it from processing other requests and making the server unavailable (a Denial of Service attack).

You can read more about Regular Expression Denial of Service (ReDoS) on our blog.

Remediation

Upgrade minimatch to version 3.0.2 or greater.

References

Fixed in 1.4.0-beta1

Insecure Randomness

high severity

Detailed paths

  • Introduced through: webpack@1.3.7 > node-libs-browser@0.3.1 > crypto-browserify@2.1.10

Overview

crypto-browserify is implementation of crypto for the browser.

Affected versions of the package are vulnerable to Insecure Randomness due to using the cryptographically insecure Math.random(). This function can produce predictable values and should not be used in security-sensitive context.

Details

Computers are deterministic machines, and as such are unable to produce true randomness. Pseudo-Random Number Generators (PRNGs) approximate randomness algorithmically, starting with a seed from which subsequent values are calculated.

There are two types of PRNGs: statistical and cryptographic. Statistical PRNGs provide useful statistical properties, but their output is highly predictable and forms an easy to reproduce numeric stream that is unsuitable for use in cases where security depends on generated values being unpredictable. Cryptographic PRNGs address this problem by generating output that is more difficult to predict. For a value to be cryptographically secure, it must be impossible or highly improbable for an attacker to distinguish between it and a truly random value. In general, if a PRNG algorithm is not advertised as being cryptographically secure, then it is probably a statistical PRNG and should not be used in security-sensitive contexts.

You can read more about node's insecure Math.random() in Mike Malone's post.

Remediation

Upgrade crypto-browserify to version 2.1.11 or higher.

References

Fixed in 1.1.0-beta9

Uninitialized Memory Exposure

medium severity

Detailed paths

  • Introduced through: webpack@1.1.0-beta8 > node-libs-browser@0.1.2 > http-browserify@0.1.6 > concat-stream@0.0.8

Overview

concat-stream is writable stream that concatenates strings or binary data and calls a callback with the result. Affected versions of the package are vulnerable to Uninitialized Memory Exposure.

A possible memory disclosure vulnerability exists when a value of type number is provided to the stringConcat() method and results in concatenation of uninitialized memory to the stream collection.

This is a result of unobstructed use of the Buffer constructor, whose insecure default constructor increases the odds of memory leakage.

Details

Constructing a Buffer class with integer N creates a Buffer of length N with raw (not "zero-ed") memory.

In the following example, the first call would allocate 100 bytes of memory, while the second example will allocate the memory needed for the string "100":

// uninitialized Buffer of length 100
x = new Buffer(100);
// initialized Buffer with value of '100'
x = new Buffer('100');

concat-stream's stringConcat function uses the default Buffer constructor as-is, making it easy to append uninitialized memory to an existing list. If the value of the buffer list is exposed to users, it may expose raw server side memory, potentially holding secrets, private data and code. This is a similar vulnerability to the infamous Heartbleed flaw in OpenSSL.

You can read more about the insecure Buffer behavior on our blog.

Similar vulnerabilities were discovered in request, mongoose, ws and sequelize.

Remediation

Upgrade concat-stream to version 1.5.2 or higher. Note This is vulnerable only for Node <=4

References

Fixed in 0.11.0-beta28

Improper minification of non-boolean comparisons

high severity

Detailed paths

  • Introduced through: webpack@0.11.0-beta27 > uglify-js@2.3.6

Overview

uglify-js is a JavaScript parser, minifier, compressor and beautifier toolkit.

Tom MacWright discovered that UglifyJS versions 2.4.23 and earlier are affected by a vulnerability which allows a specially crafted Javascript file to have altered functionality after minification. This bug was demonstrated by Yan to allow potentially malicious code to be hidden within secure code, activated by minification.

Details

In Boolean algebra, DeMorgan's laws describe the relationships between conjunctions (&&), disjunctions (||) and negations (!). In Javascript form, they state that:

 !(a && b) === (!a) || (!b)
 !(a || b) === (!a) && (!b)

The law does not hold true when one of the values is not a boolean however.

Vulnerable versions of UglifyJS do not account for this restriction, and erroneously apply the laws to a statement if it can be reduced in length by it.

Consider this authentication function:

function isTokenValid(user) {
    var timeLeft =
        !!config && // config object exists
        !!user.token && // user object has a token
        !user.token.invalidated && // token is not explicitly invalidated
        !config.uninitialized && // config is initialized
        !config.ignoreTimestamps && // don't ignore timestamps
        getTimeLeft(user.token.expiry); // > 0 if expiration is in the future

    // The token must not be expired
    return timeLeft > 0;
}

function getTimeLeft(expiry) {
  return expiry - getSystemTime();
}

When minified with a vulnerable version of UglifyJS, it will produce the following insecure output, where a token will never expire:

( Formatted for readability )

function isTokenValid(user) {
    var timeLeft = !(                       // negation
        !config                             // config object does not exist
        || !user.token                      // user object does not have a token
        || user.token.invalidated           // token is explicitly invalidated
        || config.uninitialized             // config isn't initialized
        || config.ignoreTimestamps          // ignore timestamps
        || !getTimeLeft(user.token.expiry)  // > 0 if expiration is in the future
    );
    return timeLeft > 0
}

function getTimeLeft(expiry) {
    return expiry - getSystemTime()
}

Remediation

Upgrade UglifyJS to version 2.4.24 or higher.

References

Fixed in 0.7.0

Regular Expression Denial of Service (ReDoS)

low severity

Detailed paths

  • Introduced through: webpack@0.6.2 > jade-loader@0.1.11 > jade@1.11.0 > clean-css@3.4.28

Overview

clean-css is a fast and efficient CSS optimizer for Node.js platform and any modern browser.

Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks. This can cause an impact of about 10 seconds matching time for data 70k characters long.

Disclosure Timeline

  • Feb 15th, 2018 - Initial Disclosure to package owner
  • Feb 20th, 2018 - Initial Response from package owner
  • Mar 6th, 2018 - Fix issued
  • Mar 7th, 2018 - Vulnerability published

Details

Denial of Service (DoS) describes a family of attacks, all aimed at making a system inaccessible to its original and legitimate users. There are many types of DoS attacks, ranging from trying to clog the network pipes to the system by generating a large volume of traffic from many machines (a Distributed Denial of Service - DDoS - attack) to sending crafted requests that cause a system to crash or take a disproportional amount of time to process.

The Regular expression Denial of Service (ReDoS) is a type of Denial of Service attack. Regular expressions are incredibly powerful, but they aren't very intuitive and can ultimately end up making it easy for attackers to take your site down.

Let’s take the following regular expression as an example:

regex = /A(B|C+)+D/

This regular expression accomplishes the following:

  • A The string must start with the letter 'A'
  • (B|C+)+ The string must then follow the letter A with either the letter 'B' or some number of occurrences of the letter 'C' (the + matches one or more times). The + at the end of this section states that we can look for one or more matches of this section.
  • D Finally, we ensure this section of the string ends with a 'D'

The expression would match inputs such as ABBD, ABCCCCD, ABCBCCCD and ACCCCCD

It most cases, it doesn't take very long for a regex engine to find a match:

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCD")'
0.04s user 0.01s system 95% cpu 0.052 total

$ time node -e '/A(B|C+)+D/.test("ACCCCCCCCCCCCCCCCCCCCCCCCCCCCX")'
1.79s user 0.02s system 99% cpu 1.812 total

The entire process of testing it against a 30 characters long string takes around ~52ms. But when given an invalid string, it takes nearly two seconds to complete the test, over ten times as long as it took to test a valid string. The dramatic difference is due to the way regular expressions get evaluated.

Most Regex engines will work very similarly (with minor differences). The engine will match the first possible way to accept the current character and proceed to the next one. If it then fails to match the next one, it will backtrack and see if there was another way to digest the previous character. If it goes too far down the rabbit hole only to find out the string doesn’t match in the end, and if many characters have multiple valid regex paths, the number of backtracking steps can become very large, resulting in what is known as catastrophic backtracking.

Let's look at how our expression runs into this problem, using a shorter string: "ACCCX". While it seems fairly straightforward, there are still four different ways that the engine could match those three C's:

  1. CCC
  2. CC+C
  3. C+CC
  4. C+C+C.

The engine has to try each of those combinations to see if any of them potentially match against the expression. When you combine that with the other steps the engine must take, we can use RegEx 101 debugger to see the engine has to take a total of 38 steps before it can determine the string doesn't match.

From there, the number of steps the engine must use to validate a string just continues to grow.

String Number of C's Number of steps
ACCCX 3 38
ACCCCX 4 71
ACCCCCX 5 136
ACCCCCCCCCCCCCCX 14 65,553

By the time the string includes 14 C's, the engine has to take over 65,000 steps just to see if the string is valid. These extreme situations can cause them to work very slowly (exponentially related to input size, as shown above), allowing an attacker to exploit this and can cause the service to excessively consume CPU, resulting in a Denial of Service.

Remediation

Update clean-css to version 4.1.11 or higher.

References

Sandbox Bypass

high severity

Detailed paths

  • Introduced through: webpack@0.6.2 > jade-loader@0.1.11 > jade@1.11.0 > constantinople@3.0.2

Overview

constantinople determines whether a JavaScript expression evaluates to a constant (using acorn).

Affected versions of this package are vulnerable to a sandbox bypass which can lead to arbitrary code execution.

Remediation

Upgrade constantinople to version 3.1.1 or higher.

References