Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: isaacs/minipass
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: f55015f024cfaf1a27b595ddcedebd99c38dc189
Choose a base ref
...
head repository: isaacs/minipass
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: af6d2aeaa9254f547675f82cbde18aebf0126960
Choose a head ref

Commits on Feb 26, 2022

  1. ci: tests and funding

    isaacs committed Feb 26, 2022
    Copy the full SHA
    880c1b9 View commit details
  2. Update license

    Work done by Isaac Z. Schlueter after November 2021 no longer
    copyright npm, Inc.
    isaacs committed Feb 26, 2022
    Copy the full SHA
    d183c9b View commit details
  3. chore: add copyright year to license

    License Year Bot committed Feb 26, 2022
    Copy the full SHA
    2d26b88 View commit details
  4. Copy the full SHA
    0b9e5ed View commit details
  5. drop yallist, use Array instead

    They're fast enough now, saves lots of small object gc
    isaacs committed Feb 26, 2022
    Copy the full SHA
    89b823a View commit details

Commits on Mar 2, 2022

  1. more comprehensive benchmarks

    isaacs committed Mar 2, 2022
    Copy the full SHA
    5652efa View commit details
  2. Copy the full SHA
    5fac672 View commit details

Commits on Jun 8, 2022

  1. performance improvements

    - Get rid of try/finally, that's no longer performant in node 16 and 18
    - Handle OBJECTMODE up front, to avoid excess checks later
    - Remove reference to 'arguments'
    - Pull some common event handling into dedicated methods, to trim down
      generic/spread behavior in emit()
    isaacs committed Jun 8, 2022
    Copy the full SHA
    781dd86 View commit details
  2. Copy the full SHA
    6472384 View commit details
  3. Copy the full SHA
    a72f92f View commit details
  4. Copy the full SHA
    18c2a43 View commit details
  5. 3.2.0

    isaacs committed Jun 8, 2022
    Copy the full SHA
    b596282 View commit details

Commits on Jun 10, 2022

  1. Copy the full SHA
    6d80111 View commit details
  2. benchmarks for async mode

    isaacs committed Jun 10, 2022
    Copy the full SHA
    9523234 View commit details
  3. 3.2.1

    isaacs committed Jun 10, 2022
    Copy the full SHA
    547db29 View commit details

Commits on Jun 20, 2022

  1. feature: add unpipe(dest) method

    PR-URL: #39
    Credit: @isaacs
    Close: #39
    Reviewed-by: @isaacs
    isaacs committed Jun 20, 2022
    Copy the full SHA
    9c5113e View commit details
  2. Add pipe(dest, { proxyErrors: true })

    Fix: #12
    
    PR-URL: #37
    Credit: @isaacs
    Close: #37
    Reviewed-by: @isaacs
    isaacs committed Jun 20, 2022
    Copy the full SHA
    ed5cbc0 View commit details
  3. add type definitions

    PR-URL: #38
    Credit: @isaacs
    Close: #38
    Reviewed-by: @isaacs
    isaacs committed Jun 20, 2022
    Copy the full SHA
    07dee84 View commit details
  4. 3.3.0

    isaacs committed Jun 20, 2022
    Copy the full SHA
    80662a0 View commit details
  5. add types to package

    isaacs committed Jun 20, 2022
    Copy the full SHA
    5929447 View commit details
  6. 3.3.1

    isaacs committed Jun 20, 2022
    Copy the full SHA
    d65917f View commit details
  7. Copy the full SHA
    ba370d0 View commit details
  8. 3.3.2

    isaacs committed Jun 20, 2022
    Copy the full SHA
    3802694 View commit details
  9. Copy the full SHA
    7768aec View commit details
  10. 3.3.3

    isaacs committed Jun 20, 2022
    Copy the full SHA
    af6d2ae View commit details
Showing with 4,783 additions and 3,266 deletions.
  1. +3 −0 .github/FUNDING.yml
  2. +39 −0 .github/workflows/ci.yml
  3. +13 −0 .github/workflows/commit-if-modified.sh
  4. +15 −0 .github/workflows/copyright-year.sh
  5. +37 −0 .github/workflows/isaacs-makework.yml
  6. +16 −0 .github/workflows/package-json-repo.js
  7. +0 −5 .travis.yml
  8. +1 −1 LICENSE
  9. +122 −7 README.md
  10. +31 −0 bench/README.md
  11. +14 −0 bench/impls/README.md
  12. +18 −0 bench/impls/baseline.js
  13. +0 −1 bench/{lib/extend-transform.js → impls/core-extend-transform.js}
  14. +1 −0 bench/impls/core-passthrough.js
  15. +10 −0 bench/impls/extend-minipass-current.js
  16. +2 −3 bench/{lib → impls}/extend-minipass.js
  17. +15 −0 bench/impls/extend-through2.js
  18. +7 −0 bench/impls/minipass-current-async.js
  19. +1 −0 bench/impls/minipass-current.js
  20. +1 −0 bench/impls/minipass-latest.js
  21. +101 −0 bench/impls/push-through.js
  22. +10 −0 bench/impls/through2.js
  23. +13 −0 bench/impls/web-std.js.REMOVED-GH-42157
  24. +297 −0 bench/index.js
  25. +0 −12 bench/lib/extend-through2.js
  26. +0 −12 bench/lib/nullsink.js
  27. +0 −41 bench/lib/numbers.js
  28. +0 −15 bench/lib/timer.js
  29. +184 −0 bench/package-lock.json
  30. +8 −0 bench/package.json
  31. +436 −0 bench/results.json
  32. +211 −0 bench/results.tab
  33. +5 −0 bench/results/baseline-fast-fast-1.json
  34. +5 −0 bench/results/baseline-fast-fast-20.json
  35. +5 −0 bench/results/baseline-fast-mixed-1.json
  36. +5 −0 bench/results/baseline-fast-mixed-20.json
  37. +5 −0 bench/results/baseline-fast-slow-1.json
  38. +5 −0 bench/results/baseline-fast-slow-20.json
  39. +5 −0 bench/results/baseline-slow-fast-1.json
  40. +5 −0 bench/results/baseline-slow-fast-20.json
  41. +5 −0 bench/results/baseline-slow-slow-1.json
  42. +5 −0 bench/results/baseline-slow-slow-20.json
  43. +5 −0 bench/results/core-extend-transform-fast-fast-1.json
  44. +5 −0 bench/results/core-extend-transform-fast-fast-20.json
  45. +5 −0 bench/results/core-extend-transform-fast-mixed-1.json
  46. +5 −0 bench/results/core-extend-transform-fast-mixed-20.json
  47. +5 −0 bench/results/core-extend-transform-fast-slow-1.json
  48. +5 −0 bench/results/core-extend-transform-fast-slow-20.json
  49. +5 −0 bench/results/core-extend-transform-slow-fast-1.json
  50. +5 −0 bench/results/core-extend-transform-slow-fast-20.json
  51. +5 −0 bench/results/core-extend-transform-slow-slow-1.json
  52. +5 −0 bench/results/core-extend-transform-slow-slow-20.json
  53. +5 −0 bench/results/core-passthrough-fast-fast-1.json
  54. +3 −0 bench/results/core-passthrough-fast-fast-100.json
  55. +5 −0 bench/results/core-passthrough-fast-fast-20.json
  56. +5 −0 bench/results/core-passthrough-fast-mixed-1.json
  57. +3 −0 bench/results/core-passthrough-fast-mixed-100.json
  58. +5 −0 bench/results/core-passthrough-fast-mixed-20.json
  59. +5 −0 bench/results/core-passthrough-fast-slow-1.json
  60. +3 −0 bench/results/core-passthrough-fast-slow-100.json
  61. +5 −0 bench/results/core-passthrough-fast-slow-20.json
  62. +3 −0 bench/results/core-passthrough-mixed-1.json
  63. +5 −0 bench/results/core-passthrough-slow-fast-1.json
  64. +5 −0 bench/results/core-passthrough-slow-fast-20.json
  65. +5 −0 bench/results/core-passthrough-slow-slow-1.json
  66. +3 −0 bench/results/core-passthrough-slow-slow-100.json
  67. +5 −0 bench/results/core-passthrough-slow-slow-20.json
  68. +8 −0 bench/results/core-transform-fast-fast-20.json
  69. +20 −0 bench/results/extend-core-passthroughfast-fast-20-obj.json
  70. +8 −0 bench/results/extend-core-transform-fast-fast-20.json
  71. +5 −0 bench/results/extend-minipass-current-fast-fast-1.json
  72. +5 −0 bench/results/extend-minipass-current-fast-fast-20.json
  73. +5 −0 bench/results/extend-minipass-current-fast-mixed-1.json
  74. +5 −0 bench/results/extend-minipass-current-fast-mixed-20.json
  75. +5 −0 bench/results/extend-minipass-current-fast-slow-1.json
  76. +5 −0 bench/results/extend-minipass-current-fast-slow-20.json
  77. +5 −0 bench/results/extend-minipass-current-slow-fast-1.json
  78. +5 −0 bench/results/extend-minipass-current-slow-fast-20.json
  79. +5 −0 bench/results/extend-minipass-current-slow-slow-1.json
  80. +5 −0 bench/results/extend-minipass-current-slow-slow-20.json
  81. +5 −0 bench/results/extend-minipass-fast-fast-1.json
  82. +5 −0 bench/results/extend-minipass-fast-fast-20.json
  83. +5 −0 bench/results/extend-minipass-fast-mixed-1.json
  84. +5 −0 bench/results/extend-minipass-fast-mixed-20.json
  85. +5 −0 bench/results/extend-minipass-fast-slow-1.json
  86. +5 −0 bench/results/extend-minipass-fast-slow-20.json
  87. +5 −0 bench/results/extend-minipass-slow-fast-1.json
  88. +5 −0 bench/results/extend-minipass-slow-fast-20.json
  89. +5 −0 bench/results/extend-minipass-slow-slow-1.json
  90. +5 −0 bench/results/extend-minipass-slow-slow-20.json
  91. +5 −0 bench/results/extend-through2-fast-fast-1.json
  92. +5 −0 bench/results/extend-through2-fast-fast-20.json
  93. +5 −0 bench/results/extend-through2-fast-mixed-1.json
  94. +5 −0 bench/results/extend-through2-fast-mixed-20.json
  95. +5 −0 bench/results/extend-through2-fast-slow-1.json
  96. +5 −0 bench/results/extend-through2-fast-slow-20.json
  97. +5 −0 bench/results/extend-through2-slow-fast-1.json
  98. +5 −0 bench/results/extend-through2-slow-fast-20.json
  99. +5 −0 bench/results/extend-through2-slow-slow-1.json
  100. +5 −0 bench/results/extend-through2-slow-slow-20.json
  101. +5 −0 bench/results/minipass-current-async-fast-fast-1.json
  102. +5 −0 bench/results/minipass-current-async-fast-fast-20.json
  103. +5 −0 bench/results/minipass-current-async-fast-mixed-1.json
  104. +5 −0 bench/results/minipass-current-async-fast-mixed-20.json
  105. +5 −0 bench/results/minipass-current-async-fast-slow-1.json
  106. +5 −0 bench/results/minipass-current-async-fast-slow-20.json
  107. +5 −0 bench/results/minipass-current-async-slow-fast-1.json
  108. +5 −0 bench/results/minipass-current-async-slow-fast-20.json
  109. +5 −0 bench/results/minipass-current-async-slow-slow-1.json
  110. +5 −0 bench/results/minipass-current-async-slow-slow-20.json
  111. +5 −0 bench/results/minipass-current-fast-fast-1.json
  112. +5 −0 bench/results/minipass-current-fast-fast-20.json
  113. +5 −0 bench/results/minipass-current-fast-mixed-1.json
  114. +3 −0 bench/results/minipass-current-fast-mixed-100.json
  115. +5 −0 bench/results/minipass-current-fast-mixed-20.json
  116. +5 −0 bench/results/minipass-current-fast-slow-1.json
  117. +5 −0 bench/results/minipass-current-fast-slow-20.json
  118. +5 −0 bench/results/minipass-current-slow-fast-1.json
  119. +5 −0 bench/results/minipass-current-slow-fast-20.json
  120. +5 −0 bench/results/minipass-current-slow-slow-1.json
  121. +5 −0 bench/results/minipass-current-slow-slow-20.json
  122. +5 −0 bench/results/minipass-latest-fast-fast-1.json
  123. +5 −0 bench/results/minipass-latest-fast-fast-20.json
  124. +5 −0 bench/results/minipass-latest-fast-mixed-1.json
  125. +5 −0 bench/results/minipass-latest-fast-mixed-20.json
  126. +5 −0 bench/results/minipass-latest-fast-slow-1.json
  127. +5 −0 bench/results/minipass-latest-fast-slow-20.json
  128. +5 −0 bench/results/minipass-latest-slow-fast-1.json
  129. +5 −0 bench/results/minipass-latest-slow-fast-20.json
  130. +5 −0 bench/results/minipass-latest-slow-slow-1.json
  131. +5 −0 bench/results/minipass-latest-slow-slow-20.json
  132. +5 −0 bench/results/push-through-fast-fast-1.json
  133. +5 −0 bench/results/push-through-fast-fast-20.json
  134. +5 −0 bench/results/push-through-fast-mixed-1.json
  135. +5 −0 bench/results/push-through-fast-mixed-20.json
  136. +5 −0 bench/results/push-through-fast-slow-1.json
  137. +5 −0 bench/results/push-through-fast-slow-20.json
  138. +5 −0 bench/results/push-through-slow-fast-1.json
  139. +5 −0 bench/results/push-through-slow-fast-20.json
  140. +5 −0 bench/results/push-through-slow-slow-1.json
  141. +5 −0 bench/results/push-through-slow-slow-20.json
  142. +5 −0 bench/results/through2-fast-fast-1.json
  143. +5 −0 bench/results/through2-fast-fast-20.json
  144. +5 −0 bench/results/through2-fast-mixed-1.json
  145. +5 −0 bench/results/through2-fast-mixed-20.json
  146. +5 −0 bench/results/through2-fast-slow-1.json
  147. +5 −0 bench/results/through2-fast-slow-20.json
  148. +5 −0 bench/results/through2-slow-fast-1.json
  149. +5 −0 bench/results/through2-slow-fast-20.json
  150. +5 −0 bench/results/through2-slow-slow-1.json
  151. +5 −0 bench/results/through2-slow-slow-20.json
  152. +149 −0 index.d.ts
  153. +191 −102 index.js
  154. +1,932 −3,061 package-lock.json
  155. +19 −3 package.json
  156. +20 −0 test/async-duplicate-end.js
  157. +89 −0 test/async-stream.js
  158. +3 −1 test/basic.js
  159. +10 −0 test/pipe-ended-stream.js
  160. +82 −0 test/proxy-errors.js
  161. +14 −0 test/readable-emits-immediately.js
  162. +48 −0 test/unpipe.js
  163. +16 −2 test/write-returns-true-when-readable-triggers-flow.js
3 changes: 3 additions & 0 deletions .github/FUNDING.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# These are supported funding model platforms

github: [isaacs]
39 changes: 39 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
name: CI

on: [push, pull_request]

jobs:
build:
strategy:
matrix:
node-version: [12.x, 14.x, 16.x, 17.x]
platform:
- os: ubuntu-latest
shell: bash
- os: macos-latest
shell: bash
- os: windows-latest
shell: bash
- os: windows-latest
shell: powershell
fail-fast: false

runs-on: ${{ matrix.platform.os }}
defaults:
run:
shell: ${{ matrix.platform.shell }}

steps:
- name: Checkout Repository
uses: actions/checkout@v1.1.0

- name: Use Nodejs ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}

- name: Install dependencies
run: npm install

- name: Run Tests
run: npm test -- -c -t0
13 changes: 13 additions & 0 deletions .github/workflows/commit-if-modified.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/usr/bin/env bash
git config --global user.email "$1"
shift
git config --global user.name "$1"
shift
message="$1"
shift
if [ $(git status --porcelain "$@" | egrep '^ M' | wc -l) -gt 0 ]; then
git add "$@"
git commit -m "$message"
git push || git pull --rebase
git push
fi
15 changes: 15 additions & 0 deletions .github/workflows/copyright-year.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/usr/bin/env bash
dir=${1:-$PWD}
dates=($(git log --date=format:%Y --pretty=format:'%ad' --reverse | sort | uniq))
if [ "${#dates[@]}" -eq 1 ]; then
datestr="${dates}"
else
datestr="${dates}-${dates[${#dates[@]}-1]}"
fi

stripDate='s/^((.*)Copyright\b(.*?))((?:,\s*)?(([0-9]{4}\s*-\s*[0-9]{4})|(([0-9]{4},\s*)*[0-9]{4})))(?:,)?\s*(.*)\n$/$1$9\n/g'
addDate='s/^.*Copyright(?:\s*\(c\))? /Copyright \(c\) '$datestr' /g'
for l in $dir/LICENSE*; do
perl -pi -e "$stripDate" $l
perl -pi -e "$addDate" $l
done
37 changes: 37 additions & 0 deletions .github/workflows/isaacs-makework.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: "various tidying up tasks to silence nagging"

on:
push:
branches:
- main
workflow_dispatch:

jobs:
makework:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Use Node.js
uses: actions/setup-node@v2.1.4
with:
node-version: 16.x
- name: put repo in package.json
run: node .github/workflows/package-json-repo.js
- name: check in package.json if modified
run: |
bash -x .github/workflows/commit-if-modified.sh \
"package-json-repo-bot@example.com" \
"package.json Repo Bot" \
"chore: add repo to package.json" \
package.json package-lock.json
- name: put all dates in license copyright line
run: bash .github/workflows/copyright-year.sh
- name: check in licenses if modified
run: |
bash .github/workflows/commit-if-modified.sh \
"license-year-bot@example.com" \
"License Year Bot" \
"chore: add copyright year to license" \
LICENSE*
16 changes: 16 additions & 0 deletions .github/workflows/package-json-repo.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/usr/bin/env node

const pf = require.resolve(`${process.cwd()}/package.json`)
const pj = require(pf)

if (!pj.repository && process.env.GITHUB_REPOSITORY) {
const fs = require('fs')
const server = process.env.GITHUB_SERVER_URL || 'https://github.com'
const repo = `${server}/${process.env.GITHUB_REPOSITORY}`
pj.repository = repo
const json = fs.readFileSync(pf, 'utf8')
const match = json.match(/^\s*\{[\r\n]+([ \t]*)"/)
const indent = match[1]
const output = JSON.stringify(pj, null, indent || 2) + '\n'
fs.writeFileSync(pf, output)
}
5 changes: 0 additions & 5 deletions .travis.yml

This file was deleted.

2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
The ISC License

Copyright (c) npm, Inc. and Contributors
Copyright (c) 2017-2022 npm, Inc., Isaac Z. Schlueter, and Contributors

Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
129 changes: 122 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -17,9 +17,6 @@ from this stream via `'data'` events or by calling `pipe()` into some other
stream. Calling `read()` requires the buffer to be flattened in some
cases, which requires copying memory.

There is also no `unpipe()` method. Once you start piping, there is no
stopping it!

If you set `objectMode: true` in the options, then whatever is written will
be emitted. Otherwise, it'll do a minimal amount of Buffer copying to
ensure proper Streams semantics when `read(n)` is called.
@@ -63,6 +60,10 @@ some ways superior to) Node.js core streams.
Please read these caveats if you are familiar with node-core streams and
intend to use Minipass streams in your programs.

You can avoid most of these differences entirely (for a very
small performance penalty) by setting `{async: true}` in the
constructor options.

### Timing

Minipass streams are designed to support synchronous use-cases. Thus, data
@@ -82,6 +83,82 @@ This non-deferring approach makes Minipass streams much easier to reason
about, especially in the context of Promises and other flow-control
mechanisms.

Example:

```js
const Minipass = require('minipass')
const stream = new Minipass({ async: true })
stream.on('data', () => console.log('data event'))
console.log('before write')
stream.write('hello')
console.log('after write')
// output:
// before write
// data event
// after write
```

### Exception: Async Opt-In

If you wish to have a Minipass stream with behavior that more
closely mimics Node.js core streams, you can set the stream in
async mode either by setting `async: true` in the constructor
options, or by setting `stream.async = true` later on.

```js
const Minipass = require('minipass')
const asyncStream = new Minipass({ async: true })
asyncStream.on('data', () => console.log('data event'))
console.log('before write')
asyncStream.write('hello')
console.log('after write')
// output:
// before write
// after write
// data event <-- this is deferred until the next tick
```

Switching _out_ of async mode is unsafe, as it could cause data
corruption, and so is not enabled. Example:

```js
const Minipass = require('minipass')
const stream = new Minipass({ encoding: 'utf8' })
stream.on('data', chunk => console.log(chunk))
stream.async = true
console.log('before writes')
stream.write('hello')
setStreamSyncAgainSomehow(stream) // <-- this doesn't actually exist!
stream.write('world')
console.log('after writes')
// hypothetical output would be:
// before writes
// world
// after writes
// hello
// NOT GOOD!
```

To avoid this problem, once set into async mode, any attempt to
make the stream sync again will be ignored.

```js
const Minipass = require('minipass')
const stream = new Minipass({ encoding: 'utf8' })
stream.on('data', chunk => console.log(chunk))
stream.async = true
console.log('before writes')
stream.write('hello')
stream.async = false // <-- no-op, stream already async
stream.write('world')
console.log('after writes')
// actual output:
// before writes
// after writes
// hello
// world
```

### No High/Low Water Marks

Node.js core streams will optimistically fill up a buffer, returning `true`
@@ -97,6 +174,9 @@ If the data has nowhere to go, then `write()` returns false, and the data
sits in a buffer, to be drained out immediately as soon as anyone consumes
it.

Since nothing is ever buffered unnecessarily, there is much less
copying data, and less bookkeeping about buffer capacity levels.

### Hazards of Buffering (or: Why Minipass Is So Fast)

Since data written to a Minipass stream is immediately written all the way
@@ -181,6 +261,8 @@ moving on to the next entry in an archive parse stream, etc.) then be sure
to call `stream.pause()` on creation, and then `stream.resume()` once you
are ready to respond to the `end` event.

However, this is _usually_ not a problem because:

### Emit `end` When Asked

One hazard of immediately emitting `'end'` is that you may not yet have had
@@ -197,6 +279,18 @@ To prevent calling handlers multiple times who would not expect multiple
ends to occur, all listeners are removed from the `'end'` event whenever it
is emitted.

### Emit `error` When Asked

The most recent error object passed to the `'error'` event is
stored on the stream. If a new `'error'` event handler is added,
and an error was previously emitted, then the event handler will
be called immediately (or on `process.nextTick` in the case of
async streams).

This makes it much more difficult to end up trying to interact
with a broken stream, if the error handler is added after an
error was previously emitted.

### Impact of "immediate flow" on Tee-streams

A "tee stream" is a stream piping to multiple destinations:
@@ -221,7 +315,7 @@ src.pipe(dest1) // 'foo' chunk flows to dest1 immediately, and is gone
src.pipe(dest2) // gets nothing!
```

The solution is to create a dedicated tee-stream junction that pipes to
One solution is to create a dedicated tee-stream junction that pipes to
both locations, and then pipe to _that_ instead.

```js
@@ -258,6 +352,13 @@ tee.on('data', handler2)
src.pipe(tee)
```

All of the hazards in this section are avoided by setting `{
async: true }` in the Minipass constructor, or by setting
`stream.async = true` afterwards. Note that this does add some
overhead, so should only be done in cases where you are willing
to lose a bit of performance in order to avoid having to refactor
program logic.

## USAGE

It's a stream! Use it like a stream and it'll most likely do what you
@@ -279,6 +380,10 @@ mp.end('bar')
by default if you write() something other than a string or Buffer at any
point. Setting `objectMode: true` will prevent setting any encoding
value.
* `async` Defaults to `false`. Set to `true` to defer data
emission until next tick. This reduces performance slightly,
but makes Minipass streams use timing behavior closer to Node
core streams. See [Timing](#timing) for more details.

### API

@@ -300,9 +405,19 @@ streams.
from being emitted for empty streams until the stream is resumed.
* `resume()` - Resume the stream. If there's data in the buffer, it is all
discarded. Any buffered events are immediately emitted.
* `pipe(dest)` - Send all output to the stream provided. There is no way
to unpipe. When data is emitted, it is immediately written to any and
all pipe destinations.
* `pipe(dest)` - Send all output to the stream provided. When
data is emitted, it is immediately written to any and all pipe
destinations. (Or written on next tick in `async` mode.)
* `unpipe(dest)` - Stop piping to the destination stream. This
is immediate, meaning that any asynchronously queued data will
_not_ make it to the destination when running in `async` mode.
* `options.end` - Boolean, end the destination stream when
the source stream ends. Default `true`.
* `options.proxyErrors` - Boolean, proxy `error` events from
the source stream to the destination stream. Note that
errors are _not_ proxied after the pipeline terminates,
either due to the source emitting `'end'` or manually
unpiping with `src.unpipe(dest)`. Default `false`.
* `on(ev, fn)`, `emit(ev, fn)` - Minipass streams are EventEmitters. Some
events are given special treatment, however. (See below under "events".)
* `promise()` - Returns a Promise that resolves when the stream emits
31 changes: 31 additions & 0 deletions bench/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
Stream Benchmarks

Run `node index.js` to benchmark all the different stream implementations
with the following configurations:

- `implementation` One of the implementations in `./impls`
- `case` How is the data emitted and how is it consumed?
- `fast-fast` Source emits data as fast as possible, destination
consumes it immediately.
- `fast-slow` Source emits data as fast as possible, destination
consumes one chunk per Promise cycle.
- `slow-fast` Source emits one data per Promise cycle, destination
consumes it immediately.
- `slow-slow` Source emits one data per Promise cycle, destination
consumes one chunk per Promise cycle.
- `fast-mixed` Source emits data as fast as possible, data is piped to
one fast destination stream and one slow destination stream.
- `pipeline` How many instances of the tested implementation are piped
together between the source and destination? Tested with `1` and `20` by
default.
- `type` What kind of data is written?
- `defaults` a buffer
- `str` a string
- `obj` the object `{i: 'object'}`

Results are written to the `./results` folder for each test case, and to
`results.json` and `results.tab`.

See [this google
sheet](https://docs.google.com/spreadsheets/d/1K_HR5oh3r80b8WVMWCPPjfuWXUgfkmhlX7FGI6JJ8tY/edit?usp=sharing)
for analysis and comparisons.
14 changes: 14 additions & 0 deletions bench/impls/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
To add a new stream type to test, create a file here that exports a
function or class that can be called with `s = new Class(options)`, where
`options` can include `{encoding: 'utf8'}` or `{objectMode: true}`.

The returned object must implement at minimum the following subset of the
stream interface:

* `on(event, fn)` call `fn` when `event` happens
* `write(data)` where `data` will be an object, string, or buffer. Return
`true` if more data should be written, `false` otherwise. Emit `drain`
when ready for more data if `false` is written.
* `end()` no further data will be written, emit `'finish'` when all data
processed
* `pipe(dest)` pipe all output to `dest` stream
Loading