Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: ZJONSSON/node-unzipper
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 724e71d296a6dd0cd00edfb71dffc622bdfba5f7
Choose a base ref
...
head repository: ZJONSSON/node-unzipper
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: ab64d6a38b5f091384334dd7aff283f0a5073878
Choose a head ref

Commits on May 26, 2020

  1. Copy the full SHA
    11587a4 View commit details

Commits on Jun 12, 2020

  1. docs(parseOne): last pipe is a write

    The last pipe in the parseOne example must be a createWriteStream otherwise I am not sure to understand the example.
    
    Thanks for the great library, works perfectly :)
    vvo authored Jun 12, 2020
    Copy the full SHA
    3c94b7e View commit details

Commits on Jun 20, 2020

  1. Merge pull request #203 from vvo/patch-1

    docs(parseOne): last pipe is a write
    ZJONSSON authored Jun 20, 2020
    Copy the full SHA
    d427d86 View commit details

Commits on Jun 22, 2020

  1. Merge pull request #197 from pwoldberg/fix-comment

    Get comment from centralDirectory
    ZJONSSON authored Jun 22, 2020
    Copy the full SHA
    36add2f View commit details
  2. fix: extract from url not working (#195)

    * fix: extract from url not working
    
    * test: extract from url
    vltansky authored Jun 22, 2020
    Copy the full SHA
    be3c555 View commit details
  3. Copy the full SHA
    c2d5e09 View commit details
  4. bump patch

    ZJONSSON committed Jun 22, 2020
    Copy the full SHA
    e365abd View commit details
  5. Copy the full SHA
    f41ea9d View commit details
  6. hotfix: remove ES6

    ZJONSSON committed Jun 22, 2020
    Copy the full SHA
    37c83b7 View commit details

Commits on Jul 6, 2020

  1. Copy the full SHA
    82ae9eb View commit details

Commits on Jul 18, 2020

  1. Fix default concurrency to 1

    Giovanni Esposito committed Jul 18, 2020
    Copy the full SHA
    51d730b View commit details

Commits on Jan 10, 2021

  1. Copy the full SHA
    eea8cd5 View commit details

Commits on Feb 7, 2021

  1. Merge pull request #208 from george-norris-salesforce/patch-1

    directory is undefined in docs
    ZJONSSON authored Feb 7, 2021
    Copy the full SHA
    29e9142 View commit details
  2. Merge pull request #211 from mrbabbs/fix-default-concurrency-extract

    Fix default concurrency to 1
    ZJONSSON authored Feb 7, 2021
    Copy the full SHA
    7261624 View commit details
  3. Merge pull request #229 from dergutehirte/master

    Fixed broken unicode checks
    ZJONSSON authored Feb 7, 2021
    Copy the full SHA
    fddad0a View commit details
  4. Add custom source option for Open (#223)

    * Add Open.custom to provide unzipping from a custom source
    
    * Fix readme code block
    
    * Tweak readme for Open.custom
    
    * Update Open.custom example with Google Cloud Storage
    
    This better explains the use-case for using a custom source.
    jaapvanblaaderen authored Feb 7, 2021
    Copy the full SHA
    7f83183 View commit details

Commits on Sep 9, 2021

  1. Copy the full SHA
    abf5dc2 View commit details

Commits on Sep 10, 2021

  1. bump package version

    mheggeseth committed Sep 10, 2021
    Copy the full SHA
    716c220 View commit details

Commits on Feb 12, 2022

  1. Merge pull request #244 from mheggeseth/fix-eocd-scan

    Ensure Successful ZIP64 Extraction
    ZJONSSON authored Feb 12, 2022
    Copy the full SHA
    341f258 View commit details

Commits on Apr 18, 2023

  1. End stream before closing & Prefer req.destroy() before req.abort() i…

    …f available
    
    Fixes tests for NodeJS v15 and higher
    Related to #269,#273
    Durisvk committed Apr 18, 2023
    Copy the full SHA
    e3d7c7c View commit details

Commits on May 10, 2023

  1. Merge pull request #274 from Durisvk/master

    End stream before closing & Prefer req.destroy() before req.abort() if available
    ZJONSSON authored May 10, 2023
    Copy the full SHA
    a32f156 View commit details
  2. bump version

    ZJONSSON committed May 10, 2023
    Copy the full SHA
    ab64d6a View commit details
5 changes: 5 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
language: node_js
node_js:
- "lts/*"
- "17"
- "16"
- "15"
- "14"
- "13"
- "12"
- "10"
39 changes: 37 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -145,7 +145,7 @@ Example:
```js
fs.createReadStream('path/to/archive.zip')
.pipe(unzipper.ParseOne())
.pipe(fs.createReadStream('firstFile.txt'));
.pipe(fs.createWriteStream('firstFile.txt'));
```

### Buffering the content of an entry into memory
@@ -220,7 +220,7 @@ Example:
```js
async function main() {
const directory = await unzipper.Open.file('path/to/archive.zip');
console.log('directory', d);
console.log('directory', directory);
return new Promise( (resolve, reject) => {
directory.files[0]
.stream()
@@ -317,6 +317,41 @@ async function main() {
main();
```

### Open.custom(source, [options])
This function can be used to provide a custom source implementation. The source parameter expects a `stream` and a `size` function to be implemented. The size function should return a `Promise` that resolves the total size of the file. The stream function should return a `Readable` stream according to the supplied offset and length parameters.

Example:

```js
// Custom source implementation for reading a zip file from Google Cloud Storage
const { Storage } = require('@google-cloud/storage');

async function main() {
const storage = new Storage();
const bucket = storage.bucket('my-bucket');
const zipFile = bucket.file('my-zip-file.zip');

const customSource = {
stream: function(offset, length) {
return zipFile.createReadStream({
start: offset,
end: length && offset + length
})
},
size: async function() {
const objMetadata = (await zipFile.getMetadata())[0];
return objMetadata.size;
}
};

const directory = await unzipper.Open.custom(customSource);
console.log('directory', directory);
// ...
}

main();
```

### Open.[method].extract()

The directory object returned from `Open.[method]` provides an `extract` method which extracts all the files to a specified `path`, with an optional `concurrency` (default: 1).
11 changes: 9 additions & 2 deletions lib/Open/directory.js
Original file line number Diff line number Diff line change
@@ -141,11 +141,18 @@ module.exports = function centralDirectory(source, options) {
vars.offsetToStartOfCentralDirectory += startOffset;
}
})
.then(function() {
if (vars.commentLength) return endDir.pull(vars.commentLength).then(function(comment) {
vars.comment = comment.toString('utf8');
});
})
.then(function() {
source.stream(vars.offsetToStartOfCentralDirectory).pipe(records);

vars.extract = function(opts) {
if (!opts || !opts.path) throw new Error('PATH_MISSING');
// make sure path is normalized before using it
opts.path = path.resolve(path.normalize(opts.path));
return vars.files.then(function(files) {
return Promise.map(files, function(entry) {
if (entry.type == 'Directory') return;
@@ -166,7 +173,7 @@ module.exports = function centralDirectory(source, options) {
.on('close',resolve)
.on('error',reject);
});
}, opts.concurrency > 1 ? {concurrency: opts.concurrency || undefined} : undefined);
}, { concurrency: opts.concurrency > 1 ? opts.concurrency : 1 });
});
};

@@ -198,7 +205,7 @@ module.exports = function centralDirectory(source, options) {
return records.pull(vars.fileNameLength).then(function(fileNameBuffer) {
vars.pathBuffer = fileNameBuffer;
vars.path = fileNameBuffer.toString('utf8');
vars.isUnicode = vars.flags & 0x11;
vars.isUnicode = (vars.flags & 0x800) != 0;
return records.pull(vars.extraFieldLength);
})
.then(function(extraField) {
4 changes: 4 additions & 0 deletions lib/Open/index.js
Original file line number Diff line number Diff line change
@@ -92,6 +92,10 @@ module.exports = {
}
};

return directory(source, options);
},

custom: function(source, options) {
return directory(source, options);
}
};
4 changes: 3 additions & 1 deletion lib/Open/unzip.js
Original file line number Diff line number Diff line change
@@ -106,7 +106,9 @@ module.exports = function unzip(source,offset,_password, directoryVars) {
.on('error',function(err) { entry.emit('error',err);})
.pipe(entry)
.on('finish', function() {
if (req.abort)
if(req.destroy)
req.destroy()
else if (req.abort)
req.abort();
else if (req.close)
req.close();
3 changes: 1 addition & 2 deletions lib/PullStream.js
Original file line number Diff line number Diff line change
@@ -78,10 +78,9 @@ PullStream.prototype.stream = function(eof,includeEof) {
}

if (!done) {
if (self.finished && !this.__ended) {
if (self.finished) {
self.removeListener('chunk',pull);
self.emit('error', new Error('FILE_ENDED'));
this.__ended = true;
return;
}

16 changes: 10 additions & 6 deletions lib/parse.js
Original file line number Diff line number Diff line change
@@ -26,6 +26,7 @@ function Parse(opts) {

PullStream.call(self, self._opts);
self.on('finish',function() {
self.emit('end');
self.emit('close');
});
self._readRecord().catch(function(e) {
@@ -51,16 +52,19 @@ Parse.prototype._readRecord = function () {
return self._readFile();
}
else if (signature === 0x02014b50) {
self.__ended = true;
self.reachedCD = true;
return self._readCentralDirectoryFileHeader();
}
else if (signature === 0x06054b50) {
return self._readEndOfCentralDirectoryRecord();
}
else if (self.__ended) {
return self.pull(endDirectorySignature).then(function() {
return self._readEndOfCentralDirectoryRecord();
});
else if (self.reachedCD) {
// _readEndOfCentralDirectoryRecord expects the EOCD
// signature to be consumed so set includeEof=true
var includeEof = true;
return self.pull(endDirectorySignature, includeEof).then(function() {
return self._readEndOfCentralDirectoryRecord();
});
}
else
self.emit('error', new Error('invalid signature: 0x' + signature.toString(16)));
@@ -130,7 +134,7 @@ Parse.prototype._readFile = function () {
entry.props.path = fileName;
entry.props.pathBuffer = fileNameBuffer;
entry.props.flags = {
"isUnicode": vars.flags & 0x11
"isUnicode": (vars.flags & 0x800) != 0
};
entry.type = (vars.uncompressedSize === 0 && /[\/\\]$/.test(fileName)) ? 'Directory' : 'File';

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "unzipper",
"version": "0.10.11",
"version": "0.10.14",
"description": "Unzip cross-platform streaming API ",
"author": "Evan Oxfeld <eoxfeld@gmail.com>",
"contributors": [
6 changes: 3 additions & 3 deletions test/compressed-crx.js
Original file line number Diff line number Diff line change
@@ -56,8 +56,8 @@ test('open methods', function(t) {
var tests = [
{name: 'buffer',args: [buffer]},
{name: 'file', args: [archive]},
{name: 'url', args: [request, 'https://s3.amazonaws.com/unzipper/archive.crx']},
{name: 's3', args: [s3, { Bucket: 'unzipper', Key: 'archive.crx'}]}
// {name: 'url', args: [request, 'https://s3.amazonaws.com/unzipper/archive.crx']},
// {name: 's3', args: [s3, { Bucket: 'unzipper', Key: 'archive.crx'}]}
];

tests.forEach(function(test) {
@@ -75,4 +75,4 @@ test('open methods', function(t) {
});
});
});
});
});
27 changes: 27 additions & 0 deletions test/extractFromUrl.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
"use strict";

var test = require("tap").test;
var fs = require("fs");
var unzip = require("../");
var os = require("os");
var request = require("request");

test("extract zip from url", function (t) {
var extractPath = os.tmpdir() + "/node-unzip-extract-fromURL"; // Not using path resolve, cause it should be resolved in extract() function
unzip.Open.url(
request,
"https://github.com/h5bp/html5-boilerplate/releases/download/v7.3.0/html5-boilerplate_v7.3.0.zip"
)
.then(function(d) { return d.extract({ path: extractPath }); })
.then(function(d) {
var dirFiles = fs.readdirSync(extractPath);
var isPassing =
dirFiles.length > 10 &&
dirFiles.indexOf("css") > -1 &&
dirFiles.indexOf("index.html") > -1 &&
dirFiles.indexOf("favicon.ico") > -1;

t.equal(isPassing, true);
t.end();
});
});
16 changes: 16 additions & 0 deletions test/openComment.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
'use strict';

var test = require('tap').test;
var path = require('path');
var unzip = require('../');


test("get comment out of a zip", function (t) {
var archive = path.join(__dirname, '../testData/compressed-comment/archive.zip');

unzip.Open.file(archive)
.then(function(d) {
t.equal('Zipfile has a comment', d.comment);
t.end();
});
});
41 changes: 41 additions & 0 deletions test/openCustom.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
'use strict';

var test = require('tap').test;
var fs = require('fs');
var path = require('path');
var unzip = require('../unzip');
var Promise = require('bluebird');

test("get content of a single file entry out of a zip", function (t) {
var archive = path.join(__dirname, '../testData/compressed-standard/archive.zip');

var customSource = {
stream: function(offset,length) {
return fs.createReadStream(archive, {start: offset, end: length && offset+length});
},
size: function() {
return new Promise(function(resolve, reject) {
fs.stat(archive, function(err, d) {
if (err)
reject(err);
else
resolve(d.size);
});
});
}
};

return unzip.Open.custom(customSource)
.then(function(d) {
var file = d.files.filter(function(file) {
return file.path == 'file.txt';
})[0];

return file.buffer()
.then(function(str) {
var fileStr = fs.readFileSync(path.join(__dirname, '../testData/compressed-standard/inflated/file.txt'), 'utf8');
t.equal(str.toString(), fileStr);
t.end();
});
});
});
4 changes: 2 additions & 2 deletions test/openS3.js
Original file line number Diff line number Diff line change
@@ -17,7 +17,7 @@ s3.headObject = function(params,cb) {
return s3.makeUnauthenticatedRequest('headObject',params,cb);
};

test("get content of a single file entry out of a zip", function (t) {
test("get content of a single file entry out of a zip", { skip: true }, function(t) {
return unzip.Open.s3(s3,{ Bucket: 'unzipper', Key: 'archive.zip' })
.then(function(d) {
var file = d.files.filter(function(file) {
@@ -31,4 +31,4 @@ test("get content of a single file entry out of a zip", function (t) {
t.end();
});
});
});
});
16 changes: 14 additions & 2 deletions test/zip64.js
Original file line number Diff line number Diff line change
@@ -5,6 +5,7 @@ var path = require('path');
var unzip = require('../');
var fs = require('fs');
var Stream = require('stream');
var temp = require('temp');

var UNCOMPRESSED_SIZE = 5368709120;
var ZIP64_OFFSET = 72;
@@ -76,8 +77,19 @@ t.test('Parse files from zip64 format correctly', function (t) {
.pipe(unzip.Parse())
.on('entry', function(entry) {
t.same(entry.vars.uncompressedSize, ZIP64_SIZE, 'Parse: File header');
t.end();
});
})
.on('close', function() { t.end(); });
});

t.test('in unzipper.extract', function (t) {
temp.mkdir('node-unzip-', function (err, dirPath) {
if (err) {
throw err;
}
fs.createReadStream(archive)
.pipe(unzip.Extract({ path: dirPath }))
.on('close', function() { t.end(); });
});
});

t.end();
Binary file added testData/compressed-comment/archive.zip
Binary file not shown.
1 change: 1 addition & 0 deletions testData/compressed-comment/inflated/dir/fileInsideDir.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
42
1 change: 1 addition & 0 deletions testData/compressed-comment/inflated/file.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
node.js rocks