Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: mongodb/node-mongodb-native
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 99681e1f2846d17c4b0e2df804950839ae7a17fe
Choose a base ref
...
head repository: mongodb/node-mongodb-native
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: dfb03ad5f48ab1ebdb9cae7e93f4cc54ef9e744e
Choose a head ref

Commits on Jan 24, 2020

  1. Copy the full SHA
    d4a51a4 View commit details
  2. Copy the full SHA
    41ee8e0 View commit details

Commits on Jan 29, 2020

  1. Merge branch '3.5'

    mbroadst committed Jan 29, 2020
    Copy the full SHA
    ffe92a9 View commit details

Commits on Feb 2, 2020

  1. Copy the full SHA
    2030a69 View commit details
  2. Copy the full SHA
    5f04660 View commit details
  3. Copy the full SHA
    7bbad9c View commit details
  4. feat: expand use of error labels for retryable writes

    The work here encompasses changes to the retryable writes logic of
    the driver, such that the `RetryableWriteError` label becomes the
    primary means of determining whether an operation will be retried.
    4.4+ servers will attach this label server-side, so this change
    allows us to gracefully remove client-side checking of retryable
    write errors.
    
    NODE-2379
    mbroadst committed Feb 2, 2020
    Copy the full SHA
    c775a4a View commit details

Commits on Feb 5, 2020

  1. Copy the full SHA
    bb4689e View commit details
  2. Copy the full SHA
    9b80c24 View commit details
  3. Copy the full SHA
    6449f04 View commit details
  4. test: force use of a single host for sharded topology tests

    We currently have an expectation that sharded tests are run against
    a single mongos, but the driver does not correctly discover that a
    single host passed in is a mongos or not. The only way to do this
    currently is by specifying the host multiple times. A side effect
    of this is that we need to deduplicate the seedlist in the legacy
    Mongos topology type
    mbroadst committed Feb 5, 2020
    Copy the full SHA
    c1bff29 View commit details
  5. Copy the full SHA
    a89d491 View commit details
  6. Copy the full SHA
    79f4c65 View commit details
  7. Copy the full SHA
    fefc165 View commit details
  8. fix: only consider MongoError subclasses for retryability

    A change to use the `RetryableWriteError` label to determine if
    a write should be retried ignored that non-MongoError's could also
    be caught. These should be ignored for the purposes of retryability
    mbroadst committed Feb 5, 2020
    Copy the full SHA
    265fe40 View commit details

Commits on Feb 6, 2020

  1. Merge branch '3.5'

    mbroadst committed Feb 6, 2020
    Copy the full SHA
    68170da View commit details

Commits on Feb 10, 2020

  1. feat: support shorter SCRAM conversations

    MongoDB 4.4+ will support removing an extra unnecessary empty
    exchange during SCRAM handshaking
    
    NODE-2301
    mbroadst committed Feb 10, 2020
    Copy the full SHA
    6b9ff05 View commit details

Commits on Feb 11, 2020

  1. Copy the full SHA
    292fe08 View commit details
  2. Copy the full SHA
    faee15b View commit details
  3. Copy the full SHA
    dbc0b37 View commit details
  4. Copy the full SHA
    a110ee4 View commit details
  5. Copy the full SHA
    e855c83 View commit details
  6. Copy the full SHA
    c1ed2c1 View commit details
  7. Copy the full SHA
    637f428 View commit details
  8. refactor: begin to provide formal specs for test operations

    Test operations until now have been built dynamically based on key
    checking on the arguments object passed in. This approach works
    but is very brittle. This commit introduces a new way of resolving
    the test operation through a more formal specification that all
    operations will eventually be migrated to.
    mbroadst committed Feb 11, 2020
    Copy the full SHA
    62b39a0 View commit details
  9. Copy the full SHA
    11f8792 View commit details
  10. Copy the full SHA
    8b8a20c View commit details

Commits on Feb 13, 2020

  1. Copy the full SHA
    24155e7 View commit details
  2. Copy the full SHA
    6d3f313 View commit details
  3. Copy the full SHA
    979d41e View commit details
  4. Copy the full SHA
    acdb648 View commit details

Commits on Feb 21, 2020

  1. fix(sdam): use ObjectId comparison to track maxElectionId

    Code for tracking the `maxElectionId` currently assumes that the
    id is represented in extended JSON. This fix modifies the test
    runner to parse the extended JSON into BSON, and modified the
    comparison logic to assume ObjectId
    
    NODE-2464
    mbroadst committed Feb 21, 2020
    Copy the full SHA
    db991d6 View commit details
  2. Copy the full SHA
    b98a00a View commit details

Commits on Feb 26, 2020

  1. Copy the full SHA
    7f3cfba View commit details
  2. Copy the full SHA
    2d607fa View commit details
  3. Merge branch '3.5'

    mbroadst committed Feb 26, 2020
    Copy the full SHA
    c4add7b View commit details
  4. Copy the full SHA
    e6dc1f4 View commit details
  5. Copy the full SHA
    69d10ba View commit details

Commits on Mar 4, 2020

  1. Copy the full SHA
    96e5426 View commit details

Commits on Mar 10, 2020

  1. Copy the full SHA
    814e278 View commit details

Commits on Mar 11, 2020

  1. Copy the full SHA
    fc3378c View commit details
  2. Merge branch '3.5' into 3.6

    mbroadst committed Mar 11, 2020
    Copy the full SHA
    787f54f View commit details

Commits on Mar 19, 2020

  1. docs(examples): correct replaceOne usage

    Use the correct parameters to the replaceOne example usage
    
    NODE-2502
    nbbeeken authored Mar 19, 2020
    Copy the full SHA
    3b0a23f View commit details
  2. docs: fix typo in aggregate

    correct the collation argument description for clarity
    
    NODE-2416
    nbbeeken authored Mar 19, 2020
    Copy the full SHA
    52eaf53 View commit details

Commits on Mar 25, 2020

  1. Merge branch '3.5' into 3.6

    mbroadst committed Mar 25, 2020
    Copy the full SHA
    7da3354 View commit details

Commits on Mar 27, 2020

  1. chore: remove exotic build configurations

    By default the driver doesn't actually build any code that would
    require us testing on these platforms, we only need to test in our
    dependencies such as `kerberos`, `snappy` and `bson-ext`.
    mbroadst committed Mar 27, 2020
    Copy the full SHA
    40862b5 View commit details

Commits on Mar 30, 2020

  1. Copy the full SHA
    f8a7c63 View commit details

Commits on Apr 3, 2020

  1. Merge branch '3.5' into 3.6

    mbroadst committed Apr 3, 2020
    Copy the full SHA
    a5acfe0 View commit details

Commits on Apr 6, 2020

  1. feat: support creating collections and indexes in transactions

    MongoDB 4.4 now supports creation of collections and indexes within
    transactions. This patch includes that support as well as a spec
    tests to validate the behavior.
    
    NODE-2295
    mbroadst committed Apr 6, 2020
    Copy the full SHA
    17e4c88 View commit details
  2. Copy the full SHA
    94d25c3 View commit details
Showing 858 changed files with 62,452 additions and 14,843 deletions.
1 change: 0 additions & 1 deletion .coveralls.yml

This file was deleted.

32 changes: 0 additions & 32 deletions .eslintrc

This file was deleted.

29 changes: 29 additions & 0 deletions .eslintrc.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
{
"root": true,
"parserOptions": {
"ecmaVersion": 2017
},
"plugins": [
"prettier",
"es"
],
"extends": [
"eslint:recommended",
"plugin:prettier/recommended"
],
"env": {
"node": true,
"mocha": true,
"es6": true
},
"rules": {
"prettier/prettier": "error",
"es/no-destructuring": "error",
"es/no-rest-spread-properties": "error",
"es/no-spread-elements": "error",
"no-console": "error",
"eqeqeq": ["error", "always", { "null": "ignore" }],
"strict": ["error", "global"]
},
"ignorePatterns": ["test/benchmarks/*.js", "test/examples/*.js"]
}
1,144 changes: 1,063 additions & 81 deletions .evergreen/config.yml

Large diffs are not rendered by default.

382 changes: 351 additions & 31 deletions .evergreen/config.yml.in

Large diffs are not rendered by default.

332 changes: 300 additions & 32 deletions .evergreen/generate_evergreen_tasks.js

Large diffs are not rendered by default.

84 changes: 76 additions & 8 deletions .evergreen/install-dependencies.sh
Original file line number Diff line number Diff line change
@@ -2,35 +2,103 @@
# set -o xtrace # Write all commands first to stderr
set -o errexit # Exit the script with error if any of the commands fail

NVM_WINDOWS_URL="https://github.com/coreybutler/nvm-windows/releases/download/1.1.7/nvm-noinstall.zip"
NVM_URL="https://raw.githubusercontent.com/nvm-sh/nvm/v0.35.3/install.sh"

NODE_LTS_NAME=${NODE_LTS_NAME:-carbon}
MSVS_VERSION=${MSVS_VERSION:-2017}
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
NPM_CACHE_DIR="${NODE_ARTIFACTS_PATH}/npm"
NPM_TMP_DIR="${NODE_ARTIFACTS_PATH}/tmp"

# this needs to be explicitly exported for the nvm install below
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
export XDG_CONFIG_HOME=${NODE_ARTIFACTS_PATH}

# create node artifacts path if needed
mkdir -p ${NODE_ARTIFACTS_PATH}
mkdir -p ${NPM_CACHE_DIR}
mkdir -p "${NPM_TMP_DIR}"

# install Node.js
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.8/install.sh | bash
[ -s "${NVM_DIR}/nvm.sh" ] && \. "${NVM_DIR}/nvm.sh"
nvm install --lts=${NODE_LTS_NAME}
case $NODE_LTS_NAME in
"argon")
VERSION=4
;;
"boron")
VERSION=6
;;
"carbon")
VERSION=8
;;
"dubnium")
VERSION=10
;;
"erbium")
VERSION=12
;;
"fermium")
VERSION=14
;;
*)
echo "Unsupported Node LTS version $1"
exit 1
;;
esac
NODE_VERSION=$(curl --retry 8 --retry-delay 5 --max-time 50 -s -o- \
https://nodejs.org/download/release/latest-v${VERSION}.x/SHASUMS256.txt \
| head -n 1 | awk '{print $2};' | cut -d- -f2)
export NODE_VERSION=${NODE_VERSION:1}

# output node version to expansions file for use in subsequent scripts
cat <<EOT > deps-expansion.yml
NODE_VERSION: "$NODE_VERSION"
EOT

# install Node.js on Windows
if [[ "$OS" == "Windows_NT" ]]; then
# Delete pre-existing node to avoid version conflicts
rm -rf "/cygdrive/c/Program Files/nodejs"

export NVM_HOME=`cygpath -w "$NVM_DIR"`
export NVM_SYMLINK=`cygpath -w "$NODE_ARTIFACTS_PATH/bin"`
export NVM_ARTIFACTS_PATH=`cygpath -w "$NODE_ARTIFACTS_PATH/bin"`
export PATH=`cygpath $NVM_SYMLINK`:`cygpath $NVM_HOME`:$PATH

curl -L $NVM_WINDOWS_URL -o nvm.zip
unzip -d $NVM_DIR nvm.zip
rm nvm.zip

chmod 777 $NVM_DIR
chmod -R a+rx $NVM_DIR

cat <<EOT > $NVM_DIR/settings.txt
root: $NVM_HOME
path: $NVM_SYMLINK
EOT
nvm install $NODE_VERSION
nvm use $NODE_VERSION
which node || echo "node not found, PATH=$PATH"
which npm || echo "npm not found, PATH=$PATH"
npm config set msvs_version ${MSVS_VERSION}
npm config set scripts-prepend-node-path true

# install Node.js on Linux/MacOS
else
curl -o- $NVM_URL | bash
[ -s "${NVM_DIR}/nvm.sh" ] && \. "${NVM_DIR}/nvm.sh"
nvm install --no-progress $NODE_VERSION

# setup npm cache in a local directory
cat <<EOT > .npmrc
# setup npm cache in a local directory
cat <<EOT > .npmrc
devdir=${NPM_CACHE_DIR}/.node-gyp
init-module=${NPM_CACHE_DIR}/.npm-init.js
cache=${NPM_CACHE_DIR}
tmp=${NPM_TMP_DIR}
registry=https://registry.npmjs.org
EOT
fi

# NOTE: registry was overridden to not use artifactory, remove the `registry` line when
# BUILD-6774 is resolved.

# install node dependencies
npm install
npm install --unsafe-perm
2 changes: 1 addition & 1 deletion .evergreen/run-atlas-tests.sh
Original file line number Diff line number Diff line change
@@ -7,4 +7,4 @@ NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

ATLAS_REPL="$ATLAS_REPL" ATLAS_SHRD="$ATLAS_SHRD" ATLAS_FREE="$ATLAS_FREE" ATLAS_TLS11="$ATLAS_TLS11" ATLAS_TLS12="$ATLAS_TLS12" npm run atlas
npm run atlas
52 changes: 52 additions & 0 deletions .evergreen/run-custom-csfle-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
#! /usr/bin/env bash

# Initiail checks for running these tests
if [ -z ${AWS_ACCESS_KEY_ID+omitted} ]; then echo "AWS_ACCESS_KEY_ID is unset" && exit 1; fi
if [ -z ${AWS_SECRET_ACCESS_KEY+omitted} ]; then echo "AWS_SECRET_ACCESS_KEY is unset" && exit 1; fi
if [ -z ${CSFLE_KMS_PROVIDERS+omitted} ]; then echo "CSFLE_KMS_PROVIDERS is unset" && exit 1; fi

[ -s "$PROJECT_DIRECTORY/node-artifacts/nvm/nvm.sh" ] && source "$PROJECT_DIRECTORY"/node-artifacts/nvm/nvm.sh

set -o xtrace # Write all commands first to stderr
set -o errexit # Exit the script with error if any of the commands fail

# Environment Variables:
# CSFLE_GIT_REF - set the git reference to checkout for a custom CSFLE version
# CDRIVER_GIT_REF - set the git reference to checkout for a custom CDRIVER version (this is for libbson)

CSFLE_GIT_REF=${CSFLE_GIT_REF:-master}
CDRIVER_GIT_REF=${CDRIVER_GIT_REF:-1.17.4}

rm -rf csfle-deps-tmp
mkdir -p csfle-deps-tmp
pushd csfle-deps-tmp

rm -rf libmongocrypt mongo-c-driver

git clone https://github.com/mongodb/libmongocrypt.git
pushd libmongocrypt
git fetch --tags
git checkout "$CSFLE_GIT_REF" -b csfle-custom
popd # libmongocrypt

git clone https://github.com/mongodb/mongo-c-driver.git
pushd mongo-c-driver
git fetch --tags
git checkout "$CDRIVER_GIT_REF" -b cdriver-custom
popd # mongo-c-driver

pushd libmongocrypt/bindings/node

source ./.evergreen/find_cmake.sh
bash ./etc/build-static.sh

popd # libmongocrypt/bindings/node
popd # csfle-deps-tmp

npm install

cp -r csfle-deps-tmp/libmongocrypt/bindings/node node_modules/mongodb-client-encryption

export MONGODB_UNIFIED_TOPOLOGY=${UNIFIED}
export MONGODB_URI=${MONGODB_URI}
npx mocha test/functional/client_side_encryption
29 changes: 29 additions & 0 deletions .evergreen/run-kerberos-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#!/bin/bash

set -o errexit # Exit the script with error if any of the commands fail

export PROJECT_DIRECTORY="$(pwd)"
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

# set up keytab
mkdir -p "$(pwd)/.evergreen"
touch "$(pwd)/.evergreen/krb5.conf.empty"
export KRB5_CONFIG="$(pwd)/.evergreen/krb5.conf.empty"
echo "Writing keytab"
# DON'T PRINT KEYTAB TO STDOUT
set +o verbose
if [[ "$OSTYPE" == "darwin"* ]]; then
echo ${KRB5_KEYTAB} | base64 -D > "$(pwd)/.evergreen/drivers.keytab"
else
echo ${KRB5_KEYTAB} | base64 -d > "$(pwd)/.evergreen/drivers.keytab"
fi
echo "Running kinit"
kinit -k -t "$(pwd)/.evergreen/drivers.keytab" -p ${KRB5_PRINCIPAL}

npm install kerberos
npm run check:kerberos

# destroy ticket
kdestroy
10 changes: 10 additions & 0 deletions .evergreen/run-ldap-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#!/bin/bash

set -o errexit # Exit the script with error if any of the commands fail

export PROJECT_DIRECTORY="$(pwd)"
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

npm run check:ldap
20 changes: 20 additions & 0 deletions .evergreen/run-mongodb-aws-ecs-test.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#!/bin/bash
set -o xtrace # Write all commands first to stderr
set -o errexit # Exit the script with error if any of the commands fail

MONGODB_URI="$1"
PROJECT_DIRECTORY="$(pwd)/src"

# untar packed archive
cd $PROJECT_DIRECTORY
tar -xzf src.tgz .

# load node.js
set +x
export NVM_DIR="${PROJECT_DIRECTORY}/node-artifacts/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
set -x

# run the tests
npm install aws4
MONGODB_URI=$MONGODB_URI MONGODB_UNIFIED_TOPOLOGY=1 npx mocha test/functional/mongodb_aws.test.js
25 changes: 25 additions & 0 deletions .evergreen/run-mongodb-aws-test.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#!/bin/bash
# set -o xtrace # Write all commands first to stderr
set -o errexit # Exit the script with error if any of the commands fail

MONGODB_URI=${MONGODB_URI:-}

# ensure no secrets are printed in log files
set +x

# load node.js environment
export NVM_DIR="${PROJECT_DIRECTORY}/node-artifacts/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

# the default connection string, may be overridden by the environment script
export MONGODB_URI="mongodb://localhost:27017/aws?authMechanism=MONGODB-AWS"

# load the script
shopt -s expand_aliases # needed for `urlencode` alias
[ -s "$PROJECT_DIRECTORY/prepare_mongodb_aws.sh" ] && source "$PROJECT_DIRECTORY/prepare_mongodb_aws.sh"

# revert to show test output
set -x

npm install aws4
MONGODB_UNIFIED_TOPOLOGY=1 npx mocha test/functional/mongodb_aws.test.js
21 changes: 21 additions & 0 deletions .evergreen/run-ocsp-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/bin/bash
set -o xtrace
set -o errexit

UNIFIED=${UNIFIED:-}

# load node.js environment
export NVM_DIR="${PROJECT_DIRECTORY}/node-artifacts/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

# $PYTHON_BINARY -m virtualenv --never-download --no-wheel ocsptest
# . ocsptest/bin/activate
# trap "deactivate; rm -rf ocsptest" EXIT HUP
# pip install pyopenssl requests service_identity
# PYTHON=python

# NOTE: `--opts {}` is used below to revert mocha to normal behavior (without mongodb specific plugins)
MONGODB_UNIFIED_TOPOLOGY=${UNIFIED} \
OCSP_TLS_SHOULD_SUCCEED=${OCSP_TLS_SHOULD_SUCCEED} \
CA_FILE=${CA_FILE} \
npx mocha --opts '{}' test/manual/ocsp_support.test.js
39 changes: 32 additions & 7 deletions .evergreen/run-tests.sh
Original file line number Diff line number Diff line change
@@ -8,26 +8,51 @@ set -o errexit # Exit the script with error if any of the commands fail
# UNIFIED Set to enable the Unified SDAM topology for the node driver
# MONGODB_URI Set the suggested connection MONGODB_URI (including credentials and topology info)
# MARCH Machine Architecture. Defaults to lowercase uname -m
# TEST_NPM_SCRIPT Script to npm run. Defaults to "test-nolint"
# SKIP_DEPS Skip installing dependencies
# NO_EXIT Don't exit early from tests that leak resources

AUTH=${AUTH:-noauth}
SSL=${SSL:-nossl}
UNIFIED=${UNIFIED:-}
UNIFIED=${UNIFIED:-0}
MONGODB_URI=${MONGODB_URI:-}
TEST_NPM_SCRIPT=${TEST_NPM_SCRIPT:-test-nolint}
if [[ -z "${NO_EXIT}" ]]; then
TEST_NPM_SCRIPT="$TEST_NPM_SCRIPT -- --exit"
fi

# ssl setup
SSL=${SSL:-nossl}
if [ "$SSL" != "nossl" ]; then
export SSL_KEY_FILE="$DRIVERS_TOOLS/.evergreen/x509gen/client.pem"
export SSL_CA_FILE="$DRIVERS_TOOLS/.evergreen/x509gen/ca.pem"
fi

# run tests
echo "Running $AUTH tests over $SSL, connecting to $MONGODB_URI"

export PATH="/opt/mongodbtoolchain/v2/bin:$PATH"
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"

if [[ -z "${SKIP_DEPS}" ]]; then
source "${PROJECT_DIRECTORY}/.evergreen/install-dependencies.sh"
else
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
if [[ "$OS" == "Windows_NT" ]]; then
export NVM_HOME=`cygpath -m -a "$NVM_DIR"`
export NVM_SYMLINK=`cygpath -m -a "$NODE_ARTIFACTS_PATH/bin"`
export NVM_ARTIFACTS_PATH=`cygpath -m -a "$NODE_ARTIFACTS_PATH/bin"`
export PATH=`cygpath $NVM_SYMLINK`:`cygpath $NVM_HOME`:$PATH
else
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
fi
fi

# only run FLE tets on hosts we explicitly choose to test on
if [[ -z "${CLIENT_ENCRYPTION}" ]]; then
unset AWS_ACCESS_KEY_ID;
unset AWS_SECRET_ACCESS_KEY;
else
npm install mongodb-client-encryption
npm install mongodb-client-encryption@latest
fi

MONGODB_UNIFIED_TOPOLOGY=${UNIFIED} MONGODB_URI=${MONGODB_URI} npm run test-nolint
MONGODB_UNIFIED_TOPOLOGY=${UNIFIED} MONGODB_URI=${MONGODB_URI} npm run ${TEST_NPM_SCRIPT}
19 changes: 19 additions & 0 deletions .evergreen/run-tls-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
#!/bin/bash

set -o errexit # Exit the script with error if any of the commands fail

export PROJECT_DIRECTORY="$(pwd)"
NODE_ARTIFACTS_PATH="${PROJECT_DIRECTORY}/node-artifacts"
export NVM_DIR="${NODE_ARTIFACTS_PATH}/nvm"
if [[ "$OS" == "Windows_NT" ]]; then
export NVM_HOME=`cygpath -m -a "$NVM_DIR"`
export NVM_SYMLINK=`cygpath -m -a "$NODE_ARTIFACTS_PATH/bin"`
export NVM_ARTIFACTS_PATH=`cygpath -m -a "$NODE_ARTIFACTS_PATH/bin"`
export PATH=`cygpath $NVM_SYMLINK`:`cygpath $NVM_HOME`:$PATH
else
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
fi
export SSL_KEY_FILE="$DRIVERS_TOOLS/.evergreen/x509gen/client.pem"
export SSL_CA_FILE="$DRIVERS_TOOLS/.evergreen/x509gen/ca.pem"

npm run check:tls
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -29,6 +29,9 @@ manual_tests/
docs/build
docs/Makefile

# xunit test output for CI
xunit.xml

# Directory for dbs
db

@@ -47,3 +50,5 @@ build/Release
node_modules
yarn.lock

.vscode
output
7 changes: 7 additions & 0 deletions .prettierrc.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"singleQuote": true,
"tabWidth": 2,
"printWidth": 100,
"arrowParens": "avoid",
"trailingComma": "none"
}
137 changes: 0 additions & 137 deletions .travis.yml

This file was deleted.

142 changes: 140 additions & 2 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,146 @@
# Change Log
# Changelog

All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.

### [3.6.6](https://github.com/mongodb/node-mongodb-native/compare/v3.6.5...v3.6.6) (2021-04-06)


### Bug Fixes

* always close gridfs upload stream on finish ([#2758](https://github.com/mongodb/node-mongodb-native/issues/2758)) ([c976a01](https://github.com/mongodb/node-mongodb-native/commit/c976a01bf385941bee07fa7f021adf1d425109a8))
* **csfle:** ensure that monitoring connections are not encrypted ([#2749](https://github.com/mongodb/node-mongodb-native/issues/2749)) ([86bddf1](https://github.com/mongodb/node-mongodb-native/commit/86bddf1ef516d6b8c752082e33c15624753579ab))
* ensure cursor readPreference is applied to find operations ([#2751](https://github.com/mongodb/node-mongodb-native/issues/2751)) ([91ba19e](https://github.com/mongodb/node-mongodb-native/commit/91ba19efdc4713903584c6161cfdd7b91b0e61f9))
* ensure monitor has rtt pinger in when calculating rtt ([#2757](https://github.com/mongodb/node-mongodb-native/issues/2757)) ([b94519b](https://github.com/mongodb/node-mongodb-native/commit/b94519ba894b4442d3dabbac59bd12784d8b7178))
* no infinite loop on windows requiring optional deps ([f2a4ff8](https://github.com/mongodb/node-mongodb-native/commit/f2a4ff870178fbbe8de616c45891368665f29f4b))
* **NODE-2995:** Add shared metadata MongoClient ([#2760](https://github.com/mongodb/node-mongodb-native/issues/2760)) ([9256242](https://github.com/mongodb/node-mongodb-native/commit/9256242d51c037059c0af5ada9639fc0a74ad033))
* **NODE-3109:** prevent servername from being IP ([#2763](https://github.com/mongodb/node-mongodb-native/issues/2763)) ([312ffef](https://github.com/mongodb/node-mongodb-native/commit/312ffef18c66a0020f19bdc1d654987d9148d709))

### [3.6.5](https://github.com/mongodb/node-mongodb-native/compare/v3.6.4...v3.6.5) (2021-03-16)


### Bug Fixes

* MongoError circular dependency warning ([#2734](https://github.com/mongodb/node-mongodb-native/issues/2734)) ([d67ffa7](https://github.com/mongodb/node-mongodb-native/commit/d67ffa7a2e3f86734c7e9b6944aab1d765b9e75e))
* move session support check to operation layer ([#2739](https://github.com/mongodb/node-mongodb-native/issues/2739)) ([8b370a7](https://github.com/mongodb/node-mongodb-native/commit/8b370a7ad784f5759c964cdfaec62e06c896dc95))
* session support detection spec compliance ([#2732](https://github.com/mongodb/node-mongodb-native/issues/2732)) ([9baec71](https://github.com/mongodb/node-mongodb-native/commit/9baec7128f612f2d9c290c85d24e33602f911499))
* use emitWarning API for internal messages ([#2743](https://github.com/mongodb/node-mongodb-native/issues/2743)) ([8bd9777](https://github.com/mongodb/node-mongodb-native/commit/8bd9777b0aedd56b81675c3e79fae63432319982))

### [3.6.4](https://github.com/mongodb/node-mongodb-native/compare/v3.6.3...v3.6.4) (2021-02-02)


### Features

* add explain support ([#2626](https://github.com/mongodb/node-mongodb-native/issues/2626)) ([a827807](https://github.com/mongodb/node-mongodb-native/commit/a8278070992d2de4134dc0841b4027a6cc745a93))
* Deprecate top-level write concern option keys ([#2624](https://github.com/mongodb/node-mongodb-native/issues/2624)) ([0516d93](https://github.com/mongodb/node-mongodb-native/commit/0516d93f74de4b58a99e8455e59678d4b09cd4a7))


### Bug Fixes

* Allow GridFS write stream to destroy ([#2702](https://github.com/mongodb/node-mongodb-native/issues/2702)) ([b5e9d67](https://github.com/mongodb/node-mongodb-native/commit/b5e9d67d5cd9b1912a349789cf2a122e00a46d1b))
* awaitable isMaster timeout must respect connectTimeoutMS ([#2627](https://github.com/mongodb/node-mongodb-native/issues/2627)) ([b365c50](https://github.com/mongodb/node-mongodb-native/commit/b365c5061ded832e1682167edac58e8a04b05fc4))
* don't add empty query string items to connection string ([8897259](https://github.com/mongodb/node-mongodb-native/commit/889725980ec1e3b4be4a74170bea0a3e3d23cf13))
* don't reset monitor if we aren't streaming topology changes ([a10171b](https://github.com/mongodb/node-mongodb-native/commit/a10171b57d2414f6df2aa8ffe9c2d3938ad838d1))
* dont parse tls/ssl file paths in uri ([#2718](https://github.com/mongodb/node-mongodb-native/issues/2718)) ([f89e4c1](https://github.com/mongodb/node-mongodb-native/commit/f89e4c1bd59c64664e8c9aa218bcb856be325d34))
* hasAtomicOperator check respects toBSON transformation ([#2696](https://github.com/mongodb/node-mongodb-native/issues/2696)) ([60936dc](https://github.com/mongodb/node-mongodb-native/commit/60936dca74167de239d1bb51a23cc9870860bdc4))
* honor ignoreUndefined on findAndModify commands ([#2671](https://github.com/mongodb/node-mongodb-native/issues/2671)) ([a25b67c](https://github.com/mongodb/node-mongodb-native/commit/a25b67c6ac13b6347cb78c4fc56613f3daf44300))
* ignore ENOTFOUND during TXT record lookup ([2036fe7](https://github.com/mongodb/node-mongodb-native/commit/2036fe7b298b9678e29ede87c1035c748ff89fcd))
* respect readPreference and writeConcern from connection string ([#2711](https://github.com/mongodb/node-mongodb-native/issues/2711)) ([b657c8c](https://github.com/mongodb/node-mongodb-native/commit/b657c8c4f3f86018cc4824f84cb22e1527d9f9af))
* restore auto direct connection behavior ([#2719](https://github.com/mongodb/node-mongodb-native/issues/2719)) ([617d9de](https://github.com/mongodb/node-mongodb-native/commit/617d9dec5180c5f7b67bd8c944c168d4cbd27e1c))
* support empty TXT records in legacy url parser ([2fa5c5f](https://github.com/mongodb/node-mongodb-native/commit/2fa5c5f2a113920baa8e67a1c0d65432690d37fc))
* transition topology state before async calls ([#2637](https://github.com/mongodb/node-mongodb-native/issues/2637)) ([9df093c](https://github.com/mongodb/node-mongodb-native/commit/9df093c1d46e1f8616c7a979324923205ac3dcd2))
* **cursor:** don't use other operation's session for cloned cursor operation ([#2705](https://github.com/mongodb/node-mongodb-native/issues/2705)) ([8082c89](https://github.com/mongodb/node-mongodb-native/commit/8082c89f8ef3624d22f4bdd6066b6f72c44f763d))
* **find:** correctly translate timeout option into noCursorTimeout ([#2700](https://github.com/mongodb/node-mongodb-native/issues/2700)) ([e257e6b](https://github.com/mongodb/node-mongodb-native/commit/e257e6b19d810920bafc579e725e09bd0607b74b))

<a name="3.6.3"></a>
## [3.6.3](https://github.com/mongodb/node-mongodb-native/compare/v3.6.1...v3.6.3) (2020-11-06)


### Bug Fixes

* add peerDependenciesMeta to mark optional deps ([#2606](https://github.com/mongodb/node-mongodb-native/issues/2606)) ([186090e](https://github.com/mongodb/node-mongodb-native/commit/186090e))
* adds topology discovery for sharded cluster ([f8fd310](https://github.com/mongodb/node-mongodb-native/commit/f8fd310))
* allow event loop to process during wait queue processing ([#2537](https://github.com/mongodb/node-mongodb-native/issues/2537)) ([4e03dfa](https://github.com/mongodb/node-mongodb-native/commit/4e03dfa))
* Change socket timeout default to 0 ([#2572](https://github.com/mongodb/node-mongodb-native/issues/2572)) ([89b77ed](https://github.com/mongodb/node-mongodb-native/commit/89b77ed))
* connection leak if wait queue member cancelled ([cafaa1b](https://github.com/mongodb/node-mongodb-native/commit/cafaa1b))
* correctly assign username to X509 auth command ([#2587](https://github.com/mongodb/node-mongodb-native/issues/2587)) ([9110a45](https://github.com/mongodb/node-mongodb-native/commit/9110a45))
* correctly re-establishes pipe destinations ([a6e7caf](https://github.com/mongodb/node-mongodb-native/commit/a6e7caf))
* Fix test filters and revert mocha version ([#2558](https://github.com/mongodb/node-mongodb-native/issues/2558)) ([0e5c45a](https://github.com/mongodb/node-mongodb-native/commit/0e5c45a))
* move kerberos client setup from prepare to auth ([#2608](https://github.com/mongodb/node-mongodb-native/issues/2608)) ([033b6e7](https://github.com/mongodb/node-mongodb-native/commit/033b6e7))
* permit waking async interval with unreliable clock ([e0e11bb](https://github.com/mongodb/node-mongodb-native/commit/e0e11bb))
* remove geoNear deprecation ([4955a52](https://github.com/mongodb/node-mongodb-native/commit/4955a52))
* revert use of setImmediate to process.nextTick ([#2611](https://github.com/mongodb/node-mongodb-native/issues/2611)) ([c9f9d5e](https://github.com/mongodb/node-mongodb-native/commit/c9f9d5e))
* sets primary read preference for writes ([ddcd03d](https://github.com/mongodb/node-mongodb-native/commit/ddcd03d))
* use options for readPreference in client ([6acced0](https://github.com/mongodb/node-mongodb-native/commit/6acced0))
* user roles take single string & DDL readPreference tests ([967de13](https://github.com/mongodb/node-mongodb-native/commit/967de13))



<a name="3.6.2"></a>
## [3.6.2](https://github.com/mongodb/node-mongodb-native/compare/v3.6.1...v3.6.2) (2020-09-10)


### Bug Fixes

* allow event loop to process during wait queue processing ([#2537](https://github.com/mongodb/node-mongodb-native/issues/2537)) ([4e03dfa](https://github.com/mongodb/node-mongodb-native/commit/4e03dfa))



<a name="3.6.1"></a>
## [3.6.1](https://github.com/mongodb/node-mongodb-native/compare/v3.6.0...v3.6.1) (2020-09-02)


### Bug Fixes

* add host/port to cmap connection ([06a2444](https://github.com/mongodb/node-mongodb-native/commit/06a2444))
* update full list of index options ([0af3191](https://github.com/mongodb/node-mongodb-native/commit/0af3191))


### Features

* **db:** deprecate createCollection strict mode ([4cc6bcc](https://github.com/mongodb/node-mongodb-native/commit/4cc6bcc))



<a name="3.6.0-beta.0"></a>
# [3.6.0-beta.0](https://github.com/mongodb/node-mongodb-native/compare/v3.5.5...v3.6.0-beta.0) (2020-04-14)

### Bug Fixes

* always return empty array for selection on unknown topology ([af57b57](https://github.com/mongodb/node-mongodb-native/commit/af57b57))
* always return empty array for selection on unknown topology ([f9e786a](https://github.com/mongodb/node-mongodb-native/commit/f9e786a))
* correctly use template string for connection string error message ([814e278](https://github.com/mongodb/node-mongodb-native/commit/814e278))
* createCollection only uses listCollections in strict mode ([d368f12](https://github.com/mongodb/node-mongodb-native/commit/d368f12))
* don't depend on private node api for `Timeout` wrapper ([e6dc1f4](https://github.com/mongodb/node-mongodb-native/commit/e6dc1f4))
* don't throw if `withTransaction()` callback rejects with a null reason ([153646c](https://github.com/mongodb/node-mongodb-native/commit/153646c))
* **cursor:** transforms should only be applied once to documents ([704f30a](https://github.com/mongodb/node-mongodb-native/commit/704f30a))
* only consider MongoError subclasses for retryability ([265fe40](https://github.com/mongodb/node-mongodb-native/commit/265fe40))
* **ChangeStream:** whitelist change stream resumable errors ([8a9c108](https://github.com/mongodb/node-mongodb-native/commit/8a9c108)), closes [#17](https://github.com/mongodb/node-mongodb-native/issues/17) [#18](https://github.com/mongodb/node-mongodb-native/issues/18)
* **sdam:** use ObjectId comparison to track maxElectionId ([db991d6](https://github.com/mongodb/node-mongodb-native/commit/db991d6))
* only mark server session dirty if the client session is alive ([611be8d](https://github.com/mongodb/node-mongodb-native/commit/611be8d))
* pass options into `commandSupportsReadConcern` ([e855c83](https://github.com/mongodb/node-mongodb-native/commit/e855c83))
* polyfill for util.promisify ([1c4cf6c](https://github.com/mongodb/node-mongodb-native/commit/1c4cf6c))
* single `readPreferenceTags` should be parsed as an array ([a50611b](https://github.com/mongodb/node-mongodb-native/commit/a50611b))
* store name of collection for more informative error messages ([979d41e](https://github.com/mongodb/node-mongodb-native/commit/979d41e))
* support write concern provided as string in `fromOptions` ([637f428](https://github.com/mongodb/node-mongodb-native/commit/637f428))
* use properly camel cased form of `mapReduce` for command ([c1ed2c1](https://github.com/mongodb/node-mongodb-native/commit/c1ed2c1))


### Features

* add MONGODB-AWS as a supported auth mechanism ([7f3cfba](https://github.com/mongodb/node-mongodb-native/commit/7f3cfba))
* bump wire protocol version for 4.4 ([6d3f313](https://github.com/mongodb/node-mongodb-native/commit/6d3f313))
* deprecate `oplogReplay` for find commands ([24155e7](https://github.com/mongodb/node-mongodb-native/commit/24155e7))
* directConnection adds unify behavior for replica set discovery ([c5d60fc](https://github.com/mongodb/node-mongodb-native/commit/c5d60fc))
* expand use of error labels for retryable writes ([c775a4a](https://github.com/mongodb/node-mongodb-native/commit/c775a4a))
* support `allowDiskUse` for find commands ([dbc0b37](https://github.com/mongodb/node-mongodb-native/commit/dbc0b37))
* support creating collections and indexes in transactions ([17e4c88](https://github.com/mongodb/node-mongodb-native/commit/17e4c88))
* support passing a hint to findOneAndReplace/findOneAndUpdate ([faee15b](https://github.com/mongodb/node-mongodb-native/commit/faee15b))
* support shorter SCRAM conversations ([6b9ff05](https://github.com/mongodb/node-mongodb-native/commit/6b9ff05))
* use error labels for retryable writes in legacy topologies ([fefc165](https://github.com/mongodb/node-mongodb-native/commit/fefc165))




<a name="3.5.7"></a>
## [3.5.7](https://github.com/mongodb/node-mongodb-native/compare/v3.5.6...v3.5.7) (2020-04-29)

@@ -16,7 +155,6 @@ All notable changes to this project will be documented in this file. See [standa
<a name="3.5.6"></a>
## [3.5.6](https://github.com/mongodb/node-mongodb-native/compare/v3.5.5...v3.5.6) (2020-04-14)


### Bug Fixes

* always return empty array for selection on unknown topology ([f9e786a](https://github.com/mongodb/node-mongodb-native/commit/f9e786a))
132 changes: 63 additions & 69 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,24 @@
[![npm](https://nodei.co/npm/mongodb.png?downloads=true&downloadRank=true)](https://nodei.co/npm/mongodb/) [![npm](https://nodei.co/npm-dl/mongodb.png?months=6&height=3)](https://nodei.co/npm/mongodb/)
# MongoDB NodeJS Driver

[![Build Status](https://secure.travis-ci.org/mongodb/node-mongodb-native.svg?branch=2.1)](http://travis-ci.org/mongodb/node-mongodb-native)
[![Coverage Status](https://coveralls.io/repos/github/mongodb/node-mongodb-native/badge.svg?branch=2.1)](https://coveralls.io/github/mongodb/node-mongodb-native?branch=2.1)
[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/mongodb/node-mongodb-native?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
[![npm](https://nodei.co/npm/mongodb.png?downloads=true&downloadRank=true)](https://nodei.co/npm/mongodb/)

# Description
The official [MongoDB](https://www.mongodb.com/) driver for Node.js.

The official [MongoDB](https://www.mongodb.com/) driver for Node.js. Provides a high-level API on top of [mongodb-core](https://www.npmjs.com/package/mongodb-core) that is meant for end users.
**NOTE: v3.x released with breaking API changes. You can find a list of changes [here](CHANGES_3.0.0.md).**

**NOTE: v3.x was recently released with breaking API changes. You can find a list of changes [here](CHANGES_3.0.0.md).**
## Version 4.0

## MongoDB Node.JS Driver
**Looking for the latest?** We're working on the next major version of the driver, now in beta.
Check out our [beta version 4.0 here](https://github.com/mongodb/node-mongodb-native/tree/4.0), which includes a full migration of the driver to TypeScript.

| what | where |
|---------------|------------------------------------------------|
| documentation | http://mongodb.github.io/node-mongodb-native |
| api-doc | http://mongodb.github.io/node-mongodb-native/3.1/api |
| source | https://github.com/mongodb/node-mongodb-native |
| mongodb | http://www.mongodb.org |
## Quick Links

| what | where |
| ------------- | ---------------------------------------------------- |
| documentation | http://mongodb.github.io/node-mongodb-native |
| api-doc | http://mongodb.github.io/node-mongodb-native/3.6/api |
| source | https://github.com/mongodb/node-mongodb-native |
| mongodb | http://www.mongodb.org |

### Bugs / Feature Requests

@@ -43,12 +44,12 @@ Change history can be found in [`HISTORY.md`](HISTORY.md).

For version compatibility matrices, please refer to the following links:

* [MongoDB](https://docs.mongodb.com/ecosystem/drivers/driver-compatibility-reference/#reference-compatibility-mongodb-node)
* [NodeJS](https://docs.mongodb.com/ecosystem/drivers/driver-compatibility-reference/#reference-compatibility-language-node)
- [MongoDB](https://docs.mongodb.com/ecosystem/drivers/driver-compatibility-reference/#reference-compatibility-mongodb-node)
- [NodeJS](https://docs.mongodb.com/ecosystem/drivers/driver-compatibility-reference/#reference-compatibility-language-node)

# Installation
## Installation

The recommended way to get started using the Node.js 3.0 driver is by using the `npm` (Node Package Manager) to install the dependency in your project.
The recommended way to get started using the Node.js driver is by using `npm` (Node Package Manager) to install the dependency in your project.

## MongoDB Driver

@@ -66,10 +67,10 @@ You can also use the [Yarn](https://yarnpkg.com/en) package manager.

The MongoDB driver depends on several other packages. These are:

* [mongodb-core](https://github.com/mongodb-js/mongodb-core)
* [bson](https://github.com/mongodb/js-bson)
* [kerberos](https://github.com/mongodb-js/kerberos)
* [node-gyp](https://github.com/nodejs/node-gyp)
- [bson](https://github.com/mongodb/js-bson)
- [bson-ext](https://github.com/mongodb-js/bson-ext)
- [kerberos](https://github.com/mongodb-js/kerberos)
- [mongodb-client-encryption](https://github.com/mongodb/libmongocrypt#readme)

The `kerberos` package is a C++ extension that requires a build environment to be installed on your system. You must be able to build Node.js itself in order to compile and install the `kerberos` module. Furthermore, the `kerberos` module requires the MIT Kerberos package to correctly compile on UNIX operating systems. Consult your UNIX operation system package manager for what libraries to install.

@@ -110,9 +111,9 @@ This will print out all the steps npm is performing while trying to install the

A compiler tool chain known to work for compiling `kerberos` on Windows is the following.

* Visual Studio C++ 2010 (do not use higher versions)
* Windows 7 64bit SDK
* Python 2.7 or higher
- Visual Studio C++ 2010 (do not use higher versions)
- Windows 7 64bit SDK
- Python 2.7 or higher

Open the Visual Studio command prompt. Ensure `node.exe` is in your path and install `node-gyp`.

@@ -170,7 +171,7 @@ For complete MongoDB installation instructions, see [the manual](https://docs.mo

1. Download the right MongoDB version from [MongoDB](https://www.mongodb.org/downloads)
2. Create a database directory (in this case under **/data**).
3. Install and start a ``mongod`` process.
3. Install and start a `mongod` process.

```bash
mongod --dbpath=/data
@@ -194,11 +195,11 @@ const url = 'mongodb://localhost:27017';

// Database Name
const dbName = 'myproject';

const client = new MongoClient(url);
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
client.connect(function(err) {
assert.equal(null, err);
console.log("Connected successfully to server");
console.log('Connected successfully to server');

const db = client.db(dbName);

@@ -224,23 +225,21 @@ const insertDocuments = function(db, callback) {
// Get the documents collection
const collection = db.collection('documents');
// Insert some documents
collection.insertMany([
{a : 1}, {a : 2}, {a : 3}
], function(err, result) {
collection.insertMany([{ a: 1 }, { a: 2 }, { a: 3 }], function(err, result) {
assert.equal(err, null);
assert.equal(3, result.result.n);
assert.equal(3, result.ops.length);
console.log("Inserted 3 documents into the collection");
console.log('Inserted 3 documents into the collection');
callback(result);
});
}
};
```

The **insert** command returns an object with the following fields:

* **result** Contains the result document from MongoDB
* **ops** Contains the documents inserted with added **_id** fields
* **connection** Contains the connection used to perform the insert
- **result** Contains the result document from MongoDB
- **ops** Contains the documents inserted with added **\_id** fields
- **connection** Contains the connection used to perform the insert

Add the following code to call the **insertDocuments** function:

@@ -257,7 +256,7 @@ const dbName = 'myproject';
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
assert.equal(null, err);
console.log("Connected successfully to server");
console.log('Connected successfully to server');

const db = client.db(dbName);

@@ -291,11 +290,11 @@ const findDocuments = function(db, callback) {
// Find some documents
collection.find({}).toArray(function(err, docs) {
assert.equal(err, null);
console.log("Found the following records");
console.log(docs)
console.log('Found the following records');
console.log(docs);
callback(docs);
});
}
};
```

This query returns all the documents in the **documents** collection. Add the **findDocument** method to the **MongoClient.connect** callback:
@@ -313,7 +312,7 @@ const dbName = 'myproject';
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
assert.equal(null, err);
console.log("Connected correctly to server");
console.log('Connected correctly to server');

const db = client.db(dbName);

@@ -334,16 +333,16 @@ const findDocuments = function(db, callback) {
// Get the documents collection
const collection = db.collection('documents');
// Find some documents
collection.find({'a': 3}).toArray(function(err, docs) {
collection.find({ a: 3 }).toArray(function(err, docs) {
assert.equal(err, null);
console.log("Found the following records");
console.log('Found the following records');
console.log(docs);
callback(docs);
});
}
};
```

Only the documents which match ``'a' : 3`` should be returned.
Only the documents which match `'a' : 3` should be returned.

### Update a document

@@ -354,14 +353,13 @@ const updateDocument = function(db, callback) {
// Get the documents collection
const collection = db.collection('documents');
// Update document where a is 2, set b equal to 1
collection.updateOne({ a : 2 }
, { $set: { b : 1 } }, function(err, result) {
collection.updateOne({ a: 2 }, { $set: { b: 1 } }, function(err, result) {
assert.equal(err, null);
assert.equal(1, result.result.n);
console.log("Updated the document with the field a equal to 2");
console.log('Updated the document with the field a equal to 2');
callback(result);
});
}
};
```

The method updates the first document where the field **a** is equal to **2** by adding a new field **b** to the document set to **1**. Next, update the callback function from **MongoClient.connect** to include the update method.
@@ -379,7 +377,7 @@ const dbName = 'myproject';
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
assert.equal(null, err);
console.log("Connected successfully to server");
console.log('Connected successfully to server');

const db = client.db(dbName);

@@ -400,13 +398,13 @@ const removeDocument = function(db, callback) {
// Get the documents collection
const collection = db.collection('documents');
// Delete document where a is 3
collection.deleteOne({ a : 3 }, function(err, result) {
collection.deleteOne({ a: 3 }, function(err, result) {
assert.equal(err, null);
assert.equal(1, result.result.n);
console.log("Removed the document with the field a equal to 3");
console.log('Removed the document with the field a equal to 3');
callback(result);
});
}
};
```

Add the new method to the **MongoClient.connect** callback function.
@@ -424,7 +422,7 @@ const dbName = 'myproject';
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
assert.equal(null, err);
console.log("Connected successfully to server");
console.log('Connected successfully to server');

const db = client.db(dbName);

@@ -446,18 +444,14 @@ performance. The following function creates an index on the **a** field in the

```js
const indexCollection = function(db, callback) {
db.collection('documents').createIndex(
{ "a": 1 },
null,
function(err, results) {
console.log(results);
callback();
}
);
db.collection('documents').createIndex({ a: 1 }, null, function(err, results) {
console.log(results);
callback();
});
};
```

Add the ``indexCollection`` method to your app:
Add the `indexCollection` method to your app:

```js
const MongoClient = require('mongodb').MongoClient;
@@ -471,7 +465,7 @@ const dbName = 'myproject';
// Use connect method to connect to the server
MongoClient.connect(url, function(err, client) {
assert.equal(null, err);
console.log("Connected successfully to server");
console.log('Connected successfully to server');

const db = client.db(dbName);

@@ -487,13 +481,13 @@ For more detailed information, see the [tutorials](docs/reference/content/tutori

## Next Steps

* [MongoDB Documentation](http://mongodb.org)
* [Read about Schemas](http://learnmongodbthehardway.com)
* [Star us on GitHub](https://github.com/mongodb/node-mongodb-native)
- [MongoDB Documentation](http://mongodb.org)
- [Read about Schemas](http://learnmongodbthehardway.com)
- [Star us on GitHub](https://github.com/mongodb/node-mongodb-native)

## License

[Apache 2.0](LICENSE.md)

© 2009-2012 Christian Amor Kvalheim
© 2009-2012 Christian Amor Kvalheim
© 2012-present MongoDB [Contributors](CONTRIBUTORS.md)
5 changes: 4 additions & 1 deletion docs/reference/config.toml
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
baseurl = "/node-mongodb-native/steverenaker/DOCS-8756"
baseurl = "/node-mongodb-native/3.6"
languageCode = "en-us"
title = "MongoDB Node.js Driver"
theme = "mongodb"
canonifyurls = false

[params]
referenceDocsUrl = "https://docs.mongodb.com/drivers/node"

[blackfriday]
plainIdAnchors = true

46 changes: 23 additions & 23 deletions docs/reference/content/tutorials/crud.md
Original file line number Diff line number Diff line change
@@ -18,7 +18,7 @@ This tutorial covers both the basic CRUD methods and the specialized ``findAndMo
as well as the new Bulk API methods for efficient bulk write operations.

<div class="pull-right">
<input type="checkbox" checked="" class="distroPicker" data-toggle="toggle" data-on="ES2015" data-off="ES2017" data-offstyle="success">
<input type="checkbox" checked="" class="distroPicker" data-toggle="toggle" data-on="ES2017" data-off="ES2015" data-offstyle="success">
</div>


@@ -33,7 +33,7 @@ The ``insertOne`` and ``insertMany`` methods exist on the ``Collection`` class a



<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

// Insert a single document
@@ -52,7 +52,7 @@ The ``insertOne`` and ``insertMany`` methods exist on the ``Collection`` class a
});

</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Insert a single document
@@ -90,7 +90,7 @@ The ``insertOne`` and ``insertMany`` methods also accept a second argument which
The following example shows how to serialize a passed-in function when writing to a
[replica set](https://docs.mongodb.org/manual/core/replica-set-members/).

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

// Insert a single document
@@ -108,7 +108,7 @@ The following example shows how to serialize a passed-in function when writing t
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Insert a single document
@@ -140,7 +140,7 @@ when inserting documents with the ``insertMany`` method.
The Decimal128 data type requires MongoDB server version 3.4 or higher.
{{% /note %}}

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
const Long = require('mongodb').Long;
const Decimal = require('mongodb').Decimal128;

@@ -158,7 +158,7 @@ const Decimal = require('mongodb').Decimal128;
});

</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
const Long = require('mongodb').Long;
const Decimal = require('mongodb').Decimal128;

@@ -191,7 +191,7 @@ The above operation inserts the following documents into the

The ``updateOne`` and ``updateMany`` methods exist on the ``Collection`` class and are used to update and upsert documents.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('updates');
@@ -226,7 +226,7 @@ The ``updateOne`` and ``updateMany`` methods exist on the ``Collection`` class a
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the updates collection
@@ -276,7 +276,7 @@ Just as for ``insert``, the ``update`` method allows you to specify a per operat

The ``deleteOne`` and ``deleteMany`` methods exist on the ``Collection`` class and are used to remove documents from MongoDB.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('removes');
@@ -300,7 +300,7 @@ The ``deleteOne`` and ``deleteMany`` methods exist on the ``Collection`` class a
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the removes collection
@@ -343,7 +343,7 @@ allow the user to update or upsert a document and have the modified or existing
methods, the operation takes a write lock for the duration of the operation in order to ensure the modification is
[atomic](https://docs.mongodb.org/manual/core/write-operations-atomicity/).

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('findAndModify');
@@ -371,7 +371,7 @@ methods, the operation takes a write lock for the duration of the operation in o
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the findAndModify collection
@@ -414,7 +414,7 @@ The ``findOneAndUpdate`` method also accepts a third argument which can be an op

The ``findOneAndDelete`` function is designed to help remove a document.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('findAndModify');
@@ -435,7 +435,7 @@ The ``findOneAndDelete`` function is designed to help remove a document.
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the findAndModify collection
@@ -471,7 +471,7 @@ Like ``findOneAndUpdate``, it allows an object of options to be passed in which

The ``bulkWrite`` function allows a simple set of bulk operations to run in a non-fluent way, in comparison to the bulk API discussed next.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

// Get the collection
@@ -498,7 +498,7 @@ The ``bulkWrite`` function allows a simple set of bulk operations to run in a no
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the collection
@@ -540,7 +540,7 @@ The ``bulkWrite`` function takes an array of operations which can be objects of

Bulk write operations make it easy to write groups of operations together to MongoDB. There are some caveats and to get the best performance you need to be running against MongoDB version 2.6 or higher, which supports the new write commands. Bulk operations are split into *ordered* and *unordered* bulk operations. An *ordered* bulk operation guarantees the order of execution of writes while the *unordered* bulk operation makes no assumptions about the order of execution. In the Node.js driver the *unordered* bulk operations will group operations according to type and write them in parallel.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('bulkops');
@@ -566,7 +566,7 @@ Bulk write operations make it easy to write groups of operations together to Mon
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the collection
@@ -641,7 +641,7 @@ The main method for querying the database is the ``find`` method.

The following example materializes all the documents from a query using the ``toArray`` method, but limits the number of returned results to two documents.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('find');
@@ -659,7 +659,7 @@ The following example materializes all the documents from a query using the ``to
});
});
</code></pre></section>
<section class="javascript6 hidden"><pre><code class="hljs">
<section class="javascript6"><pre><code class="hljs">
{{% js6-connect %}}

// Get the collection
@@ -715,7 +715,7 @@ More information can be found in the [Cursor API documentation](/node-mongodb-na

The following example uses the ``next`` method.

<section class="javascript5"><pre><code class="hljs">
<section class="javascript5 hidden"><pre><code class="hljs">
{{% myproject-connect %}}

const col = db.collection('find');
@@ -733,7 +733,7 @@ The following example uses the ``next`` method.
});
});
</code></pre></section>
<section class="javascript6 hidden">
<section class="javascript6">
In ECMAScript 6, The new `generator` functions allow for what is arguably a
much cleaner and easier way to read iteration code.

2 changes: 2 additions & 0 deletions docs/reference/content/tutorials/text-search.md
Original file line number Diff line number Diff line change
@@ -10,6 +10,8 @@ title = "Text Search"

# Text Search

> [Atlas Search](https://docs.atlas.mongodb.com/atlas-search) makes it easy to build fast, relevance-based search capabilities on top of your MongoDB data. Try it today on [MongoDB Atlas](https://www.mongodb.com/cloud/atlas), our fully managed database as a service.
Use the [$text](https://docs.mongodb.org/manual/reference/operator/query/text/)
operator to perform text searches on fields which have a
[text index](https://docs.mongodb.org/manual/core/index-text/).
2 changes: 1 addition & 1 deletion docs/reference/data/mongodb.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Update versions in config.toml as well
githubRepo = "node-mongodb-native"
githubBranch = "master"
currentVersion = "3.5"
currentVersion = "3.6"
highlightTheme = "idea.css"
apiUrl = "/api"
2 changes: 2 additions & 0 deletions docs/reference/layouts/partials/header.html
Original file line number Diff line number Diff line change
@@ -25,3 +25,5 @@
<div class="documentwrapper">
<div class="bodywrapper">
<div class="body">

{{ partial "new_version.html" . }}
4 changes: 4 additions & 0 deletions docs/reference/layouts/partials/new_version.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
<div class="alert alert-info" role="alert">
Note: You are currently viewing version 3.6 of the Node.js driver documentation.
<a href="{{.Site.Params.ReferenceDocsUrl}}">Click here</a> for the latest version.
</div>
8 changes: 4 additions & 4 deletions docs/reference/static/js/toggle-switch.js
Original file line number Diff line number Diff line change
@@ -2,11 +2,11 @@ $(document).ready(function(){
$('.distroPicker').bootstrapToggle();
$('.distroPicker').change(function () {
if ($('.distroPicker').prop('checked')) {
$('.javascript6').addClass('hidden');
$('.javascript5').removeClass('hidden');
} else {
$('.javascript5').addClass('hidden');
$('.javascript6').removeClass('hidden');
} else {
$('.javascript6').addClass('hidden');
$('.javascript5').removeClass('hidden');
}
});
});
});
26 changes: 26 additions & 0 deletions etc/update-spec-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/usr/bin/env bash

# This script is used to fetch the latest tests for the specified spec.
# It puts the tests in the direcory $spec_root. It should be run from the root of the repository.

set -o errexit
set -o nounset

if [ ! -d ".git" ]; then
echo "$0: This script must be run from the root of the repository" >&2
exit 1
fi

if [ $# -ne 1 ]; then
echo "$0: This script must be passed exactly one argument for which tests to sync" >&2
exit 1
fi

spec_root="test/spec"

tmpdir=$(mktemp -d -t spec_testsXXXX)
curl -sL "https://github.com/mongodb/specifications/archive/master.zip" -o "$tmpdir/specs.zip"
unzip -d "$tmpdir" "$tmpdir/specs.zip" > /dev/null
mkdir -p "$spec_root/$1"
rsync -ah --exclude '*.rst' "$tmpdir/specifications-master/source/$1/tests/" "$spec_root/$1" --delete
rm -rf "$tmpdir"
19 changes: 11 additions & 8 deletions lib/admin.js
Original file line number Diff line number Diff line change
@@ -166,10 +166,11 @@ Admin.prototype.ping = function(options, callback) {
* @param {string} username The username.
* @param {string} password The password.
* @param {object} [options] Optional settings.
* @param {(number|string)} [options.w] The write concern.
* @param {number} [options.wtimeout] The write concern timeout.
* @param {boolean} [options.j=false] Specify a journal write concern.
* @param {boolean} [options.fsync=false] Specify a file sync write concern.
* @param {(number|string)} [options.w] **Deprecated** The write concern. Use writeConcern instead.
* @param {number} [options.wtimeout] **Deprecated** The write concern timeout. Use writeConcern instead.
* @param {boolean} [options.j=false] **Deprecated** Specify a journal write concern. Use writeConcern instead.
* @param {boolean} [options.fsync=false] **Deprecated** Specify a file sync write concern. Use writeConcern instead.
* @param {object|WriteConcern} [options.writeConcern] Specify write concern settings.
* @param {object} [options.customData] Custom data associated with the user (only Mongodb 2.6 or higher)
* @param {object[]} [options.roles] Roles associated with the created user (only Mongodb 2.6 or higher)
* @param {ClientSession} [options.session] optional session to use for this operation
@@ -203,10 +204,11 @@ Admin.prototype.addUser = function(username, password, options, callback) {
* @method
* @param {string} username The username.
* @param {object} [options] Optional settings.
* @param {(number|string)} [options.w] The write concern.
* @param {number} [options.wtimeout] The write concern timeout.
* @param {boolean} [options.j=false] Specify a journal write concern.
* @param {boolean} [options.fsync=false] Specify a file sync write concern.
* @param {(number|string)} [options.w] **Deprecated** The write concern. Use writeConcern instead.
* @param {number} [options.wtimeout] **Deprecated** The write concern timeout. Use writeConcern instead.
* @param {boolean} [options.j=false] **Deprecated** Specify a journal write concern. Use writeConcern instead.
* @param {boolean} [options.fsync=false] **Deprecated** Specify a file sync write concern. Use writeConcern instead.
* @param {object|WriteConcern} [options.writeConcern] Specify write concern settings.
* @param {ClientSession} [options.session] optional session to use for this operation
* @param {Admin~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
@@ -232,6 +234,7 @@ Admin.prototype.removeUser = function(username, options, callback) {
*
* @param {string} collectionName The name of the collection to validate.
* @param {object} [options] Optional settings.
* @param {boolean} [options.background] Validates a collection in the background, without interrupting read or write traffic (only in MongoDB 4.4+)
* @param {ClientSession} [options.session] optional session to use for this operation
* @param {Admin~resultCallback} [callback] The command result callback.
* @return {Promise} returns Promise if no callback passed
15 changes: 7 additions & 8 deletions lib/aggregation_cursor.js
Original file line number Diff line number Diff line change
@@ -3,7 +3,6 @@
const MongoError = require('./core').MongoError;
const Cursor = require('./cursor');
const CursorState = require('./core/cursor').CursorState;
const deprecate = require('util').deprecate;

/**
* @fileOverview The **AggregationCursor** class is an internal class that embodies an aggregation cursor on MongoDB
@@ -203,7 +202,7 @@ class AggregationCursor extends Cursor {
/**
* Add a unwind stage to the aggregation pipeline
* @method
* @param {number} field The unwind field name.
* @param {(string|object)} field The unwind field name or stage document.
* @return {AggregationCursor}
*/
unwind(field) {
@@ -225,12 +224,6 @@ class AggregationCursor extends Cursor {
// aliases
AggregationCursor.prototype.get = AggregationCursor.prototype.toArray;

// deprecated methods
deprecate(
AggregationCursor.prototype.geoNear,
'The `$geoNear` stage is deprecated in MongoDB 4.0, and removed in version 4.2.'
);

/**
* AggregationCursor stream data event, fired for each document in the cursor.
*
@@ -329,7 +322,13 @@ deprecate(

/**
* Execute the explain for the cursor
*
* For backwards compatibility, a verbosity of true is interpreted as "allPlansExecution"
* and false as "queryPlanner". Prior to server version 3.6, aggregate()
* ignores the verbosity parameter and executes in "queryPlanner".
*
* @method AggregationCursor.prototype.explain
* @param {'queryPlanner'|'queryPlannerExtended'|'executionStats'|'allPlansExecution'|boolean} [verbosity=true] - An optional mode in which to run the explain.
* @param {AggregationCursor~resultCallback} [callback] The result callback.
* @return {Promise} returns Promise if no callback passed
*/
76 changes: 57 additions & 19 deletions lib/bulk/common.js
Original file line number Diff line number Diff line change
@@ -11,6 +11,8 @@ const applyRetryableWrites = require('../utils').applyRetryableWrites;
const applyWriteConcern = require('../utils').applyWriteConcern;
const executeLegacyOperation = require('../utils').executeLegacyOperation;
const isPromiseLike = require('../utils').isPromiseLike;
const hasAtomicOperators = require('../utils').hasAtomicOperators;
const maxWireVersion = require('../core/utils').maxWireVersion;

// Error codes
const WRITE_CONCERN_ERROR = 64;
@@ -641,6 +643,10 @@ class FindOperators {
document.hint = updateDocument.hint;
}

if (!hasAtomicOperators(updateDocument)) {
throw new TypeError('Update document requires atomic operators');
}

// Clear out current Op
this.s.currentOp = null;
return this.s.options.addToOperationsList(this, UPDATE, document);
@@ -650,12 +656,33 @@ class FindOperators {
* Add a replace one operation to the bulk operation
*
* @method
* @param {object} updateDocument the new document to replace the existing one with
* @param {object} replacement the new document to replace the existing one with
* @throws {MongoError} If operation cannot be added to bulk write
* @return {OrderedBulkOperation|UnorderedBulkOperation} A reference to the parent BulkOperation
*/
replaceOne(updateDocument) {
this.updateOne(updateDocument);
replaceOne(replacement) {
// Perform upsert
const upsert = typeof this.s.currentOp.upsert === 'boolean' ? this.s.currentOp.upsert : false;

// Establish the update command
const document = {
q: this.s.currentOp.selector,
u: replacement,
multi: false,
upsert: upsert
};

if (replacement.hint) {
document.hint = replacement.hint;
}

if (hasAtomicOperators(replacement)) {
throw new TypeError('Replacement document must not use atomic operators');
}

// Clear out current Op
this.s.currentOp = null;
return this.s.options.addToOperationsList(this, UPDATE, document);
}

/**
@@ -943,6 +970,12 @@ class BulkOperationBase {

// Crud spec update format
if (op.updateOne || op.updateMany || op.replaceOne) {
if (op.replaceOne && hasAtomicOperators(op[key].replacement)) {
throw new TypeError('Replacement document must not use atomic operators');
} else if ((op.updateOne || op.updateMany) && !hasAtomicOperators(op[key].update)) {
throw new TypeError('Update document requires atomic operators');
}

const multi = op.updateOne || op.replaceOne ? false : true;
const operation = {
q: op[key].filter,
@@ -960,7 +993,15 @@ class BulkOperationBase {
} else {
if (op[key].upsert) operation.upsert = true;
}
if (op[key].arrayFilters) operation.arrayFilters = op[key].arrayFilters;
if (op[key].arrayFilters) {
// TODO: this check should be done at command construction against a connection, not a topology
if (maxWireVersion(this.s.topology) < 6) {
throw new TypeError('arrayFilters are only supported on MongoDB 3.6+');
}

operation.arrayFilters = op[key].arrayFilters;
}

return this.s.options.addToOperationsList(this, UPDATE, operation);
}

@@ -979,6 +1020,9 @@ class BulkOperationBase {
if (op.deleteOne || op.deleteMany) {
const limit = op.deleteOne ? 1 : 0;
const operation = { q: op[key].filter, limit: limit };
if (op[key].hint) {
operation.hint = op[key].hint;
}
if (this.isOrdered) {
if (op.collation) operation.collation = op.collation;
}
@@ -1081,10 +1125,11 @@ class BulkOperationBase {
* @method
* @param {WriteConcern} [_writeConcern] Optional write concern. Can also be specified through options.
* @param {object} [options] Optional settings.
* @param {(number|string)} [options.w] The write concern.
* @param {number} [options.wtimeout] The write concern timeout.
* @param {boolean} [options.j=false] Specify a journal write concern.
* @param {boolean} [options.fsync=false] Specify a file sync write concern.
* @param {(number|string)} [options.w] **Deprecated** The write concern. Use writeConcern instead.
* @param {number} [options.wtimeout] **Deprecated** The write concern timeout. Use writeConcern instead.
* @param {boolean} [options.j=false] **Deprecated** Specify a journal write concern. Use writeConcern instead.
* @param {boolean} [options.fsync=false] **Deprecated** Specify a file sync write concern. Use writeConcern instead.
* @param {object|WriteConcern} [options.writeConcern] Specify write concern settings.
* @param {BulkOperationBase~resultCallback} [callback] A callback that will be invoked when bulkWrite finishes/errors
* @throws {MongoError} Throws error if the bulk object has already been executed
* @throws {MongoError} Throws error if the bulk object does not have any operations
@@ -1200,19 +1245,10 @@ class BulkOperationBase {
* @ignore
* @param {function} callback
* @param {BulkWriteResult} writeResult
* @param {class} self either OrderedBulkOperation or UnorderdBulkOperation
* @param {class} self either OrderedBulkOperation or UnorderedBulkOperation
*/
handleWriteError(callback, writeResult) {
if (this.s.bulkResult.writeErrors.length > 0) {
if (this.s.bulkResult.writeErrors.length === 1) {
handleCallback(
callback,
new BulkWriteError(toError(this.s.bulkResult.writeErrors[0]), writeResult),
null
);
return true;
}

const msg = this.s.bulkResult.writeErrors[0].errmsg
? this.s.bulkResult.writeErrors[0].errmsg
: 'write operation failed';
@@ -1230,7 +1266,9 @@ class BulkOperationBase {
null
);
return true;
} else if (writeResult.getWriteConcernError()) {
}

if (writeResult.getWriteConcernError()) {
handleCallback(
callback,
new BulkWriteError(toError(writeResult.getWriteConcernError()), writeResult),
8 changes: 8 additions & 0 deletions lib/bulk/unordered.js
Original file line number Diff line number Diff line change
@@ -108,6 +108,14 @@ class UnorderedBulkOperation extends BulkOperationBase {

super(topology, collection, options, false);
}

handleWriteError(callback, writeResult) {
if (this.s.batches.length) {
return false;
}

return super.handleWriteError(callback, writeResult);
}
}

/**
292 changes: 166 additions & 126 deletions lib/change_stream.js

Large diffs are not rendered by default.

29 changes: 18 additions & 11 deletions lib/cmap/connection.js
Original file line number Diff line number Diff line change
@@ -4,13 +4,16 @@ const EventEmitter = require('events');
const MessageStream = require('./message_stream');
const MongoError = require('../core/error').MongoError;
const MongoNetworkError = require('../core/error').MongoNetworkError;
const MongoNetworkTimeoutError = require('../core/error').MongoNetworkTimeoutError;
const MongoWriteConcernError = require('../core/error').MongoWriteConcernError;
const CommandResult = require('../core/connection/command_result');
const StreamDescription = require('./stream_description').StreamDescription;
const wp = require('../core/wireprotocol');
const apm = require('../core/connection/apm');
const updateSessionFromResponse = require('../core/sessions').updateSessionFromResponse;
const uuidV4 = require('../core/utils').uuidV4;
const now = require('../utils').now;
const calculateDurationInMs = require('../utils').calculateDurationInMs;

const kStream = Symbol('stream');
const kQueue = Symbol('queue');
@@ -29,15 +32,17 @@ class Connection extends EventEmitter {
this.id = options.id;
this.address = streamIdentifier(stream);
this.bson = options.bson;
this.socketTimeout = typeof options.socketTimeout === 'number' ? options.socketTimeout : 360000;
this.socketTimeout = typeof options.socketTimeout === 'number' ? options.socketTimeout : 0;
this.host = options.host || 'localhost';
this.port = options.port || 27017;
this.monitorCommands =
typeof options.monitorCommands === 'boolean' ? options.monitorCommands : false;
this.closed = false;
this.destroyed = false;

this[kDescription] = new StreamDescription(this.address, options);
this[kGeneration] = options.generation;
this[kLastUseTime] = Date.now();
this[kLastUseTime] = now();

// retain a reference to an `AutoEncrypter` if present
if (options.autoEncrypter) {
@@ -75,10 +80,14 @@ class Connection extends EventEmitter {
stream.destroy();
this.closed = true;
this[kQueue].forEach(op =>
op.cb(new MongoNetworkError(`connection ${this.id} to ${this.address} timed out`))
op.cb(
new MongoNetworkTimeoutError(`connection ${this.id} to ${this.address} timed out`, {
beforeHandshake: this[kIsMaster] == null
})
)
);
this[kQueue].clear();

this[kQueue].clear();
this.emit('close');
});

@@ -108,7 +117,7 @@ class Connection extends EventEmitter {
}

get idleTime() {
return Date.now() - this[kLastUseTime];
return calculateDurationInMs(this[kLastUseTime]);
}

get clusterTime() {
@@ -120,7 +129,7 @@ class Connection extends EventEmitter {
}

markAvailable() {
this[kLastUseTime] = Date.now();
this[kLastUseTime] = now();
}

destroy(options, callback) {
@@ -216,6 +225,7 @@ function messageHandler(conn) {
}

const operationDescription = conn[kQueue].get(message.responseTo);
const callback = operationDescription.cb;

// SERVER-45775: For exhaust responses we should be able to use the same requestId to
// track response, however the server currently synthetically produces remote requests
@@ -224,10 +234,7 @@ function messageHandler(conn) {
if (message.moreToCome) {
// requeue the callback for next synthetic request
conn[kQueue].set(message.requestId, operationDescription);
}

const callback = operationDescription.cb;
if (operationDescription.socketTimeoutOverride) {
} else if (operationDescription.socketTimeoutOverride) {
conn[kStream].setTimeout(conn.socketTimeout);
}

@@ -326,7 +333,7 @@ function write(command, options, callback) {
if (this.monitorCommands) {
this.emit('commandStarted', new apm.CommandStartedEvent(this, command));

operationDescription.started = process.hrtime();
operationDescription.started = now();
operationDescription.cb = (err, reply) => {
if (err) {
this.emit(
34 changes: 16 additions & 18 deletions lib/cmap/connection_pool.js
Original file line number Diff line number Diff line change
@@ -95,7 +95,7 @@ const VALID_POOL_OPTIONS = new Set([

function resolveOptions(options, defaults) {
const newOptions = Array.from(VALID_POOL_OPTIONS).reduce((obj, key) => {
if (options.hasOwnProperty(key)) {
if (Object.prototype.hasOwnProperty.call(options, key)) {
obj[key] = options[key];
}

@@ -198,6 +198,10 @@ class ConnectionPool extends EventEmitter {
return this[kConnections].length;
}

get waitQueueSize() {
return this[kWaitQueue].length;
}

/**
* Check a connection out of this pool. The connection will continue to be tracked, but no reference to it
* will be held by the pool. This means that if a connection is checked out it MUST be checked back in or
@@ -214,7 +218,6 @@ class ConnectionPool extends EventEmitter {
return;
}

// add this request to the wait queue
const waitQueueMember = { callback };

const pool = this;
@@ -229,11 +232,8 @@ class ConnectionPool extends EventEmitter {
}, waitQueueTimeoutMS);
}

// place the member at the end of the wait queue
this[kWaitQueue].push(waitQueueMember);

// process the wait queue
processWaitQueue(this);
process.nextTick(() => processWaitQueue(this));
}

/**
@@ -246,10 +246,8 @@ class ConnectionPool extends EventEmitter {
const stale = connectionIsStale(this, connection);
const willDestroy = !!(poolClosed || stale || connection.closed);

// Properly adjust state of connection
if (!willDestroy) {
connection.markAvailable();

this[kConnections].push(connection);
}

@@ -260,7 +258,7 @@ class ConnectionPool extends EventEmitter {
destroyConnection(this, connection, reason);
}

processWaitQueue(this);
process.nextTick(() => processWaitQueue(this));
}

/**
@@ -295,7 +293,7 @@ class ConnectionPool extends EventEmitter {
this[kCancellationToken].emit('cancel');

// drain the wait queue
while (this[kWaitQueue].length) {
while (this.waitQueueSize) {
const waitQueueMember = this[kWaitQueue].pop();
clearTimeout(waitQueueMember.timer);
if (!waitQueueMember[kCancelled]) {
@@ -430,7 +428,7 @@ function createConnection(pool, callback) {

// otherwise add it to the pool for later acquisition, and try to process the wait queue
pool[kConnections].push(connection);
processWaitQueue(pool);
process.nextTick(() => processWaitQueue(pool));
});
}

@@ -449,13 +447,17 @@ function processWaitQueue(pool) {
return;
}

while (pool[kWaitQueue].length && pool.availableConnectionCount) {
while (pool.waitQueueSize) {
const waitQueueMember = pool[kWaitQueue].peekFront();
if (waitQueueMember[kCancelled]) {
pool[kWaitQueue].shift();
continue;
}

if (!pool.availableConnectionCount) {
break;
}

const connection = pool[kConnections].shift();
const isStale = connectionIsStale(pool, connection);
const isIdle = connectionIsIdle(pool, connection);
@@ -472,21 +474,17 @@ function processWaitQueue(pool) {
}

const maxPoolSize = pool.options.maxPoolSize;
if (pool[kWaitQueue].length && (maxPoolSize <= 0 || pool.totalConnectionCount < maxPoolSize)) {
if (pool.waitQueueSize && (maxPoolSize <= 0 || pool.totalConnectionCount < maxPoolSize)) {
createConnection(pool, (err, connection) => {
const waitQueueMember = pool[kWaitQueue].shift();
if (waitQueueMember == null) {
if (waitQueueMember == null || waitQueueMember[kCancelled]) {
if (err == null) {
pool[kConnections].push(connection);
}

return;
}

if (waitQueueMember[kCancelled]) {
return;
}

if (err) {
pool.emit('connectionCheckOutFailed', new ConnectionCheckOutFailedEvent(pool, err));
} else {
316 changes: 170 additions & 146 deletions lib/collection.js

Large diffs are not rendered by default.

161 changes: 29 additions & 132 deletions lib/core/auth/auth_provider.js
Original file line number Diff line number Diff line change
@@ -1,158 +1,55 @@
'use strict';

const MongoError = require('../error').MongoError;

/**
* Creates a new AuthProvider, which dictates how to authenticate for a given
* mechanism.
* @class
* Context used during authentication
*
* @property {Connection} connection The connection to authenticate
* @property {MongoCredentials} credentials The credentials to use for authentication
* @property {object} options The options passed to the `connect` method
* @property {object?} response The response of the initial handshake
* @property {Buffer?} nonce A random nonce generated for use in an authentication conversation
*/
class AuthContext {
constructor(connection, credentials, options) {
this.connection = connection;
this.credentials = credentials;
this.options = options;
}
}

class AuthProvider {
constructor(bson) {
this.bson = bson;
this.authStore = [];
}

/**
* Authenticate
* @method
* @param {SendAuthCommand} sendAuthCommand Writes an auth command directly to a specific connection
* @param {Connection[]} connections Connections to authenticate using this authenticator
* @param {MongoCredentials} credentials Authentication credentials
* @param {authResultCallback} callback The callback to return the result from the authentication
*/
auth(sendAuthCommand, connections, credentials, callback) {
// Total connections
let count = connections.length;

if (count === 0) {
callback(null, null);
return;
}

// Valid connections
let numberOfValidConnections = 0;
let errorObject = null;

const execute = connection => {
this._authenticateSingleConnection(sendAuthCommand, connection, credentials, (err, r) => {
// Adjust count
count = count - 1;

// If we have an error
if (err) {
errorObject = new MongoError(err);
} else if (r && (r.$err || r.errmsg)) {
errorObject = new MongoError(r);
} else {
numberOfValidConnections = numberOfValidConnections + 1;
}

// Still authenticating against other connections.
if (count !== 0) {
return;
}

// We have authenticated all connections
if (numberOfValidConnections > 0) {
// Store the auth details
this.addCredentials(credentials);
// Return correct authentication
callback(null, true);
} else {
if (errorObject == null) {
errorObject = new MongoError(`failed to authenticate using ${credentials.mechanism}`);
}
callback(errorObject, false);
}
});
};

const executeInNextTick = _connection => process.nextTick(() => execute(_connection));

// For each connection we need to authenticate
while (connections.length > 0) {
executeInNextTick(connections.shift());
}
}

/**
* Implementation of a single connection authenticating. Is meant to be overridden.
* Will error if called directly
* @ignore
* Prepare the handshake document before the initial handshake.
*
* @param {object} handshakeDoc The document used for the initial handshake on a connection
* @param {AuthContext} authContext Context for authentication flow
* @param {function} callback
*/
_authenticateSingleConnection(/*sendAuthCommand, connection, credentials, callback*/) {
throw new Error('_authenticateSingleConnection must be overridden');
prepare(handshakeDoc, context, callback) {
callback(undefined, handshakeDoc);
}

/**
* Adds credentials to store only if it does not exist
* @param {MongoCredentials} credentials credentials to add to store
*/
addCredentials(credentials) {
const found = this.authStore.some(cred => cred.equals(credentials));

if (!found) {
this.authStore.push(credentials);
}
}

/**
* Re authenticate pool
* @method
* @param {SendAuthCommand} sendAuthCommand Writes an auth command directly to a specific connection
* @param {Connection[]} connections Connections to authenticate using this authenticator
* Authenticate
*
* @param {AuthContext} context A shared context for authentication flow
* @param {authResultCallback} callback The callback to return the result from the authentication
*/
reauthenticate(sendAuthCommand, connections, callback) {
const authStore = this.authStore.slice(0);
let count = authStore.length;
if (count === 0) {
return callback(null, null);
}

for (let i = 0; i < authStore.length; i++) {
this.auth(sendAuthCommand, connections, authStore[i], function(err) {
count = count - 1;
if (count === 0) {
callback(err, null);
}
});
}
}

/**
* Remove credentials that have been previously stored in the auth provider
* @method
* @param {string} source Name of database we are removing authStore details about
* @return {object}
*/
logout(source) {
this.authStore = this.authStore.filter(credentials => credentials.source !== source);
auth(context, callback) {
callback(new TypeError('`auth` method must be overridden by subclass'));
}
}

/**
* A function that writes authentication commands to a specific connection
* @callback SendAuthCommand
* @param {Connection} connection The connection to write to
* @param {Command} command A command with a toBin method that can be written to a connection
* @param {AuthWriteCallback} callback Callback called when command response is received
*/

/**
* A callback for a specific auth command
* @callback AuthWriteCallback
* @param {Error} err If command failed, an error from the server
* @param {object} r The response from the server
*/

/**
* This is a result from an authentication strategy
* This is a result from an authentication provider
*
* @callback authResultCallback
* @param {error} error An error object. Set to null if no error present
* @param {boolean} result The result of the authentication process
*/

module.exports = { AuthProvider };
module.exports = { AuthContext, AuthProvider };
4 changes: 2 additions & 2 deletions lib/core/auth/defaultAuthProviders.js
Original file line number Diff line number Diff line change
@@ -4,9 +4,9 @@ const MongoCR = require('./mongocr');
const X509 = require('./x509');
const Plain = require('./plain');
const GSSAPI = require('./gssapi');
const SSPI = require('./sspi');
const ScramSHA1 = require('./scram').ScramSHA1;
const ScramSHA256 = require('./scram').ScramSHA256;
const MongoDBAWS = require('./mongodb_aws');

/**
* Returns the default authentication providers.
@@ -16,11 +16,11 @@ const ScramSHA256 = require('./scram').ScramSHA256;
*/
function defaultAuthProviders(bson) {
return {
'mongodb-aws': new MongoDBAWS(bson),
mongocr: new MongoCR(bson),
x509: new X509(bson),
plain: new Plain(bson),
gssapi: new GSSAPI(bson),
sspi: new SSPI(bson),
'scram-sha-1': new ScramSHA1(bson),
'scram-sha-256': new ScramSHA256(bson)
};
344 changes: 127 additions & 217 deletions lib/core/auth/gssapi.js
Original file line number Diff line number Diff line change
@@ -1,241 +1,151 @@
'use strict';
const dns = require('dns');

const AuthProvider = require('./auth_provider').AuthProvider;
const retrieveKerberos = require('../utils').retrieveKerberos;
const MongoError = require('../error').MongoError;

let kerberos;

/**
* Creates a new GSSAPI authentication mechanism
* @class
* @extends AuthProvider
*/
class GSSAPI extends AuthProvider {
/**
* Implementation of authentication for a single connection
* @override
*/
_authenticateSingleConnection(sendAuthCommand, connection, credentials, callback) {
const source = credentials.source;
auth(authContext, callback) {
const connection = authContext.connection;
const credentials = authContext.credentials;
if (credentials == null) return callback(new MongoError('credentials required'));
const username = credentials.username;
const password = credentials.password;
const mechanismProperties = credentials.mechanismProperties;
const gssapiServiceName =
mechanismProperties['gssapiservicename'] ||
mechanismProperties['gssapiServiceName'] ||
'mongodb';
function externalCommand(command, cb) {
return connection.command('$external.$cmd', command, cb);
}
makeKerberosClient(authContext, (err, client) => {
if (err) return callback(err);
if (client == null) return callback(new MongoError('gssapi client missing'));
client.step('', (err, payload) => {
if (err) return callback(err);
externalCommand(saslStart(payload), (err, response) => {
if (err) return callback(err);
const result = response.result;
negotiate(client, 10, result.payload, (err, payload) => {
if (err) return callback(err);
externalCommand(saslContinue(payload, result.conversationId), (err, response) => {
if (err) return callback(err);
const result = response.result;
finalize(client, username, result.payload, (err, payload) => {
if (err) return callback(err);
externalCommand(
{
saslContinue: 1,
conversationId: result.conversationId,
payload
},
(err, result) => {
if (err) return callback(err);
callback(undefined, result);
}
);
});
});
});
});
});
});
}
}
module.exports = GSSAPI;

GSSAPIInitialize(
this,
kerberos.processes.MongoAuthProcess,
source,
username,
password,
source,
gssapiServiceName,
sendAuthCommand,
connection,
mechanismProperties,
callback
function makeKerberosClient(authContext, callback) {
const host = authContext.options.host;
const port = authContext.options.port;
const credentials = authContext.credentials;
if (!host || !port || !credentials) {
return callback(
new MongoError(
`Connection must specify: ${host ? 'host' : ''}, ${port ? 'port' : ''}, ${
credentials ? 'host' : 'credentials'
}.`
)
);
}

/**
* Authenticate
* @override
* @method
*/
auth(sendAuthCommand, connections, credentials, callback) {
if (kerberos == null) {
try {
kerberos = retrieveKerberos();
} catch (e) {
return callback(e, null);
}
if (kerberos == null) {
try {
kerberos = retrieveKerberos();
} catch (e) {
return callback(e);
}

super.auth(sendAuthCommand, connections, credentials, callback);
}
}

//
// Initialize step
var GSSAPIInitialize = function(
self,
MongoAuthProcess,
db,
username,
password,
authdb,
gssapiServiceName,
sendAuthCommand,
connection,
options,
callback
) {
// Create authenticator
var mongo_auth_process = new MongoAuthProcess(
connection.host,
connection.port,
gssapiServiceName,
options
);

// Perform initialization
mongo_auth_process.init(username, password, function(err) {
if (err) return callback(err, false);

// Perform the first step
mongo_auth_process.transition('', function(err, payload) {
if (err) return callback(err, false);

// Call the next db step
MongoDBGSSAPIFirstStep(
self,
mongo_auth_process,
payload,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
);
});
const username = credentials.username;
const password = credentials.password;
const mechanismProperties = credentials.mechanismProperties;
const serviceName =
mechanismProperties['gssapiservicename'] ||
mechanismProperties['gssapiServiceName'] ||
'mongodb';
performGssapiCanonicalizeHostName(host, mechanismProperties, (err, host) => {
if (err) return callback(err);
const initOptions = {};
if (password != null) {
Object.assign(initOptions, { user: username, password: password });
}
kerberos.initializeClient(
`${serviceName}${process.platform === 'win32' ? '/' : '@'}${host}`,
initOptions,
(err, client) => {
if (err) return callback(new MongoError(err));
callback(null, client);
}
);
});
};
}

//
// Perform first step against mongodb
var MongoDBGSSAPIFirstStep = function(
self,
mongo_auth_process,
payload,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
) {
// Build the sasl start command
var command = {
function saslStart(payload) {
return {
saslStart: 1,
mechanism: 'GSSAPI',
payload: payload,
payload,
autoAuthorize: 1
};

// Write the commmand on the connection
sendAuthCommand(connection, '$external.$cmd', command, (err, doc) => {
if (err) return callback(err, false);
// Execute mongodb transition
mongo_auth_process.transition(doc.payload, function(err, payload) {
if (err) return callback(err, false);

// MongoDB API Second Step
MongoDBGSSAPISecondStep(
self,
mongo_auth_process,
payload,
doc,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
);
});
});
};

//
// Perform first step against mongodb
var MongoDBGSSAPISecondStep = function(
self,
mongo_auth_process,
payload,
doc,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
) {
// Build Authentication command to send to MongoDB
var command = {
}
function saslContinue(payload, conversationId) {
return {
saslContinue: 1,
conversationId: doc.conversationId,
payload: payload
conversationId,
payload
};

// Execute the command
// Write the commmand on the connection
sendAuthCommand(connection, '$external.$cmd', command, (err, doc) => {
if (err) return callback(err, false);
// Call next transition for kerberos
mongo_auth_process.transition(doc.payload, function(err, payload) {
if (err) return callback(err, false);

// Call the last and third step
MongoDBGSSAPIThirdStep(
self,
mongo_auth_process,
payload,
doc,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
);
});
}
function negotiate(client, retries, payload, callback) {
client.step(payload, (err, response) => {
// Retries exhausted, raise error
if (err && retries === 0) return callback(err);
// Adjust number of retries and call step again
if (err) return negotiate(client, retries - 1, payload, callback);
// Return the payload
callback(undefined, response || '');
});
};

var MongoDBGSSAPIThirdStep = function(
self,
mongo_auth_process,
payload,
doc,
db,
username,
password,
authdb,
sendAuthCommand,
connection,
callback
) {
// Build final command
var command = {
saslContinue: 1,
conversationId: doc.conversationId,
payload: payload
};

// Execute the command
sendAuthCommand(connection, '$external.$cmd', command, (err, r) => {
if (err) return callback(err, false);
mongo_auth_process.transition(null, function(err) {
if (err) return callback(err, null);
callback(null, r);
}
function finalize(client, user, payload, callback) {
// GSS Client Unwrap
client.unwrap(payload, (err, response) => {
if (err) return callback(err);
// Wrap the response
client.wrap(response || '', { user }, (err, wrapped) => {
if (err) return callback(err);
// Return the payload
callback(undefined, wrapped);
});
});
};

/**
* This is a result from a authentication strategy
*
* @callback authResultCallback
* @param {error} error An error object. Set to null if no error present
* @param {boolean} result The result of the authentication process
*/

module.exports = GSSAPI;
}
function performGssapiCanonicalizeHostName(host, mechanismProperties, callback) {
const canonicalizeHostName =
typeof mechanismProperties.gssapiCanonicalizeHostName === 'boolean'
? mechanismProperties.gssapiCanonicalizeHostName
: false;
if (!canonicalizeHostName) return callback(undefined, host);
// Attempt to resolve the host name
dns.resolveCname(host, (err, r) => {
if (err) return callback(err);
// Get the first resolve host id
if (Array.isArray(r) && r.length > 0) {
return callback(undefined, r[0]);
}
callback(undefined, host);
});
}
32 changes: 29 additions & 3 deletions lib/core/auth/mongo_credentials.js
Original file line number Diff line number Diff line change
@@ -47,7 +47,24 @@ class MongoCredentials {
this.password = options.password;
this.source = options.source || options.db;
this.mechanism = options.mechanism || 'default';
this.mechanismProperties = options.mechanismProperties;
this.mechanismProperties = options.mechanismProperties || {};

if (this.mechanism.match(/MONGODB-AWS/i)) {
if (this.username == null && process.env.AWS_ACCESS_KEY_ID) {
this.username = process.env.AWS_ACCESS_KEY_ID;
}

if (this.password == null && process.env.AWS_SECRET_ACCESS_KEY) {
this.password = process.env.AWS_SECRET_ACCESS_KEY;
}

if (this.mechanismProperties.AWS_SESSION_TOKEN == null && process.env.AWS_SESSION_TOKEN) {
this.mechanismProperties.AWS_SESSION_TOKEN = process.env.AWS_SESSION_TOKEN;
}
}

Object.freeze(this.mechanismProperties);
Object.freeze(this);
}

/**
@@ -69,12 +86,21 @@ class MongoCredentials {
* based on the server version and server supported sasl mechanisms.
*
* @param {Object} [ismaster] An ismaster response from the server
* @returns {MongoCredentials}
*/
resolveAuthMechanism(ismaster) {
// If the mechanism is not "default", then it does not need to be resolved
if (this.mechanism.toLowerCase() === 'default') {
this.mechanism = getDefaultAuthMechanism(ismaster);
if (this.mechanism.match(/DEFAULT/i)) {
return new MongoCredentials({
username: this.username,
password: this.password,
source: this.source,
mechanism: getDefaultAuthMechanism(ismaster),
mechanismProperties: this.mechanismProperties
});
}

return this;
}
}

18 changes: 6 additions & 12 deletions lib/core/auth/mongocr.js
Original file line number Diff line number Diff line change
@@ -3,27 +3,21 @@
const crypto = require('crypto');
const AuthProvider = require('./auth_provider').AuthProvider;

/**
* Creates a new MongoCR authentication mechanism
*
* @extends AuthProvider
*/
class MongoCR extends AuthProvider {
/**
* Implementation of authentication for a single connection
* @override
*/
_authenticateSingleConnection(sendAuthCommand, connection, credentials, callback) {
auth(authContext, callback) {
const connection = authContext.connection;
const credentials = authContext.credentials;
const username = credentials.username;
const password = credentials.password;
const source = credentials.source;

sendAuthCommand(connection, `${source}.$cmd`, { getnonce: 1 }, (err, r) => {
connection.command(`${source}.$cmd`, { getnonce: 1 }, (err, result) => {
let nonce = null;
let key = null;

// Get nonce
if (err == null) {
const r = result.result;
nonce = r.nonce;
// Use node md5 generator
let md5 = crypto.createHash('md5');
@@ -43,7 +37,7 @@ class MongoCR extends AuthProvider {
key
};

sendAuthCommand(connection, `${source}.$cmd`, authenticateCommand, callback);
connection.command(`${source}.$cmd`, authenticateCommand, callback);
});
}
}
256 changes: 256 additions & 0 deletions lib/core/auth/mongodb_aws.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,256 @@
'use strict';
const AuthProvider = require('./auth_provider').AuthProvider;
const MongoCredentials = require('./mongo_credentials').MongoCredentials;
const MongoError = require('../error').MongoError;
const crypto = require('crypto');
const http = require('http');
const maxWireVersion = require('../utils').maxWireVersion;
const url = require('url');

let aws4;
try {
aws4 = require('aws4');
} catch (e) {
// don't do anything;
}

const ASCII_N = 110;
const AWS_RELATIVE_URI = 'http://169.254.170.2';
const AWS_EC2_URI = 'http://169.254.169.254';
const AWS_EC2_PATH = '/latest/meta-data/iam/security-credentials';

class MongoDBAWS extends AuthProvider {
auth(authContext, callback) {
const connection = authContext.connection;
const credentials = authContext.credentials;

if (maxWireVersion(connection) < 9) {
callback(new MongoError('MONGODB-AWS authentication requires MongoDB version 4.4 or later'));
return;
}

if (aws4 == null) {
callback(
new MongoError(
'MONGODB-AWS authentication requires the `aws4` module, please install it as a dependency of your project'
)
);

return;
}

if (credentials.username == null) {
makeTempCredentials(credentials, (err, tempCredentials) => {
if (err) return callback(err);

authContext.credentials = tempCredentials;
this.auth(authContext, callback);
});

return;
}

const username = credentials.username;
const password = credentials.password;
const db = credentials.source;
const token = credentials.mechanismProperties.AWS_SESSION_TOKEN;
const bson = this.bson;

crypto.randomBytes(32, (err, nonce) => {
if (err) {
callback(err);
return;
}

const saslStart = {
saslStart: 1,
mechanism: 'MONGODB-AWS',
payload: bson.serialize({ r: nonce, p: ASCII_N })
};

connection.command(`${db}.$cmd`, saslStart, (err, result) => {
if (err) return callback(err);

const res = result.result;
const serverResponse = bson.deserialize(res.payload.buffer);
const host = serverResponse.h;
const serverNonce = serverResponse.s.buffer;
if (serverNonce.length !== 64) {
callback(
new MongoError(`Invalid server nonce length ${serverNonce.length}, expected 64`)
);
return;
}

if (serverNonce.compare(nonce, 0, nonce.length, 0, nonce.length) !== 0) {
callback(new MongoError('Server nonce does not begin with client nonce'));
return;
}

if (host.length < 1 || host.length > 255 || host.indexOf('..') !== -1) {
callback(new MongoError(`Server returned an invalid host: "${host}"`));
return;
}

const body = 'Action=GetCallerIdentity&Version=2011-06-15';
const options = aws4.sign(
{
method: 'POST',
host,
region: deriveRegion(serverResponse.h),
service: 'sts',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': body.length,
'X-MongoDB-Server-Nonce': serverNonce.toString('base64'),
'X-MongoDB-GS2-CB-Flag': 'n'
},
path: '/',
body
},
{
accessKeyId: username,
secretAccessKey: password,
token
}
);

const authorization = options.headers.Authorization;
const date = options.headers['X-Amz-Date'];
const payload = { a: authorization, d: date };
if (token) {
payload.t = token;
}

const saslContinue = {
saslContinue: 1,
conversationId: 1,
payload: bson.serialize(payload)
};

connection.command(`${db}.$cmd`, saslContinue, err => {
if (err) return callback(err);
callback();
});
});
});
}
}

function makeTempCredentials(credentials, callback) {
function done(creds) {
if (creds.AccessKeyId == null || creds.SecretAccessKey == null || creds.Token == null) {
callback(new MongoError('Could not obtain temporary MONGODB-AWS credentials'));
return;
}

callback(
undefined,
new MongoCredentials({
username: creds.AccessKeyId,
password: creds.SecretAccessKey,
source: credentials.source,
mechanism: 'MONGODB-AWS',
mechanismProperties: {
AWS_SESSION_TOKEN: creds.Token
}
})
);
}

// If the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI
// is set then drivers MUST assume that it was set by an AWS ECS agent
if (process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI) {
request(
`${AWS_RELATIVE_URI}${process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}`,
(err, res) => {
if (err) return callback(err);
done(res);
}
);

return;
}

// Otherwise assume we are on an EC2 instance

// get a token

request(
`${AWS_EC2_URI}/latest/api/token`,
{ method: 'PUT', json: false, headers: { 'X-aws-ec2-metadata-token-ttl-seconds': 30 } },
(err, token) => {
if (err) return callback(err);

// get role name
request(
`${AWS_EC2_URI}/${AWS_EC2_PATH}`,
{ json: false, headers: { 'X-aws-ec2-metadata-token': token } },
(err, roleName) => {
if (err) return callback(err);

// get temp credentials
request(
`${AWS_EC2_URI}/${AWS_EC2_PATH}/${roleName}`,
{ headers: { 'X-aws-ec2-metadata-token': token } },
(err, creds) => {
if (err) return callback(err);
done(creds);
}
);
}
);
}
);
}

function deriveRegion(host) {
const parts = host.split('.');
if (parts.length === 1 || parts[1] === 'amazonaws') {
return 'us-east-1';
}

return parts[1];
}

function request(uri, options, callback) {
if (typeof options === 'function') {
callback = options;
options = {};
}

options = Object.assign(
{
method: 'GET',
timeout: 10000,
json: true
},
url.parse(uri),
options
);

const req = http.request(options, res => {
res.setEncoding('utf8');

let data = '';
res.on('data', d => (data += d));
res.on('end', () => {
if (options.json === false) {
callback(undefined, data);
return;
}

try {
const parsed = JSON.parse(data);
callback(undefined, parsed);
} catch (err) {
callback(new MongoError(`Invalid JSON response: "${data}"`));
}
});
});

req.on('error', err => callback(err));
req.end();
}

module.exports = MongoDBAWS;
17 changes: 5 additions & 12 deletions lib/core/auth/plain.js
Original file line number Diff line number Diff line change
@@ -1,25 +1,18 @@
'use strict';

const retrieveBSON = require('../connection/utils').retrieveBSON;
const AuthProvider = require('./auth_provider').AuthProvider;

// TODO: can we get the Binary type from this.bson instead?
const BSON = retrieveBSON();
const Binary = BSON.Binary;

/**
* Creates a new Plain authentication mechanism
*
* @extends AuthProvider
*/
class Plain extends AuthProvider {
/**
* Implementation of authentication for a single connection
* @override
*/
_authenticateSingleConnection(sendAuthCommand, connection, credentials, callback) {
auth(authContext, callback) {
const connection = authContext.connection;
const credentials = authContext.credentials;
const username = credentials.username;
const password = credentials.password;

const payload = new Binary(`\x00${username}\x00${password}`);
const command = {
saslStart: 1,
@@ -28,7 +21,7 @@ class Plain extends AuthProvider {
autoAuthorize: 1
};

sendAuthCommand(connection, '$external.$cmd', command, callback);
connection.command('$external.$cmd', command, callback);
}
}

Loading