You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/api.md
+22-22
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@
40
40
## `pino([options], [destination]) => logger`
41
41
42
42
The exported `pino` function takes two optional arguments,
43
-
[`options`](#options) and [`destination`](#destination) and
43
+
[`options`](#options) and [`destination`](#destination), and
44
44
returns a [logger instance](#logger).
45
45
46
46
<aid=options></a>
@@ -70,7 +70,7 @@ Additional levels can be added to the instance via the `customLevels` option.
70
70
Default: `undefined`
71
71
72
72
Use this option to define additional logging levels.
73
-
The keys of the object correspond the namespace of the log level,
73
+
The keys of the object correspond to the namespace of the log level,
74
74
and the values should be the numerical value of the level.
75
75
76
76
```js
@@ -88,7 +88,7 @@ logger.foo('hi')
88
88
Default: `false`
89
89
90
90
Use this option to only use defined `customLevels` and omit Pino's levels.
91
-
Logger's default `level` must be changed to a value in `customLevels`in order to use `useOnlyCustomLevels`
91
+
Logger's default `level` must be changed to a value in `customLevels` to use `useOnlyCustomLevels`
92
92
Warning: this option may not be supported by downstream transports.
93
93
94
94
```js
@@ -100,13 +100,13 @@ const logger = pino({
100
100
level:'foo'
101
101
})
102
102
logger.foo('hi')
103
-
logger.info('hello') // Will throw an error saying info in not found in logger object
103
+
logger.info('hello') // Will throw an error saying info is not found in logger object
104
104
```
105
105
#### `depthLimit` (Number)
106
106
107
107
Default: `5`
108
108
109
-
Option to limit stringification at a specific nesting depth when logging circular object.
109
+
Option to limit stringification at a specific nesting depth when logging circular objects.
110
110
111
111
#### `edgeLimit` (Number)
112
112
@@ -266,11 +266,11 @@ Default: `undefined`
266
266
As an array, the `redact` option specifies paths that should
267
267
have their values redacted from any log output.
268
268
269
-
Each path must be a string using a syntax which corresponds to JavaScript dot and bracket notation.
269
+
Each path must be a string using a syntax that corresponds to JavaScript dot and bracket notation.
270
270
271
271
If an object is supplied, three options can be specified:
272
272
*`paths` (array): Required. An array of paths. See [redaction - Path Syntax ⇗](/docs/redaction.md#paths) for specifics.
273
-
*`censor` (String|Function|Undefined): Optional. When supplied as a String the `censor` option will overwrite keys which are to be redacted. When set to `undefined` the key will be removed entirely from the object.
273
+
*`censor` (String|Function|Undefined): Optional. When supplied as a String the `censor` option will overwrite keys that are to be redacted. When set to `undefined` the key will be removed entirely from the object.
274
274
The `censor` option may also be a mapping function. The (synchronous) mapping function has the signature `(value, path) => redactedValue` and is called with the unredacted `value` and `path` to the key being redacted, as an array. For example given a redaction path of `a.b.c` the `path` argument would be `['a', 'b', 'c']`. The value returned from the mapping function becomes the applied censor value. Default: `'[Redacted]'`
275
275
value synchronously.
276
276
Default: `'[Redacted]'`
@@ -356,7 +356,7 @@ const formatters = {
356
356
357
357
Changes the shape of the log object. This function will be called every time
358
358
one of the log methods (such as `.info`) is called. All arguments passed to the
359
-
log method, except the message, will be pass to this function. By default it does
359
+
log method, except the message, will be passed to this function. By default, it does
360
360
not change the shape of the log object.
361
361
362
362
```js
@@ -503,7 +503,7 @@ pino({ transport: {}}, '/path/to/somewhere') // THIS WILL NOT WORK, DO NOT DO TH
503
503
pino({ transport: {}}, process.stderr) // THIS WILL NOT WORK, DO NOT DO THIS
504
504
```
505
505
506
-
when using the `transport` option. In this case an `Error` will be thrown.
506
+
when using the `transport` option. In this case, an `Error` will be thrown.
507
507
508
508
* See [pino.transport()](#pino-transport)
509
509
@@ -513,7 +513,7 @@ The `onChild` function is a synchronous callback that will be called on each cre
513
513
Any error thrown inside the callback will be uncaught and should be handled inside the callback.
//Exceute call back code for each newly created child.
516
+
//Execute call back code for each newly created child.
517
517
}})
518
518
// `onChild` will now be executed with the new child.
519
519
parent.child(bindings)
@@ -567,7 +567,7 @@ path, e.g. `/tmp/1`.
567
567
Default: `false`
568
568
569
569
Using the global symbol `Symbol.for('pino.metadata')` as a key on the `destination` parameter and
570
-
setting the key it to `true`, indicates that the following properties should be
570
+
setting the key to `true`, indicates that the following properties should be
571
571
set on the `destination` object after each log line is written:
572
572
573
573
* the last logging level as `destination.lastLevel`
@@ -613,7 +613,7 @@ The parameters are explained below using the `logger.info` method but the same a
613
613
#### `mergingObject` (Object)
614
614
615
615
An object can optionally be supplied as the first parameter. Each enumerable key and value
616
-
of the `mergingObject` is copied in to the JSON log line.
616
+
of the `mergingObject` is copied into the JSON log line.
617
617
618
618
```js
619
619
logger.info({MIX: {IN:true}})
@@ -658,7 +658,7 @@ the following placeholders:
658
658
659
659
* `%s` – string placeholder
660
660
* `%d` – digit placeholder
661
-
* `%O`, `%o` and `%j` – object placeholder
661
+
* `%O`, `%o`, and `%j` – object placeholder
662
662
663
663
Values supplied as additional arguments to the logger method will
664
664
then be interpolated accordingly.
@@ -776,7 +776,7 @@ Write a `'error'` level log, if the configured `level` allows for it.
776
776
777
777
Write a `'fatal'` level log, if the configured `level` allows for it.
778
778
779
-
Since `'fatal'` level messages are intended to be logged just prior to the process exiting the `fatal`
779
+
Since `'fatal'` level messages are intended to be logged just before the process exiting the `fatal`
780
780
method will always sync flush the destination.
781
781
Therefore it's important not to misuse `fatal` since
782
782
it will cause performance overhead if used for any
@@ -832,7 +832,7 @@ Options for child logger. These options will override the parent logger options.
832
832
##### `options.level` (String)
833
833
834
834
The `level` property overrides the log level of the child logger.
835
-
By default the parent log level is inherited.
835
+
By default, the parent log level is inherited.
836
836
After the creation of the child logger, it is also accessible using the [`logger.level`](#logger-level) key.
837
837
838
838
```js
@@ -921,9 +921,9 @@ The core levels and their values are as follows:
921
921
922
922
The logging level is a *minimum* level based on the associated value of that level.
923
923
924
-
For instance if `logger.level` is `info` *(30)* then `info` *(30)*, `warn` *(40)*, `error` *(50)* and `fatal` *(60)* log methods will be enabled but the `trace` *(10)* and `debug` *(20)* methods, being less than 30, will not.
924
+
For instance if `logger.level` is `info` *(30)* then `info` *(30)*, `warn` *(40)*, `error` *(50)*, and `fatal` *(60)* log methods will be enabled but the `trace` *(10)* and `debug` *(20)* methods, being less than 30, will not.
925
925
926
-
The `silent` logging level is a specialized level which will disable all logging,
926
+
The `silent` logging level is a specialized level that will disable all logging,
Copy file name to clipboardexpand all lines: docs/bundling.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -2,17 +2,17 @@
2
2
3
3
Due to its internal architecture based on Worker Threads, it is not possible to bundle Pino *without* generating additional files.
4
4
5
-
In particular, a bundler must ensure that the following files are also bundle separately:
5
+
In particular, a bundler must ensure that the following files are also bundled separately:
6
6
7
7
*`lib/worker.js` from the `thread-stream` dependency
8
8
*`file.js`
9
9
*`lib/worker.js`
10
10
*`lib/worker-pipeline.js`
11
11
* Any transport used by the user (like `pino-pretty`)
12
12
13
-
Once the files above have been generated, the bundler must also add information about the files above by injecting a code which sets `__bundlerPathsOverrides` in the `globalThis` object.
13
+
Once the files above have been generated, the bundler must also add information about the files above by injecting a code that sets `__bundlerPathsOverrides` in the `globalThis` object.
14
14
15
-
The variable is a object whose keys are identifier for the files and the values are the paths of files relative to the currently bundle files.
15
+
The variable is an object whose keys are an identifier for the files and the values are the paths of files relative to the currently bundle files.
Note that `pino/file`, `pino-worker`, `pino-pipeline-worker` and `thread-stream-worker` are required identifiers. Other identifiers are possible based on the user configuration.
30
+
Note that `pino/file`, `pino-worker`, `pino-pipeline-worker`, and `thread-stream-worker` are required identifiers. Other identifiers are possible based on the user configuration.
Copy file name to clipboardexpand all lines: docs/ecosystem.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -43,7 +43,7 @@ in a MongoDB database.
43
43
+[`pino-noir`](https://github.com/pinojs/pino-noir): redact sensitive information
44
44
in logs.
45
45
+[`pino-pretty`](https://github.com/pinojs/pino-pretty): basic prettifier to
46
-
make log lines humanreadable.
46
+
make log lines human-readable.
47
47
+[`pino-socket`](https://github.com/pinojs/pino-socket): send logs to TCP or UDP
48
48
destinations.
49
49
+[`pino-std-serializers`](https://github.com/pinojs/pino-std-serializers): the
@@ -64,7 +64,7 @@ the logger for the [Rill framework](https://rill.site/).
64
64
65
65
+[`pino-colada`](https://github.com/lrlna/pino-colada): cute ndjson formatter for pino.
66
66
+[`pino-fluentd`](https://github.com/davidedantonio/pino-fluentd): send Pino logs to Elasticsearch,
67
-
MongoDB and many [others](https://www.fluentd.org/dataoutputs) via Fluentd.
67
+
MongoDB, and many [others](https://www.fluentd.org/dataoutputs) via Fluentd.
68
68
+[`pino-pretty-min`](https://github.com/unjello/pino-pretty-min): a minimal
69
69
prettifier inspired by the [logrus](https://github.com/sirupsen/logrus) logger.
70
70
+[`pino-rotating-file`](https://github.com/homeaway/pino-rotating-file): a hapi-pino log transport for splitting logs into separate, automatically rotating files.
@@ -73,4 +73,4 @@ prettifier inspired by the [logrus](https://github.com/sirupsen/logrus) logger.
73
73
+[`pino-dev`](https://github.com/dnjstrom/pino-dev): simple prettifier for pino with built-in support for common ecosystem packages.
74
74
+[`@newrelic/pino-enricher`](https://github.com/newrelic/newrelic-node-log-extensions/blob/main/packages/pino-log-enricher): a log customization to add New Relic context to use [Logs In Context](https://docs.newrelic.com/docs/logs/logs-context/logs-in-context/)
75
75
+[`pino-lambda`](https://github.com/FormidableLabs/pino-lambda): log transport for cloudwatch support inside aws-lambda
76
-
+[`cloud-pine`](https://github.com/metcoder95/cloud-pine): transport that provide abstraction and compatibility with [`@google-cloud/logging`](https://www.npmjs.com/package/@google-cloud/logging).
76
+
+[`cloud-pine`](https://github.com/metcoder95/cloud-pine): transport that provides abstraction and compatibility with [`@google-cloud/logging`](https://www.npmjs.com/package/@google-cloud/logging).
Although it is works, we recommend using one of these options instead if you are able:
178
+
Although it works, we recommend using one of these options instead if you are able:
179
179
180
180
1. If the only change desired is the name then a transport can be used. One such
181
181
transport is [`pino-text-level-transport`](https://npm.im/pino-text-level-transport).
@@ -202,7 +202,7 @@ $ npm i pino-debug
202
202
$ DEBUG=* node -r pino-debug app.js
203
203
```
204
204
205
-
[`pino-debug`](https://github.com/pinojs/pino-debug) also offers finegrain control to map specific `debug`
205
+
[`pino-debug`](https://github.com/pinojs/pino-debug) also offers fine-grain control to map specific `debug`
206
206
namespaces to `pino` log levels. See [`pino-debug`](https://github.com/pinojs/pino-debug)
207
207
for more.
208
208
@@ -211,8 +211,8 @@ for more.
211
211
212
212
Pino uses [sonic-boom](https://github.com/mcollina/sonic-boom) to speed
213
213
up logging. Internally, it uses [`fs.write`](https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_write_fd_string_position_encoding_callback) to write log lines directly to a file
214
-
descriptor. On Windows, unicode output is not handled properly in the
215
-
terminal (both `cmd.exe` and powershell), and as such the output could
214
+
descriptor. On Windows, Unicode output is not handled properly in the
215
+
terminal (both `cmd.exe` and PowerShell), and as such the output could
216
216
be visualized incorrectly if the log lines include utf8 characters. It
217
217
is possible to configure the terminal to visualize those characters
218
218
correctly with the use of [`chcp`](https://ss64.com/nt/chcp.html) by
@@ -222,7 +222,7 @@ Node.js.
222
222
<aid="stackdriver"></a>
223
223
## Mapping Pino Log Levels to Google Cloud Logging (Stackdriver) Severity Levels
224
224
225
-
Google Cloud Logging uses `severity` levels instead log levels. As a result, all logs may show as INFO
225
+
Google Cloud Logging uses `severity` levels instead of log levels. As a result, all logs may show as INFO
226
226
level logs while completely ignoring the level set in the pino log. Google Cloud Logging also prefers that
227
227
log data is present inside a `message` key instead of the default `msg` key that Pino uses. Use a technique
228
228
similar to the one below to retain log levels in Google Cloud Logging
The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through it's API.
463
+
The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through its API.
464
464
465
465
Given an application `foo` that logs via pino, you would use `pino-datadog` like so:
@@ -742,7 +742,7 @@ const transport = pino.transport({
742
742
projectId:1,
743
743
projectKey:"REPLACE_ME",
744
744
environment:"production",
745
-
//aditional options for airbrake
745
+
//additional options for airbrake
746
746
performanceStats:false,
747
747
},
748
748
},
@@ -757,7 +757,7 @@ pino(transport)
757
757
<a id="pino-socket"></a>
758
758
### pino-socket
759
759
760
-
[pino-socket][pino-socket] is a transport that will forward logs to a IPv4
760
+
[pino-socket][pino-socket] is a transport that will forward logs to an IPv4
761
761
UDP or TCP socket.
762
762
763
763
As an example, use `socat` to fake a listener:
@@ -841,9 +841,9 @@ https://github.com/deviantony/docker-elk to setup an ELK stack.
841
841
842
842
<a id="pino-stackdriver"></a>
843
843
### pino-stackdriver
844
-
The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through it's API.
844
+
The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through its API.
845
845
846
-
Given an application `foo` that logs via pino, a stackdriver log project `bar` and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
846
+
Given an application `foo` that logs via pino, a stackdriver log project `bar`, and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
Here we discuss some technical details of how Pino communicates with its [worker threads](https://nodejs.org/api/worker_threads.html).
894
+
895
+
Pino uses [`thread-stream`](https://github.com/pinojs/thread-stream) to create a stream for transports.
896
+
When we create a stream with `thread-stream`, `thread-stream` spawns a [worker](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/index.js#L50-L60) (an independent JavaScript execution thread).
897
+
898
+
899
+
### Error messages
900
+
How are error messages propagated from a transport worker to Pino?
901
+
902
+
Let's assume we have a transport with an error listener:
903
+
```js
904
+
// index.js
905
+
consttransport=pino.transport({
906
+
target:'./transport.js'
907
+
})
908
+
909
+
transport.on('error', err=> {
910
+
console.error('error caught', err)
911
+
})
912
+
913
+
constlog=pino(transport)
914
+
```
915
+
916
+
When our worker emits an error event, the worker has listeners for it: [error](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/lib/worker.js#L59-L70) and [unhandledRejection](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/lib/worker.js#L135-L141). These listeners send the error message to the main thread where Pino is present.
917
+
918
+
When Pino receives the error message, it further [emits](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/index.js#L349) the error message. Finally, the error message arrives at our `index.js` and is caught by our error listener.
0 commit comments