How to use the stream-json/streamers/StreamValues.streamValues function in stream-json

To help you get started, we’ve selected a few stream-json examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github ravendb / ravendb-nodejs-client / src / Documents / Commands / MultiGet / MultiGetCommand.ts View on Github external
defaultTransform: "camel",
                ignorePaths: [/\./],
            })
            .collectResult({
                initResult: [] as GetResponse[],
                reduceResults: (result: GetResponse[], next) => {
                    return [...result, next["value"]];
                }
            })
            .process(bodyStream);

        const responsesResultsPromise = this._pipeline()
            .parseJsonAsync([
                pick({ filter: "Results" }),
                pick({ filter: /^\d+\.Result\b/i }),
                streamValues()
            ])
            .collectResult({
                initResult: [] as string[],
                reduceResults: (result: string[], next) => {
                    // TODO try read it another way
                    const resResult = JSON.stringify(next["value"]);
                    return [...result, resResult];
                }
            })
            .process(bodyStream);

        const [responses, responsesResults] = await Promise.all([responsesPromise, responsesResultsPromise]);
        for (let i = 0; i < responses.length; i++) {
            const res = responses[i];
            res.result = responsesResults[i];
            const command = this._commands[i];
github esri-es / arcgis_websocket_server / streamserver / pipelines / default.js View on Github external
function compose(ctx) {
  let pipeline = [
    parser({jsonStreaming: true}),
    streamValues()
  ];
  if (sanityCheck(CUSTOM_PIPELINE)) {
    pipeline.push(..._injectCtx(CUSTOM_PIPELINE,ctx));
  } else {
    console.log(`Default Pipeline setup...[Skipping custom pipeline]`);
    if (CUSTOM_PIPELINE.length > 0) {
      console.warn(`Something is wrong : Please review your custom pipeline`);
      process.exit(12);
    }
  }

  return pipeline;
}
github ravendb / ravendb-nodejs-client / src / Mapping / Json / Streams / Pipelines.ts View on Github external
export function getRestOfOutputPipeline(
    bodyStream: stream.Stream,
    ignoreFields: string | RegExp): RavenCommandResponsePipeline {
    return RavenCommandResponsePipeline.create()
        .parseJsonAsync([
            ignore({ filter: ignoreFields }),
            streamValues()
        ])
        .streamKeyCaseTransform("camel");
}
github lifeomic / cli / lib / cmds / fhir_cmds / ingest.js View on Github external
function getCSVChain () {
  return chain([
    csvParser(),
    asObjects(),
    StreamValues.streamValues()
  ]);
}

stream-json

stream-json is the micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-inspired API. I

BSD-3-Clause
Latest version published 13 days ago

Package Health Score

80 / 100
Full package analysis