How to use the apify.client function in apify

To help you get started, we’ve selected a few apify examples, based on popular ways it is used in public projects.

github apifytech / apify-js / examples / call_actor.js View on Github external
Apify.main(async () => {
    // Launch the web browser.
    const browser = await Apify.launchPuppeteer();

    console.log('Obtaining email address...');
    const user = await Apify.client.users.getUser();

    // Load charts and get last traded price of BTC
    console.log('Extracting data from');
    const page = await browser.newPage();
    await page.goto('');
    const tradedPricesHtml = await page.$eval('#ticker-top ul', el => el.outerHTML);

    // Send prices to your email. For that, you can use an actor we already
    // have available on the platform under the name: apify/send-mail.
    // The second parameter to the invocation is the actor's
    // desired input. You can find the required input parameters by checking
    // the actor's documentation page:
    console.log(`Sending email to ${}...`);
    await'apify/send-mail', {
        subject: ' BTC',
github apifytech / actor-scraper / src / bootstrap.js View on Github external
const fetchInput = async () => {
    const input = await Apify.getValue('INPUT');

    const crawler = input.crawlerId
        ? await Apify.client.crawlers.getCrawlerSettings({ crawlerId: input.crawlerId })
        : {};

    // NOTE: In old crawler settings can be some values null, replace them with default values

    const mergedInput = _.defaults(input, crawler, INPUT_DEFAULTS, {
        actId: APIFY_ACT_ID,
        runId: APIFY_ACT_RUN_ID,

    mergedInput.crawlPurls = mergedInput.crawlPurls || [];
    mergedInput.crawlPurls.forEach((purl) => {
        purl.parsedPurl = new PseudoUrl(purl.value);
github apifytech / actor-scraper / src / main.js View on Github external
const fetchInput = async () => {
    const input = await Apify.getValue('INPUT');

    if (!input.crawlerId) return input;

    const crawler = await Apify.client.crawlers.getCrawlerSettings({ crawlerId: input.crawlerId });

    return Object.assign({}, input, crawler);


The scalable web crawling and scraping library for JavaScript/Node.js. Enables development of data extraction and web automation jobs (not only) with headless Chrome and Puppeteer.

Latest version published 2 months ago

Package Health Score

84 / 100
Full package analysis