How to use the baal.active.active_loop.ActiveLearningLoop function in baal

To help you get started, we’ve selected a few baal examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github ElementAI / baal / experiments / vgg_mcdropout_cifar10.py View on Github external
# change dropout layer to MCDropout
    model = patch_module(model)

    if use_cuda:
        model.cuda()
    optimizer = optim.SGD(model.parameters(), lr=hyperparams["lr"], momentum=0.9)

    # Wraps the model into a usable API.
    model = ModelWrapper(model, criterion)

    logs = {}
    logs['epoch'] = 0

    # for prediction we use a smaller batchsize
    # since it is slower
    active_loop = ActiveLearningLoop(active_set,
                                     model.predict_on_dataset,
                                     heuristic,
                                     hyperparams.get('n_data_to_label', 1),
                                     batch_size=10,
                                     iterations=hyperparams['iterations'],
                                     use_cuda=use_cuda)

    for epoch in tqdm(range(args.epoch)):
        model.train_on_dataset(active_set, optimizer, hyperparams["batch_size"], 1, use_cuda)

        # Validation!
        model.test_on_dataset(test_set, hyperparams["batch_size"], use_cuda)
        metrics = model.metrics

        if epoch % hyperparams['learning_epoch'] == 0:
            should_continue = active_loop.step()