How to use the d2l.base.Timer function in d2l

To help you get started, we’ve selected a few d2l examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github dsgiitr / d2l-pytorch / d2l / train.py View on Github external
# Initialization
    w1 = np.random.normal(scale=0.01, size=(feature_dim, 1))
    b1 = np.zeros(1)
    w = Variable(torch.from_numpy(w1), requires_grad=True)
    b = Variable(torch.from_numpy(b1), requires_grad=True)

    if trainer.__name__ == 'SGD':
        optimizer = trainer([w, b], lr=hyperparams['lr'], momentum=hyperparams['momentum'])
    elif trainer.__name__ == 'RMSprop':
        optimizer = trainer([w, b], lr=hyperparams['lr'], alpha=hyperparams['gamma'])

    net, loss = lambda X: linreg(X, w, b), squared_loss
    # Train
    animator = Animator(xlabel='epoch', ylabel='loss',
                            xlim=[0, num_epochs], ylim=[0.22, 0.35])
    n, timer = 0, Timer()

    for _ in range(num_epochs):
        for X, y in data_iter:
            X, y = Variable(X), Variable(y)
            optimizer.zero_grad()
            output = net(X)
            l = loss(output, y).mean()
            l.backward()
            optimizer.step()
            n += X.shape[0]
            if n % 200 == 0:
                timer.stop()
                animator.add(n/X.shape[0]/len(data_iter),
                             evaluate_loss(net, data_iter, loss))
                timer.start()
    print('loss: %.3f, %.3f sec/epoch'%(animator.Y[0][-1], timer.avg()))