How to use the dnn.pytorch.layer function in dnn

To help you get started, we’ve selected a few dnn examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github NUSTM / pytorch-dnnnlp / pytorch / model.py View on Github external
def __init__(self, emb_matrix, args):
        """
        Initilize the model data and layer
        * emb_matrix [np.array]: word embedding matrix
        * args [dict]: all model arguments
        """
        nn.Module.__init__(self)
        base.base.__init__(self, args)

        self.emb_mat = layer.embedding_layer(emb_matrix, self.emb_type)
        self.pos_emb_mat = layer.positional_embedding_layer(self.n_hidden)
        self.drop_out = nn.Dropout(self.drop_prob)
        self.transformer = nn.ModuleList([
            layer.transformer_layer(self.emb_dim, self.n_hidden, self.n_head) for _ in range(self.n_layer)
        ])
        self.predict = layer.softmax_layer(self.emb_dim, self.n_class)
github NUSTM / pytorch-dnnnlp / pytorch / model.py View on Github external
def init_weights(self):
        for m in self.modules():
            if isinstance(m, layer.transformer_layer):
                m.init_weights()
            if isinstance(m, layer.softmax_layer):
                m.init_weights()