probnmn.models.question_reconstructor

class probnmn.models.question_reconstructor.QuestionReconstructor(vocabulary: allennlp.data.vocabulary.Vocabulary, input_size: int = 256, hidden_size: int = 256, num_layers: int = 2, dropout: float = 0.0)[source]

Bases: probnmn.modules.seq2seq_base.Seq2SeqBase

A wrapper over probnmn.modules.seq2seq_base.Seq2SeqBase. This sequence to sequence model accepts tokenized and padded program sequences and decodes them to question sequences.

Parameters
vocabulary: allennlp.data.vocabulary.Vocabulary

AllenNLP’s vocabulary. This vocabulary has three namespaces - “questions”, “programs” and “answers”, which contain respective token to integer mappings.

input_size: int, optional (default = 256)

The dimension of the inputs to the LSTM.

hidden_size: int, optional (default = 256)

The dimension of the outputs of the LSTM.

num_layers: int, optional (default = 2)

Number of recurrent layers in the LSTM.

dropout: float, optional (default = 0.0)

Dropout probability for the outputs of LSTM at each layer except last.

classmethod from_config(config:probnmn.config.Config)[source]

Instantiate this class directly from a Config.