1

I have a Sequential model in PyTorch:

model = nn.Sequential(
        nn.Embedding(alphabet_size, 64),
        nn.LSTM(64, ...),
        nn.Flatten(),
        nn.Linear(...),
        nn.Softmax()
    )

I would like to force the batch size to Embedding layer:

nn.Embedding(alphabet_size, 64)

as in Keras:

Embedding(alphabet_size, 64, batch_input_shape=(batch_size, time_steps)))

How to do it?

1 Answer 1

-1

Here is the answer to your question. The example already available below

PyTorch sequential model and batch_size

Not the answer you're looking for? Browse other questions tagged or ask your own question.