0

I am following the NLP tutorials on Pytorch’s tutorials website. I am getting different output than what it should show, so I just copy pasted the whole code as it is and still the output is different.

My code is shared in this gist:

Example: An LSTM for Part-of-Speech Tagging

For the 1st sentence

[‘The’, ‘dog’, ‘ate’, ‘the’, ‘apple’]
[‘DET’, ‘NN’, ‘V’, ‘DET’, ‘NN’]

the output is coming as below:

tensor([[-0.7662, -0.6405, -4.8002],
[-2.7163, -0.0698, -6.6515],
[-3.1324, -5.7668, -0.0479],
[-0.0528, -3.3832, -4.0481],
[-2.4527, -0.0931, -5.8702]])

I am getting the sequence: 1 1 2 0 1 rather than 0 1 2 0 1

Can anyone please check this and point out why I am getting different output?

4
  • Not able to reproduce the error in the gist given, I face a different error in the beggining,
    – Ryan
    Commented Jun 26, 2018 at 11:31
  • @Ryan that's pretty strange! I just checked this again in Google colab: colab.research.google.com/drive/… and able to get the same tensor output. Moreover, please note I am using Pytorch version 0.4
    – Anis
    Commented Jun 26, 2018 at 12:17
  • Oh, I seem to be using .3, May be thats why im not being able to reproduce the error.
    – Ryan
    Commented Jun 27, 2018 at 6:10
  • nvm, I made that 500 epoch and it is now output correctly. But, I really wonder why is that?
    – Anis
    Commented Jun 30, 2018 at 20:07

1 Answer 1

0

I updated the epochs=500, i.e ran 500 times and it now outputs the correct sequence.

Not the answer you're looking for? Browse other questions tagged or ask your own question.