You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Get this error on running the testing part (tried it on two separate systems to get the same error):
Start...
vocabulary size. source = 50004; target = 31280
number of XENT training sentences. 1000
number of PG training sentences. 1000
maximum batch size. 64
Building model...
('use_critic: ', False)
Loading from checkpoint at /media/BACKUP/ghproj_d/code_summarization/github-python/result/model_rf_hybrid_1_29_reinforce.pt
Traceback (most recent call last):
File "a2c-train.py", line 349, in
main()
File "a2c-train.py", line 254, in main
checkpoint = torch.load(opt.load_from, map_location=lambda storage, loc: storage)
File "/usr/local/lib/python2.7/dist-packages/torch/serialization.py", line 231, in load
return _load(f, map_location, pickle_module)
File "/usr/local/lib/python2.7/dist-packages/torch/serialization.py", line 369, in _load
magic_number = pickle_module.load(f)
cPickle.UnpicklingError: could not find MARK
Any help would be appreciated.
The text was updated successfully, but these errors were encountered:
Get this error on running the testing part (tried it on two separate systems to get the same error):
Start...
Building model...
('use_critic: ', False)
Loading from checkpoint at /media/BACKUP/ghproj_d/code_summarization/github-python/result/model_rf_hybrid_1_29_reinforce.pt
Traceback (most recent call last):
File "a2c-train.py", line 349, in
main()
File "a2c-train.py", line 254, in main
checkpoint = torch.load(opt.load_from, map_location=lambda storage, loc: storage)
File "/usr/local/lib/python2.7/dist-packages/torch/serialization.py", line 231, in load
return _load(f, map_location, pickle_module)
File "/usr/local/lib/python2.7/dist-packages/torch/serialization.py", line 369, in _load
magic_number = pickle_module.load(f)
cPickle.UnpicklingError: could not find MARK
Any help would be appreciated.
The text was updated successfully, but these errors were encountered: