You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ifnum_steps==1: # We need to special case if we have only 1 step, may be the first or not
will always be hit, and the lines from 436 to 449 will never be used, is this correct or am I missing sth here?
Thanks in advance and looking forward for your reply!
Jingwei
The text was updated successfully, but these errors were encountered:
This is correct. The reason that there is a steps dimension is that this can be used to evaluate the model on a given tour in a single forward pass, which is much more efficient than one step at the time. This could be useful for, e.g., supervised training (teacher forcing) or things like experience replay. This code is a leftover of some early experiments in that direction which I thought may still be useful to somebody.
Hi, thanks for making the code public!
I have a question regarding the function
_get_parallel_step_context
:Here,
attention-learn-to-route/nets/attention_model.py
Line 378 in c66da2c
num_steps
would always be1
as thecurrent_node
reads theprev_a
of the tsp state, so then this means thatattention-learn-to-route/nets/attention_model.py
Line 427 in c66da2c
436
to449
will never be used, is this correct or am I missing sth here?Thanks in advance and looking forward for your reply!
Jingwei
The text was updated successfully, but these errors were encountered: