You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, it's a very impressive work. But I get some question, is the training script correct since the parameter arch is 'transformer'? If so, where is the core model structure code?
The text was updated successfully, but these errors were encountered:
This parameter can be set according to the author's paper "Wait-info Policy- Balancing Source and Target at Information Level for Simultaneous Machine Translation". For example, "transformer_wmt_en_de" for de2en Transformer-base, and "transformer_vaswani_wmt_en_de_big" for de2en Transformer-big.
Hi, it's a very impressive work. But I get some question, is the training script correct since the parameter arch is 'transformer'? If so, where is the core model structure code?
The text was updated successfully, but these errors were encountered: