Start & End Tokens in LSTM when making predictions
I see examples of LSTM sequence to sequence generation models which use start and end tokens for each sequence.
I would like to understand when making predictions with this model, if I'd like to make predictions on an arbitrary sequence - is it required to include start and end tokens tokens in it?
Topic lstm tensorflow rnn nlp
Category Data Science