Validation data shall be in broken down into batches or not?

I am using fit_generator to train the model. The training dataset is being read from a generator function which gives data in a constant batch size. Now I want to know what approach shall I adopt for validation data. Shall I make a generator for the validation set or load it completely into memory and use it in fit_generator?

Note: Validation dataset is fitting into memory quite easily therefore batching is an option for me.

Topic training

Category Data Science


If validation batch size = 1 it has no difference in comparison with the case that load the data completely. But with any other value of batch size the evaluation metric will be an average of it in different batches; So, its real value will not be resulted.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.