Splitting large multi class dataset using leave one out scheme into train and test
I am doing some supervised learning using neural networks, and i have a Targets array containing 1906 samples, which contain 664 unique values. min. count of each unique value==2, by design. Is there a smarter way to split this dataset into train and test, using a leaveoneout scheme to pick randomly 1 sample from each class and put it in the test set and use the rest for training, before i get down to explicitly iterating over all my values? I am using python, numpy, sklearn and pytorch btw!
Topic numpy cross-validation scikit-learn neural-network python
Category Data Science