Splitting large multi class dataset using leave one out scheme into train and test

I am doing some supervised learning using neural networks, and i have a Targets array containing 1906 samples, which contain 664 unique values. min. count of each unique value==2, by design. Is there a smarter way to split this dataset into train and test, using a leaveoneout scheme to pick randomly 1 sample from each class and put it in the test set and use the rest for training, before i get down to explicitly iterating over all my values? I am using python, numpy, sklearn and pytorch btw!

Topic numpy cross-validation scikit-learn neural-network python

Category Data Science


Never mind! Found it!

just doing a,indices,counts=np.unique(y) will return the starting indices of each unique value in indices, and then you can just slice this off as x_test=X[indices], and remember to delete it using np.delete(X,indices)

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.