-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
enhancementNew feature or requestNew feature or request
Milestone
Description
This could be done for EMNIST or something else.
Train an autoencoder (encoder + decoder) to learn an efficient encoding/features representation in an unsupervised manner, then throw away the decoder, fix the encoder weights, add some layers (perhaps only one would suffice, a softmax layer) and do supervised learning for those layers only (or maybe finetune the whole network) to classify digits/letters, perhaps using for training only a subset of the training data.
This would be a quite easy & nice example of unsupervised learning, encoder/decoder, transfer learning, unsupervised pretraining and some other buzzwords :)
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request