You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repo contains an implementation of JointVAE, a framework for jointly disentangling continuous and discrete factors of variation in data in an unsupervised manner.
Examples
MNIST
CelebA
FashionMNIST
dSprites
Discrete and continuous factors on MNIST
dSprites comparisons
Usage
The train_model.ipynb notebook contains code for training a JointVAE model.
The load_model.ipynb notebook contains code for loading a trained model.
Example usage
fromjointvae.modelsimportVAEfromjointvae.trainingimportTrainerfromtorch.optimimportAdamfromviz.visualizeimportVisualizerasViz# Build a dataloader for your datadataloader=get_my_dataloader(batch_size=32)
# Define latent distributionlatent_spec= {'cont': 20, 'disc': [10, 5, 5, 2]}
# Build a Joint-VAE modelmodel=VAE(img_size=(3, 64, 64), latent_spec=latent_spec)
# Build a trainer and train modeloptimizer=Adam(model.parameters())
trainer=Trainer(model, optimizer,
cont_capacity=[0., 5., 25000, 30.],
disc_capacity=[0., 5., 25000, 30.])
trainer.train(dataloader, epochs=10)
# Visualize samples from the modelviz=Viz(model)
samples=viz.samples()
# Do all sorts of fun things with model
...
Trained models
The trained models referenced in the paper are included in the trained_models folder. The load_model.ipynb ipython notebook provides code to load and use these trained models.
Data sources
The MNIST and FashionMNIST datasets can be automatically downloaded using torchvision.
CelebA
All CelebA images were resized to be 64 by 64. Data can be found here.
Chairs
All Chairs images were center cropped and resized to 64 by 64. Data can be found here.
Applications
Image editing
Inferring unlabelled quantities
Citing
If you find this work useful in your research, please cite using:
@inproceedings{dupont2018learning,
title={Learning disentangled joint continuous and discrete representations},
author={Dupont, Emilien},
booktitle={Advances in Neural Information Processing Systems},
pages={707--717},
year={2018}
}
More examples
License
MIT
About
Pytorch implementation of JointVAE, a framework for disentangling continuous and discrete factors of variation 🌟