You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repository contains an implementation to the Neurips 2019 paper Controlling Neural Level Sets.
This paper presents a simple and scalable approach to directly control level sets of a deep neural network. Our method consists of two parts: (i) sampling of the neural level sets, and (ii) relating the samples' positions to the network parameters. The latter is achieved by a sample network that is constructed by adding a single fixed linear layer to the original network. In turn, the sample network can be used to incorporate the level set samples into a loss function of interest.
The code is compatible with python 3.7 + pytorch 1.2. In addition, the following packages are required:
pyhocon, plotly, skimage, trimesh, pandas, advertorch, GPUtil, plyfile.
Usage
Robustness to adversarial examples:
cd ./code
python training_adv/exp_runner.py --conf ./confs/adv/[mnist_or_cifar]_ours.conf
Generating meshes from the learned implicit representation, using the marching cubes algorithm:
python training_recon/post_plot_surface.py
Outputs are saved in:
../exps/expname/[timestamp]/
Citation
If you find our work useful in your research, please consider citing:
@inproceedings{atzmon2019controlling,
title={Controlling neural level sets},
author={Atzmon, Matan and Haim, Niv and Yariv, Lior and Israelov, Ofer and Maron, Haggai and Lipman, Yaron},
booktitle={Advances in Neural Information Processing Systems},
pages={2032--2041},
year={2019}
}