You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@article{gala2021consistent,
title={Consistent cross-modal identification of cortical neurons with coupled autoencoders},
author={Gala, Rohan and Budzillo, Agata and Baftizadeh, Fahimeh and Miller, Jeremy and Gouwens, Nathan and Arkhipov, Anton and Murphy, Gabe and Tasic, Bosiljka and Zeng, Hongkui and Hawrylycz, Michael and S{\"u}mb{\"u}l, Uygar},
journal={Nature Computational Science},
volume={1},
number={2},
pages={120--127},
year={2021},
publisher={Nature Publishing Group}
}
Abstract
Consistent identification of neurons in different experimental modalities is a key problem in neuroscience. While methods to perform multimodal measurements in the same set of single neurons have become available, parsing complex relationships across different modalities to uncover neuronal identity is a growing challenge. Here, we present an optimization framework to learn coordinated representations of multimodal data, and apply it to a large multimodal dataset profiling mouse cortical interneurons. Our approach reveals strong alignment between transcriptomic and electrophysiological characterizations, enables accurate cross-modal data prediction, and identifies cell types that are consistent across modalities.
data/proc/ contains the processed dataset used for Gala et al. 2021.
see notebooks/data_proc_T.ipynb and notebooks/data_proc_E.ipynb for pre-processing steps.
Code
create a conda environment, and install depencies (see requirements.yml). The models can be run with tensorflow versions 2.1 to 2.5
clone this repository.
navigate to the location with setup.py in this reposiory, and use pip install -e .
use cplAE_TE/train.py to start training a model.
You can also play around with a minimal version of the coupled autoencoders code (see minimal folder in this repository) hosted on a cloud environment at CodeOcean.