You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Densely Connected Convolutional Network (DenseNet) is a network architecture where each layer is directly connected to every other layer in a feed-forward fashion. It's quite similar to ResNet but in contrast DenseNet concatenates outputs instead of using summation. If you need a quick introduction about how DenseNet works, please read the original paper[1]. It's well written and easy to understand.
I implemented a DenseNet in Python using Keras and TensorFlow as backend. Because of this I can't guarantee that this implementation is working well with Theano or CNTK. In the next months I will update the code to TensorFlow 2.x. Besides I will try to optimize this architecture in my own way with some modifications.
You can find several implementations on GitHub.
Results
Fashion-MNIST
I used this notebook to evaluate the model on fashion-MNIST with following parameters:
Dense Blocks
Depth
Growth Rate
Dropout
Bottlen.
Compress.
BatchSize / Epochs
Training (loss / acc)
Validation (loss / acc)
Test (loss / acc)
5
35
20
0.4
False
0.9
100 / 80
0.1366 / 0.9681
0.1675 / 0.9643
0.2739 / 0.9459
Feel free to try it on your own with another parameters.