All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is simply possible if the peak and width Proportions of the data remain unchanged, so convolutions inside a dense block are all of stride 1. Pooling levels are inserted between dense blocks for even further dimensionality reductio