All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is simply possible if the height and width dimensions of the information keep on being unchanged, so convolutions inside a dense block are all of stride one. Pooling layers are inserted between dense blocks for https://financefeeds.com/copyright-whales-holding-50m-are-heavily-buying-these-5-altcoins-this-january/