A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.
Information-Preserving Networks and the Mirrored Transform
Buonanno A.
2019-01-01
Abstract
A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.