A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.

Information-Preserving Networks and the Mirrored Transform

Buonanno A.
2019-01-01

Abstract

A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.
2019
978-1-7281-0824-7
ReLU Networks; Unsupervised Deep Networks
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12079/54109
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
social impact