Deep ReLU (Rectified Linear Unit) neural networks exhibit complex functional symmetries: vastly different architectures and parameters (weights and biases) can realize the same function.
Complete Identification
A new study focuses on the complete identification problem: given a function f, deriving the architecture and parameters of all feedforward ReLU networks giving rise to it. The proposed method translates ReLU networks into Lukasiewicz logic formulae, effecting functionally equivalent network transformations through algebraic rewrites governed by the logic axioms.
Analogies with Shannon
A compositional norm form is proposed to facilitate the mapping from Lukasiewicz logic formulae back to ReLU networks. Using Chang's completeness theorem, it is shown that for every functional equivalence class, all ReLU networks in that class are connected by a finite set of symmetries corresponding to the finite set of axioms of Lukasiewicz logic. This idea is reminiscent of Shannon's work on switching circuit design, where the circuits are translated into Boolean formulae, and synthesis is effected by algebraic rewriting governed by Boolean logic axioms.
๐ฌ Commenti (0)
๐ Accedi o registrati per commentare gli articoli.
Nessun commento ancora. Sii il primo a commentare!