PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

In this paper, we lay out a novel model of neuroplasticity in the form of a horizontal-vertical integration model of neural processing. The horizontal plane consists of a network of neurons connected by adaptive transmission links. This fits with standard computational neuroscience approaches. Each individual neuron also has a vertical dimension with internal parameters steering the external membrane-expressed parameters. These determine neural transmission. The vertical system consists of (a) external parameters at the membrane layer, divided into compartments (spines, boutons) (b) internal parameters in the sub-membrane zone and the cytoplasm with its protein signaling network and (c) core parameters in the nucleus for genetic and epigenetic information. In such models, each node (=neuron) in the horizontal network has its own internal memory. Neural transmission and information storage are systematically separated. This is an important conceptual advance over synaptic weight models. We discuss the membrane-based (external) filtering and selection of outside signals for processing. Not every transmission event leaves a trace. We also illustrate the neuroninternal computing strategies from intracellular protein signaling to the nucleus as the core system. We want to show that the individual neuron has an important role in the computation of signals. Many assumptions derived from the synaptic weight adjustment hypothesis of memory may not hold in a real brain. We present the neuron as a self-programming device, rather than passively determined by ongoing input. We believe a new approach to neural modeling will benefit the third wave of AI. Ultimately we strive to build a flexible memory system that processes facts and events automatically.
Content may be subject to copyright.