Lecture Notes in Computer Science
ABSTRACT Inspired by a broader perspective viewing intelligent system dynamics in terms of the geometry of "cognitive spaces," we conduct a preliminary investigation of the application of information-geometry based learning to ECAN (Economic Attention Networks), the component of the integrative OpenCog AGI system concerned with attention allocation and credit assignment. We generalize Amari's "natural gradient" algorithm for network learning to encompass ECAN and other recurrent networks, and apply it to small example cases of ECAN, demonstrating a dramatic improvement in the effectiveness of attention allocation compared to prior (Hebbian learning like) ECAN methods. Scaling up the method to deal with realistically-sized ECAN networks as used in OpenCog remains for the future, but should be achievable using sparse matrix methods on GPUs.
Conference Paper: Three Hypotheses about the Geometry of Mind.[Show abstract] [Hide abstract]
ABSTRACT: We present a novel perspective on the nature of intelligence, motivated by the OpenCog AGI architecture, but intended to have a much broader scope. Memory items are modeled using probability distributions, and memory subsystems are conceived as “mindspaces” – geometric spaces corresponding to different memory categories. Two different metrics on mindspaces are considered: one based on algorithmic information theory, and another based on traditional (Fisher information based) “information geometry”. Three hypotheses regarding the geometry of mind are then posited: 1) a syntax-semantics correlation principle, stating that in a successful AGI system, these two metrics should be roughly correlated; 2) a cognitive geometrodynamics principle, stating that on the whole intelligent minds tend to follow geodesics in mindspace; 3) a cognitive synergy principle, stating that shorter paths may be found through the composite mindspace formed by considering multiple memory types together, than by following the geodesics in the mindspaces corresponding to individual memory types.Artificial General Intelligence - 4th International Conference, AGI 2011, Mountain View, CA, USA, August 3-6, 2011. Proceedings; 01/2011
- [Show abstract] [Hide abstract]
ABSTRACT: In the domain of intelligent systems, the management of mental resources is typically called “attention”. Attention exists because all moderately complex environments – and the real-world environments of everyday life in particular – are a source of vastly greater information than can be processed in real-time by available cognitive resources of any known intelligence, human or otherwise. General-purpose artificial intelligence (AI) systems operating with limited resources under time-constraints in such environments must select carefully which information will be processed and which will be ignored. Even in the (rare) cases where sufficient resources may be available, attention could help make better use of them. All real-world tasks come with time limits, and managing these is a key part of the role of intelligence. Many AI researchers ignore this fact. As a result, the majority of existing AI architectures is incorrectly based on an (explicit or implicit) assumption of infinite or sufficient computational resources. Attention has not yet been recognized as a key cognitive process of AI systems and in particular not of artificial general intelligence systems. This dissertation argues for the absolute necessity of an attention mechanism for artificial general intelligence (AGI) architectures. We examine several issues related to attention and resource management, review prior work on these topics in cognitive psychology and AI, and present a design for a general attention mechanism for AGI systems. The proposed design – inspired by constructivist AI methodologies – aims at architectural and modal independence, and comprehensively addresses and integrates all principal factors associated with attention to date.05/2013, Degree: Ph.D. Computer Science, Supervisor: Kristinn R. Thórisson