Cognitive computing is a nascent interdisciplinary domain. It is a confluence of cognitive science, neuroscience, data science, and cloud computing. Cognitive science is the study of mind and offers theories, mathematical and computational models of human cognition. Cognitive science itself is an interdisciplinary domain and draws upon philosophy, linguistics, psychology, and anthropology, among others.
Neuroscience is the study of the nervous system including its development, structure and function. More specifically, neuroscientists study the structure of the brain, and how behavior and cognitive functions are regulated by the brain. Brain imaging techniques such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and arterial spin labeling (ASL) enable probing brain functions both qualitatively and quantitatively.
Data Science is also an interdisciplinary domain. It deals with processes and systems to extract information and knowledge from structured and unstructured data using machine learning algorithms and statistical methods. The end goal is to discover patterns, generate actionable insights, and answer predictive questions.
Cloud computing provides turnkey solutions such as platform-as-a-service, infrastructure-as-a-service, and software-as-a-service. It uses high performance CPUs, GPUs, neuromorphic processors, virtually unlimited memory and storage, and high speed networks to provide computing resources on demand. A fixed pool of these resources are dynamically provisioned among various applications and continually adjusted so that the applications can guarantee performance amidst fluctuating workloads. Cloud computing achieves economies of scale and helps cognitive computing applications to perform at scale without upfront computing investments. Applications are billed for only the resources they actually use.
Broadly, there are two lines of research in the cognitive computing discipline. The first one is centered on cognitive science as the foundation and encompasses neuroscience, philosophy, psychology, anthropology, and linguistics research. The second one is more recent and is based on computer science as the foundation. It encompasses data science, statistics, and sub-disciplines of computer science such as high performance computing, cloud computing, natural language processing, computer vision, machine learning, information retrieval, and data management. These two lines of research are not only complementary, but mutually helping to accelerate discoveries and innovation.
It is this synergistic confluence that makes cognitive computing powerful and has potential for groundbreaking discoveries and advances. Especially the advances in the computing discipline are poised to bring about transformational changes to the way research is conducted in the discipline. For example, IBM's TrueNorth cognitive computing system is a case in point. Its design is inspired by the function and efficiency of the human brain. The TureNorth architecture provides spiking neuron model as a building block. Its programming paradigm is based on an abstraction called corelet, which represents a network of neurosynaptic cores. The corelet encapsulates all details except the external inputs and outputs. An object-oriented language is available for programming corelets. A library of reusable corelets as well as an integrated development environment help accelerate the development of cognitive computing applications. Using this environment, IBM has already implemented several algorithms including hidden Markov models, convolution networks, and restricted Boltzmann machines. These algorithms have been incorporated into applications such as speaker recognition, sequence prediction, and collision avoidance. As of this writing, Nvidia released Tesla P100 GPU, which specifically targets machine learning algorithms that employ deep learning. P100 features 150 billion transistors on a single chip. These computing advances will propel research in cognitive and neurosciences.
The goal of this handbook is to bring together a coherent body of knowledge and recent research in cognitive computing. It promotes a unified view of the domain and lays the foundation for cognitive computing as an academic discipline and a research enterprise. To the best of the editors' knowledge, this handbook is the first in formally defining cognitive computing and providing an academic exposition of the field. The handbook aims to serve as a catalyst in advancing research in cognitive computing.
Audience
The handbook aims to meet the needs of both students and industry practitioners. Especially it is suited for students in advanced undergraduate and beginning graduate courses on cognitive computing, neuroscience, and cognitive science. It is also a good source for graduate students who plan to pursue research in cognitive computing. The handbook is also a good reference for industry practitioners who desire to learn about cognitive computing.
Organization
The handbook is comprised of 11 chapters, which are organized into three sections. Section A, consists of two chapters, provides an introduction to cognitive computing and sets the backdrop for reading rest of the handbook. Section B is comprised of five chapters. Complex analytics and machine learning areas are discussed in this section. Lastly, Section C discusses applications of cognitive computing and four chapters are devoted for these topics.
Chapter 1: Cognitive Computing: Concepts, Architectures, Systems and Applications. Provides an interdisciplinary introduction to cognitive computing. The aim of the chapter is to provide a unified view of the discipline. It begins with an overview of cognitive science, data science, and cognitive computing. Principal technology enablers of cognitive computing, an overview of three major categories of cognitive architectures, cognitive computing systems and their applications are discussed. Current trends and future research directions in cognitive computing are indicated. The chapter concludes by listing various cognitive computing resources.
Chapter 2: Cognitive Computing and Neural Networks: Reverse Engineering the Brain. IBM, Nvidia, Qualcomm have developed microprocessors which mimic neurons and synapses of the human brain. These microprocessors are called neuromorphic chips and IBM's TrueNorth and the HumanBrain Project's SpiNNaker are examples. This chapter presents principles and theory needed as a backdrop to understand these advances from a cognitive science perspective. Neural networks found within the mammalian neocortex, and associated formal and computational models, that appear to form the basis of human cognition are described.
Chapter 3: Visual Analytic Decision-Making Environments for Large-Scale Time Evolving Graphs. Data scientists are faced with the challenge of analyzing large-scale graphs that change dynamically. Existing tools and metaphors for data collection, processing, storage and analysis are not suitable for handling large-scale evolutionary graphs. This chapter describes visual analytics as a cognitive computing approach to improve decision making with large-scale dynamic graphs. It provides a conceptual introduction to time varying graphs, describes functional components of systems for visual analytics including performance considerations, and presents a visual graph analytics sandbox architecture and sample applications implemented within it.
Chapter 4: CyGraph: Graph-Based Analytics and Visualization for Cybersecurity. The adversarial nature and complex interdependencies of networked machines demands a cognitive systems approach to cybersecurity. This chapter describes CyGraph, a graph-based cognitive system for protecting mission-critical computing assets and applications. CyGraph brings together isolated data and events into a comprehensive property-graph model, providing an overall picture for decision support and situational awareness. CyGraph features CyQL (CyGraph Query Language), a domain-specific query language for expressing graph patterns of interest, with interactive visualization of query results. CyGraph integrates with third-party tools for visualizing graph state changes. CyGraph can also synthesize graph models with particular statistical properties.
Chapter 5: Cognitive Analytics: Going Beyond Big Data Analytics and Machine Learning. Traditional data analytics evolved from the database domain and exclusively focused on structured data stored in relational databases. It was propelled to the next stage in its evolution with the advent of data warehouses and data mining. Cognitive analytics is the third stage in this evolutionary path and goes beyond structured data. It integrates semi-structured and unstructured data into the analytic process. This chapter provides an introduction to cognitive analytics. It describes types of learning and classes of machine learning algorithms in the context of cognitive analytics. It proposes a reference architecture for cognitive analytics and indicates ways to implementing it. It also describes a few cognitive analytics applications.
Chapter 6: A cognitive random forest: an intra- and inter-cognitive computing for big data classification under cune-condition. This chapter address the classification problem in big data context. The data is often noisy, inconsistent, and incomplete. To solve the classification problem, a cognitive model (called STE - M) is proposed in this chapter. Also, a cognitive computing architecture called Cognitive Random Forest, is proposed to implement STE - M. The architecture amalgamates the STE-M model and a set of random forest classifiers to enhance continuous learning. The architecture is implemented and validated.
Chapter 7: Bayesian Additive Regression Tree for Seemingly Unrelated Regression with Automatic Tree Selection. This chapter introduces a flexible Bayesian additive regression tree (seemingly unrelated regression) model, called BART - SUR, which is suitable for situations where the response variable is a vector and the components of the vector are highly correlated. BART - SUR can jointly model the correlation structure among the related response variables and provide a highly flexible and nonlinear regression structure for each of the individual regression functions. The number of trees in BART - SUR is selected adaptively by treating it as a model parameter and assigning a prior distribution on it. The adaptive tree selection makes BART - SUR extremely fast. The author demonstrates the superiority of BART-SUR over several out of the shelve popular methods like random forest, neural network, wavelet regression, and support vector machine through two simulation studies and three real data applications.
Chapter 8: Cognitive Systems for the Food-Water-Energy Nexus. Meeting the food, water, and energy needs of a growing world population is a grand challenge. These resources are often not produced in places where they are consumed, which entails transportation and storage costs. One can avoid storing a resource, if good forecast models for supply and demand exist. Developing such models requires handling large scale datasets efficiently, building forecasting models using machine learning methods, and leveraging optimization techniques to help incorporate forecasting results into a decision making process. Towards these goals, this chapter discusses methods to make the most of sensor data. Next, forecasting methods ranging from a-few-minutes-ahead to days or even years-ahead are described. Finally, how to use the outputs of these analytics tools to help decision making processes in the context of energy are discussed.
Chapter 9: Cognitive Computing Applications in Education and Learning. Education and learning applications stand out among many uses of cognitive computing due to their practical appeal as well as research challenge. This chapter discusses the role of cognitive computing in teaching and learning environments. More specifically, the chapter examines the important roles played by the Educational Data Mining (EDM) and Learning Analytics (LA) researchers in improving student learning. It describes an architecture for personalized eLearning and summarizes relevant research.
Chapter 10: Large Scale Data Enabled Evolution of Spoken Language Research and Applications. Human languages are used in two forms: written and spoken. Text and speech are the mediums for written and spoken languages, respectively. Human languages are the most natural means for communication between cognitive computing systems and their users. The emergence of big data and data science are accelerating research and applications in the analysis and understanding of human/natural languages. This chapter provides an introductory tutorial on the core tasks in speech processing, reviews recent large scale data-driven approaches to solving problems in spoken languages, describes current trends in speech research, and indicates future research directions.
Chapter 11: IoT and Cognitive Computing. Internet of Things (IoT) technologies are now more widely deployed. The confluence of IoT and cognitive computing provides unprecedented opportunities to develop deeper insights from the data generated by IoT devices. These actionable insights have the potential for transformational changes that affect people, cities, and industry. This chapter explores the state of the art and future opportunities to bring IoT and cognitive computing together to solve a range of problems including smart cities and connected health care.