
Walid SabaNortheastern University | NEU
Walid Saba
Doctor of Philosophy
Follow my blog on AI/NLU here https://medium.com/ontologik
About
37
Publications
6,596
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
74
Citations
Citations since 2017
Introduction
Walid Saba
R&D, ontologik.ai
Walid does research in Artificial Intelligence, NLU and Formal Ontology. See https://medium.com/ontologik
Publications
Publications (37)
We argue that the relative success of large language models (LLMs) is not a reflection on the symbolic vs. subsymbolic debate but a reflection on employing an appropriate strategy of bottom-up reverse engineering of language at scale. However, due to the subsymbolic nature of these models whatever knowledge these systems acquire about language will...
Knowledge graphs (KGs) have become the standard technology for the representation of factual information in applications such as recommendation engines, search, and question-answering systems. However, the continual updating of KGs, as well as the integration of KGs from different domains and KGs in different languages, remains to be a major challe...
Knowledge graphs (KGs) have become the standard technology for the representation of factual information in applications such as recommendation engines, search, and question-answering systems. However, the continual updating of KGs, as well as the integration of KGs from different domains and KGs in different languages, remains to be a major challe...
Large language models (LLMs) have achieved a milestone that undenia-bly changed many held beliefs in artificial intelligence (AI). However, there remains many limitations of these LLMs when it comes to true language understanding, limitations that are a byproduct of the under-lying architecture of deep neural networks. Moreover, and due to their su...
The purpose of this paper is twofold: (i) we will argue that formal semantics might have faltered due to its failure in distinguishing between two fundamentally very different types of concepts, namely ontological concepts, that should be types in a strongly-typed ontology, and logical concepts, that are predicates corresponding to properties of, a...
In the concluding remarks of Ontological Promiscuity Hobbs (1985) made what we believe to be a very insightful observation: given that semantics is an attempt at specifying the relation between language and the world, if "one can assume a theory of the world that is isomorphic to the way we talk about it ... then semantics becomes nearly trivial"....
The Winograd Schema (WS) challenge, proposed as an alternative to the Turing Test, has become the new standard for evaluating progress in natural language understanding (NLU). In this paper we will not however be concerned with how this challenge might be addressed. Instead, our aim here is threefold: (i) we will first formally "situate" the WS cha...
The Winograd Schema (WS) challenge has been proposed as an alternative to the Turing Test as a test for machine intelligence. In this short paper we "situate" the WS challenge in the data-information-knowledge continuum, suggesting in the process what a good WS is. Furthermore, we suggest that the WS is a special case of a more general phenomenon i...
The Winograd Schema (WS) challenge has been proposed as an alternative to the Turing Test as a test for machine intelligence. In this short paper we "situate" the WS challenge in the data-information-knowledge continuum, suggesting in the process what a good WS is. Furthermore, we suggest that the WS is a special case of a more general phenomenon i...
This is a short Commentary on Trinh & Le (2018) ("A Simple Method for Commonsense Reasoning") that outlines three serious flaws in the cited paper and discusses why data-driven approaches cannot be considered as serious models for the commonsense reasoning needed in natural language understanding in general, and in reference resolution, in particul...
We argue that logical semantics might have faltered due to its failure in distinguishing between two fundamentally very different types of concepts: ontological concepts, that should be types in a strongly-typed ontology, and logical concepts, that are predicates corresponding to properties of and relations between objects of various ontological ty...
The Communications Web site, http://cacm.acm.org, features more than a dozen bloggers in the [email protected] community. In each issue of Communications, we'll publish selected posts or excerpts.
twitter
Follow us on Twitter at http://twitter.com/blogCACM
http://cacm.acm.org/blogs/blog-cacm
Edwin Torres considers the enduring value of code comm...
We suggest modeling concepts as types in a strongly-typed ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. In such a framework, certain types of ambiguities in
natural language are explained by the notion of polymorphism. In this paper we suggest such a typed compositional semantics for nom...
Over two decades ago a "quite revolution" overwhelmingly replaced knowledgebased approaches in natural language processing (NLP) by quantitative (e.g., statistical, corpus-based, machine learning) methods. Although it is our firm belief that purely quantitative approaches cannot be the only paradigm for NLP, dissatisfaction with purely engineering...
In this paper we suggest a typed compositional seman-tics for nominal compounds of the form [Adj Noun] that models adjectives as higher-order polymorphic functions, and where types are assumed to represent concepts in an ontology that reflects our commonsense view of the world and the way we talk about it in or-dinary language. In addition to [Adj...
We suggest modeling concepts as types in a strongly-typed ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. In such a framework, certain types of ambiguities in natural language are explained by the notion of polymorphism. In this paper we suggest such a typed compositional semantics for nom...
In this note we suggest that difficulties encountered in natural language semantics are, for the most part, due to the use of mere symbol manipulation systems that are devoid of any content. In such systems, where there is hardly any link with our common-sense view of the world, and it is quite difficult to envision how one can formally account for...
We argue for a compositional semantics grounded in a strongly typed ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. Assuming the existence of such a structure, we show that the semantics of various natural language phenomena may become nearly trivial.
The purpose of this paper is twofold: (i) we argue that the structure of commonsense knowledge must be discovered, rather than
invented; and (ii) we argue that natural language, which is the best known theory of our (shared) commonsense knowledge, should itself
be used as a guide to discovering the structure of commonsense knowledge. In addition to...
We argue for a compositional semantics grounded in a strongly typed ontology that reflects our commonsense view of the world and the way we talk about it. Assuming such a structure we show that the semantics of various natural language phenomena may become nearly trivial.
Intelligent systems (e.g., natural language understanding systems, planning systems, etc.) require the storage of and reasoning with a vast amount of background (commonsense) knowledge. This 'knowledge bottleneck' has in fact led several researchers to abandon the 'knowledge intensive' approaches in AI in favor of more quantitative methods, with ve...
We describe a mental state model for agents negotiating in a virtual marketplace. Buying and selling agents enter the marketplace with an 'attitude' formulated as a complex function of prior experience(s), market conditions, product information as well as personal characteristics such as importance of time rs. importance of price and the commitment...
this paper we briefly report on our progress on these three tracks
A mental state model for autonomous agent negotiation in is described. In this model, agent negotiation is assumed to be a
function of the agents’ mental state (attitude) and their prior experiences. The mental state model we describe here subsumes
both competitive and cooperative agent negotiations. The model is first instantiated by buying and se...
The purpose of this paper is to suggest that quantifiers in natural languages do not have a fixed truth functional meaning as has long been held in logical semantics. Instead we suggest that quantitiers can best be modeled as complex inference procedures that are highly dynamic and sensitive to the linguistic context, as well as time and memory con...
It is by now widely accepted that a number of tasks in natural language understanding (NLU) require the storage of and reasoning
with a vast amount of background (commonsense) knowledge. While several efforts have been made to build such ontologies, a
consensus on a scientific methodology for ontological design is yet to emerge. In this paper we su...
Despite overwhelming evidence suggesting that quanti�er scope is a phenomenon that must be treated at the pragmatic level, most computational treatments of scope ambiguities have thus far been a collection of syntactically motivated preference rules. This might be in part due to the prevailing wisdom that a commonsense inferencing strategy would re...
Quantification in natural language is an important phenomena that seems to touch on some pragmatic and inferential aspects of language understanding. In this paper we focus on quantifier scope ambiguity and suggest a cognitively plausible model that resolves a number of problems that have traditionally been addressed in isolation. Our claim here is...
Traditional approaches to the resolution of quantifier scope ambiguity are based on devising syntactic and semantic rules to eliminate a multitude of otherwise equally valid readings. This approach is neither cognitively nor computationally plausible. Instead we suggest a cognitively plausible model to quantifier scope using a "quantificational res...
Quantification in natural language is an important phenomena as it relates to scoping, reference resolution, and, more importantly, to inference. In this paper we argue that the reasoning involved in quantifier scoping and reference resolution is highly dependent on the linguistic context as well as time and memory constraints. Time and memory cons...
This book deals with a well-known problem in artificial intelligence, namely commonsense reasoning. The author tackles this problem from the standpoint of classical AI, building on theories of nonmonotonic reasoning developed by McCarthy, McDermott, Poole, Reiter, and Allen. However, the proposals given in this book differ from traditional approach...
In this paper we describe a database interface that is loosely based upon some of theconcepts proposed by Richard Montague in his approach to the interpretation of natural language. The system is implemented as an executable attribute grammar specified in a higher order, lazy, pure functional programming language. The attribute grammar formalism pr...
We suggest modeling concepts as types in a strongly-typed ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. In such a framework, certain types of ambiguities in natural language are explained by the notion of polymorphism. In this paper we suggest such a typed compositional semantics for nom...
Thesis (Ph. D.)--Carleton University, 1999. Includes bibliographical references.