Chapter

From Computing with Numbers to Computing with Words — from Manipulation of Measurements to Manipulation of Perceptions

Wiley
Annals of the New York Academy of Sciences
Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Interest in issues relating to consciousness has grown markedly during the last several years. And yet, nobody can claim that consciousness is a well-understood concept that lends itself to precise analysis. It may be argued that, as a concept, consciousness is much too complex to fit into the conceptual structure of existing theories based on Aristotelian logic and probability theory. An approach suggested in this paper links consciousness to perceptions and perceptions to their descriptors in a natural language. In this way, those aspects of consciousness which relate to reasoning and concept formation are linked to what is referred to as the methodology of computing with words (CW). Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language (e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc.). Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech, and summarizing a story. Underlying this remarkable capability is the brain's crucial ability to manipulate perceptions-perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood, and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions: a theory which may have an important bearing on how humans make-and machines might make-perception-based rational decisions in an environment of imprecision, uncertainty, and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp, whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers that are capable of performing billions of computations per second; we have constructed telescopes that can explore the far reaches of the universe; and we can date the age of rocks that are millions of years old. But alongside the brilliant successes stand conspicuous underachievements and outright failures. We cannot build robots that can move with the agility of animals or humans; we cannot automate driving in heavy traffic; we cannot translate from one language to another at the level of a human interpreter; we cannot create programs that can summarize nontrivial stories; our ability to model the behavior of economic systems leaves much to be desired; and we cannot build machines that can compete with children in the performance of a wide variety of physical and cognitive tasks. It may be argued that underlying the underachievements and failures is the unavailability of a methodology for reasoning and computing with perceptions rather than measurements. An outline of such a metliodology-referred to as a computational theory of perceptions-is presented in this paper. The computational theory of perceptions (CTP) is based on the methodology of CW. In CTP, words play the role of labels of perceptions, and, more generally, perceptions are expressed as propositions in a natural language. CW-based techniques are employed to translate propositions expressed in a natural language into what is called the Generalized Constraint Language (GCL). In this language, the meaning of a proposition is expressed as a generalized constraint, X isr R, where X is the constrained variable, R is the constraining relation, and isr is a variable copula in which r is an indexing variable whose value defines the way in which R constrains X. Among the basic types of constraints are possibilistic, veristic, probabilistic, random set, Pawlak set, fuzzy graph, and usuality. The wide variety of constraints in GCL makes GCL a much more expressive language than the language of predicate logic. In CW, the initial and terminal data sets, IDS and TDS, are assumed to consist of propositions expressed in a natural language. These propositions are translated, respectively, into antecedent and consequent constraints. Consequent constraints are derived from antecedent constraints through the use of rules of constraint propagation. The principal constraint propagation rule is the generalized extension principle. The derived constraints are re-translated into a natural language, yielding the terminal data set (TDS). The rules of constraint propagation in CW coincide with the rules of inference in fuzzy logic. A basic problem in CW is that of explicitation of X, R and r in a generalized constraint, X isr R, which represents the meaning of a proposition, p, in a natural language. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers; and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW and CTP. At this juncture, the computational theory of perceptions-which is based on CW-is in its initial stages of development. In time, it may come to play an important role in the conception, design and utilization of information/intelligent systems. Furthermore, it may contribute to a better understanding of those aspects of consciousness that relate to reasoning and concept formation. The role model for CW and CTP is the human mind.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Ein wichtiger Aspekt der Fuzzy-Logik ist Granular-Computing (Klumpenbildung), da die Informationsgranulation essenziell für die menschliche Problemlösung [38] und für die menschliche Kognition ist [40]. Gemäß Hobbs [9] [10]. ...
... Der Fokus bei Rechnungen liegt üblicherweise auf der Manipulation von Zahlen und Symbolen . Computing-with-Words ist dagegen eine Methodik, bei welcher Worte und Sätze als Kalkulationsgegenstand verwendet werden [40]. Computing-with-Words beschäftigt sich mit unpräzisen , unsicheren und vagen Informationen, welche in Propositionen zu finden sind, die in natürlicher Sprache auftauchen [33] { VON FUZZY-SETS ZU COMPUTING-WITH-WORDS der Ungenauigkeit besteht, kann dies durch die Verwendung von Worten anstelle von Zahlen ausgenutzt werden [18, 37, 40]. ...
... Computing-with-Words ist dagegen eine Methodik, bei welcher Worte und Sätze als Kalkulationsgegenstand verwendet werden [40]. Computing-with-Words beschäftigt sich mit unpräzisen , unsicheren und vagen Informationen, welche in Propositionen zu finden sind, die in natürlicher Sprache auftauchen [33] { VON FUZZY-SETS ZU COMPUTING-WITH-WORDS der Ungenauigkeit besteht, kann dies durch die Verwendung von Worten anstelle von Zahlen ausgenutzt werden [18, 37, 40]. Computing-with-Words baut auf der Fähigkeit des menschlichen Hirns (z. ...
Article
Full-text available
Zusammenfassung Dieser Artikel bietet einen Überblick über die Entwicklung und Zusammenhänge der einzelnen Elemente der Fuzzy-Logik, wovon Fuzzy-Set-Theorie die Grundlage bildet. Die Grundproblematik besteht in der Handhabung von linguistischen Informationen, die häufig durch Ungenauigkeit gekennzeichnet sind. Die verschiedenen technischen Anwendungen von Fuzzy-Logik bieten eine Möglichkeit, intelligentere Computersysteme zu konstruieren, die mit unpräzisen Informationen umgehen können. Solche Systeme sind Indizien für die Entstehung einer neuen Ära des Cognitive-Computing, die in diesem Artikel ebenfalls zur Sprache kommt. Für das bessere Verständnis wird der Artikel mit einem Beispiel aus der Meteorologie (d. h. Schnee in Adelboden) begleitet.
... In addition, in the FLp we use precise theorems, classical deducibility and formal logic, whereas the FLu operates with informal and approximate reasoning (Fig. 1 ). In practice the FLe stems from Zadeh's previous theories on information granulation, precisiated language and computing with words, as well as on the theory of perceptions [21, 22, 23, 24, 26, 27]. Zadeh's ideas mean that we can apply both traditional bivalent-based and novel approximate validity , definitions, axioms, theories and explanations, inter alia. ...
... This procedure is analogous to operationalization in statistics and mathematical modeling because in these contexts the original, often linguistic and imprecise, terms and hypotheses are replaced with their quantitative and numerically measurable counterparts. A formal FLe language usually contains such linguistic variables whose values are specified by using primitive terms, linguistic modifiers (hedges), negations, connectives, quantifiers and qualifiers [21, 24, 27]. For example, if our variable is age of persons, we may use such values as young, very young, not old, young or very young and some persons are old. ...
Article
Full-text available
Lotfi Zadeh's fuzzy extended logic is applied to approximate linguistic reasoning. The prevailing fuzzy reasoning methods still seem to have some bivalent commitments in truth valuation and thus an alternative many-valued resolution is presented at meta-level.
... Fuzzy models demonstrate significant appropriateness in scenarios characterized by limited data availability. They facilitate the development of fuzzy rules through the integration of linguistic expressions and expert knowledge, as suggested by Zadeh [28,29]. ...
Article
Full-text available
Resource conflicts constitute a major global issue in areas rich in natural resources. The modeling of factors influencing natural resource conflicts (NRCs), including environmental, health, socio-economic, political, and legal aspects, presents a significant challenge compounded by inadequate data. Quantitative research frequently emphasizes large-scale conflicts. This study presents a novel multilevel approach, SEFLAME-CM—Spatially Explicit Fuzzy Logic-Adapted Model for Conflict Management—for advancing understanding of the relationship between NRCs and drivers under territorial and rebel-based typologies at a community level. SEFLAME-CM is hypothesized to yield a more robust positive correlation between the risk of NRCs and the interacting conflict drivers, provided that the conflict drivers and input variables remain the same. Local knowledge from stakeholders is integrated into spatial decision-making tools to advance sustainable peace initiatives. We compared our model with spatial multi-criteria evaluation for conflict management (SMCE-CM) and spatial statistics. The results from the Moran’s I scatter plots of the overall conflicts of the SEFLAME-CM and SMCE-CM models exhibit substantial values of 0.99 and 0.98, respectively. Territorial resource violence due to environmental drivers increases coast-wards, more than that stemming from rebellion. Weighing fuzzy rules and conflict drivers enables equal comparison. Environmental variables, including proximity to arable land, mangrove ecosystems, polluted water, and oil infrastructures are key factors in NRCs. Conversely, socio-economic and political factors seem to be of lesser importance, contradicting prior research conclusions. In Third World nations, local communities emphasize food security and access to environmental services over local political matters amid competition for resources. The synergistic integration of fuzzy logic analysis and community perception to address sustainable peace while simultaneously connecting environmental and socio-economic factors is SEFLAME-CM’s contribution. This underscores the importance of a holistic approach to resource conflicts in communities and the dissemination of knowledge among specialists and local stakeholders in the sustainable management of resource disputes. The findings can inform national policies and international efforts in addressing the intricate underlying challenges while emphasizing the knowledge and needs of impacted communities. SEFLAME-CM, with improvements, proficiently illustrates the capacity to model intricate real-world issues.
... In situations where there is limited data availability, fuzzy models prove to be highly suitable. They enable the generation of fuzzy rules based on a combination of linguistic statements and expert knowledge, as proposed by Zadeh [27,28]. 3 In this paper, we present an integrative approach that facilitates the inclusion of community knowledge in modeling NRBCs. ...
Preprint
Full-text available
Resource conflicts represent a significant global challenge in regions abundant with natural re-sources. Modelling the myriad factors driving natural resource-based conflicts (NRBCs), spanning environmental, health, socio-economic, and political dimensions, is a complex endeavor exacer-bated by data scarcity. Furthermore, existing quantitative studies often focus solely on large-scale conflicts. This article introduces a novel algorithm, the Spatially Explicit Fuzzy Logic-Adapted Model for Conflict Management (SEFLAME-CM), which integrates the local knowledge of stakeholders into spatial decision-making technologies to support sustainable peace efforts. The results are validated with spatial multi-criteria evaluation (SMCE) using spatial statistics. The Moran’s I scatter plots for the overall conflicts reveal significant values of 0.99 and 0.98 for both the SEFLAME-CM and SMCE, respectively, with significant spatial autocorrelation. While there re-mains room for improvement in enhancing the model's quality, SEFLAME-CM demonstrates its capacity to transparently model complex real-world problems. The findings underscore the im-perative for a holistic approach to addressing environmental degradation, socio-economic, and political drivers of resource conflicts at the community level. Our paper demonstrates the signif-icance of spatial information technologies and knowledge exchange between experts and local stakeholders in effectively managing resource conflicts. These insights should inform national policies and international interventions, ensuring that the complex underlying issues are ad-dressed while prioritizing the knowledge and needs of affected communities.
... The countries that have recognized the potential of AI have already started investing the funds for resources to conduct further research programs in AI. Many leading countries including Canada, Italy, Austria, Singapore, Japan, USA and Britain have already initiated and announced plans for some AI research and development programs [14]. ...
Chapter
Technological advancements continue to grow at an incredible speed which results an increased collaboration between humans and technology. Though the technology‐based AI is relatively a new term but ground researches have already initiated on artificial intelligence. AI, has an ability to store huge amounts of data and process it with a very high speed, as well as has the ability to solve problems and could compete with human abilities. Today AI is nothing but a modern technology which studies how human brain thinks, learn, decide, and work while trying to solve a problem, and outcomes of the study is utilized for developing intelligent software and systems. AI will become an integral part of business houses and individuals, it might exceed human abilities but can it replace sophisticated behavior such as love, moral choice, emotions. In this study we will analyse and examine how humans and AI might evolve together and what would be the future of humans with the help of AI? Will people be at an advantage than they are today?
... International Journal of Research and Analytical Reviews (IJRAR) www.ijrar.org 55 Lotfi A. Zadeh, 1999, expressed that Fuzzy Logic, which is designed according to people's great capacity of making approximations with no genuine esteems, postures numerous entanglements. Subsequently dealing with a lot of conflicting information is another complexity, in light of the fact that conflicting information is inescapable yet hard to process. ...
Article
Full-text available
Artificial Intelligence is an intelligence displayed by machine in contrast with human being. AI is a combination of machines and learning techniques, which are used for various applications in the market. Artificial Intelligence is future and it is a part of our daily lives. From taking a ride in a driverless car, or analyze patterns of online behaviour or to detect credit card fraud artificial intelligence is involved. This study explores the areas and functions in quick service restaurants where artificial Intelligence can enhance the serviceability of the outlets. The present paper tries to understand different aspects of the AI prevalent and suitable for hospitality related activities. This will enable us to know that if AI's strength can be used in quick service restaurants and will also provide insights to see in which functional areas of quick service restaurants artificial intelligence can be used. It endeavors to lay path for further research in this direction. The study finds that Artificial Intelligence can be used in different functional areas such as customer interactions, ordering process, for preparing and delivering food, online table reservation, customer service, choosing meals, payment options, advertising etc. It will not only increase customer satisfaction but will also help restaurants chain in generating revenue and creating brand loyalty.
... Self-perceived health information is usually expressed by means of numerical values although it is subjective, uncertain or vague. The fuzzy linguistic approach represents qualitative aspects as linguistic values by means of linguistic variables (see [27]). This approach is adequate when attempting to qualify phenomena related to human perceptions as in the problem we address here: self-perceived health. ...
Chapter
Full-text available
The concept of life quality is a subjective feeling that only patient is able to define. The absence of disease is one of the determinants of well-being and life quality. Generally, self-perceived health status is measured by specific or generic questionnaires. The health information collected in the questionnaires is usually expressed by numerical values although the indicators evaluated are qualitative and subjective. This contribution proposes a linguistic approach where health information provided by patients is modelled by means of linguistic information in order to manage the uncertainty and subjectivity of such assessments. The contribution introduces a new model for measuring self-perceived health that can manage linguistic information and computes a final linguistic evaluation for each patient, applying an effective aggregation operator. A real case study is also presented to show the usefulness and effectiveness of the proposed model in the case of diabetes disease.
Article
Perth Airport is a major airport along the southwest coast of Australia. Even though, on average, fog only occurs about twelve times a year, the lack of suitable alternate aerodromes nearby for diversion makes fog forecasts for Perth Airport very important to long-haul international flights. Fog is most likely to form in the cool season between April and October. This study developed an objective fuzzy logic fog forecasting model for Perth Airport for the cool season. The fuzzy logic fog model was based on outputs from a high-resolution operational NWP model called LAPS125 that ran twice daily at 00 and 12UTC, but fuzzy logic was employed to deal with the inaccuracy of NWP prediction and uncertainties associated with relationships between fog predictors and fog occurrence. The outcome of the fuzzy logic fog model is in one of the four categories from low to high fog risk as FM0, FM5, FM15 or FM30, intended to map to approximate fog probability of 0, 5, 15 and 30%, respectively. The model was found useful in its 5year performance in the cool seasons between 2004 and 2008 and required little recalibration if mist was treated as if it were also a fog event in the skill evaluation. To generate an operational fog forecast for Perth Airport, the outcome of the fuzzy logic fog model was averaged with the outcomes of two other fog forecasting methods using a simple consensus approach. Fog forecast so generated is known as the operational consensus forecast. Skill assessment using frequency distribution diagram, Hansen and Kuiper skill score, and Relative Operating Characteristic curve showed that the operational consensus forecast outperformed all three individual methods. Out of the three methods, the fuzzy logic fog model ranked second. It performed better than the other objective method called GASM but worse than the subjective method which relied on forecaster’s subjective assessment. The skills of the fuzzy logic fog model can be further improved with the tuning of fuzzy functions. In addition, similar models can be customised for other airports. The study also suggested the use of the simple consensus approach to enhance forecasting skills for other stations or weather phenomena if there were two or more independent forecasting methods available. KeywordsFog–fog forecasting–fuzzy logic–NWP–consensus–Perth Airport
Article
Full-text available
The research described in this article is a continuation of work on a computational model of quality of life (QoL) satisfaction. In the proposed approach, overall life satisfaction is aggregated to personal life satisfaction (PLUS). The model described in the article is based on well-known and commonly used clinimetric scales (e.g., in psychiatry, psychology and physiotherapy). The simultaneous use of multiple scales, and the complexity of describing the quality of life with them, require complex fuzzy computational solutions. The aim of the study is twofold: (1) To develop a fuzzy model that allows for the detection of changes in life satisfaction scores (data on the influence of the COVID-19 pandemic and the war in the neighboring country were used). (2) To develop more detailed guidelines than the existing ones for further similar research on more advanced intelligent systems with computational models which allow for sensing, detecting and evaluating the psychical state. We are concerned with developing practical solutions with higher scientific and clinical utility for both small datasets and big data to use in remote patient monitoring. Two exemplary groups of specialists at risk of occupational burnout were assessed three times at different intervals in terms of life satisfaction. The aforementioned assessment was made on Polish citizens because the specific data could be gathered: before and during the pandemic and during the war in Ukraine (a neighboring country). That has a higher potential for presenting a better analysis and reflection on the practical application of the model. A research group (physiotherapists, n = 20) and a reference group (IT professionals, n = 20) participated in the study. Four clinimetric scales were used for assessment: the Perceived Stress Scale (PSS10), the Maslach Burnout Scale (MBI), the Satisfaction with Life Scale (SWLS), and the Nordic Musculoskeletal Questionnaire (NMQ). The assessment was complemented by statistical analyses and fuzzy models based on a hierarchical fuzzy system. Although several models for understanding changes in life satisfaction scores have been previously investigated, the novelty of this study lies in the use of data from three consecutive time points for the same individuals and the way they are analyzed, based on fuzzy logic. In addition, the new hierarchical structure of the model used in the study provides flexibility and transparency in the process of remotely monitoring changes in people’s mental well-being and a quick response to observed changes. The aforementioned computational approach was used for the first time.
Article
Full-text available
Featured Application Potential application of the work concerns automatic or semi-automatic systems for the assessment and classification of characteristics of life satisfaction as far as early risk of work-related stress or burnout. Abstract The general goal of the research in this article is to devise an algorithm for assessing overall life satisfaction—a term often referred to as Quality of Life (QoL). It is aggregated to its own proposition, called personal life usual satisfaction (PLUS). An important assumption here is that the model is based on already known and commonly used solutions, such as medical (psychological and physiotherapeutic) questionnaires. Thanks to this, the developed solution allows us to obtain a synergy effect from the existing knowledge, without the need to design new, complicated procedures. Fuzzy multivariate characterization of life satisfaction presents a challenge for a complete analysis of the phenomenon. The complexity of description using multiple scales, including linguistic, requires additional computational solutions, as presented in this paper. The detailed aim of this study is twofold: (1) to develop a fuzzy model reflecting changes in life satisfaction test scores as influenced by the corona virus disease 2019 (COVID-19) pandemic, and (2) to develop guidelines for further research on more advanced models that are clinically useful. Two groups affected by professional burnout to different degrees were analyzed toward life satisfaction twice (pre- and during pandemy): a study group (physiotherapists, n = 25) and a reference group (computer scientists, n = 25). The Perceived Stress Score (PSS10), Maslach Burnout Inventory (MBI), Satisfaction with Life Scale (SWLS), and Nordic Musculoskeletal Questionnaire (NMQ) were used. The resultant model is based on a hierarchical fuzzy system. The novelty of the proposed approach lies in the combination of the use of data from validated clinimetric tests with the collection of data from characteristic time points and the way in which they are analyzed using fuzzy logic through transparent and scalable hierarchical models. To date, this approach is unique and has no equivalent in the literature. Thanks to the hierarchical structure, the evaluation process can be defined as a modular construction, which increases transparency and makes the whole procedure more flexible.
Article
In this paper we experimentally assess, from both algorithmic and pragmatic perspectives, the adequacy of linguistic descriptions of real data generated by two metaheuristics: simulated annealing and genetic algorithm meta-heuristics. The type of descriptions we consider are fuzzy quantified statements (both Zadeh's type-1 and type-2) involving three well-known quantification models (Zadeh's scalar and fuzzy and Delgado's GD). We conducted an empirical validation using real observation and prediction meteorological data, where both automatic (metrics-based) and manual (human experts-based) assessment on the adequacy of the generated descriptions was assessed. Results indicate that, overall, the genetic approach performs better than simulated annealing in terms of quality of the obtained descriptions and time execution. Significance of this outperforming depends on the type of meteorological data and the quantification model selected. Tests of statistical significance point out that for type-1 descriptions no significant differences exist between the two meta-heuristics in the prediction case. For type-2 descriptions, significant differences exist for Delgado's GD model for both types of data. For Zadeh's scalar and fuzzy quantification significance depends on the type of data (observation or prediction). Globally, outperforming of the genetic approach over simulated annealing i) is significant in 4 out of 12 scenarios considered (all of them type-2), and ii) is not significant in the other 8 out 12 scenarios (all type-1 and two type-2). Also human expert assessment on the adequacy of the descriptions was conducted, showing that both meta-heuristics behave similarly for type-1 descriptions, while genetic algorithms produce more suitable type-2 linguistic descriptions.
Chapter
Human beings are animals endowed with a great curiosity. They continuously ask themselves how things are, where they come from, and where they go to. Questioning is at the origins of reasoning; and possibly, without the capability of self-questioning and guessing, neither directed thinking, nor reasoning, will exist. Their existence makes them a matter of study.
Article
In this paper we present a model based on computational intelligence and natural language generation for the automatic generation of textual summaries from numerical data series, aiming to provide insights which help users to understand the relevant information hidden in the data. Our model includes a fuzzy temporal ontology with temporal references which addresses the problem of managing imprecise temporal knowledge, which is relevant in data series. We fully describe a real use case of application in the environmental information systems field, providing linguistic descriptions about the air quality index (AQI), which is a very well-known indicator provided by all meteorological agencies worldwide. We consider two different data sources of real AQI data provided by the official Galician (NW Spain) Meteorology Agency: (i) AQI distribution in the stations of the meteorological observation network and (ii) time series which describe the state and evolution of the AQI in each meteorological station. Both application models were evaluated following the current standards and good practices of manual human expert evaluation of the Natural Language Generation field. Assessment results by two experts meteorologists were very satisfactory, which empirically confirm that the proposed textual descriptions fit this type of data and service both in content and layout.
Article
Full-text available
In this research, prediction of crude oil cuts from the first stage of refining process field is laid out using rough set theory (RST) based adaptive neuro-fuzzy inference system (ANFIS) soft sensor model to enhance the performance of oil refinery process. The RST was used to reduce the fuzzy rule sets of ANFIS model, and its features in the decision table. Also, discretisation methods were used to optimise the continuous data’s discretisation. This helps to predict the two critical variables of light naphtha product: Reid Vapor Pressure (RVP) and American Petroleum Institute gravity (API gravity), which detect the cut’s quality. Hence, a real-time process of Al Doura oil refinery is examined and the process data of refining crude oil from these two sources improve the knowledge provided by the data. The response variables represent the feedback measured value of cascade controller in the top of the splitter in crude distillation unit (CDU) in the rectifying section, which controls the reflux liquid’s flow towards the splitter’s head. The proposed adaptive soft sensor model succeeded to fit the results from laboratory tests, and a steady-state control system was achieved through an embedded virtual sensor. The predictive control system has been employed using cascade ANFIS controller in parallel with the soft sensor model to keep the purity of the distillate product in the stated range of the quality control of oil refinery. The results obtained from the proposed ANFIS based cascade control have no over/undershoots, and the rise time and settling time are improved by 26.65% and 84.63 %, respectively than the conventional proportional-integral-derivative (PID) based cascade control. Furthermore, the results of prediction and control model are compared with those of other machine learning techniques.
Article
This paper revisits the Communication and Engagement Toolkit for CO2 Capture and Storage (CCS) projects proposed by Ashworth and colleagues in collaboration with the Global CCS Institute. The paper proposes a new method for understanding the social context where CCS will be deployed based on the toolkit. In practice, the proposed method can be used to harness social data collected on the CCS project. The outcome of this application is a development of a predictive tool for gaining insight into the future, to guide strategic decisions that may enhance deployment. Methodologically, the proposed predictive tool is an artificial intelligence (AI) tool. It uses fuzzy deep neural network to develop computational ability to reason about the social behavior. The hybridization of fuzzy logic and deep neural network algorithms make the predictive tool an explainable AI system. It means that the prediction of the algorithm is interpretable using fuzzy logical rules. The practical feasibility of the proposed system has been demonstrated using an experimental sample of 198 volunteers. Their perceptions, emotions and sentiments were tested using a standard questionnaire from the literature, on a hypothetical CCS project based on 26 predictors. The generalizability of the algorithm to predict future reactions was tested on, 84 out-of-sample respondents. In the simulation experiment, we observed an approximately 90 % performance. This performance was measured when the algorithm's predictions were compared to the self- reported reactions of the out of sample subjects. The implication of the proposed tool to enhance the predictive power of the conventional CCS Communication and Engagement tool is discussed © 2020 xx. Hosting by Elsevier B.V. All rights reserved.
Article
Full-text available
In the face of today’s global challenges, oil and gas companies must define long-term priorities and opportunities in implementing complex Arctic offshore projects, taking into account environmental, economic, technological and social aspects. In this regard, ensuring strategic sustainability is the basis for long-term development. The aim of the study is to analyze existing approaches to the concept of “strategic sustainability” of an offshore Arctic oil and gas project and to develop a methodological approach to assessing the strategic sustainability of offshore oil and gas projects. In the theoretical part of the study, the approaches to defining strategic sustainability were reviewed, and their classification was completed, and the most appropriate definition of strategic sustainability for an offshore oil and gas project was chosen. The method of hierarchy analysis was used for strategic sustainability assessment. Specific criteria have been proposed to reflect the technical, geological, investment, social and environmental characteristics important to the offshore oil and gas project. The strategic sustainability of 5 offshore oil and gas projects was analyzed using an expert survey as part of the hierarchy analysis method. Recommendations were made on the development of an offshore project management system to facilitate the emergence of new criteria and improve the quality of the strategic sustainability assessment of offshore projects in the Arctic.
Article
Full-text available
In this contribution an original concept of stochastic stability (P‐stability), formulated here to represent realistic stochastic nature of materials, is employed for stability analysis of material performance characteristics. Stochasticity in materials can be related, for example, to manufacturing processes, potential treatment of material, or properties of raw materials. On the other hand, performance characteristics, obtained through the novel fuzzy sets based multi‐scale methodology, are also of stochastic nature. The particular strength of this newly developed P‐stability, is its ability to analyse not only individual characteristics, but also multiple characteristics simultaneously. This methodology also allows to estimate and quantify an actual accuracy of obtained characteristics.
Article
In computing with words, it has been stressed that words mean different things for different people, which entails that decision makers (DMs) have personalized individual semantics (PISs) attached to linguistic expressions in linguistic group decision making (GDM). In particular, the PISs of DMs are not fixed, and they will be changing during the consensus building process, which indicates the necessary of continual PIS learning. Therefore, in this article, we propose a continual PIS-learning-based consensus approach in linguistic GDM. Specifically, a continual PIS learning model with the consistency-driven methodology is proposed to update the PISs taking into account all the linguistic preference data given by DMs during the consensus process. Then, the consensus measurement and feedback recommendation based on PIS are developed to detect the consensus process. Finally, numerical examples and simulation analysis are presented to illustrate and justify the use of the continual PIS-learning-based consensus approach. Index Terms-Consensus process, continual learning, group decision making, linguistic decision making, personalized individual semantics.
Conference Paper
Full-text available
Artificial Intelligence (AI) has become a first class citizen in the cities of the 21st century. New applications are including features based on opportunities that AI brings, like medical diagnostic support systems, recommendation systems or intelligent assistance systems that we use every day. Also, each day, people are more concerned regarding the security and reliability of those AI-based systems. Moreover, trust, fairness, accountability, transparency and ethical issues are becoming main issues regarding AI-based systems. Institutions begin to issue regulations and to sponsor projects to promote AI transparency and to ensure that every decision made by an AI-based system can be convincingly explained to humans. In this context, Explainable AI (XAI), has become a hot topic within the research community. In this paper we have conducted an experimental study with 15 datasets to validate the feasibility of using a pool of gray-box classifiers to automatically explain a black-box classifier.
Article
To be acceptably safe one must identify the risks one is exposed to and decide what risk reducing measures are required. It is uncertain whether the threat really will materialize, but determining the size and probability of the risk is also full of uncertainty. When performing an analysis and preparing for decision making under uncertainty, quite frequently failure rate data, information on consequence severity or on a probability value, yes, even on the possibility that an event can or cannot occur, is lacking. In those cases, a possible way and sometimes the only way to proceed is to revert to expert judgment. Even in case historical data is available, an expert can be asked whether and to what extent such data still hold in the current situation. Anyhow, expert elicitation comes with an uncertainty depending on the expert’s reliability, which becomes very visible when two or more experts give different answers or even conflicting answers. This is not a new problem, and very bright minds have thought how to tackle this in a rational and objective way. But so far, however, the topic has not been given much attention in daily process safety and risk assessment practice. Therefore, this paper has a review and applied character and will present various approaches with detailed explanation and examples.
Article
Full-text available
The paper contains a discussion on solutions to symmetric type of fuzzy stochastic differential equations. The symmetric equations under study have drift and diffusion terms symmetrically on both sides of equations. We claim that such symmetric equations have unique solutions in the case that equations’ coefficients satisfy a certain generalized Lipschitz condition. To show this, we prove that an approximation sequence converges to the solution. Then, a study on stability of solution is given. Some inferences for symmetric set-valued stochastic differential equations end the paper.
Article
The present paper proposes two construction ways to study the general forms of ordinal sums of fuzzy implications with the intent of unifying the ordinal sums existing in the literature. The first ordinal sum construction way, which we call “Implication Complementing”, is to study how to complement a specific fuzzy implication to the linear transformations of given fuzzy implications defined on respective disjoint subsquares whose principal diagonals are segments of the principal diagonal of the unit square, in order that the resulting ordinal sum on the unit square is a fuzzy implication. The second way, which we call “Implication Reconstructing”, is to study how to reconstruct an initial fuzzy implication through replacing its some values on given rectangular regions of the unit square with the linear transformations of respective given fuzzy implications such that the redefined function is a new fuzzy implication. In both ways, necessary and sufficient conditions for the final reconstructed functions to be fuzzy implications are given and several new constructions of ordinal sums of fuzzy implications are obtained, which would generalize the existing ordinal sums from several aspects. In particular, by adopting the idea behind the second way, the generalized ordinal sums of fuzzy implications are proposed, in which the regions where the linear transformations of given fuzzy implications are defined are neither necessarily subsquares nor necessarily along the principal or minor diagonal of the unit square.
Article
Knowledge discovery from databases copes with several problems including the heterogeneity of data and interpreting the solution in an understandable and convenient form for domain experts. Fuzzy logic approaches based on the computing with words paradigm are very appealing since they offer the possibility to express useful knowledge from a large volume of data by linguistic terms, which are easily understandable for diverse users. In this paper, the novel descriptive data mining algorithm based on fuzzy functional dependencies has been proposed. In the first step, data are fuzzified, which ensures the same manipulation of crisp and fuzzy data. The data mining step is based on revealing fuzzy functional dependencies among considered attributes. In the final step, the mined knowledge is interpreted linguistically by the fuzzy modifiers and quantifiers. The proposed algorithm has been explained on illustrative data and tested on real-world dataset. Finally, its benefits, weak points and possible future research topics are discussed.
Article
Full-text available
A novel methodology, based on the theory of fuzzy sets, to obtain materials with pre‐defined sets of strength properties has been analysed from the position of identifying the necessary and sufficient number of experiments needed to predict these macro characteristics and establishing which micro parameters significantly influence the macroscale results. The procedure to estimate, with a user‐defined degree of accuracy, the minimum number of experiments and significant micro parameters has been tested and verified using experimental data, obtained from digital images of material microsections under different heat treatment conditions while analysing strength properties of reinforcing steel. The results confirm the possibility of using the developed methodologies for the performance properties evaluation of materials based on the minimum number of experiments and identification of the key grain‐phase parameters.
Article
Full-text available
Abstract. Object recognition is a complex neuronal process determined by interactions between many visual areas: from the retina, thalamus to the ventral visual pathway. These structures transform variable, single pixel signal in photoreceptors to a stable object representation. Neurons in visual area V4, midway in ventral stream, represent such stable shape detector. A feed forward hierarchy of increasing in size and complexity receptive fields (RF) leads to grand mother cell concept. Our question is how these processes might identify an object or its elements in order to recognize it in new, unseen conditions? We propose a new approach to this problem by extending the classical definition of the RF to a fuzzy detector. RF properties are also determined by the computational properties of the bottom-up and top-down pathways comparing stimulus with many predictions. The “driver-type” logic (DTL) of bottom-up computations looks for large number of possible object parts (hypotheses – rough set (RS) upper approximation), as object’s elements are similar to RF properties. The optimal combination is chosen, in unsupervised, parallel, multi-hierarchical pathways by the “modulator-type” logic (MTL) of top-down computations (RS lower approximation). Interactions between DTL (hypotheses) and MTL (predictions) terminates when RS boundary became small - the object is recognized.
Article
Full-text available
The paper substantiates the need to consider economic efficiency indicators of bank activity as fuzzy quantities. Formulations of the problem of fuzzy regression analysis and modelling, available in literary sources, have been analyzed. Three main approaches to the fuzzy regression analysis are presented. The general mathematical and meaningful formulation of problem of a fuzzy multivariate regression analysis for commercial bank competitiveness has been proposed. Sequence of its solutions is described. The example of numerical computations for one of the large Ukrainian banks is given. Results of obtained solution were analyzed from the standpoint of reliability, accuracy and compared against the classical crisp regression analysis. Finishing steps for obtaining final accurate numerical results of solution process are described. In summary, convincing arguments concerning the expediency of application of this approach to the problem of determining the competitiveness of banks are formulated and presented.
Article
Die Schweizerische Post gilt als prädestinierter Anbieter von Smart-City-Dienstleistungen, dies, weil die Schweizerische Post als Mischkonzern verschiedenste Angebote in den grundlegenden Bereichen wie Kommunikation, Logistik und Mobilität, die für das Funktionieren einer Smart City von Bedeutung sind, mithilfe moderner Technologien stetig optimiert und dabei der Kunde ins Zentrum setzt. Die postalischen Dienstleistungen sollen dem Kunden mittels adaptiver und interaktiver Systeme einen Vorteil generieren und ihm somit den Alltag vereinfachen. Um dieses Ziel zu erreichen, werden in Zusammenarbeit mit dem Human-IST Institut der Universität Fribourg die theoretischen Konzepte des Soft und Cognitive Computing aufgearbeitet und anschliessend in der Praxis direkt in konkreten Projekten geplant und umgesetzt. Der Artikel bietet somit einen Überblick über den aktuellen Stand sowohl der Theorie als auch der Projekte der Post.
A new methodology to obtain metallic functional materials with predefined sets of strength properties has been developed. It has been shown that in order to accurately estimate set of material properties at the macro-level, information from the micro-level needs to be taken into account. As a result a two-level estimation model, based on the theory of fuzzy sets, has been proposed. To demonstrate the developed methodology, a reinforcing steel has been analysed. Using microstructural information, derived from an available set of experimentally obtained digital images of material microsections under different heat treatment conditions, macroscopic strength properties of reinforcing steel have been determined.
Article
In this paper, an algebraic construction–called rotation–is introduced, which produces a fuzzy implication from a fuzzy implication. This construction method is similar to the rotation construction for triangular norms. An infinite number of new families of such fuzzy implications can be constructed in this way which provides a broad spectrum of choices for e.g. fuzzy connectives in fuzzy set theory. A preservation of the logical properties of the initial implication in the final one is investigated.
Article
Alzheimer’s Disease (AD) is the most frequent neurodegenerative form of dementia. Although dementia cannot be cured, it is very important to detect preclinical AD as early as possible. Several studies demonstrated the effectiveness of the joint use of structural Magnetic Resonance Imaging (MRI) and cognitive measures to detect and track the progression of the disease. Since hippocampal atrophy is a well known biomarker for AD progression state, we propose here a novel methodology, exploiting it as a searchlight to detect the best discriminating features for the classification of subjects with Mild Cognitive Impairment (MCI) converting (MCI-c) or not converting (MCI-nc) to AD. In particular, we define a significant subdivision of the hippocampal volume in fuzzy classes, and we train for each class Support Vector Machine SVM classifiers on cognitive and morphometric measurements of normal controls (NC) and AD patients. From the ADNI database, we used MRI scans and cognitive measurements at baseline of 372 subjects, including 98 subjects with AD, and 117 NC as a training set, 86 with MCI-c and 71 with MCI-nc as an independent test set. The accuracy of early diagnosis was evaluated by means of a longitudinal analysis. The proposed methodology was able to accurately predict the disease onset also after one year (median AUC = 88.2%, interquartile range 87.2%–89.0%). Besides its robustness, the proposed fuzzy methodology naturally incorporates the uncertainty degree intrinsically affecting neuroimaging features. Thus, it might be applicable in several other pathological conditions affecting morphometric changes of the brain.
Article
Full-text available
The division of internal structures and external space of geographical entities is foundation of spatial analysis, query and reasoning. Most current division methods are crisp, and inconsistent with human cognitive habits. Usually existing geographic information systems(GISs) analyze spatial data directly by certain spatial analysis methods and then use natural language or words to explain analysis results, so the encoding process is absence but it is necessary in intelligent GIS (IGIS). Fuzzy geographical entities and phenomena occur throughout the real world, and these semantics of words related spatial locations and relations usually involve uncertainties. First, the geographical perceptual computing (GPC) model based on CWW is proposed and it includes four modules: input, geo-encoder, geographical CWW engine and geo-decoder. Then, the trapezoidal fuzzy set is adopted to represent spatial words. A fine fuzzy spatial partitioning model of line objects based on CWW is proposed. The interior of a line object is divided into several parts according to fuzzy logic and human cognitive habits. The exterior of a line entity is then divided into several parts by combining direction relation and distance relation models with fuzzy logic methods. This model provides a full natural language description of the interior and exterior of crisp or fuzzy line entities and consistent with human cognitive habits. An application case of this model is provided in the last and proves the superiority of this model.
Article
Full-text available
The article announces the possibilities of semantic modeling in the development of feedback tools in social sciences. A new approach to the computational theory of perceptions (CTP) for analysis of mental object is proposed. The article demonstrates the implementation of relativistic psychometrics for the study of mental response (opinions, expectations and attitudes). The problem of image understanding and its significance is considered in combination of soft and hard computing. It is shown that the modeling of object (its coding and decoding in ‘mental map’) obeys the semiotic and mathematical logic. Computing with perceptions for the rules of mental representation proves their identity to the laws of conservation. The article demonstrates the versatility of the semiotic description of objects in Minkowski space. It also confirms by mathematical solution C. S. Peirce's metaphor, according to which the semiology of language is a truly universal algebra of relations.
Chapter
Fuzzy logic [161] has been successfully applied in many areas, such as control of industrial processes, control of robotic manipulators, control of servo-motors, complex decision making, diagnostic systems and others [10,93,94,98]. When using fuzzy logic, input data in form of linguistic values are represented by membership functions which are used for defining fuzzy sets of crisp values and their corresponding membership degrees related to these sets. Many parameters of systems based on fuzzy logic should be defined with help of an expert. In the same time, however, the associated parameters construction processes are performed by the method of trial and error or some heuristic algorithms [143]. Moreover, a designer that knows the characteristics of the system also needs initial rules. Some researchers have suggested a number of mechanisms to generate fuzzy rules and developed methods of their modification on the basis of experience [131,134,140,141,142]. Among them we must distinguish the self-organizing fuzzy controller that is capable of forming and modifying fuzzy rules [134,142], the clustering algorithm for fuzzy partitioning the input data space [140], and the least square algorithm for defining a succession of parameters [140,141] that can be used to construct systems based on fuzzy logic.
Article
In group decision making (GDM) dealing with Computing with Words (CW) has been highlighted the importance of the statement, words mean different things for different people, because of its influence in the final decision. Different proposals that either grouping such different meanings (uncertainty) to provide one representation for all people or use multi-granular linguistic term sets with the semantics of each granularity, have been developed and applied in the specialized literature. Despite these models are quite useful they do not model individually yet the different meanings of each person when he/she elicits linguistic information. Hence, in this paper a personalized individual semantics (PIS) model is proposed to personalize individual semantics by means of an interval numerical scale and the 2-tuple linguistic model. Specifically, a consistency-driven optimization-based model to obtain and represent the PIS is introduced. A new CW framework based on the 2-tuple linguistic model is then defined, such a CW framework allows us to deal with PIS to facilitate CW keeping the idea that words mean different things to different people. In order to justify the feasibility and validity of the PIS model, it is applied to solve linguistic GDM problems with a consensus reaching process.
Book
This book deals with the theory, design principles, and application of hybrid intelligent systems using type-2 fuzzy sets in combination with other paradigms of Soft Computing technology such as Neuro-Computing and Evolutionary Computing. It provides a self-contained exposition of the foundation of type-2 fuzzy neural networks and presents a vast compendium of its applications to control, forecasting, decision making, system identification and other real problems. Type-2 Fuzzy Neural Networks and Their Applications is helpful for teachers and students of universities and colleges, for scientists and practitioners from various fields such as control, decision analysis, pattern recognition and similar fields.
Chapter
Glossary Definition of the Subject Introduction: Granular Computing and Ordered Data Philosophical Basis of DRSA Granular Computing Dominance-Based Rough Set Approach Fuzzy Set Extensions of the Dominance-Based Rough Set Approach Variable-Consistency Dominance-Based Rough Set Approach (VC-DRSA) Dominance-Based Rough Approximation of a Fuzzy Set Monotonic Rough Approximation of a Fuzzy Set Versus Classical Rough Set Dominance-Based Rough Set Approach to Case-Based Reasoning An Algebraic Structure for Dominance-Based Rough Set Approach Conclusions Future Directions Bibliography
Article
Any computing with words (CW) system is required to assign a phrase in natural language to the fuzzy values it provides as its output. This paper explores different linguistic approximation methods for CW systems. The outputs of these methods are evaluated through various measures such as fuzziness, specificity, validity, and sigma-count. We illustrate that certain linguistic methods may result in complex and incomprehensible phrases in natural language. Some might even include an invalid linguistic term in their linguistic approximation. Copyright © 2013, Association for the Advancement of Artificial Intelligence. All rights reserved.
Chapter
In the last sections of the previous chapter we introduced some simple examples of practical fuzzy logic based applications where a FRBS always had two crisp inputs and a crisp output. In this chapter we shall discover models of a bigger complexity that, more than examples, are intended to be a guideline for more ambitious goals. These models, aside practical, pretend to be inspiring for the reader.
Article
Concept maps and Lotfi Zadeh’s fuzzy extended logic are applied to such computerized approximate reasoning models as modus ponens and modus tollens. A statistical application is also sketched. A pedagogical approach is mainly adopted, but these ideas are also applicable to the conduct of inquiry in general.
Article
The purpose of this paper is to explore synergies and gaps in research in Conceptual Spaces (CS) and Computing with Words (CWW), which both attempt to address aspects of human cognition such as judgement and intuition. Both CS and CWW model concepts in term of collections of properties, and use similarity as a key computational device. We outline formal methods developed in CWW for modelling and manipulating constructs when membership values are imprecise. These could be employed in CS modelling. On the other hand, CS offers a more comprehensive theoretical framework than CWW for the construction of properties and concepts on collections of domains. We describe a specific formalism of CS based on fuzzy sets, and discuss problems with it and with alternative methods for aggregating property memberships into concept membership. To overcome the problems, we present a model in which all constructs are fuzzy sets on a plane, and similarity of two constructs is an inverse function of the average separation between their membership functions.
Article
We consider recently introduced fuzzy stochastic differential equations with solutions of decreasing fuzziness. In general, such the equations do not have solutions that could be written in explicit, closed form. Therefore some methods of construction of approximate solutions are proposed in this paper. In considered framework, approximate solutions are some measurable and adapted fuzzy stochastic processes.We analyze two kinds of sequences of approximate solutions. It is showed that each sequence of approximate solutions can be used to prove existence and uniqueness of solution to fuzzy stochastic differential equations of decreasing fuzziness. In fact, both the sequences converge to a unique solution. The rates of convergence related to both the sequences are investigated. All the results apply immediately to set-valued stochastic differential equations with solutions of decreasing diameter.
Article
Full-text available
We define fuzzy constraint networks and prove a theorem about their relationship to fuzzy logic. Then we introduce Khayyam, a fuzzy constraint-based programming lan-guage in which any sentence in the first-order fuzzy predicate calculus is a well-formed con-straint statement. Finally, using Khayyam to address an equipment selection application, we illustrate the expressive power of fuzzy constraint-based languages.
Conference Paper
Full-text available
After reviewing the notion of crisp constraint networks and their relationship to semantics in classical logic, the authors define fuzzy constraint networks and their relationship to fuzzy logic. Then they introduce Khayyam, a fuzzy constrained-based programming language which implements much of Zadeh's PRUF formalism. In Khayyam, any sentence in the first-order fuzzy predicate calculus is a well-formed constrained statement. Finally, using Khayyam to address an equipment selection application, the expressive power of constraint-based languages is illustrated
Article
Full-text available
In classical Constraint Satisfaction Problems (CSPs) knowledge is embedded in a set of hard constraints, each one restricting the possible values of a set of variables. However constraints in real world problems are seldom hard, and CSP's are often idealizations that do not account for the preference among feasible solutions. Moreover some constraints may have priority over others. Lastly, constraints may involve uncertain parameters. This paper advocates the use of fuzzy sets and possibility theory as a realistic approach for the representation of these three aspects. Fuzzy constraints encompass both preference relations among possible instanciations and priorities among constraints. In a Fuzzy Constraint Satisfaction Problem (FCSP), a constraint is satisfied to a degree (rather than satisfied or not satisfied) and the acceptability of a potential solution becomes a gradual notion. Even if the FCSP is partially inconsistent, best instanciations are provided owing to the relaxation of some constraints. Fuzzy constraints are thus flexible. CSP notions of consistency and k-consistency can be extended to this framework and the classical algorithms used in CSP resolution (e.g., tree search and filtering) can be adapted without losing much of their efficiency. Most classical theoretical results remain applicable to FCSPs. In the paper, various types of constraints are modelled in the same framework. The handling of uncertain parameters is carried out in the same setting because possibility theory can account for both preference and uncertainty. The presence of uncertain parameters lead to ill-defined CSPs, where the set of constraints which defines the problem is not precisely known.
Article
Full-text available
Given a set of objects in a scene whose identifications are ambiguous, it is often possible to use relationships among the objects to reduce or eliminate the ambiguity. A striking example of this approach was given by Waltz [13]. This paper formulates the ambiguity-reduction process in terms of iterated parallel operations (i.e., relaxation operations) performed on an array of (object, identification) data. Several different models of the process are developed, convergence properties of these models are established, and simple examples are given.
Book
The concept of fuzzy sets is one of the most fundamental and influential tools in computational intelligence. Fuzzy sets can provide solutions to a broad range of problems of control, pattern classification, reasoning, planning, and computer vision. This book bridges the gap that has developed between theory and practice. The authors explain what fuzzy sets are, why they work, when they should be used (and when they shouldn't), and how to design systems using them. The authors take an unusual top-down approach to the design of detailed algorithms. They begin with illustrative examples, explain the fundamental theory and design methodologies, and then present more advanced case studies dealing with practical tasks. While they use mathematics to introduce concepts, they ground them in examples of real-world problems that can be solved through fuzzy set technology. The only mathematics prerequisites are a basic knowledge of introductory calculus and linear algebra. Bradford Books imprint
Article
In this paper, we focus on two kinds of linear programmings with fuzzy numbers. They are called interval number and fuzzy number linear programmings, respectively. The problems of linear programmings with interval number coefficients are approached by taking maximum value range and minimum value range inequalities as constraint conditions, reduced it into two classical linear programmings, and obtained an optimal interval solution to it. The problems of fuzzy linear programming with fuzzy number coefficients are approached in two ways: i.e. ''fuzzy decisive set approach'' and ''interval number linear programming approach for several membership levels''. Finally, we gave the numerical solutions of the illustrative examples.
Article
This paper deals with multiobjective integer linear programming problems with fuzzy parameters in the objective functions and in the constraints by using the concept of α-level set of fuzzy numbers. A parametric study is carried out on the problem of concern. In addition, a solution algorithm is described to solve the formulated model. This algorithm is based mainly on a weighting method together with the technique of Gupta and Ravindran. A numerical simple example is included to clarify the theory developed in the paper.
Article
Este libro es una introducción a la aritmética difusa.
Article
A fuzzy restriction may be visualized as an elastic constraint on the values that may be assigned to a variable. In terms of such restrictions, the meaning of a proposition of the form “x is P,” where x is the name of an object and P is a fuzzy set, may be expressed as a relational assignment equation of the form R(A(x)) = P, where A(x) is an implied attribute of x, R is a fuzzy restriction on x, and P is the unary fuzzy relation which is assigned to R. For example, “Stella is young,” where young is a fuzzy subset of the real line, translates into R(Age(Stella))= young. The calculus of fuzzy restrictions is concerned, in the main, with (a) translation of propositions of various types into relational assignment equations, and (b) the study of transformations of fuzzy restrictions which are induced by linguistic modifiers, truth-functional modifiers, compositions, projections and other operations. An important application of the calculus of fuzzy restrictions relates to what might be called approximate reasoning, that is, a type of reasoning which is neither very exact nor very inexact. The main ideas behind this application are outlined and illustrated by examples.
Article
A basic idea suggested in this paper is that a linguistic hedge such as very, more or less, much, essentially. slightly, etc. may be viewed as an operator which acts on the fuzzy set representing the meaning of its operand. For example, in the case of the composite term very tall man, the operator very acts on the fuzzy meaning of the term tall man.
Article
Natural Logic. In 1970, G. Lakoff published a paper [8] in which he introduced the concept of natural logic with the following goals: to express all concepts capable of being expressed in natural language, to characterize all the valid inferences that can be made in natural language, to mesh with adequate linguistic descriptions of all natural languages.
Article
Fuzzy linear programming problems are analyzed within the fuzzy set context to uncover redundancies, infeasibilities, variables whose values are fixed, and implied bounds on rows and columns. Applications include pre-analysis of perturbed constraint matrices and post-optimization analysis fuzzy linear programming problems.
Article
A Man whose height is four feet is short; adding one tenthof an inch to a short man's height leaves him short; therefore, a man whose height is four feet and one tenth of an inch is short. Now begin again and argue in the same pattern. A man whose height is four feet and one tenth of an inch is short; adding one tenth of an inch to a short man's height leaves him short; therefore, a man whose height is four feet and two tenths of an inch is short. In this way, it seems, we can reach the absurd result that a man whose height is four feet plus any number of tenths of an inch is short. For, if the first argument is sound, so is the second; and if the second, so is the third; and so on. There appears to be no good reason for stopping at any one point rather than at another; it is hard to see why the chain of arguments should ever be broken. But the conclusion is ridiculous; it is, for example, preposterous to say that a man whose height is seven feet is, nevertheless, short.
Article
There are three basic concepts that underlie human cognition: granulation, organization and causation. Informally, granulation involves decomposition of whole into parts; organization involves integration of parts into whole; and causation involves association of causes with effects. Granulation of an object A leads to a collection of granules of A, with a granule being a clump of points (objects) drawn together by indistinguishability, similarity, proximity or functionality. For example, the granules of a human head are the forehead, nose, cheeks, ears, eyes, etc. In general, granulation is hierarchical in nature. A familiar example is the granulation of time into years, months, days, hours, minutes, etc. Modes of information granulation (IG) in which the granules are crisp (c-granular) play important roles in a wide variety of methods, approaches and techniques. Crisp IG, however, does not reflect the fact that in almost all of human reasoning and concept formation the granules are fuzzy (f-granular). The granules of a human head, for example, are fuzzy in the sense that the boundaries between cheeks, nose, forehead, ears, etc. are not sharply defined. Furthermore, the attributes of fuzzy granules, e.g., length of nose, are fuzzy, as are their values: long, short, very long, etc. The fuzziness of granules, their attributes and their values is characteristic of ways in which humans granulate and manipulate information. The theory of fuzzy information granulation (TFIG) is inspired by the ways in which humans granulate information and reason with it. However, the foundations of TFIG and its methodology are mathematical in nature. The point of departure in TFIG is the concept of a generalized constraint. A granule is characterized by a generalized constraint which defines it. The principal types of granules are: possibilistic, veristic and probabilistic. The principal modes of generalization in TFIG are fuzzification (f-generalization); granulation (g-generalization); and fuzzy granulation (f.g-generalization), which is a combination of fuzzification and granulation. F.g-generalization underlies the basic concepts of linguistic variable, fuzzy if-then rule and fuzzy graph. These concepts have long played a major role in the applications of fuzzy logic and differentiate fuzzy logic from other methodologies for dealing with imprecision and uncertainty. What is important to recognize is that no methodology other than fuzzy logic provides a machinery for fuzzy information granulation.. :To Didier Dubois and Henri Prade, who have contributed in so many major ways to the development of fuzzy logic and its applications.
Article
The approach described in this paper represents a substantive departure from the conventional quantitative techniques of system analysis. It has three main distinguishing features: 1) use of so-called ``linguistic'' variables in place of or in addition to numerical variables; 2) characterization of simple relations between variables by fuzzy conditional statements; and 3) characterization of complex relations by fuzzy algorithms. A linguistic variable is defined as a variable whose values are sentences in a natural or artificial language. Thus, if tall, not tall, very tall, very very tall, etc. are values of height, then height is a linguistic variable. Fuzzy conditional statements are expressions of the form IF A THEN B, where A and B have fuzzy meaning, e.g., IF x is small THEN y is large, where small and large are viewed as labels of fuzzy sets. A fuzzy algorithm is an ordered sequence of instructions which may contain fuzzy assignment and conditional statements, e.g., x = very small, IF x is small THEN Y is large. The execution of such instructions is governed by the compositional rule of inference and the rule of the preponderant alternative. By relying on the use of linguistic variables and fuzzy algorithms, the approach provides an approximate and yet effective means of describing the behavior of systems which are too complex or too ill-defined to admit of precise mathematical analysis.
Article
The approach described in this paper provides a framework for the definition of such concepts through the use of fuzzy algorithms which have the structure of a branching questionnaire. The starting point is a relational representation of the definiendum as a composite question whose constituent questions are either attributional or classificational in nature. The constituent questions as well as the answers to them are allowed to be fuzzy. By putting the relational representation into an algebraic form, one can derive a fuzzy relation which defines the meaning of the definiendum. This fuzzy relation, then, provides a basis for an interpolation of the relational representation. To transform a relational representation into an efficient branching questionnaire, the tableau of the relation is subjected to a process of compactification which identifies the conditionally redundant questions. From a maximally compact representation, various efficient realizations which have the structure of a branching questionnaire, with each realization corresponding to a prescribed order of asking the constituent questions, can readily be determined. Then, given the cost of constituent questions as well as the conditional probabilities of answers to them, one can compute the average cost of deducing the answer to the composite question. In this way, a relational representation of a concept leads to an efficient branching questionnaire which may serve as its operational definition.
Article
In this paper, we focus on large-scale linear programming problems with block angular structure for which the Dantzig-Wolfe decomposition method has been successfully applied. By considering the vague nature of human judgements, we assume that the decision maker may have a fuzzy goal for the objective function and fuzzy constraints for the coupling constraints. Having elicited the corresponding linear membership functions through the interaction with the decision maker, if we adopt the convex fuzzy decision for combining them, it is shown that, under some appropriate conditions, the formulated problem can be reduced to a number of independent linear subproblems and the overall satisficing solution for the decision maker is directly obtained just only solving the subproblems.
Article
By a linguistic variable we mean a variable whose values are words or sentences in a natural or artificial language. For example, Age is a linguistic variable if its values are linguistic rather than numerical, i.e.,young, not young, very young, quite young, old, not very old and not very young, etc., rather than 20, 21,22, 23, In more specific terms, a linguistic variable is characterized by a quintuple (L>, T(L), U,G,M) in which L is the name of the variable; T(L) is the term-set of L, that is, the collection of its linguistic values; U is a universe of discourse; G is a syntactic rule which generates the terms in T(L); and M is a semantic rule which associates with each linguistic value X its meaning, M(X), where M(X) denotes a fuzzy subset of U. The meaning of a linguistic value X is characterized by a compatibility function, c: U → [0,1], which associates with each u in U its compatibility with X. Thus, the compatibility of age 27 with young might be 0.7, while that of 35 might be 0.2. The function of the semantic rule is to relate the compatibilities of the so-called primary terms in a composite linguistic value-e.g., young and old in not very young and not very old-to the compatibility of the composite value. To this end, the hedges such as very, quite, extremely, etc., as well as the connectives and and or are treated as nonlinear operators which modify the meaning of their operands in a specified fashion. The concept of a linguistic variable provides a means of approximate characterization of phenomena which are too complex or too ill-defined to be amenable to description in conventional quantitative terms. In particular, treating Truth as a linguistic variable with values such as true, very true, completely true, not very true, untrue, etc., leads to what is called fuzzy logic. By providing a basis for approximate reasoning, that is, a mode of reasoning which is not exact nor very inexact, such logic may offer a more realistic framework for human reasoning than the traditional two-valued logic. It is shown that probabilities, too, can be treated as linguistic variables with values such as likely, very likely, unlikely, etc. Computation with linguistic probabilities requires the solution of nonlinear programs and leads to results which are imprecise to the same degree as the underlying probabilities. The main applications of the linguistic approach lie in the realm of humanistic systems-especially in the fields of artificial intelligence, linguistics, human decision processes, pattern recognition, psychology, law, medical diagnosis, information retrieval, economics and related areas.
Article
PRUF - an acronym for Possibilistic Relational Universal Fuzzy - is a meaning representation language for natural languages which departs from the conventional approaches to the theory of meaning in several important respects. First, a basic assumption underlying PRUF is that the imprecision that is intrinsic in natural languages is, for the most part, possibilistic rather than probabilistic in nature. Second, the logic underlying PRUF is not a two-valued or multivalued logic, but a fuzzy logic, FL, in which the truth-values are linguistic, that is, are of the form true, not true, very true, more or less true, not very true, etc. , with each such truth-value representing a fuzzy subset of the unit interval. Third, the quantifiers in PRUF - like the truth-values - are allowed to be linguistic, i. e. may be expressed as most, many, few, some, not very many, almost all, etc. Based on the concept of the cardinality of a fuzzy set, such quantifiers are given a concrete interpretation which makes it possible to translate into PRUF propositions exemplified by ″Many tall men are much taller than most men,″ ″All tall women are blonde is not very true,″ etc.
Article
A fuzzy set is a class of objects with a continuum of grades of membership. Such a set is characterized by a membership (characteristic) function which assigns to each object a grade of membership ranging between zero and one. The notions of inclusion, union, intersection, complement, relation, convexity, etc., are extended to such sets, and various properties of these notions in the context of fuzzy sets are established. In particular, a separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Conference Paper
In this paper, we develop a new algorithm called Fuzzy Q-Learning (or FQ-Learning) which extends Watkin's Q-Learning method. It can be used for decision processes in which the goals and/or the constraints, but not necessarily the system under control, are fuzzy in nature. An example of a fuzzy constraint is: the weight of object A must not be substantially heavier than w where w is a specified weight. Similarly, an example of a fuzzy goal is: the robot must be in the vicinity of door k. We show that FQ-Learning provides an alternative solution to this problem which is simpler than the Bellman-Zadeh's fuzzy dynamic programming approach. We apply the algorithm to a multistage decision making problem.
Conference Paper
This paper describes an inference system for uncertain predicates, providing an alternative to the maximal entropy method used by Paris and Vencovska in In the Appendix we give an example of the application of the process, and a formal definition of the logics that underlie the system.
Book
Fuzzy sets were introduced by Zadeh [9] in 1965 to represent/manipu-late data and information possessing nonstatistical uncertainties. Fuzzy sets serve as a means of representing and manipulating data that are not precise, but rather fuzzy.
Article
The concept of constraint propagation of label sets is discussed. It is shown how this idea can be extended to environments in which the constraints and label sets are imprecise. This requires the introduction of fuzzy sets. An algorithmic procedure is provided for including default-type constraints into the constraint propagation problem.
Article
Montague's difficult notation and complex model theory have tended to obscure potential insights for the computer scientist studying Natural Language. Despite his strict insistence on an abstract model-theoretic interpretation for his formalism, we feel that Montague's work can be related to procedural semantics in a fairly direct way. A simplified version of Montague's formalism is presented, and its key concepts are explicated in terms of computational analogues. Several examples are presented within Montague's formalism but with a view toward developing a procedural interpretation. We provide a natural translation from intensional logic into lisp. This allows one to express the composition of meaning in much the way Montague does, using subtle patterns of functional application to distribute the meanings of individual words throughout a sentence. The paper discusses some of the insights this research has yielded on knowledge representation and suggests some new ways of looking at intensionality, context, and expectation.
Article
The year 1990 may well be viewed as the beginning of a new trend in the design of household appliances, consumer electronics, cameras, and other types of widely used consumer products. The trend in question relates to a marked increase in what might be called the Machine Intelligence Quotient (MIQ) of such products compared to what it was before 1990. Today, we have microwave ovens and washing machines that can figure out on their own what settings to use to perform their tasks optimally.
Article
By a linguistic variable we mean a variable whose values are words or sentences in a natural or artificial language. For example, Age is a linguistic variable if its values are linguistic rather than numerical, i.e.,young, not young, very young, quite young, old, not very old and not very young, etc., rather than 20, 21,22, 23, In more specific terms, a linguistic variable is characterized by a quintuple (L>, T(L), U,G,M) in which L is the name of the variable; T(L) is the term-set of L, that is, the collection of its linguistic values; U is a universe of discourse; G is a syntactic rule which generates the terms in T(L); and M is a semantic rule which associates with each linguistic value X its meaning, M(X), where M(X) denotes a fuzzy subset of U. The meaning of a linguistic value X is characterized by a compatibility function, c: U → [0,1], which associates with each u in U its compatibility with X. Thus, the compatibility of age 27 with young might be 0.7, while that of 35 might be 0.2. The function of the semantic rule is to relate the compatibilities of the so-called primary terms in a composite linguistic value-e.g., young and old in not very young and not very old-to the compatibility of the composite value. To this end, the hedges such as very, quite, extremely, etc., as well as the connectives and and or are treated as nonlinear operators which modify the meaning of their operands in a specified fashion. The concept of a linguistic variable provides a means of approximate characterization of phenomena which are too complex or too ill-defined to be amenable to description in conventional quantitative terms. In particular, treating Truth as a linguistic variable with values such as true, very true, completely true, not very true, untrue, etc., leads to what is called fuzzy logic. By providing a basis for approximate reasoning, that is, a mode of reasoning which is not exact nor very inexact, such logic may offer a more realistic framework for human reasoning than the traditional two-valued logic. It is shown that probabilities, too, can be treated as linguistic variables with values such as likely, very likely, unlikely, etc. Computation with linguistic probabilities requires the solution of nonlinear programs and leads to results which are imprecise to the same degree as the underlying probabilities. The main applications of the linguistic approach lie in the realm of humanistic systems-especially in the fields of artificial intelligence, linguistics, human decision processes, pattern recognition, psychology, law, medical diagnosis, information retrieval, economics and related areas.
Article
One of the fundamental tenets of modern science is that a phenomenon cannot be claimed to be well understood until it can be characterized in quantitative terms.l Viewed in this perspective, much of what constitutes the core of scientific knowledge may be regarded as a reservoir of concepts and techniques which can be drawn upon to construct mathematical models of various types of systems and thereby yield quantitative information concerning their behavior.
Conference Paper
The authors propose a unified treatment of prioritized and flexible constraints, both being represented by possibility distributions. An approach based on possibility theory is described for representing and solving such fuzzy constraint satisfaction problems (FCSP) involving both types of constraints. Arc- and path-consistency-based methods for constraint satisfaction problems are extended to this possibility theory framework. An illustrative example is given
Article
The past two decades have witnessed profound changes in the composition, functions and the level of complexity of electrical as well as electronic systems which are employed in modem technology. As a result, classical RLC network theory, which was the mainstay of electrical engineering at a time when RLC networks were the bread and butter of the electrical engineer, has been and is being increasingly relegated to the status of a specialized branch of a much broader discipline-system theory-which is concerned with systems of all types regardless of their physical identity and purpose. This paper presents a brief survey of the evolution of system theory, together with an exposition of some of its main concepts, techniques and problems. The discussion is centered on the notion of state and emphasizes the role played by state-space techniques. The paper concludes with a brief statement of some of the key problems of system theory.
Article
A fuzzy algorithm is an ordered set of fuzzy instructions that upon execution yield an approximate solution to a given problem. Two unrelated aspects of fuzzy algorithms are considered in this paper. The first is concerned with the problem of maximization of a reward function. It is argued that the conventional notion of a maximizing value for a function is not sufficiently informative and that a more useful notion is that of a maximizing set. Essentially, a maximizing set serves to provide information not only concerning the point or points at which a function is maximized, but also about the extent to which the values of the reward function approximate to its supremum at other points in its range. The second is concerned with the formalization of the notion of a fuzzy algorithm. In this connection, the notion of a fuzzy Markoff algorithm is introduced and illustrated by an example. It is shown that the generation of strings by a fuzzy algorithm bears a resemblance to a birth-and-death process and that the execution of the algorithm terminates when no more “live” strings are left
Article
As its name suggests, computing with words (CW) is a methodology in which words are used in place of numbers for computing and reasoning. The point of this note is that fuzzy logic plays a pivotal role in CW and vice-versa. Thus, as an approximation, fuzzy logic may be equated to CW. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW. In CW, a word is viewed as a label of a granule; that is, a fuzzy set of points drawn together by similarity, with the fuzzy set playing the role of a fuzzy constraint on a variable. The premises are assumed to be expressed as propositions in a natural language. In coming years, computing with words is likely to evolve into a basic methodology in its own right with wide-ranging ramifications on both basic and applied levels
Synergetic Computation for Constraint Satisfaction Problems Involving Continuous and Fuzzy Variables by Using Occam
  • O Katai
  • S Matsubara
  • H Masuichi
  • M Ida
  • O. Katai
Fuzzy Approach to Reasoning and Decision-Making
  • V Novak
  • M Ramik
  • M Cerny
  • J Nekola
Vagueness, Ambiguity and all the Rest
  • P Bosch
  • P. Bosch
Extending Constraint Satisfaction Problem Solving in Structural Design”, 5th Int
  • G Qi
  • G Friedrich
On Reaching Consensus by Groups of Intelligent Agents”, Methodologies for Intelligent Systems Amsterdam: North-Holland
  • H Rasiowa
  • M Marek
Cognition et système. Paris: l’Interdisciplinaire Système(s)
  • R Vallée
  • R. Vallée