ChapterPDF Available

Fuzzy Logic in Artificial Intelligence

Authors:

Abstract

After a basic introduction of fuzzy logic, we discuss its role in artificial and computational intelligence. Then we present innovative applications of fuzzy logic, focusing on fuzzy expert systems, with one typical example explored in some detail. The article concludes with suggestions how artificial intelligence and fuzzy logic can benefit from each other. I. Introduction In 1948, Alan Turing wrote a paper [1] marking the begin of a new era, the era of the intelligent machine, which raised questions that still remain unanswered today. This era was heavily influenced by the appearance of the computer, a machine that allowed humans to automate their way of thinking. However, human thinking is not exact. If you had to park your car precisely in one place, you would have extreme difficulties. To allow computers to really mimic the way humans think, the theories of fuzzy sets and fuzzy logic were created. They should be viewed as formal mathematical theories for the representation of un...
Labor
Expertensysteme
... The main focus of this paper is the production of polynomial automorphisms, which are able to be used to generalize fuzzy connectives. Achieving this is important because of the crucial role that fuzzy connectives play in the fields of fuzzy logic and, consequently, artificial intelligence [1]. To be more specific, the above-mentioned fields will benefit from advancements in the study of fuzzy connectives, like the one presented in this paper, as they make possible further developments and optimizations. ...
... The definition displayed in this subsection can be found in the following references: Baczyński M., 1 [22], and Trillas E., p. 49, [23]. ...
... The sequent definition can be found in: Klement [22], and Yun s., p. 16, [24]. 1] is called a triangular norm, or t-norm, if it satisfies, for all x, y ∈ [0, 1], the following conditions: (T1) T(x, y) = T(y, x), (commutativity); (T2) T(x, T(y, z)) = T(T(x, y), z), (associativity); (T3) i f y ≤ z, then T(x, y) ≤ T(x, z), (monotonicity); (T4) T(x, 1) = x, (boundary condition). ...
Article
Full-text available
Fuzzy logic is becoming one of the most-influential fields of modern mathematics with applications that impact not only other sciences, but society in general. This newly found interest in fuzzy logic is in part due to the crucial role it plays in the development of artificial intelligence. As a result, new tools and practices for the development of the above-mentioned field are in high demand. This is one of the issues this paper was composed to address. To be more specific, a sizable part of fuzzy logic is the study of fuzzy connectives. However, the current method used to generalize them is restricted to the use of basic automorphisms, which hinders the creation of new fuzzy connectives. For this reason, in this paper, a new method of generalization is conceived of that aims to generalize the fuzzy connectives using polynomial automorphism functions instead. The creation of these automorphisms is achieved through numerical analysis, an endeavor that is supported with programming applications that, using mathematical modeling, validate and visualize the research. Furthermore, the automorphisms satisfy all the necessary criteria that have been established for use in the generalization process and, consequently, are used to successfully generalize fuzzy connectives. The result of the new generalization method is the creation of new usable and flexible fuzzy connectives, which is very promising for the future development of the field.
... Turing (1950) and McCarthy et al. (2006) have long asked questions about whether computers can really think and mimic humans, leading to the development of Fuzzy Logic (FL) theories, for instance. FL is a method of reasoning based on vague and imprecise information and resembles human reasoning (Klement and Slany, 1993). ...
Article
Full-text available
Technology has mostly been embraced in qualitative research as it has not directly conflicted with qualitative methods' paradigmatic underpinnings. However, Artificial Intelligence (AI), and in particular the process of automating the analysis of qualitative research, has the potential to be in conflict with the assumptions of interpretivism. The short article aims to explore how AI technologies, such as Natural Language Processing (NLP), have started to be used to analyze qualitative data. While this can speed up the analysis process, it has also sparked debates within the interpretive paradigm about the validity and ethics of these methods. I argue that research underpinned by the human researcher for contextual understanding and final interpretation should mostly remain with the researcher. AI might overlook the subtleties of human communication. This is because automated programmes with clear rules and formulae do not work well-under interpretivism's assumptions. Nevertheless, AI may be embraced in qualitative research in a partial automation process that enables researchers to conduct rigorous, rapid studies that more easily incorporate the many benefits of qualitative research. It is possible that AI and other technological advancements may lead to new research paradigms that better underpin the contemporary digital researcher. For example, we might see the rise of a “computational” paradigm. While AI promises to enhance efficiency and rigor in data analysis, concerns remain about its alignment with interpretivism.
... Fuzzy logic is a type of mathematical logic that allows for reasoning with imprecise or uncertain information. It is particularly useful in situations where traditional binary logic systems are inadequate, such as in control systems (Lee 1990) and artificial intelligence (Klement and Slany 1993) applications. ...
Article
Full-text available
During the development of a field, many fluid samples are taken from wells. Selecting a robust fluid sample as the reservoir representative helps to have a better field characterization, reliable reservoir simulation, valid production forecast, efficient well placement and finally achieving optimized ultimate recovery. First, this paper aims to detect and separate the samples that have been collected under poor conditions or analyzed in a non-standard way. Moreover, it introduces a novel ranking method to score the samples based on the amount of coordination with other fluid samples in the region. The dataset includes 136 fluid samples from five reservoirs in Iranian fields, each of them consisting of 21 key parameters. Five acknowledged machine learning based anomaly detection techniques are implemented to compare fluid samples and detect those whose results deviate from others, indicating non-standard samples. To ensure the proper detection of outlier data, the results are compared with the traditional validation method of gas-oil ratio estimation. All five outlier detection methods demonstrate acceptable performance with average accuracy of 79% compared to traditional validation. Furthermore, the fluid samples with the highest scores in scoring-based algorithms are introduced as the best reservoir’s representative fluid. Finally, fuzzy logic is used to obtain a final score for each sample, taking the results of the six methods as input and ranking the samples based on their output score. The study confirms the robustness of the novel approach for fluid validation using outlier detection techniques and the value of machine learning and fuzzy logic for sample ranking, excelling in considering all critical fluid parameters simultaneously over traditional methods.
... This technology, which has been around since the 1960s and has been widely employed in medical research for decades, is used for categorizing applications like predicting whether a patient will contract a specific disease. The weights of the variables or "features" that connect inputs to outputs are taken into consideration while analyzing problems [42][43][44]. It has been compared to how neurons interpret signals, however that comparison focuses on very mediocre brain activity. ...
Article
Full-text available
Incubator Control for Premature Babies has benefited greatly from the development of creative methods and uses of artificial intelligence. Due to the immaturity of the epidermis, premature infants lose fluid and heat early in life, which causes hyperosmolar dehydration and hypothermia. Water loss through the epidermis. Therefore, in order to maintain the baby's healthy temperature, an incubator is required. As a result, it is anticipated that the baby will maintain the same temperature as in the mother's womb. A temperature regulation system with good measurement and regulation quality is necessary due to the necessity of Incubator Control for Premature Infants with Artificial Intelligence Based on Fuzzy Logic in treating premature infants. The purpose of this research is to assess current trends in artificial intelligence-based fuzzy logic incubator control for preterm infants. The Preferred Reporting Items for Systematic Review (PRISMA) were used in this study's systematic literature review. 188 suitable articles that fit the inclusion requirements were found after the articles were screened and chosen. The outcomes demonstrated that the Incubator Control for Premature Infants offered the best environment for newborns with growth or disease-related issues (premature babies). An incubator is a sealed space free of dust and bacteria with the ability to regulate temperature, humidity, and oxygen to maintain a stable environment.
... Alguns dos primeiros trabalhos foram feitos pela Fuji Electric em um tratamento de água e pela Hitachi em um sistema de metrô. A partir de 1990 é que algumas empresas dos EUA começam a utilizar aplicações em lógica nebulosa no contexto industrial (ABAR, 2004).Grande parte dos trabalhos que foram desenvolvidos ao longo do tempo consistiram em aplicações de sistemas de controle e de sistemas com elevada complexidade, que tentam auxiliar ou substituir o raciocínio humano(KLEMENT, 1994). Tais sistemas de controle podem ser encontrados na indústria, como uma forma de checagem de processos, em veículos de transporte, para que uma determinada função seja executada devidamente, e até em máquinas que nos auxiliam no dia a dia. ...
Article
Full-text available
O conceito de Lógica Fuzzy foi criado por Loftali Askar-Zadeh em 1965 e é considerado uma das principais técnicas de inteligência artificial dos últimos 50 anos. Nesta contribuição, foi desenvolvida inicialmente uma abordagem analítica de um pêndulo invertido em um carrinho que foi simulado numericamente em linguagem Python. Em seguida, foi desenvolvido um controlador nebuloso também em Python para manter o pêndulo invertido na posição vertical de acordo com as propriedades mecânicas do sistema e condições iniciais. O controlador considera o ângulo e a velocidade angular do pêndulo no processo de ajuste para a resposta da força horizontal aplicada ao carrinho. Concluindo, observando os resultados apresentados, o controlador nebuloso foi capaz de reagir ao sistema de forma satisfatória, obtendo uma posição de equilíbrio em diferentes configurações de comprimento do pêndulo.
Article
Full-text available
The digitalization of classic power systems enables the monitoring of diverse processes and occurrences in power electric systems and big data collection. Manipulating data is a key role in decision-making and planning, and machine learning (ML) algorithms can mimic human-like reasoning and automate decision processes. This manuscript centers on applying ML algorithms to create maintenance schedules for power generators, primarily based on stator winding testing. The investigation showcases the implementation of multiple ML algorithms, encompassing fuzzy logic (FL), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN), autoencoder neural network (AENN), and decision tree (DT). This research’s primary objective is to investigate and determine the most acceptable ML algorithm that fits the most with the assessments made by human experts. Based on real data from power plants of Serbia best prediction correlation to the human expert shows DT algorithms with 69.45%, then ANN 68.05%, ANFIS 59.56%, FL 47.81% and at the end AENN 6.83%. The presented ML algorithms results with risk maps prove to be a valuable instrument for condition and risk-based maintenance. The presented methodology for decision making can be used to optimize power plants maintenance schedules, minimize downtime, and maximize operational efficiency in one power system. This investigation contributes to the progress of the electrical power industry by showcasing the capabilities of ML in facilitating informed decision-making processes and enabling proactive maintenance strategies.
Article
Digital technology refers to any technology that uses digital signals or electronic data to process, store, and transmit information. Some examples of digital technologies include social media platforms, cloud computing, artificial intelligence, virtual and augmented reality, and blockchain technology. Digital technology has the potential to play a significant role in achieving sustainable development goals by providing solutions for a wide range of environmental, social, and economic challenges. In this manuscript, we investigate digital technology implementation under sustainable development and would find which area of sustainable development is most in need of digital technology. Further, we investigate the operational laws based on Schweizer-Sklar t-norm and t-conorm and originate aggregation operators based on these deduced operational laws under the environment of bipolar complex fuzzy set that is bipolar complex fuzzy Schweizer-Sklar power averaging, bipolar complex fuzzy Schweizer-Sklar power weighted averaging, bipolar complex fuzzy Schweizer-Sklar power geometric and bipolar complex fuzzy Schweizer-Sklar power weighted geometric operators and then we deduce techniques of decision-making utilizing these originated operators. Afterward, we tackle a numerical example related to the digital technology implementation under sustainable development by considering artificial data and finding the area of sustainable development which is most in need of digital technology. Moreover, we reveal the impact of one of the digital technologies that are artificial intelligence in the field of healthcare and study a numerical example by considering hypothetical data by employing the originated technique of decision-making. At the last, we do a comparison of the deduced operators with numerous current operators to reveal the superiority and benefits of the deduced operators.
Book
K. Menger [Statistical metrics. Proc. Natl. Acad. Sci. USA 28, 535- 537 (1942)] proposed a probabilistic generalization of the theory of metric spaces by introducing the concept of probabilistic (statistical) metric space. This paper by Menger constituted the starting point for a field of research known as the theory of probabilistic metric spaces. This monograph presents an organized body of advanced material on this theory, incorporating much of the authors’ own research. It begins with the introductory chapter 1 devoted to historical aspects of this theory. The remaining chapters are divided into two major parts. chapters 2 through 7 develop the mathematical tools which are needed for the study of probabilistic metric spaces. This study properly begins with chapter 8 and goes through to chapter 15. Chapter 8 contains the basic definitions and simple properties. Chapters 9, 10, and 11 are devoted to special classes of probabilistic spaces: random metric spaces, distribution-generated spaces, and transformation-generated spaces. Chapters 12 and 13 deal with topologies and generalized topologies. Chapter 14 is devoted to betweenness. The final chapter is concerned with related structures such as probabilistic normed, inner-product, and information spaces. An extensive literature accompanies the text. Clearly written, this unified and self-contained monograph on probabilistic metric spaces will be particularly useful to researchers who are interested in this field. It is also suitable as a text for a graduate course on selected topics in applied probability. Probabilistic metric spaces. Available from: https://www.researchgate.net/publication/265461280_Probabilistic_metric_spaces [accessed May 11, 2015].
Article
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
“Someday, perhaps soon, we will build a machine that will be able to perform the functions of a human mind, a thinking machine” [88], the first sentence in Hillis’ book on the Connection Machine, a legendary computing machine that provided a large number of tiny processors and memory cells connected by a programmable communications network. Alan Turing probably had a very similar vision much earlier in the 20th century. What was real processor and real memory for Hillis was pencil and paper for Turing.
Conference Paper
The most important operations on fuzzy sets, namely intersection, union and complementation, and the addition of fuzzy numbers from the very general point of view of the theory of triangular norms which have been introduced and studied in the theory of probabilistic metric spaces and which provide a unifying concept for these operations, are discussed.