Article

A Mathematical Theory of Communication

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Bell System Technical Journal, also pp. 623-656 (October)

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Let X be a random variable defined on R n with density function f (x), which is assumed to be differentiable. The differential entropy H(X) and the Fisher information I(X) of X are, respectively, defined to be In 1948, Shannon [1] proposed the entropy power inequality (EPI) N X+Y ≥ N X + N Y , where X and Y are independent random variables defined by R n and N(X) := exp( 2 n H(X))/ (2πe). As one of the most important inequalities in information theory, Shannon's EPI has many proofs and applications [2][3][4][5][6]. ...
... The main idea of the proof is that proof for inequality (1) can be reduced to the proof of whether a quadratic polynomial is a sum of squares (SOS) [18] of linear forms, which can be solved with SDP [19]. The SOS is explicitly given, which provides a rigourous proof for the theorem. ...
... In Theorem 1, we do not assume that X is a log-concave variable. If adding the logconcave condition, then from Toscani [20], 1 I(X t ) is concave, which implies inequality (1) and the proof can be found in Lemma 2. A drawback of the approach based on SDP is that the proof is difficult for people to check. Although the SOS gives an explicit proof for the theorem, it is quite large to be computed manually. ...
Article
Full-text available
Recently, Ledoux, Nair, and Wang proved that the Fisher information along the heat flow is log-convex in dimension one, that is d2dt2log(I(Xt))≥0 for n=1, where Xt is a random variable with density function satisfying the heat equation. In this paper, we consider the high dimensional case and prove that the Fisher information is square root convex in dimension two, that is d2dt2IX≥0 for n=2. The proof is based on the semidefinite programming approach.
... On the other hand, different measures have been introduced for studying causal interactions, such as directed transfer function [18], directed coherence [14,15], partial directed coherence [16,17], and Granger causality [12,93]. Conversely, more general approaches, such as mutual information [75,95] and transfer entropy [13,96], can investigate non-linear dependencies between the recorded signals, starting from the definition of entropy given by Shannon [95] and based on the estimation of probability distributions of the observed data. Importantly, under the Gaussian assumption [19], modelfree and model-based measures converge and can be inferred from the linear parametric representation of multivariate vector autoregressive (VAR) models [12,20,24]. ...
... On the other hand, different measures have been introduced for studying causal interactions, such as directed transfer function [18], directed coherence [14,15], partial directed coherence [16,17], and Granger causality [12,93]. Conversely, more general approaches, such as mutual information [75,95] and transfer entropy [13,96], can investigate non-linear dependencies between the recorded signals, starting from the definition of entropy given by Shannon [95] and based on the estimation of probability distributions of the observed data. Importantly, under the Gaussian assumption [19], modelfree and model-based measures converge and can be inferred from the linear parametric representation of multivariate vector autoregressive (VAR) models [12,20,24]. ...
... The statistical dependencies among electrophysiological signals can be evaluated using information theory. Concepts of mutual information, mutual information rate, and information transfer are widely used to assess the information exchanged between two interdependent systems [75,95], the dynamic interdependence between two systems per unit of time [22,23], and the dynamic information transferred to the target from the other connected systems [13,96], respectively. The main advantage of these approaches lies in the fact that they are probabilistic and can thus be stated in a fully model-free formulation. ...
Article
Full-text available
Understanding how different areas of the human brain communicate with each other is a crucial issue in neuroscience. The concepts of structural, functional and effective connectivity have been widely exploited to describe the human connectome, consisting of brain networks, their structural connections and functional interactions. Despite high-spatial-resolution imaging techniques such as functional magnetic resonance imaging (fMRI) being widely used to map this complex network of multiple interactions, electroencephalographic (EEG) recordings claim high temporal resolution and are thus perfectly suitable to describe either spatially distributed and temporally dynamic patterns of neural activation and connectivity. In this work, we provide a technical account and a categorization of the most-used data-driven approaches to assess brain-functional connectivity, intended as the study of the statistical dependencies between the recorded EEG signals. Different pairwise and multivariate, as well as directed and non-directed connectivity metrics are discussed with a pros–cons approach, in the time, frequency, and information-theoretic domains. The establishment of conceptual and mathematical relationships between metrics from these three frameworks, and the discussion of novel methodological approaches, will allow the reader to go deep into the problem of inferring functional connectivity in complex networks. Furthermore, emerging trends for the description of extended forms of connectivity (e.g., high-order interactions) are also discussed, along with graph-theory tools exploring the topological properties of the network of connections provided by the proposed metrics. Applications to EEG data are reviewed. In addition, the importance of source localization, and the impacts of signal acquisition and pre-processing techniques (e.g., filtering, source localization, and artifact rejection) on the connectivity estimates are recognized and discussed. By going through this review, the reader could delve deeply into the entire process of EEG pre-processing and analysis for the study of brain functional connectivity and learning, thereby exploiting novel methodologies and approaches to the problem of inferring connectivity within complex networks.
... Others have studied the relaxation of the measure in Equation (1) by introducing free parameters (see Kaniadakis [10] and Sharma and Mittal [11]). In the sequel to this paper, we will concentrate on the Renyi entropy (RE), which was first discussed in relation to coding and information theory by Renyi [12] as one of the first endeavors to extend the Shannon entropy (Shannon [13]). The definition of the RE is ...
... in which "log" represents the natural logarithm. Specifically, the Shannon differential entropy [13] can be calculated as ...
Article
Full-text available
The measurement of uncertainty across the lifetimes of engineering systems has drawn more attention in recent years. It is a helpful metric for assessing how predictable a system’s lifetime is. In these circumstances, Renyi entropy, a Shannon entropy extension, is particularly appealing. In this paper, we develop the system signature to give an explicit formula for the Renyi entropy of the residual lifetime of a coherent system when all system components have lived to a time t. In addition, several findings are studied for the aforementioned entropy, including the bounds and order characteristics. It is possible to compare the residual lifespan predictability of two coherent systems with known signatures using the findings of this study.
... In information theory, Shannon and Weaver [70] were the first to establish the concept of a divergence measure. The said measure is a logarithmic function and defined as follows. ...
... In this step, the enhanced logarithm function is employed to calculate the overall performances of the alternatives. It is derived from a non-linear function pioneered by Shannon and Weaver [70] and variants have been researched [71][72][73]. The computation is performed using the following equation: ...
Article
Full-text available
In multi-criteria decision-making (MCDM) research, the criteria weights are crucial components that significantly impact the results. Many researchers have proposed numerous methods to establish the weights of the criterion. This paper provides a modified technique, the fuzzy method based on the removal effects of criteria (MEREC) by modifying the normalization technique and enhancing the logarithm function used to assess the entire performance of alternatives in the weighting process. Since MCDM problems intrinsically are ambiguous or complex, fuzzy theory is used to interpret the linguistic phrases into triangular fuzzy numbers. The comparative analyses were conducted through the case study of staff performance appraisal at a Malaysian academic institution and the simulation-based study is used to validate the effectiveness and stability of the presented method. The results of the fuzzy MEREC are compared with those from a few different objective weighting techniques based on the correlation coefficients, outlier tests and central processing unit (CPU) time. The results of the comparative analyses demonstrate that fuzzy MEREC weights are verified as the correlation coefficient values are consistent throughout the study. Furthermore, the simulation-based study demonstrates that even in the presence of outliers in the collection of alternatives, fuzzy MEREC is able to offer consistent weights for the criterion. The fuzzy MEREC also requires less CPU time compared to the existing MEREC techniques. Hence, the modified method is a suitable alternative and efficient for computing the objective criteria weights in the MCDM problems.
... The Shannon, or block entropy [38], is defined by H n = −∑ a 1 ,a 2 ,...,a n P(a 1 , a 2 , . . . , a n ) log 2 P(a 1 , a 2 , . . . ...
... The Shannon, or block entropy [38], is defined by ...
Article
Full-text available
We propose a new high-speed secret key distillation system via public discussion based on the common randomness contained in the speech signal of the protocol participants. The proposed system consists of subsystems for quantization, advantage distillation, information reconciliation, an estimator for predicting conditional Renyi entropy, and universal hashing. The parameters of the system are optimized in order to achieve the maximum key distillation rate. By introducing a deep neural block for the prediction of conditional Renyi entropy, the lengths of the distilled secret keys are adaptively determined. The optimized system gives a key rate of over 11% and negligible information leakage to the eavesdropper, while NIST tests show the high cryptographic quality of produced secret keys. For a sampling rate of 16 kHz and quantization of input speech signals with 16 bits per sample, the system provides secret keys at a rate of 28 kb/s. This speed opens the possibility of wider application of this technology in the field of contemporary information security.
... Reference [1] made a significant contribution to statistics by coining the term "entropy", which refers to the measurement of uncertainty in a probability distribution. If X is a nonnegative random variable (rv) that admits an absolutely continuous cumulative distribution function (cdf) F(x) with the corresponding probability density function (pdf) f (x), then the Shannon entropy concept associated with X can be defined as follows: ...
Article
Full-text available
Shannon developed the idea of entropy in 1948, which relates to the measure of uncertainty associated with a random variable X. The contribution of the extropy function as a dual complement of entropy is one of the key modern results based on Shannon’s work. In order to develop the inferential aspects of the extropy function, this paper proposes a non-parametric kernel type estimator as a new method of measuring uncertainty. Here, the observations are exhibiting α-mixing dependence. Asymptotic properties of the estimator are proved under appropriate regularity conditions. For comparison’s sake, a simple non-parametric estimator is proposed, and in this respect, the performance of the estimator is investigated using a Monte Carlo simulation study based on mean-squared error and using two real-life data.
... The reliability of a system decreases as uncertainty increases, and systems with longer lifetimes and lower uncertainty are better systems (see, e.g., Ebrahimi and Pellery, [1]). It has found applications in numerous areas described in Shannon's seminal work, [2]. Information theory provides a measure of the uncertainty associated with a random phenomenon. ...
Article
Full-text available
Measuring the uncertainty of the lifetime of technical systems has become increasingly important in recent years. This criterion is useful to measure the predictability of a system over its lifetime. In this paper, we assume a coherent system consisting of n components and having a property where at time t, all components of the system are alive. We then apply the system signature to determine and use the Tsallis entropy of the remaining lifetime of a coherent system. It is a useful criterion for measuring the predictability of the lifetime of a system. Various results, such as bounds and order properties for the said entropy, are investigated. The results of this work can be used to compare the predictability of the remaining lifetime between two coherent systems with known signatures.
... Real patient DNA sequence, N=745 bases (UB leukemia patient DB): (a) nucleotides distribution; (b) nucleotides frequenies (A -1; C -2; G -3; T -Simulated 4 elements sequence by Uniform distribution, N=745 bases: ): (a) nucleotides distribution; (b) nucleotides frequenies (A -1; C -2; G -3; T -4)Claude Shannon originally proposed the formula of information entropy in 1948[21] ...
Preprint
Full-text available
The purpose of this study is to provide an accurate formula for calculating entropy for short DNA sequences and to demonstrate how to use it to examine leukemia patient surviving. We used IDIBAPS leukemia patient’s data base with 117 anonymized records. The generalized form of the Robust Entropy Estimator (EnRE) for short DNA sequences was proposed and key EnRE futures was showed. The Survival Analysis has been done using statistical package IBM SPSS. Entropy EnRE were calculated for leukemia patients for two samples: A. 2 groups divided by median EnRE and B. 2 groups of patients were formed according to their belonging to 1st and 4th quartiles of EnRE. The result of survival analysis are statistically significant: A. p < 0.05; B. p < 0.005. The death hazard for a patient with EnRE below median is 1.556 times that of a patient with EnRE over median and that the death hazard for a patient of 1st quartile (lowest EnRE) is 2.143 times that of a patient of 4th quartile (highest EnRE). The transition from median to quartile patients’ groups with more EnRE differentiation confirmed the unique significance of the entropy of DNA sequences for leukemia patients surviving.
... This information is processed and interpreted by gene regulatory processes to make 'decisions' about responding to environmental challenges and carrying out physiological, neural, and behavioral changes. Thus, nutrigenomic mechanisms could provide a critical path for information flow in biological systems (Shannon, 1948;Reinagel, 2000;Smith, 2000;Fabris, 2009). A clear advantage could reside in their ability to amplify transient, and often minor, variations in nutrient and activity levels into strong reactions, which can be used to orchestrate responses to current and future environmental challenges. ...
Article
Full-text available
Diet profoundly influences brain physiology, but how metabolic information is transmuted into neural activity and behavior changes remains elusive. Here, we show that the metabolic enzyme O-GlcNAc Transferase (OGT) moonlights on the chromatin of the D. melanogaster gustatory neurons to instruct changes in chromatin accessibility and transcription that underlie sensory adaptations to a high-sugar diet. OGT works synergistically with the Mitogen Activated Kinase/Extracellular signal Regulated Kinase (MAPK/ERK) rolled and its effector stripe (also known as EGR2 or Krox20) to integrate activity information. OGT also cooperates with the epigenetic silencer Polycomb Repressive Complex 2.1 (PRC2.1) to decrease chromatin accessibility and repress transcription in the high-sugar diet. This integration of nutritional and activity information changes the taste neurons' responses to sugar and the flies' ability to sense sweetness. Our findings reveal how nutrigenomic signaling generates neural activity and behavior in response to dietary changes in the sensory neurons.
... be the number of genes in the regulon gene set of the rth regulator, where I{·} is the indicator function, then the discrete joint sampling distribution has the dimensionality (N r × G). Subsequently, we recognize that from an information theoretic perspective, both versions of the alternative hypothesis reduce the Shannon entropy [18] of the discrete joint sampling distribution for the gene set members due to an increase in the probability mass at extremes of the nonparametric differential gene expression signature. Furthermore, if we consider the magnitude of the difference in the protein activity between the test phenotype (A) and the reference phenotype (B), we conclude that a greater difference in the activity of the rth regulator corresponds with a greater increase in probability mass at the extremes of the nonparametric differential gene expression signature and therefore a more substantial reduction in the joint Shannon entropy of the discrete joint probability distribution. ...
Article
Full-text available
Gene sets are being increasingly leveraged to make high-level biological inferences from transcriptomic data; however, existing gene set analysis methods rely on overly conservative, heuristic approaches for quantifying the statistical significance of gene set enrichment. We created Nonparametric analytical-Rank-based Enrichment Analysis (NaRnEA) to facilitate accurate and robust gene set analysis with an optimal null model derived using the information theoretic Principle of Maximum Entropy. By measuring the differential activity of ~2500 transcriptional regulatory proteins based on the differential expression of each protein’s transcriptional targets between primary tumors and normal tissue samples in three cohorts from The Cancer Genome Atlas (TCGA), we demonstrate that NaRnEA critically improves in two widely used gene set analysis methods: Gene Set Enrichment Analysis (GSEA) and analytical-Rank-based Enrichment Analysis (aREA). We show that the NaRnEA-inferred differential protein activity is significantly correlated with differential protein abundance inferred from independent, phenotype-matched mass spectrometry data in the Clinical Proteomic Tumor Analysis Consortium (CPTAC), confirming the statistical and biological accuracy of our approach. Additionally, our analysis crucially demonstrates that the sample-shuffling empirical null models leveraged by GSEA and aREA for gene set analysis are overly conservative, a shortcoming that is avoided by the newly developed Maximum Entropy analytical null model employed by NaRnEA.
... Furthermore, an alpha rarefaction analysis was performed using the MetONTIIME statistical package [22,23] in order to examine the association between the sequencing depth and the richness of the bacterial community under study. Alpha diversity at the genus level was assessed using two distinct metrics, Shannon diversity index [24] and observed features. Finally, based on the classification results, pie charts and bar plots were built using Microsoft Office Excel (Redmond, WA, USA). ...
Article
Full-text available
Mediterranean mussels (Mytilus galloprovincialis), due to their nutritional mechanisms which involve filtering huge amounts of water, are affected by seawater pollution and can host microbial diversity of environmental origin, as well as pathogenic bacteria that must be constantly monitored. Herein, we applied a Next Generation Sequencing (NGS) metabarcoding approach in order to study the M. galloprovincialis microbiota. Collection of samples was conducted during winter and summer months from various mussel farm zones located in specific farm regions in the Thermaikos gulf, the northern Aegean Sea, Greece. A microbiological test was performed for the enumeration of Escherichia coli and the presence of Salmonella sp. DNA extraction and amplification of the whole bacterial 16S rRNA gene, followed by NGS amplicon sequencing and taxonomic classification, were carried out. Statistically significant differences (p < 0.05) in the abundance of the most dominant bacterial phyla, families and genera between winter and summer time periods, regions, as well as zones within each region of sampling, were evaluated with z-score computation. According to the obtained results, the most prevalent taxa at the genus level were Mycoplasma (12.2%), Anaplasma (5.8%), Ruegeria (5.2%) and Mariniblastus (2.1%). Significant differences in the abundance of the most dominant genera were found at all levels of comparison (seasons, regions and zones within each region), highlighting the dynamic character of microorganisms, which might be affected by microenvironmental, temporal and spatial changes. The present research contributes to the characterization of M. galloprovincialis microbiome in areas that have not been studied previously, setting the baseline for future, more thorough investigations of the specific bivalve species and its bacterial profile in the above geographic regions.
... To prevent malicious attackers from guessing the connection between sensitive data and user identity through background knowledge attacks, this paper proposes an active protection strategy to protect the privacy of data through the information value of data. According to the information entropy theory proposed by Shannon [15], we can know that if a transaction or information has a larger possible value, it means that it contains more information, and then it is more likely to identify the data record uniquely. In extreme cases, such as ID card numbers, the information entropy is very large and can uniquely identify a person. ...
Article
Full-text available
In recent years, cloud computing has attracted extensive attention from industry and academia due to its convenience and ubiquity. As a new Internet-based IT service model, cloud computing has brought revolutionary changes to traditional computing and storage services. More and more individual users and enterprises are willing to deploy their own data and applications on the cloud platform, but the accompanying security issues have also become an obstacle to the development of cloud computing. Multi-tenancy and virtualization technologies are the main reasons why cloud computing faces many security problems. Through the virtualization of storage resources, multi-tenant data are generally stored as shared physical storage resources. To distinguish the data of different tenants, labels are generally used to distinguish them. However, this simple label cannot resist the attack of a potential malicious tenant, and data still has the risk of leakage. Based on this, this paper proposed a data partitioning method in a multi-tenant scenario to prevent privacy leakage of user data. We demonstrate the use of the proposed approach in protecting patient data in medical records in health informatics. Experiments show that the proposed algorithm can partition the attributes more fine-grained and effectively protect the sensitive information in the data.
... Shape grammar shares the same explanatory principles of Chomsky's generative grammars with its elegant formalism of "seeing and doing", 70 but similarly, it also lacks the expressivity of Norvig's statistical and probabilistic models. If one is to agree with Chomsky that a linguist's job is to define rules that could distinguish between what is grammatical and ungrammatical, then it would not be 68 (Shannon 1948). 69 (Breiman 2001). ...
... A Mathematical Theory of Communication, Claude E. Shannon's 1948 article [23], was the first to establish the idea of entropy. Entropy is "a measure of the uncertainty associated with a random variable", according to Wikipedia. ...
Article
Full-text available
In this work, we focus on the centered Hausdorff measure, the packing measure, and the Hewitt–Stromberg measure that determines the modified lower box dimension Moran fractal sets. The equivalence of these measures for a class of Moran is shown by having a strong separation condition. We give a sufficient condition for the equality of the Hewitt–Stromberg dimension, Hausdorff dimension, and packing dimensions. As an application, we obtain some relevant conclusions about the Hewitt–Stromberg measures and dimensions of the image measure of a t-invariant ergodic Borel probability measures. Moreover, we give some statistical interpretation to dimensions and corresponding geometrical measures.
... The investigation of fundamental bounds in the form of entropy inequalities, limiting the distribution of information within a multipartite system, marked the birth of information theory as a field of study [1]. In the same way, the exploration of fundamental limitations on the distribution of quantum information within a system lies at the core of the rapidly evolving field of quantum information theory. ...
Preprint
Full-text available
We present a novel inequality on the purity of a bipartite state depending solely on the difference of the local Bloch vector lengths. For two qubits this inequality is tight for all marginal states and so extends the previously known solution for the 2-qubit marginal problem and opens a new research avenue. We further use this inequality to construct a 3-dimensional Bloch model of the 2-qubit quantum state space in terms of Bloch lengths, thus providing a geometrically pleasing visualization of this difficult to access high-dimensional state space. This allows to characterize quantum states relying on a strongly reduced set of parameters alone and to investigate the interplay between local properties of the marginal systems and global properties encoded in the correlations.
Article
Bu çalışmada, Avrupa Birliği (AB) ülkelerinin mevcut yaşam maliyeti analizinin Çok Kriterli Karar Verme (ÇKKV) yöntemleri kullanılarak ölçülmesi amaçlanmıştır. Araştırma için gerekli veriler Numbeo adlı siteden elde edilmiştir ve 2021 yıl ortasını kapsamaktadır. Çalışma kapsamına 27 alternatif ve beş kriter (kira endeksi, yaşam maliyeti+kira endeksi, bakkaliye endeksi, restaurant fiyat endeksi, yerel satın alma gücü endeksi) dahil edilmiştir. Entropy yöntemi kriterlerin ağırlıklandırılması için kullanılırken, COPRAS-ARAS entegre modeli alternatifleri değerlendirmek için kullanılmıştır. Sonuçların sağlamlığı ve güvenilirliği duyarlılık analizi uygulanarak gerçekleştirilmiştir. Bu kapsamda, Entropy temelli COPRAS-ARAS entergre modeli ile elde edilen sonuçlar Entropy temelli SAW, PIV, ROV, CoCoSo ve MARCOS yöntemleri ile elde edilen sonuçlar ile karşılaştırılmıştır. Son adımda ise çeşitli ÇKKV yöntemleri ile elde edilen sonuçlar Copeland yöntemi kullanılarak rasyonel nihai bir sıralama haline getirilmiştir. Çalışma sonunda, mevcut yaşam maliyeti açısından en ucuz ülke Romanya olarak tespit edilirken, Lüksemburg en pahalı ülke olarak tespit edilmiştir. Bu çalışma, mevcut yaşam maliyeti analizini ÇKKV yöntemleri ile ele alan ilk çalışma olması bakımından önemlidir ve çalışmanın literatürdeki boşluğu dolduracağı düşünülmektedir.
Chapter
Full-text available
To improve the user experience, service providers may systematically record and analyse user interactions with a service using event logs. User journeys model these interactions from the user’s perspective. They can be understood as event logs created by two independent parties, the user and the service provider, both controlling their share of actions. We propose multi-party event logs as an extension of event logs with information on the parties, allowing user journeys to be analysed as weighted games between two players. To reduce the size of games for complex user journeys, we identify decision boundaries at which the outcome of the game is determined. Decision boundaries identify subgames that are equivalent to the full game with respect to the final outcome of user journeys. The decision boundary analysis from multi-party event logs has been implemented and evaluated on the BPI Challenge 2017 event log with promising results, and can be connected to existing process mining pipelines.
Article
Full-text available
The one-dimensional cellular automata (CA) system detailed herein uses a hybrid mechanism to attain reversibility, and this approach is adapted to create a novel block cipher algorithm called HCA (Hybrid Cellular Automata). CA are widely used for modeling complex systems and display inherently parallel properties. Therefore, applications derived from CA have a tendency to fit very well in the current computational paradigm where multithreading potential is very desirable. The HCA system has recently received a patent by the Brazilian agency INPI. Analyses performed on the model are presented here, including a theoretical discussion on its reversibility. Finally, the cryptographic robustness of HCA is empirically evaluated through avalanche property compliance and the NIST randomness suite.
Article
Classical systems of mereology identify a maximuml set of jointly exhaustive and pairwise disjoint (RCC5) relations. The amount of information that is carried by each member of this set of (crisp) relations is determined by the number of bits of information that are required to distinguish all the members of the set. It is postulated in this paper, that vague mereological relations are limited in the amount of information they can carry. That is, if a crisp mereological relation can carry N bits of information, then a vague mereological relation can carry only 1 ⩽ n < N bits. The aim of this paper is it to explore these ideas in the context of various systems of vague mereological relations. The resulting formalism is non-classical in the sense of quantum information theory.
Chapter
Logic minimization plays a significant part in decreasing the complexity of the circuit since the number of Gates will be diminished… Till today, the conventional or traditional approaches like Boolean laws; Karnaugh map; Quine-Mccluskey are in existence for Boolean expression simplification. The above approaches have several drawbacks; to name a few – logical synthesis complexity, multiple solutions using k-maps, the number of cells increases exponentially with number of variables in a Boolean function and more computational time is needed while solving using Quine Mccluskey. Moreover, the implementation of k-map and Quine-Mccluskey method is difficult in logic synthesis of chip design. The solution to all above drawbacks is the use of Binary Decision Diagrams which is faster and its applicability to large circuits is possible. The main purpose of this work is to reduce the Boolean expressions considering DC function and also to calculate the entropy of the simplified Boolean expression using binary decision diagrams.KeywordsBinary decision diagramsBoolean expressionEntropyLogic gatesKarnaugh mapMinimal SOPQuine McClusky method
Chapter
This chapter presents results on learning control of quantum systems. In Sect. 5.2, two differential evolution (DE) algorithms are proposed and numerical results on quantum control via DE are presented for inhomogeneous open quantum ensembles and synchronization of quantum networks.\(^\text {a}\) In Sect. 5.3 closed-loop learning control using DE is applied to ultrafast quantum engineering, and experimental results on optimal and robust control of molecules using femtosecond laser pulses are presented.\(^\text {b}\) Numerical and experimental results on learning control design of quantum autoencoders are presented in Sect. 5.4.\(^\text {c}\) In Sect. 5.5, some progress on quantum control using reinforcement learning is introduced.\(^\text {d}\)
Article
Full-text available
15 Building retrofit has received renewed interests in recent years, driven by energy-savings and indoor 16 environmental quality goals. Digital technologies such as building performance simulation and 17 optimization algorithms have been used to identify optimal retrofit schemes, yet the existing approaches 18 are limited by the slow running speed of physics-based models and sub-optimal results. This study 19 describes a novel framework, the Building Performance Optimization using Artificial Neural Network 20 (BPO-ANN), which can automatically identify optimal building retrofit schemes. A robust Artificial 21 Neural Network model was developed and validated as a surrogate to rapidly assess building 22 performances, which was then connected to a genetic algorithm in search of Pareto optimal. The impact 23 of key design attributes on building performances have been assessed using sensitivity analysis. The 24 BPO-ANN framework has been tested in a high-performing campus building in Northern China under 25 two competing objectives: building energy demand and occupant thermal comfort. It can automatically 26 identify optimal design schemes, which were expected to achieve an energy-savings of 4% and reduce 27 the annual thermal discomfort percentage by 4%. Sensitivity analysis suggested that window-to-wall 28 ratio and HVAC setpoint have contributed the most to the performances of the campus building, 29 followed by the roof U-value and wall U-value. The study has contributed methodologically to 30 simulation-based optimization method, with novelties in the use of neural network algorithms to 31 accelerate the otherwise time-consuming physics-based simulation models. It has also contributed a 32 robust procedure in the tuning of hyperparameters in neural network models, with marked 33 improvements in model prediction and computational efficiency. 34 35
Article
Full-text available
Biochemical chain reactions are signal transduction cascades that can transmit biological information about the intracel-lular environment. In this study, we modelled a chain reaction as a code string for applying information theory. Herein, we assumed that cell signal transduction selects a strategy to maximize the transduced signal per signal event duration. To investigate the same, we calculated the information transmission capacity of the reaction chain by maximizing the average entropy production rate per reaction time, indicating the idea of the entropy coding method. Moreover, we defined a signal cascade trajectory. Subsequently, we found that the logarithm of the forward and reverse transition ratio per reaction time is equal to the entropy production rate, which derives the form of the fluctuation theorem in signal transduction. Our findings suggest the application of information entropy theory for analysing signal transduction.
Article
Full-text available
Statistics of natural scenes are not uniform—their structure varies dramatically from ground to sky. It remains unknown whether these nonuniformities are reflected in the large-scale organization of the early visual system and what benefits such adaptations would confer. Here, by relying on the efficient coding hypothesis, we predict that changes in the structure of receptive fields across visual space increase the efficiency of sensory coding. Using the mouse (Mus musculus) as a model species, we show that receptive fields of retinal ganglion cells change their shape along the dorsoventral retinal axis, with a marked surround asymmetry at the visual horizon, in agreement with our predictions. Our work demonstrates that, according to principles of efficient coding, the panoramic structure of natural scenes is exploited by the retina across space and cell types.
Chapter
Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management of agriculture and forests; assessment of climate change; and much more. With rapid advances in machine learning in the last decade, this book provides an urgently needed, comprehensive guide to machine learning and statistics for students and researchers interested in environmental data science. It includes intuitive explanations covering the relevant background mathematics, with examples drawn from the environmental sciences. A broad range of topics is covered, including correlation, regression, classification, clustering, neural networks, random forests, boosting, kernel methods, evolutionary algorithms and deep learning, as well as the recent merging of machine learning and physics. End‑of‑chapter exercises allow readers to develop their problem-solving skills, and online datasets allow readers to practise analysis of real data.
Article
Full-text available
Speech and song have been transmitted orally for countless human generations, changing over time under the influence of biological, cognitive, and cultural pressures. Cross-cultural regularities and diversities in human song are thought to emerge from this transmission process, but testing how underlying mechanisms contribute to musical structures remains a key challenge. Here, we introduce an automatic online pipeline that streamlines large-scale cultural transmission experiments using a sophisticated and naturalistic modality: singing. We quantify the evolution of 3,424 melodies orally transmitted across 1,797 participants in the United States and India. This approach produces a high-resolution characterization of how oral transmission shapes melody, revealing the emergence of structures that are consistent with widespread musical features observed cross-culturally (small pitch sets, small pitch intervals, and arch-shaped melodic contours). We show how the emergence of these structures is constrained by individual biases in our participants-vocal constraints, working memory, and cultural exposure-which determine the size, shape, and complexity of evolving melodies. However, their ultimate effect on population-level structures depends on social dynamics taking place during cultural transmission. When participants recursively imitate their own productions (individual transmission), musical structures evolve slowly and heterogeneously, reflecting idiosyncratic musical biases. When participants instead imitate others' productions (social transmission), melodies rapidly shift toward homogeneous structures, reflecting shared structural biases that may underpin cross-cultural variation. These results provide the first quantitative characterization of the rich collection of biases that oral transmission imposes on music evolution, giving us a new understanding of how human song structures emerge via cultural transmission.
Article
Full-text available
Deep learning is widely used in various fields due to the advancement of algorithms, the enrichment of high-efficiency databases, and the increase in computing power. Especially in the satellite communication, the learning and parallel computing capabilities of neural networks make them ideal for decoding. Many researchers have recently applied deep learning neural networks to decode high-density parity check (HDPC) codes (such as BCH and RS code), improving the decoder’s performance. This review aims to provide general insights on applying neural network decoders to satellite communications. Due to the neural network’s learning ability, the neural network-based decoder can be trained to change the weights, thereby reducing the influence of non-white noise in satellite communications, such as the influence between the satellite and the terrestrial network and the mutual interference within the satellites. To compensate for non-white noise, shortest circles in Tanner graph and unreliable information, a decoder system model for satellite communication constructed by three neural networks is presented.
Article
In some experiments such as stress testing and industrial quality control experiments, only values which are larger or smaller than all previous ones are observed. Study of such extremes are of great importance. An extensive research on record values using distribution function are available in literature; however, a quantile-based study on the same have not been considered so far. Motivated with these, in this article, we introduce a quantile function approach of record values, which is an equivalent and alternative to the traditional distribution function approach. We study various properties of quantile-based measures of record values. We also obtain some stochastic comparison and ageing properties of quantile-based record values. The L-moment estimation method of hazard quantile function of record values is explained using a real-data example.
Article
Carbon fiber reinforced polymer (CFRP) materials have been widely used in aerospace and other fields because of their excellent properties such as high temperature resistance and corrosion resistance, so the nondestructive inspection technology for CFRP materials has become a hot research topic. In this paper, we propose a method based on infrared thermography and Attention U-Net algorithm to characterize the defect shape of CFRP material surface. Firstly, the CFRP surface is scanned by a line laser and the trend of its temperature distribution is recorded using an infrared thermography camera. Subsequently, temperature analysis and entropy value of image information are calculated for individual defect image blocks in order to select images with clear defect contours. Next, the Attention U-Net is used to segment the defect in the image blocks, and the defect shape is characterization. By calculating the evaluation indexes of image segmentation, the method in this paper can achieve 99.57% accuracy, 97.06% recall, 96.63% precision, and the processing time for a single image is 0.13s. Finally, the algorithm of this paper is compared with other algorithms to verify the advantages of this research in the task of detecting surface defects in CFRP materials with fast and high accuracy.
Article
Full-text available
We computationally explore the relationship between surface–subsurface exchange and hydrological response in a headwater-dominated high elevation, mountainous catchment in East River Watershed, Colorado, USA. In order to isolate the effect of surface–subsurface exchange on the hydrological response, we compare three model variations that differ only in soil permeability. Traditional methods of hydrograph analysis that have been developed for headwater catchments may fail to properly characterize catchments, where catchment response is tightly coupled to headwater inflow. Analyzing the spatially distributed hydrological response of such catchments gives additional information on the catchment functioning. Thus, we compute hydrographs, hydrological indices, and spatio-temporal distributions of hydrological variables. The indices and distributions are then linked to the hydrograph at the outlet of the catchment. Our results show that changes in the surface–subsurface exchange fluxes trigger different flow regimes, connectivity dynamics, and runoff generation mechanisms inside the catchment, and hence, affect the distributed hydrological response. Further, changes in surface–subsurface exchange rates lead to a nonlinear change in the degree of connectivity—quantified through the number of disconnected clusters of ponding water—in the catchment. Although the runoff formation in the catchment changes significantly, these changes do not significantly alter the aggregated streamflow hydrograph. This hints at a crucial gap in our ability to infer catchment function from aggregated signatures. We show that while these changes in distributed hydrological response may not always be observable through aggregated hydrological signatures, they can be quantified through the use of indices of connectivity.
Article
Full-text available
Understanding speech requires mapping fleeting and often ambiguous soundwaves to meaning. While humans are known to exploit their capacity to contextualize to facilitate this process, how internal knowledge is deployed online remains an open question. Here, we present a model that extracts multiple levels of information from continuous speech online. The model applies linguistic and nonlinguistic knowledge to speech processing, by periodically generating top-down predictions and incorporating bottom-up incoming evidence in a nested temporal hierarchy. We show that a nonlinguistic context level provides semantic predictions informed by sensory inputs, which are crucial for disambiguating among multiple meanings of the same word. The explicit knowledge hierarchy of the model enables a more holistic account of the neurophysiological responses to speech compared to using lexical predictions generated by a neural network language model (GPT-2). We also show that hierarchical predictions reduce peripheral processing via minimizing uncertainty and prediction error. With this proof-of-concept model, we demonstrate that the deployment of hierarchical predictions is a possible strategy for the brain to dynamically utilize structured knowledge and make sense of the speech input.
Article
Full-text available
Background The human skin contains a diverse microbiome that provides protective functions against environmental pathogens. Studies have demonstrated that bacteriophages modulate bacterial community composition and facilitate the transfer of host-specific genes, potentially influencing host cellular functions. However, little is known about the human skin virome and its role in human health. Especially, how viral-host relationships influence skin microbiome structure and function is poorly understood. Results Population dynamics and genetic diversity of bacteriophage communities in viral metagenomic data collected from three anatomical skin locations from 60 subjects at five different time points revealed that cutaneous bacteriophage populations are mainly composed of tailed Caudovirales phages that carry auxiliary genes to help improve metabolic remodeling to increase bacterial host fitness through antimicrobial resistance. Sequence variation in the MRSA associated antimicrobial resistance gene, erm(C) was evaluated using targeted sequencing to further confirm the presence of antimicrobial resistance genes in the human virome and to demonstrate how functionality of such genes may influence persistence and in turn stabilization of bacterial host and their functions. Conclusions This large temporal study of human skin associated viruses indicates that the human skin virome is associated with auxiliary metabolic genes and antimicrobial resistance genes to help increase bacterial host fitness.
Chapter
This chapter discusses the pedagogical importance of folk culture in early age and introduces the Hungarian legal frame and regulations regarding the introduction of folk culture in Hungarian kindergartens. It looks at four surveys addressing the kindergarten teachers' and parents' attitudes regarding that. Hence the topic of adapting folk culture in the contemporary pedagogical program of kindergartens in Hungary are investigated from diverse perspectives. The chapter concludes in possible methods monitored and evaluated in practice of Hungarian institutions, through which folk cultural elements can be effectively adapted to the everyday programs in kindergartens. Hence the results and suggestions can be seen as tools for effectively communicating intangible cultural heritage for young generation, ensuring the inheritance of such values in the future as well.
Article
Multiple stressors affect freshwater systems and cause a deficient ecological status according to the European Water Framework Directive (WFD). To select effective mitigation measures and improve the ecological status, knowledge on the stressor hierarchy and individual and joined effects is necessary. However, compared to common stressors like nutrient enrichment and morphological degradation, the relative importance of micropollutants such as pesticides and pharmaceuticals is largely unaddressed. We used WFD monitoring data from Saxony (Germany) to investigate the importance of 85 environmental variables (including 34 micropollutants) for 18 benthic invertebrate metrics at 108 sites. The environmental variables were assigned to five groups (natural factors, nutrient enrichment, metals, micropollutants and morphological degradation) and were ranked according to their relative importance as group and individually within and across groups using Principal Component Analyses (PCAs) and Boosted Regression Trees (BRTs). Overall, natural factors contributed the most to the total explained deviance of the models. This variable group represented not only typological differences between sampling sites but also a gradient of human impact by strongly anthropogenically influenced variables such as electric conductivity and dissolved oxygen. These large-scale effects can mask the individual importance of the other variable groups, which may act more specifically at a subset of sites. Accordingly, micropollutants were not represented by a few dominant variables but rather a diverse palette of different chemicals with similar contribution. As group, micropollutants contributed similarly as metals, nutrient enrichment and morphological degradation. However, the importance of micropollutants might be underestimated due to limitations of the current chemical monitoring practices.
Chapter
The chapter problematizes the frequent lack of a genuinely global perspective in educational approaches at World Heritage sites, manifesting in limitations of educational contents and aims to local conservation, local history or art history, despite World Heritage sites' professed ‘outstanding universal value' (OUV). To overcome such limitations, the authors strategically include the approach of Education for Sustainable Development (ESD) in the context of World Heritage sites. Considering the emergence of a ‘postdigital' condition, the authors sketch specific uses of communication at World Heritage sites to support reflexive formal and informal education processes. In doing so, the authors establish an original approach to reflexive World Heritage Education (WHE). The chapter demonstrates on the basis of practical examples the possibilities for reflexive, postdigital educational approaches at World Heritage sites by referring to the educational pilot project “Young Climate Action for World Heritage.” The chapter concludes by identifying gaps for further research.
Article
A distinguishing feature of neural computation and information processing is that it fits models that describe the most efficient strategies for performing different cognitive tasks. Efficiency determines a distinctive sense of teleology involving optimal performance and resource management through a specific strategy. I articulate this kind of teleology and call it efficient teleological function. I argue that efficient teleological function is compatible with mechanistic explanation and, most likely, neural computational mechanisms are efficiently functional in this sense. They are members of a distinctive class of computational mechanisms whose efficiency is intertwined with their functionality. This is illustrated by widely discussed approaches to mind, such as Barlow’s efficient coding hypothesis or the ones associated with the so-called “predictive mind”, which propose that the brain employs highly efficient coding strategies to save energy resources that are critical to the organism’s survival.
Article
Objectives: To develop an automated international classification of diseases (ICD) coding tool using natural language processing (NLP) and discharge summary texts from Thailand. Materials and methods: The development phase included 15,329 discharge summaries from Ramathibodi Hospital from January 2015 to December 2020. The external validation phase included Medical Information Mart for Intensive Care III (MIMIC-III) data. Three algorithms were developed: naïve Bayes with term frequency-inverse document frequency (NB-TF-IDF), convolutional neural network with neural word embedding (CNN-NWE), and CNN with PubMedBERT (CNN-PubMedBERT). In addition, two state-of-the-art models were also considered; convolutional attention for multi-label classification (CAML) and pretrained language models for automatic ICD coding (PLM-ICD). Results: The CNN-PubMedBERT model provided average micro- and macro-area under precision-recall curve (AUPRC) of 0.6605 and 0.5538, which outperformed CNN-NWE (0.6528 and 0.5564), NB-TF-IDF (0.4441 and 0.3562), and CAML (0.6257 and 0.4964), with corresponding differences of (0.0077 and −0.0026), (0.2164 and 0.1976), and (0.0348 and 0.0574), respectively. However, CNN-PubMedBERT performed less well relative to PLM-ICD, with corresponding AUPRCs of 0.7202 and 0.5865. The CNN-PubMedBERT model was externally validated using two subsets of MIMIC-III; MIMIC-ICD-10, and MIMIC-ICD-9 datasets, which contained 40,923 and 31,196 discharge summaries. The average micro-AUPRCs were 0.3745, 0.6878, and 0.6699, corresponding to directly predictive MIMIC-ICD-10, MIMIC-ICD-10 fine-tuning, and MIMIC-ICD-9 fine-tuning approaches; the average macro-AUPRCs for the corresponding models were 0.2819, 0.4219 and 0.5377, respectively. Discussion: CNN-PubMedBERT performed second-best to PLM-ICD, with considerable variation observed between average micro- and macro-AUPRC, especially for external validation, generally indicating good overall prediction but limited predictive value for small sample sizes. External validation in a US cohort demonstrated a higher level of model prediction performance. Conclusion: Both PLM-ICD and CNN-PubMedBERT models may provide useful tools for automated ICD-10 coding. Nevertheless, further evaluation and validation within Thai and Asian healthcare systems may prove more informative for clinical application.
ResearchGate has not been able to resolve any references for this publication.