Publications (42)28.97 Total impact
 [Show abstract] [Hide abstract]
ABSTRACT: We describe a largescale project in applied automated deduction concerned with the following problem of considerable interest in loop theory: If $Q$ is a loop with commuting inner mappings, does it follow that $Q$ modulo its center is a group and $Q$ modulo its nucleus is an abelian group? This problem has been answered affirmatively in several varieties of loops. The solution usually involves sophisticated techniques of automated deduction, and the resulting derivations are very long, often with no higherlevel human proofs available.Automated Reasoning and Mathematics, Essays in Memory of William W. McCun, Lecture Notes in Artifical Intelligence 7788 edited by Maria Paola Bonacia, Mark E. Stickel, 01/2013: pages 151164; Springer., ISBN: 03029743  [Show abstract] [Hide abstract]
ABSTRACT: In [“Time, truth and logic”, Techn. Rep., Australian National Univ. Canberra (1989), http://ftp.rsise. anu.edu.au/papers/slaney/TTL/TTL.ps.gz], J. Slaney, T. Surendonk and R. Girle introduced a little known deductive system F ** in connection with the problem of the indeterminacy of future contingents. The main result of this paper shows that, up to definitional equivalence, F ** has a familiar description: it is precisely Nelson’s constructive logic with strong negation [D. Vakarelov, Stud. Log. 36, 109–125 (1977; Zbl 0385.03055)].Bulletin of the Section of Logic 01/2010; 39(3).  [Show abstract] [Hide abstract]
ABSTRACT: We consider the class of pointed varieties of algebras having a lattice term reduct and we show that each such variety gives rise in a natural way, and according to a regular pattern, to at least three interesting logics. Although the mentioned class includes several logically and algebraically significant examples (e.g. Boolean algebras, MV algebras, Boolean algebras with operators, residuated lattices and their subvarieties, algebras from quantum logic or from depth relevant logic), we consider here in greater detail Abelian ℓgroups, where such logics respectively correspond to: i) Meyer and Slaney’s Abelian logic [31]; ii) Galli et al.’s logic of equilibrium [21]; iii) a new logic of “preservation of truth degrees”.Logica Universalis 09/2008; 2(2):209233. DOI:10.1007/s1178700800342  [Show abstract] [Hide abstract]
ABSTRACT: The goal of this twopart series of papers is to show that constructive logic with strong negation N is definitionally equivalent to a certain axiomatic extension NFL ew of the substructural logic FL ew . The main result of Part I of this series [41] shows that the equivalent variety semantics of N (namely, the variety of Nelson algebras) and the equivalent variety semantics of NFL ew (namely, a certain variety of FL ew algebras) are term equivalent. In this paper, the term equivalence result of Part I [41] is lifted to the setting of deductive systems to establish the definitional equivalence of the logics N and NFL ew . It follows from the definitional equivalence of these systems that constructive logic with strong negation is a substructural logic.Studia Logica 08/2008; 89(3):401425. DOI:10.1007/s1122500891381 · 0.60 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: The goal of this twopart series of papers is to show that constructive logic with strong negation N is definitionally equivalent to a certain axiomatic extension NFL The goal of this twopart series of papers is to show that constructive logic with strong negation N is definitionally equivalent to a certain axiomatic extension NFL ew ew of the substructural logic FL of the substructural logic FL ew ew . In this paper, it is shown that the equivalent variety semantics of N (namely, the variety of Nelson algebras) and the equivalent variety semantics of NFL . In this paper, it is shown that the equivalent variety semantics of N (namely, the variety of Nelson algebras) and the equivalent variety semantics of NFL ew ew (namely, a certain variety of FL (namely, a certain variety of FL ew ew algebras) are term equivalent. This answers a longstanding question of Nelson [30]. Extensive use is made of the automated algebras) are term equivalent. This answers a longstanding question of Nelson [30]. Extensive use is made of the automated theoremprover Prover9 in order to establish the result.an> in order to establish the result. The main result of this paper is exploited in Part II of this series [40] to show that the deductive systems N and NFL The main result of this paper is exploited in Part II of this series [40] to show that the deductive systems N and NFL ew ew are definitionally equivalent, and hence that constructive logic with strong negation is a substructural logic over FL are definitionally equivalent, and hence that constructive logic with strong negation is a substructural logic over FL ew ew . .Studia Logica 04/2008; 88(3):325348. DOI:10.1007/s112250089113x · 0.60 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: Ideally biosignatures can be detected at the early infection phase and used both for developing diagnostic patterns and for prognostic triage. Such biosignatures are important for vaccine validation and to provide risk stratification to a population such as for the identification of individuals who are exposed to biological or chemical agents and who are at high risk for developing an infection. The research goal is to detect broad based biosignature models and is initially focused on developing effective computeraugmented pathology tied to animal models developed at the University of New Mexico (UNM). Using lung tissue from infected and nai ve mice, feature extraction from images of the tissue under a specialized microscope, and Bayesian networks to analyze the data sets of features, we were able to differentiate normal from diseased samples and viral from bacterial samples in mid to late stages of infection. This effort has shown the potential effectiveness of computeraugmented pathology in this application. The extended research intends to couple analysis of serum, microarray analysis of organs, proteomic data and the pathology. The rational for the current invasive procedure on animal models is to facilitate the development of data analysis and machine learning techniques that can eventually be generalized to the task of discovering noninvasive and early stage biosignatures for human models.Computers in Biology and Medicine 12/2007; 37(11):153952. DOI:10.1016/j.compbiomed.2007.02.005 · 1.24 Impact Factor 
Article: Gene expression overlap affects karyotype prediction in pediatric acute lymphoblastic leukemia [17]
Leukemia 07/2007; 21(6):13414. DOI:10.1038/sj.leu.2404640 · 10.43 Impact Factor 
Article: Characterisations of Nelson algebras
[Show abstract] [Hide abstract]
ABSTRACT: Nelson algebras arise naturally in algebraic logic as the algebraic models of Nelson's constructive logic with strong negation. This note gives two characterisations of the variety of Nelson algebras up to term equivalence, together with a characterisation of the finite Nelson algebras up to polynomial equivalence. The results answer a question of Blok and Pigozzi and clarify some earlier work of Brignole and Monteiro.Revista de la Unión Matemática Argentina 01/2007; · 0.18 Impact Factor  04/2006: pages 316332;
 [Show abstract] [Hide abstract]
ABSTRACT: The skew Boolean propositional calculus (SBPC{SBPC}) is a generalization of the classical propositional calculus that arises naturally in the study of certain wellknown deductive systems. In this article, we consider a candidate presentation of SBPC{SBPC} and prove it constitutes a Hilbertstyle axiomatization. The problem reduces to establishing that the logic presented by the candidate axiomatization is algebraizable in the sense of Blok and Pigozzi. In turn, this process is equivalent to verifying four particular formulas are derivable from the candidate presentation. Automated deduction methods played a central role in proving these four theorems. In particular, our approach relied heavily on the method of proof sketches.Journal of Automated Reasoning 01/2006; 37(1):320. DOI:10.1007/s108170069029y · 0.88 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: A conjecture of Padmanabhan, on provability in cancellative semigroups, is addressed. Several of Levi’s group theory commutator theorems are proved for cancellative semigroups. The proofs, found by automated deduction, support the conjecture.Semigroup Forum 08/2005; 71(1):152157. DOI:10.1007/s0023300505060 · 0.37 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: We present short single axioms for ortholattices, orthomodular lattices, and modular ortholattices, all in terms of the Sheffer stroke. The ortholattice axiom is the shortest possible. We also give multiequation bases in terms of the Sheffer stroke and in terms of join, meet, and complementation. Proofs are omitted but are available in an associated technical report and on the Web. We used computers extensively to find candidates, reject candidates, and search for proofs that candidates are single axioms.Algebra Universalis 01/2005; 52(4):541549. DOI:10.1007/s0001200419020 · 0.44 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: No abstract prepared.  [Show abstract] [Hide abstract]
ABSTRACT: We present new techniques for the application of a Bayesian network learning framework to the problem of classifying gene expression data. The focus on classification permits us to develop techniques that address in several ways the complexities of learning Bayesian nets. Our classification model reduces the Bayesian network learning problem to the problem of learning multiple subnetworks, each consisting of a class label node and its set of parent genes. We argue that this classification model is more appropriate for the gene expression domain than are other structurally similar Bayesian network classification models, such as Naive Bayes and Tree Augmented Naive Bayes (TAN), because our model is consistent with prior domain experience suggesting that a relatively small number of genes, taken in different combinations, is required to predict most clinical classes of interest. Within this framework, we consider two different approaches to identifying parent sets which are supported by the gene expression observations and any other currently available evidence. One approach employs a simple greedy algorithm to search the universe of all genes; the second approach develops and applies a gene selection algorithm whose results are incorporated as a prior to enable an exhaustive search for parent sets over a restricted universe of genes. Two other significant contributions are the construction of classifiers from multiple, competing Bayesian network hypotheses and algorithmic methods for normalizing and binning gene expression data in the absence of prior expert knowledge. Our classifiers are developed under a cross validation regimen and then validated on corresponding outofsample test sets. The classifiers attain a classification rate in excess of 90% on outofsample test sets for two publicly available datasets. We present an extensive compilation of results reported in the literature for other classification methods run against these same two datasets. Our results are comparable to, or better than, any we have found reported for these two sets, when a traintest protocol as stringent as ours is followed.Journal of Computational Biology 02/2004; 11(4):581615. DOI:10.1089/1066527041887294 · 1.74 Impact Factor 
Article: ON A HOMOMORPHISM PROPERTY OF HOOPS
[Show abstract] [Hide abstract]
ABSTRACT: We present a syntactic proof that equationBulletin of the Section of Logic 01/2004; 33(3). 
Article: Yet Another Single Law for Lattices
[Show abstract] [Hide abstract]
ABSTRACT: In this note we show that the equational theory of all lattices is defined by a single absorption law. The identity of length 29 with 8 variables is shorter than previously known such equations defining lattices.Algebra Universalis 08/2003; 50(2). DOI:10.1007/s0001200318322 · 0.44 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: This article answers two questions (posed in the literature), each concerning the guaranteed existence of proofs free of double negation. A proof is free of double negation if none of its deduced steps contains a term of the form n(n(t)) for some term t, where n denotes negation. The first question asks for conditions on the hypotheses that, if satisfied, guarantee the existence of a doublenegationfree proof when the conclusion is free of double negation. The second question asks about the existence of an axiom system for classical propositional calculus whose use, for theorems with a conclusion free of double negation, guarantees the existence of a doublenegationfree proof. After giving conditions that answer the first question, we answer the second question by focusing on the Lukasiewicz threeaxiom system. We then extend our studies to infinitevalued sentential calculus and to intuitionistic logic and generalize the notion of being doublenegation free. The doublenegation proofs of interest rely exclusively on the inference rule condensed detachment, a rule that combines modus ponens with an appropriately general rule of substitution. The automated reasoning program OTTER played an indispensable role in this study.Studia Logica 02/2003; 80(23). DOI:10.1007/s1122500584694 · 0.60 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: Abstract In this article, we present a short 2  basis for Boolean algebra in terms of the Sheffer stroke and prove that no such 2  basis can be shorter We also prove that the new 2  basis is unique (for its length) up to applications of commutativity Our proof of the 2  basis was found by using the method of proof sketches and relied on the use of an automated reasoning programJournal of Automated Reasoning 01/2003; 31(1):19. DOI:10.1023/A:1027322305654 · 0.88 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: In this note, we present a firstorder proof that equation (((x y) y) x) x = (((y x) x) y) y holds in the quasivariety HBCK.  [Show abstract] [Hide abstract]
ABSTRACT: We present new techniques for the application of the Bayesian network learning framework to the problem of classifying gene expression data. Our techniques address the complexities of learning Bayesian nets in several ways. First, we focus on classification and demonstrate how this reduces the Bayesian net learning problem to the problem of learning subnetworks consisting of a class label node and its set of parent genes. We then consider two different approaches to identifying parent sets which are supported by current evidence; one approach employs a simple greedy algorithm to search the universe of all genes, and a second approach develops and applies a gene selection algorithm whose results are incorporated as a prior to enable an exhaustive search for parent sets over a restricted universe of genes. Two other significant contributions are the construction of classifiers from multiple, competing Bayesian network hypotheses and algorithmic methods for normalizing and binning gene expression data in the absence of prior expert knowledge. Our classifiers are first developed under a cross validation regimen against two publicly available data sets and then validated on corresponding outofsample test sets. The classifiers attain a classification rate in excess of 90% on each of these outofsample test sets.
Publication Stats
407  Citations  
28.97  Total Impact Points  
Top Journals
Institutions

1984–2013

University of New Mexico
 Department of Computer Science
Albuquerque, New Mexico, United States
