ArticlePDF Available

Maple in Mathematics and the Sciences

Authors:

Abstract and Figures

This first special issue of MapleTech presents an ensemble of some of the most significant mathematical and scientific results ever obtained with a computer-algebra system (or symbolic computation system). This includes computationally resolving one of two cases of Fermat's last theorem before it was finally solved by A. Wiles. Also some exact solutions are given to General relativistic and quantum mechanical problems. These results were obtained with the Maple symbolic computation system.
Content may be subject to copyright.
A preview of the PDF is not available
Chapter
The nineteenth century witnessed a gradual transformation of mathematics—in fact, a gradual revolution, if that is not a contradiction in terms. Mathematicians turned more and more for the genesis of their ideas from the sensory and empirical to the intellectual and abstract. Although this subtle change already began in the sixteenth and seventeenth centuries with the introduction of such nonintuitive concepts as negative and complex numbers, instantaneous rates of change, and infinitely small quantities, these were often used (successfully) to solve physical problems and thus elicited little demand for justification.
Chapter
The axiomatic method is, without doubt, the single most important contribution of ancient Greece to mathematics. The explicit recognition that mathematics deals with abstractions, and that proof by deductive reasoning from explicitly stated postulates offers a foundation for mathematics, was indeed an extraordinary development. When, how, and why this came about is open to conjecture.
Chapter
Interest in solving equations in integers or rational numbers dates back from antiquity. I tried to show some fundamental problems which are still unsolved. Euclid and Diophantus already solved the equation a2 + b2 = c2, and gave a formula for all the solutions. The next hardest equation like y2 = x3 + ax + b has given rise to very great problems which have been at the center of mathematics since the 19th century. No one knows how to give an effective method for finding all solutions. I described some of the structures which the solutions have, and the context in which one would like to find such a method.
Article
This paper takes a look at the events leading to the British Industrial Revolution and renews the argument that a theory of useful knowledge is required to fully understand the timing of the event as well as the reasons why it did not peter out after a few decades the way previous waves of technological progress had done. It develops such a theory in some detail, distinguishing between Useful Knowledge and Techniques. It is argued that in the century before 1760, fundamental developments associated with the enlightenment and the Scientific Revolution took place that made the Industrial Revolution possible.
Chapter
Full-text available
Answering to the double-faced influence of string theory on mathematical practice and rigour, the mathematical physicists Arthur Jaffe and Frank Quinn have contemplated the idea that there exists a ‘theoretical’ mathematics (alongside ‘theoretical’ physics) whose basic structures and results still require independent corroboration by mathematical proof. In this paper, I shall take the Jaffe-Quinn debate mainly as a problem of mathematical ontology and analyse it against the backdrop of two philosophical views that are appreciative towards informal mathematical development and conjectural results: Lakatos’s methodology of proofs and refutations and John von Neumann’s opportunistic reading of Hilbert’s axiomatic method. The comparison of both approaches shows that mitigating Lakatos’s falsificationism makes his insights about mathematical quasi-ontology more relevant to 20th century mathematics in which new structures are introduced by axiomatisation and not necessarily motivated by informal ancestors. The final section discusses the consequences of string theorists’ claim to finality for the theory’s mathematical make-up. I argue that ontological reductionism as advocated by particle physicists and the quest for mathematically deeper axioms do not necessarily lead to identical results.
Chapter
Full-text available
Mathematical precision and rigour without conceptual clarity was for von Neumann neither possible nor desirable either in physical sciences or in mathematics. It seems justified to say that what drove von Neumann in his research, especially in physics, was the desire to achieve conceptual clarity and formulate conceptually consistent theories. Von Neumann’s work on quantum mechanics and especially his abandoning the Hilbert space formalism corroborates this interpretation to a large extent. In arriving at acceptable theories von Neumann was relying on the method of an opportunistically interpreted soft axiomatics, a method of axiomatisation which was not affected by Gödel’s results. Von Neumann himself, when speaking of the method in physics, emphasized that the aim of theoretical physics is to create mathematical models. His success in creating powerful mathematical models in physics was due to his unparalleled skill and talent in combining algebraic-combinatorial techniques with analysis.
Article
Full-text available
Students (and not only students) usually think of mathematics as monolithic and fixed for all time, and of mathematicians (if they give any thought to the creators of the subject they are dealing with) as in total agreement on mathematical means and ends. How many of them realize that not only the methods and results of mathematics but also its basic concepts are tentative rather than final, in flux rather than eternal? That the ideas of number, of function, of continuity, even of proof have ever been different from what they are today? We focus in this essay on the concept of proof. It is not only that the notion and practice of proof have evolved over time, but that at any given time they were often debated by contemporary mathematicians. Since the way that mathematicians prove, or arrive at, their results is usually a reflection of their overall view of mathematics, we will have to consider the broader picture. From antiquity onwards we note a divergence of views among mathematicians on how best to do mathematics, on what methods to use for attacking problems and establishing results. Some advocate formal, rigorous proofs, others intuitive, heuristic ones (and some do not see the difference between the two). Adherents of the synthetic method battle supporters of the analytic method. Rationalists confront empiricists and formalists oppose intuitionists (to use current terms rather loosely). Of course the tensions between these groups have, in general, been healthy for mathematics (though, perhaps, less so for the protagonists). This essays consists of examples from various historical periods which illustrate the above themes of the pluralistic nature of mathematics.
Conference Paper
Full-text available
In this article, we survey some applications of Maple where it has proved useful as a problem solving tool. The application areas include relativity, quantum theory, audio engineering, number theory, and asbestos fiber analysis. They should give the reader some idea of the potential capabilities and versatility of the Maple system. Most of the applications presented here include new approaches at the forefront of research. For the reader new to the field of symbolic computation, we hope to show how the symbolic computation tools available in Maple have provided new possibilities in various fields of research.
Article
Full-text available
We consider the integral ϵb(k,K)=0xeηx2Jb(Kx)Yb(kx)dx \epsilon_b (k, K) = \int_0^\infty xe^{ - \eta x^2 } J_b (Kx) Y_b (kx ) dx where, K, k and b are all positive real numbers. We reduce this integral to a linear combination of two integrals. The first of these is an exponential integral, which can be expressed as a difference of two Shkarofsky functions, or can easily be evaluated numerically. The second is the original integral, but with k and K both replaced by kK. We express this as a Meijer G function, and then reduce it to the sum of an associated Bessel function and a modified Bessel function.
Chapter
Optimized inner projection (OIP) method is used to determine a lower bound to the exact correlation energy for the Pariser-Parr-Pople (PPP) and Hubbard Hamiltonian models of the cyclic polyene with ten sites in the whole range of the coupling constant, for which the exact full configuration interaction (FCI) solutions are available. The evaluation of the energy dependent effective interaction, which enters the OIP method, is achieved using the diagrammatic technique of the many-body perturbation theory (MBPT). Both diagrammatic and algebraic representations for the effective interaction within the doubly excited manifold and involving triply and quadruply excited intermediate states are given. The actual computations are carried out using the symbolic computation language MAPLE and its interface with FORTRAN. Newton’s method must be employed to solve for the OIP correlation energy in the intermediately and highly correlated regions of the coupling constant. The method provides good results in both weakly and very strongly correlated regions, yielding the exact result in the fully correlated limit, but is rather unsatisfactory for the intermediate values of the coupling constant including the region that corresponds to the spectroscopic parametrization. A comparison with the exact FCI results as well as with several other approximate approaches to the many-electron correlation problem is carried out and discussed along with the general aspects of symbolic computation.
Article
The following theorem is proved in this paper: "If the first case of Fermat's Last Theorem does not hold for sufficiently large prime l, then for all pairs of positive integers N, k, N ≤ 94, 0 ≤ k ≤ N- 1." The proof of this theorem is based on a recent paper of Skula and uses computer techniques.
Article
We show that if the first case of Fermat’s Last Theorem is false for prime exponent p then p2 divides qp — q for all primes q < Sq. As a corollary we state the theorem of the title.
Article
Two basic aspects of the familiar Heyser energy-time curve (ETC) as applied to electroacoustic system measurements are addressed, 1) its inherently acausal nature, which means that it should not be interpreted literally as representing the energy flow of a physical system, and, more importantly, 2) the way in which the appearance of an actual ETC is affected by the detailed nature of the data from which it is computed, and especially by any frequency-domain window that is used in its computation. Theoretical and experimental data are used to illustrate the variety of behavior that can occur, and show how the processing can either enhance or falsify the measurement.