A wealth of evidence has forced my colleagues and me to conclude that 65 million years ago a mountain‐sized object hit Earth and caused the extinction of most of the then existing species, bringing a close to the Cretaceous period of geological history and opening the Tertiary period. Much of the evidence for this lies in the unusual layer of clay that separates those periods in the geological record, shown in figure 1. For example, this stratum contains anomalously high concentrations of iridium, an element whose abundance in the crust of the Earth is only one ten‐thousandth that in meteorites and, presumably, in other “bolides,” or large pieces of Solar System debris. Evidence indicates that the collision of Earth and a large piece of Solar System debris such as a meteoroid, asteroid or comet caused the great extinctions of 65 million years ago, leading to the transition from the age of the dinosaurs to the age of the mammals.
Cells sense chemical gradients, communicate gradient information throughout the cell, and change their shape in response. Statistics, materials science, and more underlie thoseessential biological processes.
At first glance the nucleic acids and proteins that are the basis of life do not stand out in any way among all the possible polymericstructures. If we look at their functions, however, we find one unique feature of these biological polymers: self‐replication, the distinctive property of living systems. What is self‐replication, and how could this biologically primordial property have originated in an unorganized medium? The solution to the problem of life's origin lies in resolving the paradox of how polymers of rather common structure can exhibit such a distinctive function. Biological polymers have a preferred chirality and con replicate themselves. Physical arguments provide insight into which of these unique and apparently related properties evolved first, and by what mechanism.
The long term systematic features about citation statistics that are revealed by publicly available data are presented. The age of citation is the difference between the year when a citation occurs and the publication year of the cited paper. The citation data provides a lots of quantitative information. It can be used to identify influential research, new trends in research, unanticipated connections across fields and downturns in subfields that are exhausted. The data also reveals the idiosyncratic features in the citation histories of individual articles.
We reconsider the crucial 1927 Solvay conference in the context of current research in the foundations of quantum theory. Contrary to folklore, the interpretation question was not settled at this conference and no consensus was reached; instead, a range of sharply conflicting views were presented and extensively discussed. Today, there is no longer an established or dominant interpretation of quantum theory, so it is important to re-evaluate the historical sources and keep the interpretation debate open. In this spirit, we provide a complete English translation of the original proceedings (lectures and discussions), and give background essays on the three main interpretations presented: de Broglie's pilot-wave theory, Born and Heisenberg's quantum mechanics, and Schroedinger's wave mechanics. We provide an extensive analysis of the lectures and discussions that took place, in the light of current debates about the meaning of quantum theory. The proceedings contain much unexpected material, including extensive discussions of de Broglie's pilot-wave theory (which de Broglie presented for a many-body system), and a "quantum mechanics" apparently lacking in wave function collapse or fundamental time evolution. We hope that the book will contribute to the ongoing revival of research in quantum foundations, as well as stimulate a reconsideration of the historical development of quantum physics. A more detailed description of the book may be found in the Preface. (Copyright by Cambridge University Press (ISBN: 9780521814218).) Comment: 553 pages, 33 figures. Draft of a book (as of Sept. 2006, same as v1). Published in Oct. 2009, with corrections and an appendix, by Cambridge University Press (available at http://www.cambridge.org/catalogue/catalogue.asp?isbn=9780521814218)
Adaptive optics and interferometry, two techniques that will improve the limiting resolution of optical and infrared observations by factors of tens or even thousands, are discussed. The real-time adjustment of optical surfaces to compensate for wavefront distortions will improve image quality and increase sensitivity. The phased operation of multiple telescopes separated by large distances will make it possible to achieve very high angular resolution and precise positional measurements. Infrared and optical interferometers that will manipulate light beams and measure interference directly are considered. Angular resolutions of single telescopes will be limited to around 10 milliarcseconds even using the adaptive optics techniques. Interferometry would surpass this limit by a factor of 100 or more. Future telescope arrays with 100-m baselines (resolution of 2.5 milliarcseconds at a 1-micron wavelength) are also discussed.
Nonequilibrium phenomena in hypersonic flows are examined on the basis of theoretical models and selected experimental data, in an introduction intended for second-year graduate students of aerospace engineering. Chapters are devoted to the physical nature of gas atoms and molecules, transitions of internal states, the formulation of the master equation of aerothermodynamics, the conservation equations, chemical reactions in CFD, the behavior of air flows in nonequilibrium, experimental aspects of nonequilibrium flow, a review of experimental results, and gas-solid interaction. Diagrams, graphs, and tables of numerical data are provided.
The inflationary universe scenario is the current paradigm of early-universe cosmology. Inflation has been successful phenomenologically: It not only provides answers to several questions that cannot be addressed in the standard Big Bang (SBB) cosmology, but it also led to the first quantitative and predictive theory for the origin of structure in the universe, a theory whose predictions have been verified to great accuracy by cosmic microwave background (CMB) anisotropy experiments. at present, however, models of inflation suffer from serious conceptual problems that have driven attempts to find alternative paradigms.
In a reference frame fixed to the solar system's center of mass, a
satellite's energy will change as it is deflected by a planet. But a number of
satellites flying by Earth have also experienced energy changes in the
Earth-centered frame -- and that's a mystery.
The Arctic is currently considered an area in transformation. Glaciers have been retreating, permafrost has been diminishing, snow covered areas have been decreasing, and sea ice and ice sheets have been thinning. This paper provides an overview of the unique role that satellite sensors have contributed in the detection of changes in the Arctic and demonstrates that many of the changes are not just local but a pan-Arctic phenomenon. Changes from the upper atmosphere to the surface are discussed and it is apparent that the magnitude of the trends tends to vary from region to region and from season to season. Previous reports of a warming Arctic and a retreating perennial ice cover have also been updated, and results show that changes are ongoing. Feedback effects that can lead to amplification of the signals and the role of satellite data in enhancing global circulation models are also discussed.
The Gum Nebula is seen on photographs of the southern skies as an extensive region of ionized hydrogen that surrounds the well known Vela X supernova remnant. One of the largest objects in our galaxy, the Nebula was first recognized as a single emission complex by Colin S. Gum during the 1950's. Until recently, astronomers assumed that it was a “Strömgren sphere,” excited by ultraviolet light from the hot stars gamma Velorum and zeta Puppis within it. This excitation process accounts for the Orion Nebula and other H‐II regions. The synchrotron and thermal bremsstrahlung mechanisms account for the radiation from supernova remnants, such as the Crab Nebula and Vela X; these objects consist of rapidly expanding matter ejected by the supernovaexplosions. Did radiation from a supernova explosion ionize this huge mass of hydrogen? Four theories propose ways that the Nebula could have been created by energy from the supernova.
All nuclei in the periodic table of the elements, as well as electrons and positrons, are present in the stream of cosmic-ray particles. The cosmic-ray particles constitute the only sample of matter from outside the solar system which reaches the earth. Some of the most accurate knowledge of the extrasolar-element abundance distribution is based on the study of these particles. Observational data concerning the cosmic rays are discussed along with cosmic-ray sources, questions of particle interactions and propagation, the electron spectrum, and the significance of the positron component. The directions of cosmic ray research in the immediate future are also considered, giving attention to some fundamental questions which have not yet been answered.
The subject of gamma-ray astronomy is discussed with emphasis on celestial gamma rays with energies in excess of 10 MeV. Early observations of such gamma rays are reviewed, a gamma-ray spark-chamber telescope is described together with a gas Cerenkov-counter telescope, and the gamma-ray sky is delineated. It is shown that the diffuse high-energy gamma radiation from the galactic plane probably results primarily from cosmic-ray interactions with interstellar matter. Mechanisms for gamma-ray production are identified, and it is noted that the general galactic radiation may prove to be of great value in studies of galactic structure. Possible sources are considered for the diffuse celestial radiation, and discrete sources are described, including the Crab pulsar, the Vela remnant, the Cygnus region, and Gould's Belt. Future developments in gamma-ray astronomy are considered.
Stanley Fischer had a long and distinguished career as an academic economist at MIT, and was Vice President, Development Economics and Chief Economist at the World Bank, before becoming First Deputy Managing Director of the International Monetary Fund in 1994. He is now President of Citigroup International and Vice Chairman of Citigroup. In this interview, Brian Snowdon discusses with Stanley Fischer several important issues relating to the contemporary world economy, including problems of stabilisation, inflation and growth, the economics and politics of transition, exchange rate regimes, the IMF, the East Asian crisis, and globalisation and economic development.
Since the mid-eighties there has been an accumulation of metallic materials whose thermodynamic and transport properties differ significantly from those predicted by Fermi liquid theory. Examples of these so-called non-Fermi liquids include the strange metal phase of high transition temperature cuprates, and heavy fermion systems near a quantum phase transition. We report on a class of non-Fermi liquids discovered using gauge/gravity duality. The low energy behavior of these non-Fermi liquids is shown to be governed by a nontrivial infrared (IR) fixed point which exhibits nonanalytic scaling behavior only in the temporal direction. Within this class we find examples whose single-particle spectral function and transport behavior resemble those of strange metals. In particular, the contribution from the Fermi surface to the conductivity is inversely proportional to the temperature. In our treatment these properties can be understood as being controlled by the scaling dimension of the fermion operator in the emergent IR fixed point. Comment: 26 pages, 9 figures
SONIC BOOMS are explosive sounds that occur without warning. Sometimes annoying because of their startle effects and their ability to shake buildings, these sounds pose a unique problem for the orderly development of high‐speed air transport. Although booms from military aircraft are widely observed around the world, the real concern is for proposed commercial airtransport operations that will cause repeated booms over very large areas. What are the effects of sonic booms and what steps can be taken to minimize the exposure to a suitable level? How are sonic booms created? What do they do to people and buildings? Will the future be dominated by the boom?
Bose-Einstein condensation (BEC) is a phenomenon that occurs in a macroscopic system of bosons (particles obeying ► Bose-Einstein statistics) at low temperatures: a nonzero fraction of all the particles in the system (thus a macroscopic number of particles) occupy a singleM one-particle state. This would, of course, happen for a system of distinguishable, noninteracting particles at zero temperature, but in this case the phenomenon disappears as soon as the temperature becomes comparable to the energy splitting between the single-particle groundstate and the first excited state — a quantity which tends to zero with the size of the system. By contrast, in BEC the macroscopic occupation occurs at all temperatures below a transition temperature, usually denoted T
, which while a function of intensive parameters such as density and interaction strength is constant in the thermodynamic limit.
The fundamental reason for the occurrence of BEC lies in the requirement, which follows from considerations of quantum field theory, that the ► wave function of a system of identical bosons should be symmetric under the exchange of any two particles. This has the consequence that states that differ only by such an exchange must be counted as identical, i.e. counted only once. Thus, for example, while for a system of N distinguishable objects, which must be partitioned between two boxes, the number of ways of putting M of them into one box is given by the familiar binomial formula N!/(M!N — M!), for bosons there is exactly one way for each M. The effect is to remove the “entropic” factor, which for distinguishable objects militates against putting a large fraction of them in a single one-particle state.
Data from NASA's Earth Radiation Budget (ERB) Experiment instruments carried by the ERB Satellite and by NOAA-9 and -10 are presently evaluated, with a view to the role played by clouds in the global radiation energy balance. While an individual water droplet scatters 85 percent of incident energy in the forward direction, a cloud of such drops can scatter 75 percent or more of the energy backward. The resulting enhancement of surface-atmosphere albedo reduces the solar radiation absorbed by the atmospheric column. Clouds also significantly enhance the long-wave opacity of the atmosphere; like gaseous absorption, this reduces the radiation emitted to space.
Your opponent's serve was almost perfect, but you vigorously returned it beyond his outstretched racquet to win the point. Now the tennis ball sits wedged in the chain-link fence around the court. What happened to the ball's kinetic energy? It has gone to heat the fence, of course, and you realize that if the fence were quite colder, you might be able to measure that heat and determine just how energetic your swing really was. Calorimetry has been a standard measurement technique since James Joule and Julius von Mayer independently concluded, about 150 years ago, that heat is a form of energy. But only in the past 15 years or so has calorimetry been applied, at millikelvin temperatures, to the measurement of the energy of individual photons and particles with exquisite sensitivity. In this article, we have tried to show that continuing research in low-temperature physics leads to a greater understanding of high-temperature astrophysics. Adaptations of the resulting spectrometers will be useful tool for fields of research beyond astrophysics.
We study the statistics of citations from all Physical Review journals for the 110-year period 1893 until 2003. In addition to characterizing the citation distribution and identifying publications with the highest citation impact, we investigate how citations evolve with time. There is a positive correlation between the number of citations to a paper and the average age of citations. Citations from a publication have an exponentially decaying age distribution; that is, old papers tend to not get cited. In contrast, the citations to a publication are consistent with a power-law age distribution, with an exponent close to -1 over a time range of 2 -- 20 years. We also identify a number of strongly-correlated citation bursts and other dramatic features in the time history of citations to individual publications.
In 1963 Edward Lorenz revealed deterministic predictability to be an illusion
and gave birth to a field that still thrives. This Feature Article discusses
Lorenz's discovery and developments that followed from it.
The vast majority of compounds crystallize into a regular form in which a unit cell is repeated indefinitely, except for generally localized defects, impurities and boundaries. In a few compounds, however, at sufficiently low temperatures interactions between electrons and ions across unit cells make this regular array unstable with respect to small distortions. The stable state is one in which the charge density, the spin density, or the ion positions display long‐period modulations. The period of these modulations may be incommensurate with the spacing of the underlying lattice, so that the material is no longer truly periodic, having two unrelated periods. In this article we shall focus on charge‐density waves, in which the electron density and also the ion positions exhibit a periodic variation. At low temperatures some crystals undergo a phase transition to a state in which the electron density displays periodic modulations incommensurate with the crystal lattice.
For thousands of years, code-makers and code-breakers have been competing for supremacy. Their arsenals may soon include a powerful new weapon: quantum mechanics. We give an overview of quantum cryptology as of November 2000. Comment: 14 pages, 4 figures. Originally appeared in Physics Today: <http://www.physicstoday.org/pt/vol-53/iss-11/p22.html>. This article may be downloaded for personal use only. Any other use requires prior permission of both the author and the American Institute of Physics
Superconducting circuits, which can test quantum mechanics at macroscopic scales and can be used to conduct atomic physics experiments on a silicon chip are discussed. Josephson junctions, superconducting grains or electrodes separated by an insulating oxide acts like nonlinear inductors in a circuit. Superconducting circuits experience significant levels of noise due to their coupling to the environment. Superconducting circuits provide researchers with tools to test fundamental quantum mechanics in several ways.
Quantum mechanics works exceedingly well in all practical applications. No example of conflict between its predictions and experiment is known. Without quantum physics we could not explain the behavior of solids, the structure and function of DNA, the color of the stars, the action of lasers or the properties of superfluids. Yet well over half a century after its inception, the debate about the relation of quantum mechanics to the familiar physical world continues. How can a theory that can account ith precision for everything we can measure still be deemed lacking? The environment surrounding a quantum system can, in effect, monitor some of the system's observobles. As a result, the eigenstates of those observables continuously decohere and can behave like classical states.
An account is given of physical processes governing the formation of stratospheric particles, in order to dramatize the interactions between polar stratospheric clouds and the Antarctic ozone-destruction mechanism. Attention is given to the successive stages of particle nucleation, condensation/evaporation and sedimentation/coagulation phenomena, and the ways in which polar stratospheric clouds are observed. Considerable evidence exists that polar stratospheric cloud particles are composed of nitric acid. The relatively small Arctic ozone hole depletion is due to the much smaller duration of Arctic stratospheric clouds.
This is an introduction to conformal invariance and two-dimensional critical phenomena for graduate students and condensed-matter physicists. After explaining the algebraic foundations of conformal invariance, numerical methods and their application to the Ising, Potts, Ashkin-Teller and XY models, tricritical behaviour, the Yang-Lee singularity and the XXZ chain are presented. Finite-size scaling techniques and their conformal extensions are treated in detail. The vicinity of the critical point is studied using the exact $S$-matrix approach, the truncation method, the thermodynamic Bethe ansatz and asymptotic finite-size scaling functions. The integrability of the two-dimensional Ising model in a magnetic field is also dealt with. Finally, the extension of conformal invariance to surface critical phenomena is described and an outlook towards possible applications in critical dynamics is given. Comment: xvi + 260 pages (here only the first xvi are given), Latex, UGVA-DPT 1992/11-794
Attention is given to experimental and theoretical approaches to plasma physics, plasma phenomena in laboratory and space, field and particle aspects of plasmas, the present state of the classical theory, boundary conditions and circuit dependence, and cosmology. Electric currents in space plasmas are considered, taking into account dualism in physics, particle-related phenomena in plasma physics, magnetic field lines, filaments, local plasma properties and the circuit, electric double layers, field-aligned currents as 'cables', an expanding circuit, different types of plasma regions, the cellular structure of space, and the fine structure of active plasma regions. Other topics discussed are related to circuits, the theory of cosmic plasmas, the origin of the solar system, the coexistence of matter and antimatter, annihilation as a source of energy, the Hubble expansion in a Euclidean space, and a model for the evolution of the Metagalaxy.
Ideas of Statistical Physics are very relevant for cosmic structures especially considering that the field is undergoing a period of exceptional development with many new data appearing on a monthly basis. In the past years we have focused mostly on galaxy distributions and their statistical properties. This led to an interesting debate which will be resolved by the next generation of data in a couple of years. In addition here we discuss the statistical properties of the fluctuations of the cosmic microwave background which are small in amplitude but complex in structure. We finally discuss the connection between these observations and the Harrison-Zeldovich spectrum and its further implications on the theories of structure formation and the and the cosmological $N$-body simulations.
A review of the historical development of solar cosmic ray research is presented and details concerning the solar atmosphere, the interplanetary space, and solar activity are considered, giving attention to solar-atmosphere structure, problems of radiative transfer, questions of solar magnetism, solar wind, and interplanetary plasmas. Solar flares and associated phenomena are discussed along with the generation of solar cosmic ray events, the mechanism of solar flares, the acceleration process of solar cosmic rays, the propagation of solar cosmic rays, and relations between the flow of energetic protons and solar active regions. Questions regarding the origin theory of cosmic rays are also explored, taking into account the solar origin theory and problems of flare stars.
SOME OF THE MOST exciting recent developments in physics have been in the realms of the very small and the very large. Chief among the latter was the discovery of “quasars,” which has quadrupled our depth penetration of the universe (ifquasars are where most astronomers think they are), and which has also stimulated the possibly irrelevant but intrinsically fruitful investigations into gravitational collapse. Then came the discovery of the 3°K background radiation, which seems to have taken us back a long way in time to an apparent “big‐bang” origin of the universe. Both bode ill for the steady‐state theory to which many of us had been attracted. Most recently, Robert H. Dicke's speculations and observations on the oblate sun have battered at the edifice of general relativity, and thus at the very foundations of modern cosmology. Even advances in the realm of the very small affect our knowledge of the very large; for, after all, most of the phenomena we see in the sky are manifestations of either gravitational or nuclear activity.
Questions asked by cosmologists today concern the large‐scale structure of the universe. How do intergalactic distances expand? Is the universe isotropic and homogeneous? Flat or curved? Finite or infinite? Did it all start with a “big bang?”
This is the LaTeX version of my book "Particle Physics and Inflationary Cosmology'' (Harwood, Chur, Switzerland, 1990). I decided to put it to hep-th, to make it easily available. Many things happened during the 15 years since the time when it was written. In particular, we have learned a lot about the high temperature behavior in the electroweak theory and about baryogenesis. A discovery of the acceleration of the universe has changed the way we are thinking about the problem of the vacuum energy: Instead of trying to explain why it is zero, we are trying to understand why it is anomalously small. Recent cosmological observations have shown that the universe is flat, or almost exactly flat, and confirmed many other predictions of inflationary theory. Many new versions of this theory have been developed, including hybrid inflation and inflationary models based on string theory. There was a substantial progress in the theory of reheating of the universe after inflation, and in the theory of eternal inflation. It's clear, therefore, that some parts of the book should be updated, which I might do sometimes in the future. I hope, however, that this book may be of some interest even in its original form. I am using it in my lectures on inflationary cosmology at Stanford, supplementing it with the discussion of the subjects mentioned above. I would suggest to read this book in parallel with the book by Liddle and Lyth "Cosmological Inflation and Large Scale Structure,'' with the book by Mukhanov "Physical Foundations of Cosmology,'' to be published soon, and with my review article hep-th/0503195, which contains a discussion of some (but certainly not all) of the recent developments in inflationary theory. Comment: 270 pages, 35 figures
This is a review of the basic theoretical ideas of quantum criticality, and
of their connection to numerous experiments on correlated electron compounds. A
shortened, modified, and edited version appeared in Physics Today. This arxiv
version has additional citations to the literature.
The time lag between the publication of a Nobel discovery and the conferment
of the prize has been rapidly increasing for all disciplines, especially for
Physics. Does this mean that fundamental science is running out of
Data obtained from the Viking missions to Mars have lead to comparisons between characteristics of that planet and certain aspects of earth. Topics discussed include: thermal and compositional evolution of the planets' interiors, the origin of their atmospheres, the history of their fluid environments, the evolution of organic material, and climatic changes. Examples of convective and magmatic activity on the crusts of the two planets are contrasted; outgassing evidenced in the crustal zones is related to the evolution of atmosphere. Analysis of experimental degassing of meteorites provides further insight into the origin and composition of planetary atmospheres. Attributes of the Martian surface, which is apparently an oxidizing rather than reducing medium, are reviewed, and explanations proposed to account for its lack of an organic richness comparable to that found on earth. Data on long-range climatic changes on earth, related to precession, obliquity, and eccentricity of the planet, may contribute to understanding of the Martian polar caps, and vice versa.
New approaches to economics, a broadening of the scope of physics, and the emergence of the new field of econophysics, are discussed. An emerging body of work by physicists is addressing questions of economic organization and function, and many books are being written on econophysics in general or on specific subtopics. Physics departments worldwide are granting PhD theses for research in economics, and in Europe several professors are specializing in econophysics. Some physics and economics departments are planning to design a basic course teaching the essential elements of both physics and economics.
This paper is a non-technical, informal presentation of our theory of the second law of thermodynamics as a law that is independent of statistical mechanics and that is derivable solely from certain simple assumptions about adiabatic processes for macroscopic systems. It is not necessary to assume a-priori concepts such as "heat", "hot and cold", "temperature". These are derivable from entropy, whose existence we derive from the basic assumptions. See cond-mat/9708200 and math-ph/9805005. Comment: LaTex file. To appear in the April 2000 issue of PHYSICS TODAY
NASA's Small Explorer program, or 'SMEX', was designed to support disciplines heretofore accommodated within NASA's astrophysics/space physics/upper-atmosphere science Explorer program. Under the aegis of SMEX, a principal investigator proposes an entire mission and its experiments; emphasis is placed on the passing of spacecraft-design experience to a new generation of scientists and engineers. The first SMEX mission selected for implementation is the Solar Anomalous and Magnetospheric Particle Explorer, which is scheduled for launch on June, 1992.
An overview of the radio-astronomy field is given, and prospects ready for construction at NASA are presented. A very-long-baseline array consisting of ten 25 m antennas, with a limiting wavelength of 7 mm and an angular resolution at that wavelength of 2 x 10 to the 4th arcsec is discussed. Eighty percent of the phase information will be obtained by closure around the 36 independent triangles, and high quality aperture-synthesis maps will be produced at all wavelengths. The 25 m telescope will be capable of several applications including the discovery of new molecules in our galaxy (in particular, the envelope of the evolved carbon star IRC + 10216), the detection of CO to distances of perhaps 100 million light years, and the understanding of the events which occur as stars are formed from molecular clouds, and as energy is fed back into the molecular gas by new stars. The submillimeter-wave telescope contains the last atmospheric radio windows where astronomical observations can be made from the earth's surface. The need for funding is stressed.
Iron-based superconductors were discovered seven years ago, in 2008. This
short review summarizes what we learned about these materials over the last
seven years, what are open questions, and what new physics we expect to extract
from studies of this new class of high-temperature superconductors.
The Fermi-Pasta-Ulam (FPU) problem, first written up in a Los Alamos report in May 1955, marked the beginning of both a new field, nonlinear physics, and the age of computer simulations of scientific problems. The idea was to simulate the one-dimensional analogue of atoms in a crystal: a long chain of masses linked by springs that obey Hooke's law (a linear interaction), but with a weak nonlinear term. A purely linear interaction would ensure that energy introduced into a single Fourier vibrational mode always remains in that mode; the nonlinear term allows the transfer of energy between modes. Under certain conditions, the weakly nonlinear system exhibits surprising behavior: The energy does not drift toward the equipartition predicted by statistical physics but periodically returns to the original mode. That highly remarkable result, known as the FPU paradox, shows that nonlinearity is not enough to guarantee the equipartition of energy.
This respected high-level text is aimed at students and professionals working on random processes in various areas, including physics and finance. The first author, Melvin Lax (1922-2002) was a distinguished Professor of Physics at City College of New York and a member of the U. S. National Academy of Sciences, and is widely known for his contributions to our understanding of random processes in physics. Most chapters of this book are outcomes of the class notes which Lax taught at the City University of New York from 1985 to 2001. The material is unique as it presents the theoretical framework of Lax's treatment of random processes, from basic probability theory to Fokker-Planck and Langevin Processes, and includes diverse applications, such as explanations of very narrow laser width, analytical solutions of the elastic Boltzmann transport equation, and a critical viewpoint of mathematics currently used in the world of finance. Available in OSO: http://www.oxfordscholarship.com/oso/public/content/physics/9780198567769/toc.html
The theory of neutron and gamma-ray production in flares is reviewed and comparisons of the calculations with data are made. The principal conclusions pertain to the accelerated proton and electron numbers and spectra in flares and to the interaction site of these particles in the solar atmosphere. For the June 21, 1980 flare, from which high-energy neutrons and high-energy ( MeV) photons were seen, the electron-to-proton ratio is energy dependent and much smaller than unity at energies greater than 1 MeV. The interaction site of these particles appears to be the solar chromosphere.