November 2015
·
15 Reads
·
4 Citations
American Scientist
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
November 2015
·
15 Reads
·
4 Citations
American Scientist
September 2015
·
24 Reads
·
2 Citations
American Scientist
Probabilistic algorithms, which make random choices at various points in their execution, have long been essential tools in simulation, optimization, cryptography, number theory, and statistics. The idea is to create a probabilistic programming language (PPL). In a language of this kind, random variables and probability distributions are first-class citizens, with the same rights and privileges as other data types. Monte Carlo simulations and other probabilistic models can be written in any programming language that offers access to a pseudorandom number generator. What a PPL offers is an environment where probabilistic concepts can be expressed naturally and concisely, and where procedures for computing with probabilities are built into the infrastructure of the language. Some of the algorithms embedded in PPLs may also be embedded in people.
July 2015
·
4 Reads
American Scientist
May 2015
·
36 Reads
·
1 Citation
American Scientist
March 2015
·
24 Reads
·
2 Citations
American Scientist
A full-scale computer simulation of the galaxy must trace the motions of at least 100 billion stars and other objects over several billion years. The challenge is to solve a 100-billion-body problem. Computational astronomers are preparing to simulate the motions of all the stars in a galaxy the size and shape of the Milky Way, tracing the stars' trajectories over a period of several billion years. A Dutch-Japanese collaboration team led by Simon Portegies Zwart of the Leiden Observatory has completed a pilot study of 51 billion stars.
January 2015
·
24 Reads
·
6 Citations
American Scientist
Three communities in the world of computation, computer science, computational science, and software development, are bound together by common interests but set apart by distinctly different aims and agendas. The object of study in computer science is the computing process itself, detached from any particular hardware or software. In computational science, the computer is not an object of study but a scientific instrument, a device for answering questions about the natural world. Software development is all about making things, not answering questions. According to the author, Brian Hayes, computing has been a fragmented field. As the push to teach more people to code grows, an increasing number of people will start acquiring programming skills, but there is no unified approach to programming, so the field is likely to continue to fragment as these new coders take the discipline in their own direction. Hayes notes that several attempts have been made over the years to create uniform, professionalized standards and certifications for programming, but they have usually faltered. He worries that ongoing fragmentation seems unwise and unhealthy.
September 2014
·
14 Reads
·
1 Citation
American Scientist
William Shanks was one of the finest computers of the Victorian era. His specialty was mathematical constants, and his most ambitious project was a record-setting computation of pi. Biographical details about William Shanks are hard to come by. Pencil-and-paper computation was a skill more highly prized in the 19th century than it is today. Towards the close of the year 1850, the Author first formed the design of rectifying the Circle to upwards of 300 places of decimals. He was fully aware, at that time, that the accomplishment of his purpose would add little or nothing to his fame as a Mathematician, though it might as a Computer. The only hidden subtlety is that the numeric variables must be able to accommodate numbers of arbitrary size and precision. Shanks does not reveal much about his computational methods.
July 2014
·
20 Reads
American Scientist
The literary scholar needs a quiet room, a reading lamp, a notebook, a receptive mind and algorithms for n-gram analysis, part-of-speech tagging, word-sense disambiguation, and sentence parsing. The idea of applying mathematical and computational tools to literature is hardly new. One was a man of science who made a few brief forays into statistical language studies. The other was a professor of English literature who yearned to import scientific methods into his field. The first half of Analytics is a fairly conventional introduction to rhetoric and poetics, with chapters on meter and rhyme, figures of speech, the emotional force of words a lot of close reading. Then Sherman suddenly goes all quantitative, launching into a discussion of sentence length. Sherman was motivated by broader questions than the authorship puzzles that concerned Mendenhall. While teaching the historical development of English literature, Sherman took note of pervasive changes in sentence structure.
May 2014
·
41 Reads
·
1 Citation
American Scientist
Endowing a computer with human perceptual skills, such as understanding spoken language or recognizing faces, has been on the agenda of computer science since the era of vacuum tubes and punch cards. If the network gives the correct answer for an image, do nothing. If the system makes the wrong choice, there must be at least one neuron in the input layer that responded incorrectly, either accepting a motif it should have rejected, or vice versa. Find all such errant neurons, and instruct them to reverse their classification of the current motif. Such a network recognizes any pattern of vertical stripes, regardless of the stripes' width. It is a wallpaper sensor. Other subsets of the 16 motifs yield networks triggered by horizontal stripes or by diagonals. The ability to detect stripes in various orientations is intriguing in that the primary visual cortex of the mammalian brain is full of stripe-sensitive neurons.
March 2014
·
5 Reads
At first, peas served as particles in Ernesto Altshuler’s experiment. A mechanical dispenser would drop the chícharos one by one into the space between two glass plates, forming a tidy two-dimensional approximation of a sand pile. Lattice structure appeared, then vanished, as the pile self-organized and went critical—avalanche! But Havana’s insects soon found the peas in Altshuler’s physics lab. For a physicist working under harsh economic conditions of Cuba in the early 1990s, options were few. Yet Altshuler’s solution came as a byproduct of the crisis: Because of fuel shortages, the country had begun importing Chinese bicycles, and ball bearings were available in abundance. Thus the peas have been replaced by steel beads, but Altshuler and his students still call their machine the chícharotron.
... (Burningham, 2016) We could call this pan-connectivity different things; several terms have been popularly used to describe group members who use their individual intelligence to further the goals of the group, most notably "swarm intelligence" and "collective intelligence." Such examples are readily found in nature from the murmuration and initiating movements of bird species, stigmergic behavior in nest-building wasps, to movement between hives and colonies in bees and ants (Garnier et al., 2007;Hayes, 2011;Langridge et al., 2008;Ramseyer et al., 2009;Visscher & Seeley, 2007). ...
January 2011
American Scientist
... With Dyson's estimate of n = 137, there would be (2(137) + 1)!! ∼ 10 277 irreducible diagrams. When one considers that for QED vertex diagrams there are seven diagrams at two loops, 72 diagrams at three loops, 891 diagrams at four loops, and 12 672 diagrams at five loops, and that the evaluation of the two-loop diagrams was not completed until 1957 (Kinoshita 2003), the three-loop diagrams took until 1996, and we have not yet completely evaluated the four-loop diagrams (Hayes 2004), there is no immediate worry that we will reach the point at which the S-matrix series begins to diverge. In fact, the Bekenstein Bound of 10 123 bits allowed in the visible universe today shows that it is impossible in principle to evaluate Feynman diagrams to the order required to see the S-matrix sum begin to grow, at least in the current epoch of universal history. ...
May 2004
American Scientist
... Markov's findings opened up new horizons for the scientific community-the discovery of the Markov chain changed how researchers interpret probabilistic events [74]. Markov chains have reached widespread use in today's society, even though we might be unaware of that. ...
Reference:
Simple test
March 2013
American Scientist
... The deep dream is a computer vision program that uses the CNN algorithm [4] to transform images using motifs that the network has learned to recognize. Where it makes the mountaintop birds like beef birds and landscapes full of turtle-dogs and other chimeric creatures [5], applying DD in some situations is essential. It cannot be ignored in producing images [6], videos [7], [8], generating music [9], [10], NLP, and also for security [11]. ...
Reference:
A Systematic Review of Deep Dream
November 2015
American Scientist
... Writing code is a technical process, but generated through social processes and human actions that are embedded in and shaped by historical developments. The production of code reflects a set of sharp historical divisions in the disciplines of computing-between computer science, computational science, and software development-that Brian Hayes (2015) has traced out in a recent analysis of different 'cultures of code.' While computer science is concerned with understanding underlying algorithms, software development is concerned with the production of tangible artefacts, and computational science treats the computer not as an object of study but a scientific instrument. ...
January 2015
American Scientist
... As illustrated in [13], Alice kept some amount of money in a locked suitcase. She sent the suitcase to Bob and requested that Bob should count the money. ...
September 2012
American Scientist
... The 10,200 parking spots saved result in an additional 0.1 km2 of available space. We do not account for parking space for robotaxis based on numerous studies that assume the robotaxis are to be stored off roads in special facilities (AV parks) to be maintained and charged or would be circulating all day (Hayes, 2011;ITF, 2015;Nourinejad et al., 2018;Zhang & Guhathakurta, 2017). Moreover, these garages are not part of the public domain since they belong to the private stakeholders (the robotaxis are operated by private stakeholders; see 5.2.2, scenario 3 robotaxis); hence, they would not affect public infrastructure. ...
September 2011
American Scientist
... However, it is unclear how broad these classes are and it is in general hard to foresee whether a given physical model would have a sign problem in any QMC simulations. The situation is not dissimilar to the study of many intriguing problems in the NP complexity class, where a seemingly infeasible problem might turn out to have a polynomial-time solution surprisingly [4]. ...
January 2008
American Scientist
... Table 1 shows a list of estimated r ̅ for small clusters comprising configurations up to six nanospheres of 43 and 145 nm nanospheres, assuming that the nanospheres are in contact. 40 Beyond six nanospheres per cluster, there is a larger number of touching configurations, and even more partially touching configurations, that for the same particle count yield an ever larger distribution of r ̅ for clusters with the sample particle count; see Figure S3 in the SI. Hence, larger clusters rapidly complicate the analysis and would require vast computational efforts. ...
November 2012
American Scientist
... The May-June, 2009, issue of the American Scientist (Vol. 97, No. 3) contained, serendipitously, four -possibly …ve 3 -articles on the fundamental role played by simulations -in its synergetic interactions with theoretical analysis, experiment, computation, prediction and dynamics -in macroeconomics ( [60]), physics ( [105]), engineering ([101] 4 ) and a 'revisit'to the Limits to Growth report ( [55]). In particular, the repository of simulation in the example of macroeconomics by Brian Hayes, is an exemplary exposition of the Phillips Machine, devised and constructed as an electro-mechanical-hydraulic analogue computing machine, encapsulating early Keynesian Monetary Macrodynamics, and capable of interacting with macroeconomic theory and even settling controversial theoretical debates decisively. ...
May 2009
American Scientist