Computer scientists are working with biomedical researchers, policy specialists, and medical practitioners to usher in a new era in healthcare. A recently convened panel of experts considered various research opportunities for technology-mediated social participation in Health 2.0.
Numerous application areas, including bioinformatics and computational biology, demand increasing amounts of processing capability. In many cases, the computation cores and data types are suited to field-programmable gate arrays. The challenge is identifying the design techniques that can extract high performance potential from the FPGA fabric.
Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies.
While the potential benefits of smart home technology are widely recognized, a lightweight design is needed for the benefits to be realized at a large scale. We introduce the CASAS "smart home in a box", a lightweight smart home design that is easy to install and provides smart home capabilities out of the box with no customization or training. We discuss types of data analysis that have been performed by the CASAS group and can be pursued in the future by using this approach to designing and implementing smart home technologies.
The Performance Evaluation Subcommittee of the Technical Committee on Computer Architecture sponsored a workshop at Argonne National Laboratories in October 1971. This issue is in part a result of that workshop.
A team of Caltech researchers has recently demonstrated the world's first integrated optoelectronic circuit–combination of optical (laser detectors) and electronic (transistors) elements "grown" monolithically on a single layered crystal.
Scientists at IBM Research have used a computer to transcribe speech, composed of sentences drawn from a 1000-word vocabulary and read at a normal speaking pace, into printed form with what is believed to be the best accuracy yet obtained under complex experimental conditions–91 percent.
Using voice synthesis, the IBM Audio Typing Unit simulates actual speech, offering audio functions to assist a blind typist. Consisting of an audio keypad and console, the unit can be attached to the IBM Mag Card II, Mag Card/A, Memory, or Memory 100 host typewriter.
Methods for designing new computer systems have changed rapidly.
Consider general purpose microprocessors: gone are the days when one or
two expert architects would use hunches, experience, and rules of thumb
to determine a processor's features. Marketplace competition has long
since forced companies to replace this ad hoc process with a targeted
and highly systematic process that focuses new designs on specific
workloads. Although the process differs from company to company, there
are common elements. The main advantage of a systematic process is that
it produces a finely tuned design targeted at a particular market. At
its core are models of the processor's performance and its workloads.
Developing and verifying these models is the domain now called
performance analysis. We cover some of the advances in dealing with
modern problems in performance analysis. Our focus is on architectural
performance, typically measured in cycles per instruction
University of Washington scientists are using high-speed electrostatic printer/plotters for on-the-spot analysis of oceanographic data. The Gould 5000 units are part of the $3 million computer system on the university research ship Thomas G. Thompson
Over 5,000 blind people in the Boston area have a new friend in a talking computer system that allows them to type letter-perfect correspondence, proofread manuscripts, calculate bookkeeping problems, and write computer programs.
The Australian Bureau of Animal Health has begun developing the world's largest national animal disease information system based on 20 Data General Nova 3 minicomputers. The system will be a key factor in the eradication of brucellosis–a long-persisting bacterial disease in cattle–to protect Australia's international markets for beef and dairy products. The system, according to the bureau, is the largest of its kind. The minicomputers are presently being installed at veterinary diagnostic laboratories throughout Australia.
The paper discusses the properties of a software component and the
use of object oriented techniques. It considers the varieties of
components in terms of four viewpoints: level of software process task,
level of abstraction, level of execution, and level of accessibility
After 20 years in academia and the Silicon Valley, the new Provost
of Stanford University calls for a shift in focus for systems research.
Performance-long the centerpiece-needs to share the spotlight with
availability, maintainability, and other qualities. Although performance
increases over the past 15 years have been truly amazing, it will be
hard to continue these trends by sticking to the basically evolutionary
path that the research community is currently on. The author advocates a
more evolutionary approach to systems problems and thinks the approach
needs to be more integrated. Researchers need to think about hardware
and software as a continuum, not as separate parts. He sees society on
the threshold of a “post PC” era, in which computing is
ubiquitous, and everyone will use information services and everyday
utilities. When everyone starts using these systems, guess what? They
expect them to work and to be easy to use. So this era will drive system
design toward greater availability, maintainability, and scalability,
which will require a real refocusing of current research
The IEEE Computer Society Computer Elements Technical Committee has been tracking the progress of new technology for many years. The December 1974 workshop in Phoenix once again focused on the impact of semiconductor LSI, but papers and discussions ranged from social problems of the world to that last uncertain electron tipping a flip-flop out of its metastable state. The various factors in the success of emerging technologies were identified, with technical novelty far down the list–no surprise to the older committee members, but often a disappointment to the young innovator.
The Kurzweil Reading Machine converts print to speech, and is designed as a reading tool for the blind and visually handicapped. The system handles ordinary printed material–books, letters, reports, memoranda, etc.–in most common styles and sizes of type. The output produced is a synthetic voice using full-word English speech. The reader operates the device by placing printed material face down on the glass plate forming the top surface of the scanning unit; he then presses the "page" button on the contol panel, and listens to the synthetic speech produced as an electronic camera scans the page and transmits its image to a minicomputer housed within the device. The computer separates the image into discrete character forms, recognizes the letters, groups the letters into words, computes the pronunciation of each word, and then produces the speech sounds associated with each phoneme. The machine operates at normal speech rates, about 150 words per minute.
For pt.I see ibid., vol.4, p.64-70 (1996). As microprocessor
advances begin to level off, communication network deployment will keep
accelerating, and software engineering must face the prospect of radical
change if it is to keep pace. The paper considers how the intersection
of these ascending and descending technologies will propel the high-tech
world into a new model of computing by the year 2012. Software is the
steam that drives the engines of the Information Age, but clearly it is
not keeping up with developments on the hardware side. Historical trends
suggest that further progress in programmer productivity and
programming-language power over the next 10,000<sub>2</sub> years is
highly unlikely. With a large percentage of programmers maintaining
legacy code, the resources available for innovation are limited. In
fact, software innovation will have to come from a five percent fringe
of artisans and nontraditional thinkers outside the current programming
language and software engineering establishment
Forecasts technological breakdowns and breakthroughs for the next
16 (10,000 to the base 2) years. Change has always been a part of recent
history. Indeed, Earth-shaking change occurs about every 150-200 years.
It takes about 50 years to make the transition from the old to the new,
and we are nearing the end of just such a 50-year period. Change is
caused by both technological breakthroughs and technological breakdowns.
In the current 50-year transition, the breakthrough is in networking and
software development, and the breakdown is in processor (VLSI)
technology. Both forces will propel the high-tech world into a new model
of computing by the year 2012. The new model will be based on a
networked, global megacomputer that obeys the Gustafson-Barsis speedup
law instead of the Amdahl law of parallelism. The next century's
information superhighway will actually be a network of cable TV
operators, not telephone companies. A new era of programming that
eliminates traditional programming languages (and scolds the software
engineering community for failure) will arise and lead to a software
economy-an electronic commerce dominated by software artisans
The Smart Dust project is probing microfabrication technology's
limitations to determine whether an autonomous sensing, computing, and
communication system can be packed into a cubic millimeter mote (a small
particle or speck) to form the basis of integrated, massively
distributed sensor networks. Although we've chosen a somewhat arbitrary
size for our sensor systems, exploring microfabrication technology's
limitations is our fundamental goal. Because of its discrete size,
substantial functionality, connectivity, and anticipated low cost, Smart
Dust will facilitate innovative methods of interacting with the
environment, providing more information from more places less