Computer

Published by IEEE (Institute of Electrical and Electronics Engineers)

Online ISSN: 1558-0814

·

Print ISSN: 0018-9162

Articles


Figure 1. Research opportunities in developing the national health information infrastructure. Technology-mediated social participation systems have applications within the spheres of personal, clinical, and population health information. 
Figure 2. The Community Health Data Initiative encourages the use of mashups, social networking tools, enhanced search, and other technological innovations to support active community participation based on credible public-health sources. Source: Department of Health and Human Services (HHS). 
Social Participation in Health 2.0
  • Article
  • Full-text available

November 2010

·

753 Reads

·

·

Thomas Finholt

·

[...]

·

John C Thomas
Computer scientists are working with biomedical researchers, policy specialists, and medical practitioners to usher in a new era in healthcare. A recently convened panel of experts considered various research opportunities for technology-mediated social participation in Health 2.0.
Download
Share

Achieving High Performance with FPGA-Based Computing

March 2007

·

232 Reads

Numerous application areas, including bioinformatics and computational biology, demand increasing amounts of processing capability. In many cases, the computation cores and data types are suited to field-programmable gate arrays. The challenge is identifying the design techniques that can extract high performance potential from the FPGA fabric.

e-Science, caGrid, and Translational Biomedical Research

November 2008

·

77 Reads

Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies.

Figure 1. CASAS smart home components.  
Table 1 . Summary of costs for CASAS "smart home in a box" components.
Figure 3. Visualization of discovered patterns: P1 (top left), P2 (top right), and P3 (bottom left).  
Figure 4. Activity trends for a smart home resident.  
Figure 5. Snapshot of CASAS activity visualizer. The visualizer renders sensor events on a computer or mobile device while plotting usage of resources such as electricity.  
CASAS: A smart home in a box

July 2013

·

5,108 Reads

While the potential benefits of smart home technology are widely recognized, a lightweight design is needed for the benefits to be realized at a large scale. We introduce the CASAS "smart home in a box", a lightweight smart home design that is easy to install and provides smart home capabilities out of the box with no customization or training. We discuss types of data analysis that have been performed by the CASAS group and can be pursued in the future by using this approach to designing and implementing smart home technologies.

Performance Evaluation

October 1972

·

22 Reads

The Performance Evaluation Subcommittee of the Technical Committee on Computer Architecture sponsored a workshop at Argonne National Laboratories in October 1971. This issue is in part a result of that workshop.

New Applications & Recent Research

June 1981

·

15 Reads

A team of Caltech researchers has recently demonstrated the world's first integrated optoelectronic circuit–combination of optical (laser detectors) and electronic (transistors) elements "grown" monolithically on a single layered crystal.

New Applications & Recent Research

October 1980

·

18 Reads

Scientists at IBM Research have used a computer to transcribe speech, composed of sentences drawn from a 1000-word vocabulary and read at a normal speaking pace, into printed form with what is believed to be the best accuracy yet obtained under complex experimental conditions–91 percent.

New Applications & Recent Research

April 1980

·

13 Reads

Using voice synthesis, the IBM Audio Typing Unit simulates actual speech, offering audio functions to assist a blind typist. Consisting of an audio keypad and console, the unit can be attached to the IBM Mag Card II, Mag Card/A, Memory, or Memory 100 host typewriter.

Performance analysis and its impact on design

June 1998

·

24 Reads

Methods for designing new computer systems have changed rapidly. Consider general purpose microprocessors: gone are the days when one or two expert architects would use hunches, experience, and rules of thumb to determine a processor's features. Marketplace competition has long since forced companies to replace this ad hoc process with a targeted and highly systematic process that focuses new designs on specific workloads. Although the process differs from company to company, there are common elements. The main advantage of a systematic process is that it produces a finely tuned design targeted at a particular market. At its core are models of the processor's performance and its workloads. Developing and verifying these models is the domain now called performance analysis. We cover some of the advances in dealing with modern problems in performance analysis. Our focus is on architectural performance, typically measured in cycles per instruction



New Application

April 1975

·

13 Reads

University of Washington scientists are using high-speed electrostatic printer/plotters for on-the-spot analysis of oceanographic data. The Gould 5000 units are part of the $3 million computer system on the university research ship Thomas G. Thompson


New Applications

April 1973

·

10 Reads

Over 5,000 blind people in the Boston area have a new friend in a talking computer system that allows them to type letter-perfect correspondence, proofread manuscripts, calculate bookkeeping problems, and write computer programs.

New Applications

October 1975

·

62 Reads

The Australian Bureau of Animal Health has begun developing the world's largest national animal disease information system based on 20 Data General Nova 3 minicomputers. The system will be a key factor in the eradication of brucellosis–a long-persisting bacterial disease in cattle–to protect Australia's international markets for beef and dairy products. The system, according to the bureau, is the largest of its kind. The minicomputers are presently being installed at veterinary diagnostic laboratories throughout Australia.

On To Components

February 1999

·

35 Reads

The paper discusses the properties of a software component and the use of object oriented techniques. It considers the varieties of components in terms of four viewpoints: level of software process task, level of abstraction, level of execution, and level of accessibility

The Future of Systems Research

September 1999

·

99 Reads

After 20 years in academia and the Silicon Valley, the new Provost of Stanford University calls for a shift in focus for systems research. Performance-long the centerpiece-needs to share the spotlight with availability, maintainability, and other qualities. Although performance increases over the past 15 years have been truly amazing, it will be hard to continue these trends by sticking to the basically evolutionary path that the research community is currently on. The author advocates a more evolutionary approach to systems problems and thinks the approach needs to be more integrated. Researchers need to think about hardware and software as a continuum, not as separate parts. He sees society on the threshold of a “post PC” era, in which computing is ubiquitous, and everyone will use information services and everyday utilities. When everyone starts using these systems, guess what? They expect them to work and to be easy to use. So this era will drive system design toward greater availability, maintainability, and scalability, which will require a real refocusing of current research directions

Technology in the Real World

June 1975

·

13 Reads

The IEEE Computer Society Computer Elements Technical Committee has been tracking the progress of new technology for many years. The December 1974 workshop in Phoenix once again focused on the impact of semiconductor LSI, but papers and discussions ranged from social problems of the world to that last uncertain electron tipping a flip-flop out of its metastable state. The various factors in the success of emerging technologies were identified, with technical novelty far down the list–no surprise to the older committee members, but often a disappointment to the young innovator.

New Applications & Recent Research

October 1981

·

32 Reads

The Kurzweil Reading Machine converts print to speech, and is designed as a reading tool for the blind and visually handicapped. The system handles ordinary printed material–books, letters, reports, memoranda, etc.–in most common styles and sizes of type. The output produced is a synthetic voice using full-word English speech. The reader operates the device by placing printed material face down on the glass plate forming the top surface of the scanning unit; he then presses the "page" button on the contol panel, and listens to the synthetic speech produced as an electronic camera scans the page and transmits its image to a minicomputer housed within the device. The computer separates the image into discrete character forms, recognizes the letters, groups the letters into words, computes the pronunciation of each word, and then produces the speech sounds associated with each phoneme. The machine operates at normal speech rates, about 150 words per minute.



The next 10,000/sub 2/ years. II

June 1996

·

10 Reads

For pt.I see ibid., vol.4, p.64-70 (1996). As microprocessor advances begin to level off, communication network deployment will keep accelerating, and software engineering must face the prospect of radical change if it is to keep pace. The paper considers how the intersection of these ascending and descending technologies will propel the high-tech world into a new model of computing by the year 2012. Software is the steam that drives the engines of the Information Age, but clearly it is not keeping up with developments on the hardware side. Historical trends suggest that further progress in programmer productivity and programming-language power over the next 10,000<sub>2</sub> years is highly unlikely. With a large percentage of programmers maintaining legacy code, the resources available for innovation are limited. In fact, software innovation will have to come from a five percent fringe of artisans and nontraditional thinkers outside the current programming language and software engineering establishment


The next 10,0002 years. I

May 1996

·

15 Reads

Forecasts technological breakdowns and breakthroughs for the next 16 (10,000 to the base 2) years. Change has always been a part of recent history. Indeed, Earth-shaking change occurs about every 150-200 years. It takes about 50 years to make the transition from the old to the new, and we are nearing the end of just such a 50-year period. Change is caused by both technological breakthroughs and technological breakdowns. In the current 50-year transition, the breakthrough is in networking and software development, and the breakdown is in processor (VLSI) technology. Both forces will propel the high-tech world into a new model of computing by the year 2012. The new model will be based on a networked, global megacomputer that obeys the Gustafson-Barsis speedup law instead of the Amdahl law of parallelism. The next century's information superhighway will actually be a network of cable TV operators, not telephone companies. A new era of programming that eliminates traditional programming languages (and scolds the software engineering community for failure) will arise and lead to a software economy-an electronic commerce dominated by software artisans

Figure 1. Conceptual diagram showing a Smart Dust mote's major components: a power system, sensors, an optical transceiver, and an integrated circuit.
Figure 3. Autonomous bidirectional communication mote with a MEMS optics chip containing a corner-cube retroreflector on the large die, a CMOS application-specific integrated circuit (ASIC) for control on the 300 × 360 micron die, and a hearing aid battery for power. The total volume is 63 mm
Figure 4. Conceptual diagram of steered agile laser transmitter (side view). A laser emits an infrared beam that is collimated with a lens. The lens directs the narrow laser beam onto a beam-steering mirror, aiming the beam toward the intended receiver.
Figure 5. Scanning electron micrograph of the first-generation steered agile laser transmitter. The chip combines a laser diode and ball lens with a micromachined two-degreeof-freedom beam-steering mirror. The optical path runs from the top of the laser diode's front facet, through the ball lens, reflects off the left-hand mirror plate, then finally reflects off the substrate before leaving the chip.
Figure A. Model of a crawling microrobot developed by University of California researchers. This device measures less than one cubic centimeter. Developers folded 2-micron-thick silicon sheets to create insect-like legs with microhinges on the folds and joints. The hollow structure has lightweight, rigid legs. Silicon tendons inside the legs couple each rigid leg segment to electrostatic motors on the robot's body.  
Smart dust: communicating with a cubic-millimeter computer. Computer 34(1):44-51. DOI 10.1109/2.895117

February 2001

·

6,692 Reads

The Smart Dust project is probing microfabrication technology's limitations to determine whether an autonomous sensing, computing, and communication system can be packed into a cubic millimeter mote (a small particle or speck) to form the basis of integrated, massively distributed sensor networks. Although we've chosen a somewhat arbitrary size for our sensor systems, exploring microfabrication technology's limitations is our fundamental goal. Because of its discrete size, substantial functionality, connectivity, and anticipated low cost, Smart Dust will facilitate innovative methods of interacting with the environment, providing more information from more places less intrusively

Top-cited authors