IT Professional

Published by Institute of Electrical and Electronics Engineers
Online ISSN: 1520-9202
Publications
Article
The National Library of Medicine's MedlinePlus Connect service extends the reach of the consumer health website MedlinePlus.gov to deliver relevant information to patients and providers via health IT systems, electronic health records, and patient portals.
 
Article
MedlinePlus Mobile provides a core set of the authoritative health information found on MedlinePlus.gov, optimized for mobile devices. The authors detail the technical approach taken to create this website, from the system architecture through development and testing.
 
Article
Knowledge management applications must convey two types of information. The first type, support information, includes descriptive explanations that serve as a basis for understanding (who, what, when, where, with what, and why). The second type, guidance, strengthens your ability to act, through explanations of how to accomplish a task. The relationship between IT and knowledge management is strong and much needed. Only a complete rethinking of the elements of software design to incorporate both support information and guidance will yield increased functionality and improved user performance. In choosing or developing various products, ask yourself: Will this information technology help users gain understanding? Will it improve their ability to act? Information technology that is designed to enhance both understanding and action can add great value to business applications. Taking this next step in IT design will enable dramatic improvements in workplace performance
 
Article
The author looks at the security measures the companies have taken in the wake of the 11 September tragedy. She found that little has changed in the past 10 months. If anything, "business as usual" seems to be the rule; for example, the majority of network administrators still have no comprehensive network security plan. This and other security fundamentals continue to be sorely lacking. Security is not a product, it itself is a process; and to make the digital systems secure, one has to start building processes.
 
Article
The events leading to 11 September 2001 seem to indicate an interaction of oversights, which, in concert, compromised security. Here, we use a semiotic model to explain some of what went wrong prior to 9-11. Semiotics involves the study of signs and symbols to better understand their meaning and contextual relation. We begin with so-called information. For information to be useful, it must be necessary; and to be necessary, it must be universal in the same way as a mathematical expression is understandable by mathematicians worldwide, regardless of their native language. Current systems literature provides little that addresses what is necessary and why there is a requirement for universality in information representation and processing. Our semiotic model, adapted from the existing domain of semiotic models, provides such a universal model, defining it in five levels.
 
Article
Software engineering has developed over the last few decades into a discipline of many diverse areas of interest. Examples include testing, programming, design, architecture, maintenance, metrics and evolution. Specialty conferences and publications proliferate, stay for a short time, and then disappear, while software engineering remains as nontraditional engineering - part craft, part art, and part logic. This article describes 13 challenges faced by software engineering research and practitioner community, and hints on what to do about them. The challenges include: (1) software quality, (2) return on investment, (3) process improvement, (4) metrics and measurement, (5) standards confusion, (6) standards interoperability, (7) legacy software, (8) testing stoppage criteria, (9) interoperability and composability, (10) operational profiles, (11) designing in, (12) product certification, (13) services.
 
Article
Railway vehicles are complex systems that are an essential component of any national transportation infrastructure. But railway vehicles are also mission-critical systems that must support the comfort and safety of passengers. It is therefore essential that these systems be the epitome of fault-tolerant control technology. The Rail Transit Vehicle Interface (RTVI) standards committee of the IEEE's Vehicular Technology Society developed IEEE 1473 (Standard for Communications Protocol Aboard Trains) for use on passenger rail vehicles in North America. As it turns out, the standard has become international and appears in a wide range of applications on freight and passenger trains. The adoption of the open 1473 protocol and the widespread use of wireless communications standards such as IEEE 802.11 provide additional opportunities for further integrating train line applications with emerging Web services promises exciting scenarios for travelers and enhanced operational success for systems maintainers. And maintenance engineers at wayside stations could integrate information from moving trains with diagnostic data in stationary repositories of service data for enhanced system safety and longevity.
 
Article
Web 2.0, the second phase in the Web's evolution, is attracting the attention of IT professionals, businesses, and Web users. Web 2.0 is also called the wisdom Web, people-centric Web, participative Web, and read/write Web. Web 2.0 harnesses the Web in a more interactive and collaborative manner, emphasizing peers' social interaction and collective intelligence, and presents new opportunities for leveraging the Web and engaging its users more effectively. Within the last two to three years, Web 2.0, ignited by successful Web 2.0 based social applications such as MySpace, Flickr, and YouTube, has been forging new applications that were previously unimaginable.
 
Article
How SOA, Web services, and software engineering are keys to building Web-based solutions.
 
Article
Recently, the relationship between Web 2.0 and service-oriented architectures (SOAs) has received an enormous amount of coverage because of the notion of complexity-hiding and reuse, along with the concept of loosely coupling services. Some argue that Web 2.0 and SOAs have significantly different elements and thus cannot be regarded as parallel philosophies. Others, however, consider the two concepts as complementary and regard Web 2.0 as the global SOA. This paper investigate these two philosophies and their respective applications from both a technological and business perspective
 
Article
The open mobile device uses wireless broadband access and human-computer interface (HCI) technology for web 2.0 applications. Devices manufacturers are making efforts to develop devices with input/output user interfaces (UIs), microphones, touch panels, screen displays, stereo earphone, and speakers. Mobile devices with Web 2.0 application can help mobile user for online shopping or watch movie without tickets by using their mobile devices. Mobile computing software architecture based on the Google Calendar service has a capability to provide customized services and applications in mobile commerce. The new iPhone and Android mobile devices uses HCI technology for remote plug-and-play operations with Web 2.0 applications. The HCI and advanced communication technology can improve the display quality and bandwidth quality of devices.
 
Article
Web 2.0 has been described as a set of services that allow Web users easily share their opinions and resources. Many of these services appear to naturally evolved services, such as bulletin board systems (BBS) and photo sharing. Moreover, some of them lack credibility, cause privacy concerns, and may be considered counterproductive in particular circumstances. The key developments evidenced by Web 2.0 are the great increase of content on the Web and the ability to easily search through it and combine parts of it to form new content. Educators, employers, and other Web 2.0 users need to view its services critically for several reasons, in which case the services may either appear to be evolutionary rather than revolutionary. However, many of them will find their place in education, commerce, and entertainment, and will also continue to evolve in how they provide useful content and add value to the Web experience.
 
Article
Service oriented architecture (SOA) and Web 2.0 have revitalized the Internet in terms of profitability and usability. SOA is a model for organizing and utilizing distributed capabilities that may be under the control of different ownership domains. The basic concept behind SOA is to provide a basic set of services that each application can access to provide this common functionality. A SOA also results in increased standardization for the enterprise and to implement base functionality, a higher degree of consistency will exist between applications. The most common implementation for a company willing to move towards SOA is through Web services. These services can be used to adapt almost any code to become a new Web service and will enable technology that will move businesses forward and speed process evolution.
 
Article
Web 2.0 predicates an improved and seamless user interaction and management of Web environment and resources, guaranteeing required services and a flexible generation of user applications. This occurs via an agile composition of services from available application (foundation) tools. Hence, Web 2.0 developers merely create the core of these applications, while the boundaries dynamically change and expand in accordance with user interaction (a use model). In other words, the resources, tools, and services act as plug-in entities that users can add or remove from the application's operational framework according to their needs. In this article we propose a service-oriented architecture, or SOA, model for managing the operation of Web 2.0 platform actors, improving the services provided to consumers
 
Article
Recent hardware and software developments have paved the way for exploiting Web 2.0 in healthcare. This survey reviews these developments and the related security risks and considers how Enterprise 2.0 can help.
 
Article
CIOs and IT professionals should know the status of carbon management systems as well as the evolutionary trend in this software to better position themselves to exploit future opportunities in a low-carbon economy.
 
Article
Welcome to a new year of IT Professional. Our editorial board enters this year optimistic that the technical recession of the last two years finally appears to be easing.
 
Article
This year, IT Professional have chosen the following pattern of themes and topics for issues to address. The first issue focuses on what is one of the hottest real-world topics of the day: IT operations during crisis situations. These articles provide you with architectural insights as well as experienced-based knowledge of how IT operations are changing to accommodate the threat of emergencies and wartime situations. An excellent overview of the architectural issues and interoperability concerns associated with public-safety wireless communications leads off this issue. In addition, a topical article related to the crisis communications theme describes the benefits of commercially available satellite communications to augment military needs in the military environment in Iraq. Other articles in this first issue will continue the pattern of mixing themes and topics as appropriate. Hopefully many will benefit from this choices.
 
Article
With the increasing demand for networked and XML services, IT professionals need conferences like EDOC to discuss emerging technologies and next-generation enterprise computing.
 
Article
The odds are good that the appointments you've made with US or Canadian associates through Microsoft's Exchange or IBM's Lotus Notes or Domino are going to be off by an hour in the last three weeks of March 2007. Your legacy Java runtime environments also will likely produce incorrect time-sensitive results. Why? Buried among the hundreds of provisions in the 1,700 pages of the United States' Energy Policy Act of 2005 is a modest change to the rules that establish when daylight saving time starts and ends. The changes, which the government ostensibly implemented as part of a federal energy conservation effort, require that beginning in 2007, daylight saving time (DST) start three weeks earlier and end one week later than in previous years. (For more information on the policy's history, see the sidebar "US Laws governing daylight saving time.") These new start and end dates can greatly affect IT systems. Users will expect the effects to be transparent, so the burden of managing the situation will fall to IT professionals. Unfortunately, the fixes are not automatic. In this article, we'll describe the possible risks to computer networks and discuss some remediation strategies to keep your systems running on time
 
Article
A set of IT predictions and predilections are formulated in 'wishful thinking- for 2008', relying on the unpublished number of unknown sources. The 2008 IT predictions find an increase in demands and reduction in productivity related to cybersecurity issues. A high-profile case involving identity theft, accidental release of customer or patient records, and failure-to-comply litigation and prosecutions are likely to erupt in 2008. Customer-facing social networks can help marketing gather information, improve communications, and lead to improved customer understanding. A conglomerated 'Office of CIO' is expected to replace a unique CIO in many companies in 2008. The IT predictions also found that the year 2008 is expected to see green IT initiatives in companies involved in social engagement.
 
Article
Phillip A. Laplante, professor at Penn State University, has made some predictions for information technology (IT) professionals for 2009. He predicted that future cyber-attacks can be more coordinated with one or more physical strikes on infrastructure or political establishments. Physical attacks may involve damage to the undersea telecom cables to disconnect services of Middle East and Southwest Asia. He said about some security lapses found in the US Transportation Security Administration's traveler dispute-resolution web site. He also predicted that in near-future senior-level executives will put more pressure on cost savings and improved productivity from virtualization technologies. Spending of business intelligence initiatives will be decreased, while the new open-source-based business intelligence initiatives will gain importance in the coming years.
 
Article
Digital data has experienced explosive growth over the past few years and evidence suggests that this trend continues. As the world becomes more digitized, data storage becomes increasingly relevant to the individual, the IT community, and the economy. Data storage undoubtedly continues to be a big issue in the near future, but in five years, expect continued demand for capacity, proliferation of form factors (sizes and shapes of storage devices), and governmental regulations. So what does 2010 look like? You can expect continued emphasis on management technologies for heterogeneous environments, less-expensive storage media, and tools for content management. In short, you see cheaper, better, and faster technology for creating, managing, and storing digital data.
 
Article
IT education is about as agile as a barge when it comes to reacting to changes in the IT industry. It takes a while but eventually the educational institutions do align their programs with advances in the technical and business requirements of IT hiring needs and corporate research agendas. Because of institutional inertia, however, program changes are usually years behind changes in industry fundamentals. IT education in 2010 changed in the following ways: 1) specialized educational programs fully reflect changes in the global IT community; 2) universities incorporate product commoditization into their curriculums; 3) remote learning technologies become sophisticated and highly used; 4) and IT education increases its focus on business and communication skills.
 
Article
The changes that are expected to take place in the field of information technology by 2010 are evaluated. As regards peer set nodes and paths, by 2010, each member of a peer set would be having its own international model of what its environment is supposed to look like. 2010 would also experience the evolution of an unusual devious spyware, the HIV-like attackers (shivas), named in reference to their AIDS like ability to take over a computer's immune system. Another interesting feature that is likely to take place in 2010 is a partial collapse of the international patent system, centering around claims of 'patent feudalism' in software and biotechnology industries.
 
Article
In 2010, the line between work and personal time becomes even more blurred. Although productivity tools such as Blackberries, cell phones, and wireless networking allow everyone to stay connected anywhere, they make everyone more available than ever. Companies now expect to reach anyone 24/7, regardless of time or location. In 2010, work-life balance is a serious issue for employees struggling to juggle their lives and careers; it is also a critical issue for employers fighting to attract and retain IT personnel. The technology that keeps everyone connected 24/7 is racing ahead of our understanding of its implications. If companies are not thoughtful about helping employees balance their personal lives, they might find more employees balancing their lives by leaving the profession.
 
Article
As is traditional this time of year, Phillip A. Laplante prognosticates on the state of IT for next year and reflects on his last set of predictions.
 
Article
The progress of IT in the US is the direct result of federal investments. Advances in information technology will provide the basis for much of the world's economic growth as we head into the next century (2000). During the past five years (1994-9), production in computers, semiconductors, and communications equipment quadrupled at a time when total industrial production grew by 28 percent. In the coming decades, the opportunities for innovation in IT are larger than they ever have been-and more important. If the past is any guide, the most important advances will come from unexpected directions, facilitated by advanced technical capabilities that result from fundamental research. Thus, federal investments in basic, high risk research for IT will be even more essential to ensuring the intellectual and technical underpinnings for these as yet unforeseeable developments
 
Article
Can IT and the Internet promote and strengthen global civil society and democracy? How does the Internet create conditions for improving the delivery of services to the common public? Reviewing ICT initiatives in India leads to a set of recommendations for developing and implementing ICT-based initiatives of interest to common citizens and the government, some of which might be useful for Tunisia, Egypt, and other new democracies working to build a more inclusive e-society and an effective system of democratic governance.
 
Article
Optical fiber promises higher data rates, but first it must get around a few hurdles presented by physics. Optical networking's capacity is increasing in response to Web surfing, images attached to e-mail, and the development of remote-collaboration applications. All these applications consume significant amounts of bandwidth. Optical networking uses two basic methods to increase an optical fiber's transmission capacity. First, you can increase data transmission rates. Second, you can couple multiple wavelengths onto a single fiber. An obvious outgrowth of these two basic methods is to combine them by transmitting more quickly on individual wavelengths and packing more wavelengths onto a common optical fiber. In November 2000, Global Crossing teamed up with Lucent Technologies to upgrade a 70km fiber network in Belgium to a 40Gbps operating rate for an approximately one-week OC-768 field trial. Although the field trial produced promising results, an actual upgrade of the existing optical infrastructure might not occur for several years. This pessimism comes from several factors, most of which relate to physics The most significant problem arises from a simple fact of physics: As the data rate increases, pulses become narrower. Although the field trial indicated OC-768 can operate over modern optical fiber, the jury is still out on the effect of multiple wavelengths, each operating at 40 Gbps
 
Article
Although the IEEE 802.11 standard has been around since 1997, work continues to make it more adaptable to the demand for higher data rates and true wireless flexibility. Until recently, few organizations used wireless LANs because they cost too much, their data rates were too low, they posed occupational safety problems because of concerns about the health effects of electromagnetic radiation, and the spectrum used required a license. Today, these problems have largely diminished, and wireless LAN popularity is skyrocketing. Wireless LANs must meet requirements typical of any LAN. They must also meet requirements specific to their intended environment. IEEE 802.11 defines several services that the wireless LAN must provide if its usefulness is to match the functionality inherent in wired LANs. IEEE 802.11 is poised to have a significant impact on the LAN marketplace. As the demand for mobility and freedom from wiring requirements increases, the standard offers a comprehensive yet flexible approach to wireless LAN products.
 
Article
It is important to be aware of the security issues a wireless LAN can pose, and take steps to use its built-in security features. IEEE 802.11b-related security issues range from security management to inherent weaknesses of the underlying technologies. In addition, WLANs are subject to the same attacks that target traditional LANs.
 
Finite-state machine for a DFS algorithm.  
Snapshot of MiSer's rate-power combination table.  
Article
The original purpose of the IEEE 802.11h standard was to extend WLAN operation in Europe to the 5 GHz band, where WLAN devices need dynamic frequency selection (DFS) and transmit power control (TPC) to coexist with the band's primary users, i.e. radar and satellite systems. Although 802.11h defines the mechanisms or protocols for DFS and TPC, it does not cover the implementations themselves. With some amendments to the 802.11h standard to address this need, the mechanisms for DFS and TPC can be used for many other applications with the benefits of automatic frequency planning, power consumption reduction, range control, interference reduction, and quality-of-service enhancement. MiSer is presented as a simple application example for saving significant communication energy for 802.11h devices
 
Article
From its inception in 1980, the IEEE 802 committee has led the way in developing LAN (local area network) standards. The committee's work on wireless LANs (WLANs) began in 1987 within the TEEE 802.4 working group. Developing an ISM (industrial, scientific, and medical)-based WLAN using the equivalent of a token-passing bus media access control (MAC) protocol was the committee's initial goal. After some work, the committee decided that a token-bus-controlled radio medium would cause inefficient use of the radio frequency spectrum. In 1990, the committee formed a new working group, IEEE 802.11, specifically devoted to WLANs, with a charter to develop a MAC protocol and physical-medium specification. Since then, demand for WLANs, at different frequencies and data rates, has exploded. Keeping pace with this demand, the IEEE 802.11 working group has issued an ever-expanding list of standards.
 
Article
Sorel Reisman discusses the world view of academia and how "members of the academy" need a dose of reality before they bash the for-profit world.
 
Article
This paper argues that there is no such thing as academic freedom. A person who speak out on issues, whether within their institution or in public, risk retaliation by others more powerful than they. And there are many forms of retaliation. Even tenure can't guarantee protection from powerful colleagues. Tenure and academic freedom were originally conceived as a way to provide scholars the freedom to explore and express, without fear of retaliation, unconventional ideas that might challenge existing or preconceived social or scientific philosophies or practices
 
Article
The Computer Society has a long tradition of serving the academic side of our membership, but many of the recent efforts have targeted the "practitioner" side. Of course, in reality, there's no hard division between the two, and IT Pro and the CS seek to serve those at all points along the continuum.
 
Article
By defining a protocol that supplies Web clients and servers with cryptographic parameters, the Secure Sockets Layer protocol enables the safe exchange of sensitive data, a crucial aspect of any e-business. The protocol's sticking point is that encrypting and decrypting data requires a tremendous amount of CPU processing power. The burden is especially apparent on the server side, because multiple Web clients often connect to a single Web server. For e-commerce transactions, it's important to implement SSL in a way that doesn't overburden your Web server's CPU and slow down the entire operation. Although the original Web servers that supported SSL did so exclusively in software, SSL adapter cards soon became available to help off-load the server's CPU load and increase performance. Today, content switches with SSL accelerators can encrypt and decrypt data at the network edge, eliminating the need for a Web server's CPU to perform any SSL-related calculations. The article focuses on the relative merits of these newer implementations. A look at the original software-only approach and its drawbacks clarify the reasons that hardware acceleration for SSL became necessary.
 
Article
In this paper wireless technologies continue to advance to provide faster, more reliable, and more secure service. The articles in this issue provide case studies and practical information to help the IT professional understand existing wireless broadband options as well as their future direction.
 
Article
Today, digitally stored information isn't only ubiquitous, it's also increasing in volume at an exponential rate. And not only is the volume increasing, but so is the variety, as well as the ways of combining information from different sources to derive insights. Not surprisingly, our most pressing technological and business problem is finding what we need in this sea of information. The dominant paradigm for addressing this problem is information retrieval (Modem Information Retrieval, Ricardo Baeza-Yates and Berthier Ribeiro-Neto, ACM Press, 1999). In this paradigm, the user enters a query (typically a few words typed into a search box), and the system retrieves documents matching the query, ranking the matches based on an estimate of their relevancy to the query. If the system finds many matches, the user sees only the highest-ranked matches. The popularity of Web search systems such as Google shows that the information retrieval paradigm can be effective. An information access framework empowers users by explicitly focusing on the interaction between users and the system. The key problem for information access systems isn't guessing which matching document is most relevant, but establishing a dialogue in which users progressively communicate their information goals while the system provides immediate, incremental feedback that guides users in the pursuit of those goals
 
Article
Networks are fast becoming the backbone of many businesses, but the technology behind them is probably the most complex and least understood IT subject. It's not enough to have a particular network in place-you must also keep data flowing from site to site and to and from the Internet at adequate speed. This is where the choice of access technologies-the various ways your enterprise can connect to larger networks-is crucial. Your choices have never been more diverse. After years of talk about how to upgrade the “last mile” (the connection from your location to a hub or central office), vendors now have products to offer. Some are relatively immature, having only recently become commercially available. Regardless, they can make an immediate difference in throughput. To benefit, you'll need to navigate some complex vendor options and keep an eye on future developments in technology and standards. To help guide your decision making, the authors present a brief overview of new network access technologies. Be aware that deployment depends on location and market: some of these technologies may be available in your area, others may not. The following are mentioned: phone line access; digital subscriber lines; cable modems; and direct broadcast satellites
 
Article
Technology advances and the continuing convergence of computing and telecommunications have made an unprecedented amount of information available to the public. For many people with disabilities, however, accessibility issues limit the impact of such widespread availability. Of the many types of disabilities-mobility, hearing, and learning impairments, for example-vision impairments are most pervasive in the general population, especially among seniors. The world's rapidly aging population is redefining visually impaired, which refers to individuals with low vision (that is, people for whom ordinary eyeglasses, contact lenses, or intraocular lens implants don't provide clear vision), color blindness, and blindness. In 1998, the US Congress amended the Rehabilitation Act, strengthening provisions covering access to government-posted information for people with disabilities. As amended, Section 508 requires federal agencies to ensure that all assets and technologies are accessible and usable by employees and the public, regardless of physical, sensory, or cognitive disabilities. Most current assistive technologies for visually impaired users are expensive, difficult to use, and platform dependent. A new approach by the US National Library of Medicine (NLM), National Institutes of Health (NIH), addresses these weaknesses by locating the assistive capability at the server, thus freeing visually impaired individuals from the software expense, technical complexity, and substantial learning curve of other assistive technologies. NLM's Senior Health Web site (http://nihseniorhealth.gov), a talking Web (a Web application that presents Web content as speech to users), demonstrates the approach's effectiveness.
 
Article
In deciding how to handle messaging access in a Microsoft Exchange environment, IT managers have an important and interesting tradeoff to make. On the one hand, employees have legitimate communication and e-mail requirements. On the other hand, providing access to messaging from outside the corporate intranet raises considerable security concerns. Obviously, no rule applies universally, and every company or organization must draw its own conclusions and make its own recommendations. This article's purpose is to provide the information you need to make the appropriate tradeoff. Outlook Web Access is not a public messaging service. Instead, it is a Microsoft software product that lets Exchange server users access their accounts with a standard Web browser.
 
Article
The Americans with Disabilities Act requires organizations to provide physical access to people with disabilities. With the advent of the Web, federal legislation may extend into cyberspace. (The "Federal Support" sidebar highlights some relevant legislation.) How courts will apply ADA is still uncertain. Federal cases in which ADA has been applied to cyberspace are still under review at various court levels. On the state level, however, the New York State attorney general recently required Priceline and Ramada Inn to make their Web sites accessible and subject to independent accessibility audits. Other research has examined the accessibility of college and university, corporate, and retail Web sites relative to the ADA. The authors conducted a study on the accessibility of the Web sites of US government agencies and branches of the US government and their contractors, which, like university sites, are mandated to be accessible. They also attempt to determine the types of accessibility barriers that occur most frequently on these sites. Because the study is preliminary, findings are basic estimates of accessibility; you should not construe them as statistically representative of all sites offered. However, our findings on accessibility barriers can serve as a guideline to Web designers seeking to create sites that effectively communicate their contents to people with disabilities.
 
Article
First Page of the Article
 
A GenBank data record. {(#uid: 6138971, #title: "Homo sapiens adrenergic ...", #accession: "NM_001619", #organism: "Homo sapiens", #taxon: 9606, #lineage: ["Eukaryota", "Metazoa", ... ], #seq: "CTCGGCCTCGGGCGCGGC...", #feature: { (#name: "source", #continuous: true, #position: [ (#accn: "NM_001619", #start: 0, #end: 3602, #negative: false)], #anno: [ (#anno_name: "organism", #descr: "Homo sapiens"), ... ]), ...}, ...)}
Recognizing translation initiation sites. 299 HSU27655.1 CAT U27655 Homo sapiens CGTGTGTGCAGCAGCCTGCAGCTGCCCCAAGCCATGGCTGAACACTGACTCCCAGCTGTG 80 CCCAGGGCTTCAAAGACTTCTCAGCTTCGAGCATGGCTTTTGGCTGTCAGGGCAGCTGTA 160 GGAGGCAGATGAGAAGAGGGAGATGGCCTTGGAGGAAGGGAAGGGGCCTGGTGCCGAGGA 240 CCTCTCCTGGCCAGGAGCTTCCTCCAGGACAAGACCTTCCACCCAACA............  
Mining literature for protein interactions.  
Pathway extracted by Pies.  
Article
A new IT discipline - bioinformatics - fuses computing, mathematics and biology to meet the many computational challenges in modern molecular biology and medical research. The two major themes in bioinformatics - data management and knowledge discovery - rely on effectively adopting techniques developed in IT for biological data, with IT scientists playing an essential role. The future of molecular biology and biomedicine will greatly depend on advances in informatics. As we review researchers' many achievements in bioinformatics, we're confident that the marriage between molecular biology and information technology is a happy one. Accomplishments in bioinformatics have advanced molecular biology and information technology. Although many computational challenges lie ahead, more fruitful outcomes of this successful multidisciplinary marriage are likely.
 
Article
As the threat of malicious activity through the Internet increased, the US government began reviewing protection requirements for national and international security systems. Legislation resulting from this review included the Federal Information Security Management Act (FISMA), the Paperwork Reduction Act of 1995, and the Information Technology Management Reform Act of 1996 (also known as the Clinger-Cohen Act). This legislation sought to foster trust between the government and the public through significant security improvements in these systems. To this end, it the government established a peer-review process we've come to know as certification and accreditation, or C&A.
 
Article
New accreditation criteria cover university programs in information systems. The authors consider how IT professionals can contribute. They discuss accreditation efforts to date and current information systems accreditation.
 
Article
Data conversion looks easy to a lot of people. It's not-particularly if it's a large project. There are just too many ways things can go wrong. You can lose data or mess it up. And you can stall the conversion if you don't have enough computing resources. Project size really makes a difference. Some type of mapping tool can improve your chances for success because the data in the specification is the same data that drives the program that loads the target tables. Automation eliminated the mistakes that cause inconsistency. The tools and processes we developed on a large project worked well on a smaller project, but the reverse certainly would not have been true. A systematic process is required with some way to validate each step
 
Article
A critical task in an enterprise IT architecture project is to identify and understand key business requirements to ensure that the planned IT systems will fully support and evolve with the business. This article illustrates a more effective approach: leveraging the combined power of value chain, which captures the static business view, and use cases, which animate the business model. In this way, enterprise architects can rapidly define the business's main elements and understand how key systems interact to support business activities.
 
Top-cited authors
San Murugesan
  • BRITE Professional Services
Nir Kshetri
  • University of North Carolina at Greensboro
Phillip A. Laplante
  • Pennsylvania State University
Robert L. Grossman
  • University of Chicago
Keith Willam Miller
  • University of Missouri - St. Louis