Social Science Computer Review

Published by SAGE Publications
Online ISSN: 0894-4393
Publications
Comparison of Sample Hospitals to AHA Hospitals With 50þ Beds 
Frequencies of EMR Component Capabilities (in percentages) 
Components of the EMR Sophistication Classes EMR component Ancillary based Ancillary þ data aggregation Ancillary to bedside 
Mean Posterior Probabilities by Class 
Hospital Characteristics and Odds Ratios for EMR Sophistication 
Article
Many believe that electronic medical record systems hold promise for improving the quality of health care services. The body of research on this topic is still in the early stages, however, in part because of the challenge of measuring the capabilities of electronic medical record systems. The purpose of this study was to identify classes of Electronic Medical Record (EMR) system sophistication in hospitals as well as hospital characteristics associated with the sophistication categories. The data used were from the American Hospital Association (AHA) and the Health Information Management and Systems Society (HIMSS). The sample included acute care hospitals in the United States with 50 beds or more. We used latent class analysis to identify the sophistication classes and logistic regression to identify relationships between these classes and hospital characteristics. Our study identifies cumulative categories of EMR sophistication: ancillary-based, ancillary/data aggregation, and ancillary-to-bedside. Rural hospital EMRs are likely to be ancillary-based, while hospitals in a network are likely to have either ancillary-based or ancillary-to-bedside EMRs. Future research should explore the effect of network membership on EMR system development.
 
Article
This article reviews a multimedia application in the area of survey measurement research: adding audio capabilities to a computer-assisted interviewing system. Hardware and software issues are discussed, and potential hardware devices that operate from DOS platforms are reviewed. Three types of hardware devices are considered: PCMCIA devices, parallel port attachments, and laptops with built-in sound.
 
Article
Special treatments are needed for the rehabilitation of brain-injured patients as well as for other medical illnesses The ethical criteria for outcome are the welfare and positive results in recovery, using multidisciplinary perspectives. Pervasive ethical issues include inappropriately and/or poorly trained rehabilitation practitioners; trends toward replacement of clinicians with computers, exclusion of neuropsychologists from treatment planning and monitoring of cognitive rehabilitation; sociopolitical biases which limit access to treatment; limited vocational goal formulation; overlooking common sources of head injury; violation of patient confidentiality; poorly developed treatment programs, including software, use of standardized psychological tests as training exercises; and unnormed computerized assessment software. This paper addresses the above problems with a focus on brain-injured patients. Solutions are recommended and can be used as a basis for ethical considerations by other disciplines. Recommended ethical approaches are illustrated, using computer-assisted neuropsychological rehabilitation. Keywords brain injured, ethics, rehabilitation, computers, gender, neuropsychological, treatment.
 
Article
Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender people in the United States. We developed an online instrument, recruited participants exclusively via websites frequented by members of the target population, and collected data using online quantitative survey and qualitative synchronous and asynchronous interview methods. Our experiences indicate that the Internet environment presents the investigator with some unique challenges and that commonly expressed criticisms about Internet research (e.g., lack of generalizable samples, invalid study participants, and multiple participation by the same subject) can be overcome with careful method design, usability testing, and pilot testing. The importance of both usability and pilot testing are described with respect to participant engagement and retention and the quality of data obtained online.
 
Conference Paper
Following a brief consideration of the role of rhetoric in law and science, the paper explores how rhetorical accounts of new technologies influence the course of legislative, judicial and regulatory decisions. It proposes that such themes have the powerful capacity to determine outcomes and to shape modern concepts of individual as well as societal freedoms and rights
 
Conference Paper
Communication is the fastest growing industry in the world due to a transformation of world information flows caused by the convergence of telecommunication, information technology and mass media institutions. The emerging conglomerates are leading the world towards the information age. The new technology at their disposal is creating ever increasing world information flows which are changing political, economic and cultural landscapes, and redefining work, education and development. The speed and uptake of new technology could not have been predicted and its far reaching political, economic and social implications have only recently begun to emerge. The paper argues that an international communication policy is necessary to ensure the world does not divide into the information rich and the information poor
 
Article
Electoral prediction from Twitter data is an appealing research topic. It seems relatively straightforward and the prevailing view is overly optimistic. This is problematic because while simple approaches are assumed to be good enough, core problems are not addressed. Thus, this paper aims to (1) provide a balanced and critical review of the state of the art; (2) cast light on the presume predictive power of Twitter data; and (3) depict a roadmap to push forward the field. Hence, a scheme to characterize Twitter prediction methods is proposed. It covers every aspect from data collection to performance evaluation, through data processing and vote inference. Using that scheme, prior research is analyzed and organized to explain the main approaches taken up to date but also their weaknesses. This is the first meta-analysis of the whole body of research regarding electoral prediction from Twitter data. It reveals that its presumed predictive power regarding electoral prediction has been rather exaggerated: although social media may provide a glimpse on electoral outcomes current research does not provide strong evidence to support it can replace traditional polls. Finally, future lines of research along with a set of requirements they must fulfill are provided.
 
Article
This paper presents a framework for thinking abaout the different types of spreadsheet modeling applications available for teaching economics. Spreadsheet applications are categorized by the degree to which students are involved in the spreadsheet's construction, and the degree to which students are involved in the mathematics of the model. Examples of applications from each category call attention to the high degree of flexibility that instructors have in designing spreadsheet applications, and suggest alternative ways that instructors can tailor applications to fit their specific instructional style and setting. While the costs associated with developing spreadsheet applications are significant, we believe that the costs are manageable, even for instructors who are new to spreadsheets. We conclude that the advantages imparted to students by carefully tailored spreadsheet applications are significant, and we hope to encourage the development and use of such applications.
 
Article
Respondents follow simple heuristics in interpreting the visual features of questions. The authors carried out two experiments in two panels to investigate how the effect of visual heuristics affects the answers to survey questions. In the first experiment, the authors varied the distance between scale points in a 5-point scale to investigate whether respondents use the conceptual or visual midpoint of a scale. In the second experiment, the authors used different end point labels of a 5-point scale, by adding different shadings of color and numbers that differed both in sign and value (2 to -2), to study whether options that are similar of appearance are considered conceptually closer than when they are dissimilar in appearance. The authors predicted that there is a hierarchy of features that respondents attend to, with verbal labels taking precedence over numerical labels, and numerical labels taking precedence over visual cues. The results confirmed the hypothesis: the effect of spacing of response options and different end points was only apparent in polar point scales and not in fully labeled scales. In addition, this study on two panels, with one consisting of extremely trained respondents and the other of relatively fresh respondents, shows that trained respondents are affected by the distance between response options whereas relatively new respondents are not. To reduce the effect of visual cues, taking into account the robustness of results, the authors suggest it is better to use fully labeled 5-point scales in survey questions.
 
Article
Conducted a pilot study of the Smithtown program, a computer program that uses the discovery world approach, in a lecture-oriented introductory college microeconomics course of 150 students. Features of the program are reviewed. Activity profiling shows that Ss asked many more questions when working on the program than in lectures or recitations. Also, Ss who completed the laboratory work with Smithtown earned final course grades that were about one-quarter of a letter grade higher than those who did not complete the lab work. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
 
Article
Examined chapters in leading economic textbooks that were accompanied by tutorial software to evaluate the role and contribution of the software to the understanding of the text. Texts included works by Baumol and Blinder, Byrns and Stone, Ekelund and Tollison, R. G. Lipsey et al., E. Mansfield, McConnell and Brue, Samuelson and Nordhaus, and B. R. Schiller. Heuristic analysis was done of both book text and software by comparing a counting of problems in the chapters with evaluation in terms of subject matter and questions. Findings demonstrated that the texts were almost exclusively essay, short-answer, and computational questions while the software used multiple-choice responses. It is concluded that the software packages do supplement the text as they reinforce and extend the understanding of economics, but they cannot substitute for the text. Major advantages of the software tutorials were information location and instant student feedback. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
 
Article
Reviews research on CAPI within US government agencies, at European statistical agencies, and at university-based research organizations in the US. CAPI was developed to reduce the time needed to collect and process survey data, to improve the quality of the information collected, to reduce survey costs, and to implement more complex questionnaire designs than are possible with paper and pencil. However, 4 areas of concern have slowed deployment of CAPI technology: respondent acceptance, interviewer acceptance, impact on data quality, and costs. Although most interviewers and respondents seem to like and accept CAPI, areas that still need to be addressed include the impact on response rates by people who dislike being interviewed by computer and the eventual cost of the system once the learning curve and initial startup and equipment costs have been settled. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
 
Article
NIST, under sponsorship from the Bureau of the Census, has collected a database consisting of 2,100 pages of binary image data of hand printed characters including numerals and text. NIST Special Database 1 contains handwriting samples from 2,100 writers geographically distributed across the United States. NIST is currently using this database to research field-isolation, box detection and removal, character segmentation, and writer-independent neural character recognition. In addition to advancing the design of algorithms, this database and its study can aid the social sciences. Observations can be compiled and used to improve form design and field layout strategies. Region-based studies and comparisons are also possible due to regionally distributed and referenced populations of writers in the database. In creating this large database, obstacles to the effective archiving of images have been dealt with and eliminated. This paper describes the database's content in terms of its colle...
 
Article
Examined the extent to which computer-mediated (CM) communication instilled a state of deindividuation (low private and low public self-awareness [SA]) vs a state of high private and low public SA among 55 Ss with previous computer experience. Ss participated in face-to-face (FTF) or CM problem-solving discussions and completed self-report measures of private and public SA. The CM group reported significantly higher levels of private SA and marginally lower levels of public SA, suggesting that CM Ss were not deindividuated. CM Ss with low public SA were more likely to evaluate the social context negatively. The social evaluations of FTF Ss were not related to their levels of SA. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
 
Article
The paper questions three common assumptions (1) that electronic forums will certainly strengthen pre-existing face-to-face territorial (and other intact) communities; (2) electronic forum will replace them with comparably pro-social forms and (3) "market forces" will naturally foster electronic forums that are prosocial. It proposes a strong research program that examines how the social design/organization of electronic forums strengthen or weaken group life in workplaces and communities. KEYWORDS: Electronic forums, community, social capital, communication. THE COMPLEX RELATIONSHIPS BETWEEN ELECTRONIC FORUMS AND COMMUNITY LIFE.
 
Article
Reports results of the use of the Gathering program, a computer simulation based on the principle of perception control theory, on C. McPhail and R. Wolstein's collective locomotion experiment (see record 1987-34354-001). The program was used to explain purposive individual and collective locomotion behavior found in the experiment. Ss in five 12-person treatment groups were given a set of instructions, similar to those used by demonstration organizers, involving movement from one point to another. Results show that coordination increased as specificity of instructions rose. The simulation's ability to produce this relationship supports the hypothesis that collective behavior is the result of Ss with similar or related reference signals. However, the current version of the Gathering program is not sophisticated enough to simulate all details of McPhail and Wolstein's study. Suggestions are provided for improving the program. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
 
Article
This article argues that for the social sciences to devel op the software and services needed for effective use of computer networks, they will have to adopt and abide by standards that have been developed largely outside the social sciences. These standards are essentially those employed in the World Wide Web.It is suggested that these standards will prove sufficiently powerful to meet the needs of the social sciences and that, in fact, a qualitatively superior level of information services can be made available to students and researchers through such tools. Communication, cooperation, and collaboration are urged for efficient and rapid development of these services for the social sciences.
 
Conference Paper
Increasingly, government agencies are facing the challenge of effective implementation of information technologies critical to their digital government programs and initiatives. This article reports two user-centric evaluation studies of COPLINK, an integrated knowledge management system that supports and enhances law enforcement officers' crime-fighting activities. Specifically, the evaluations concentrate on system usability and user acceptance in the law enforcement setting. The article describes the study designs, highlights the analysis results, and discusses their implications for digital government research and practices. Findings from these studies provide valuable insights into digital government system evaluation and, at the same time, shed light on how government agencies can design adequate management interventions to foster technology acceptance and use.
 
Conference Paper
The UrbanSim project provides a case study in Digital Government, and this article examines progress to date in developing and applying the system in a range of metropolitan areas. Digital Government is meant here in the context of an innovative, cross-cutting initiative of the National Science Foundation. The project integrates academic research on urban simulation modeling and policy evaluation with research on human-computer interaction and software engineering, and uses a value-sensitive design to ensure that the system addresses the needs of governments and citizens. This article addresses the importance of the problem domain and the project objectives, presents a range of challenges, and outlines the design and application of UrbanSim in response to these. It discusses issues arising in the application of the model in the Salt Lake City metropolitan region, where a lawsuit over a highway project has precipitated use of UrbanSim to assess the interactions of transportation, land use, and environmental outcomes, and concludes with an assessment and directions for future research.
 
Article
Artificial Intelligence: The Very Idea is a lucid and admirably objective consideration of what artificial intelligence can reasonably aspire to and how that relates to the capabilities of human minds as understood in psychology and philosophy. Despite the virtual absence of technical notation, important computational ideas and techniques are presented with remarkable fidelity. The author bids us to consider (though not necessarily to accept) not only the very idea that one might artifice intelligence, but also the startling related idea that our own intelligence is equivalent to an artifice, that "we are, at root, computers ourselves."
 
Article
Inter-Nation Simulation IV (INS4) is a dynamic microcomputer model of the international system based on Harold Guetzkow's work at Northwestern University in the 1950s. About six decision making teams are assigned to prototype nations with defined characteristics concerning internal satisfaction with domestic and foreign policy, population and natural resources, research and development, military capability and defense, capital formation and depreciation, political structure, and national budget. A standard simulation run consists of a briefing, a trial period, several decision sessions dealing with a particular issue (four alternative scenarios are provided), and a debriefing.
 
Article
This article illustrates the structural changes in hyperlink networks from Web 1.0 to Web 2.0 and describes Web 1.0 using hyperlink data obtained from websites of South Korean National Assembly members between 2000 and 2001. The websites were sparsely knitted and formed a hub-spike network. Hyperlinks were created to enhance the interface and navigation ability of websites. The article also examines how hyperlink patterns began to change in 2005 and 2006 when Web 2.0 (blogs) was introduced. A key difference between Web 1.0 and Web 2.0 was that the Assembly members were relatively well connected in the blogosphere. Furthermore, prominent Web 1.0 hubs with many links tended to disappear, but butterfly networks based on political homophily emerged. Finally, the hyperlink network of Twitter, a recent Web 2.0 application, is examined. Twitter’s network diagram shows that online social ties between politicians are becoming denser.
 
Article
Artificial Intelligence and Psychiatry is a detailed discussion of the relationship of computer science to psychiatric practice. To illustrate his points, the author reviews schematically a number of concepts of artificial intelligence programming without introducing technical knowledge of programming or mathematics. It is well referenced and its tone is very positive, stopping short of strident.
 
Article
This book is most useful for the moderately experienced programmer who is taking an initial foray into dBASE II. It provides both a well-conceived approach to thinking through basic problems in constructing and using a database and a generally valid model for dBASE II programs. Its most significant flaws lie in attempting to reach too varied an audience and in failing to provide a completely valid programming model. Nonetheless, supplemented with access to an experienced dBASE II programmer, it can be the right guide to begin using dBASE II as a programming language. (Reviewed by Larry S. Luton, Eastern Washington University)
 
Article
Social scientists seeking empirical knowledge of the effects of computer use in organizations continue to rely on the findings of the University of California, Irvine's Urban Information Systems Project. Danziger and Kraemer's book is one result of that project and a recent addition to the CORPS (Computing, Organizations, Policy, and Society) monograph series, which promises to have a major impact on the sociology of computing literature. Their research includes a detailed 1975 survey of computer use in the governments of all large U.S. cities and counties, supplemented by systematic field studies of 42 municipal governments and 50 additional exploratory case studies chosen for their particular interest. Besides being a model of clearly designed and presented survey research, this book transcends many of survey research's limitations by incorporating direct observation, in-depth interview material, and the authors' considerable personal experience with computing in organizations. Despite its quantitative focus, this is a remarkably readable book of interest to readers concerned with the qualitative dimensions of computer use. It would also be suitable for students in a graduate or advanced undergraduate class.
 
Article
The program is designed for those desiring to apply more specific statistical tests to the Campbell-Fiske multitrait-multimethod matrix. The package uses non-parametric tests to test hypotheses about four Campbell-Fiske convergent and discriminative validity criteria, with an option to use reliability coefficients as input for attenuation correction of MTMM coefficients. The package also contains an additional program which provides the same analyses, but presents the results in a more traditional ANOVA format.
 
Article
The media in the 1990s is variously described as the interactive media or the so-called multimedia era. Cable television, laptop computers, fiber-optics, direct broadcast satellites, cellular telephones, and facsimile technology is being joined by the latest in interactive television, desktop video conferencing, and cable television modems, all of which bring together the streams of computer-based information of the World Wide Web with the visual images of cable television and broadcast television to create a stream of information into the nation's households and workplaces. The work of political campaigns in the 1990s is tapping into this dizzying array of multimedia. In the 1996 election, the latest in multimedia technologies were evident throughout the election season. This article is a review of the application of these new technologies in political campaign, with particular attention to the use of the World Wide Web in the 1996 presidential election campaign.
 
Article
We first develop the network paradigm that is currently dominating the way we think about the internet and introduce varieties of social networking that are being fashioned in interactive web environments. This serves to ground our arguments about Web 2.0 technologies. These constitute ways in which users of web-based services can take on the role of producers as well as consumers of information that derive from such services with sharing becoming a dominant mode of adding value to such data. These developments are growing Web 2.0 from the ground up, enabling users to derive hitherto unknown, hidden and even new patterns and correlations in data that imply various kinds of social networking. We define crowdsourcing and crowdcasting as essential ways in which large groups of users come together to create data and to add value by sharing. This is highly applicable to new forms of mapping. We begin by noting that maps have become important services on the internet with nonproprietary services such as Google Maps being ways in which users can fashion their own functionality. We review various top-down and bottom-up strategies and then present our own contributions in the form of GMapCreator that lets users fashion new maps using Google Maps as a base. We have extended this into an archive of pointers to maps created by this software, which is called MapTube, and we demonstrate how it can be used in a variety of contexts to share map information, to put existing maps into a form that can be shared, and to create new maps from the bottom up using a combination of crowdcasting, crowdsourcing and traditional broadcasting. We conclude by arguing that these developments define a neogeography which is essentially ‘mapping for the masses’.
 
Article
Digital cable modems, wireless Internet service providers (ISPs), digital subscriber line (DSL) services, fiber optics systems, and Internet access through a growing number of direct satellite television providers have transformed the way that millions of Americans receive the latest news and information about political campaigns. Multilingual video, text, and audio files on candidate Web sites, dial-up Internet services provided through the Republican National Committee and the Democratic National Committee, electronic mail updates of policy announcements, Web site rebuttals to the statements made during the nationally televised presidential debates, full-color printable flyers and bumper stickers for laser printing, and other forms of interactive software and multimedia were just a few of the new developments in the 2000 presidential race. This article addresses the changing multimedia technologies—including the use of local television markets and the new uses of the World Wide Web—in the 2000 presidential election.
 
Percentage of major party candidates who adopted Facebook in congressional campaigns, 2006–2012.  
Logistic Regression Analysis of Facebook Adoption in the 2012 U.S. House Elections.
Logistic Regression Analysis of Facebook Adoption by Nonincumbents in the 2012 U.S. House Elections.
Article
Diffusion of innovation theory is used to explain adoption of Facebook in the 2012 campaigns for the U.S. Congress and to identify characteristics that differentiate the small subset of candidates who did not create a Facebook presence from the large majority who did. Models of Facebook adoption for House candidates reveal that there are no differences between Republicans and Democrats. Nonadopters are significantly more likely to be challengers or open-seat candidates, poorly financed candidates, candidates in noncompetitive races, and older. Among nonincumbents, Republicans, and candidates from Republican-oriented districts are more likely to adopt. This study serves as one of the first examinations of social media adoption by congressional candidates in the 2012 elections and discusses relevant developments and trends in adoption since Facebook’s introduction in 2006.
 
Concept map between key words on official Barack Obama Facebook page. Note. R 2 1⁄4 .176, Stress 1⁄4 .433. 
Concept map between key words on official Mitt Romney Facebook page. Note. R 2 1⁄4 .175, Stress 1⁄4 .437. 
Concept map of spatial distances between key words on #election2012 Twitter page. Note. R 2 1⁄4 .108, Stress 1⁄4 .437. 
Article
By being embedded in everyday life, social networking sites (SNSs) have altered the way campaign politics are understood and engaged with by politicians and citizens alike. However, the actual content of social media has remained a vast but somewhat amorphous and understudied entity. The study reported here examines public sentiment as it was expressed in just over 1.42 million social media units on Facebook and Twitter to provide broad insights into dominant topics and themes that were prevalent in the 2012 U.S. election campaign online. Key findings include the fact that contrary to what one might expect, neither presidential candidate was framed in an overly critical manner in his opponent’s Facebook space nor on Twitter’s dedicated nonpartisan election page. Beyond this, similarities and divergences in sentiment across social media spaces are observed that allow for a better understanding of what is being communicated in political social media.
 
Article
This research examined the influence of attention to specific forms of traditional and online media on young adults’ online and offline political participation as well as voter turnout during the fall 2012 presidential campaign. A three-wave panel survey demonstrated that attention to traditional media did not increase offline and online political participation in September; instead, participation was heightened by attention to online sources, particularly presidential candidate websites, Facebook, Twitter, and blogs. In the following months, individual-level change in participation was attributable to attention to several online media sources as well as change in media attention. In the case of voter turnout, results suggest that television attention was positively linked to voter likelihood in September but was negatively linked to individual-level change in voter turnout in November.
 
Article
A posttest-only experimental design is used to test the effects of humorous negative video ads from the 2012 presidential campaign on evaluations of Barack Obama and Mitt Romney among young people. Findings show that ads targeting Romney had a negative effect on attitudes toward him. Romney also had his evaluations lowered as the result of respondents viewing third-party ads attacking Obama. This may be consistent with some research that suggests that negative political ads have a “backlash,” or “boomerang” effect on their source. Obama, on the other hand, was largely insulated from both target and source effects. The study suggests first that candidates who decide to “go negative” may not be able to insulate themselves from unintended negative effects by framing their ads in a humorous fashion. Moreover, because the anti-Obama ads were sponsored by a third party, it suggests that the source really might matter in terms of the effects ads have on “their” candidates.
 
Predicting Offline Participation, Online Participation, and Political Interest.
Predicting Selective Exposure, Selective Avoidance, and Strength of Party Affiliation.
Article
The increasing popularity of social network sites (SNSs) in election campaigns provides a unique climate for scholarly inquiry. The study reported here builds upon Zhang, Johnson, Seltzer, and Bichard and investigates the impact of different types of SNS use on voters’ attitudes and behavior during the 2012 U.S. presidential campaign. Sites such as Facebook, Google Plus, Twitter, and YouTube are included to offer a robust assessment of distinct relationships. A national online panel of Internet users was utilized to examine reliance on SNSs and the multiple consequences on political attitudes and behavior such as political participation, political interest, selective exposure, selective avoidance, and strength of party affiliation. The findings are evaluated for theoretical and practical implications on democratic governance.
 
Functions of Twitter based on usage conventions 
Number of users posting s21 for the first time per day  
Usage patterns during different time spans 
Retweets 
Hyperlinks 
Article
Political actors increasingly use the microblogging service, Twitter, for the organization, coordination, and documentation of collective action. These interactions with Twitter leave digital artifacts that can be analyzed. In this article, we look at Twitter messages commenting on one of the most contentious protests in Germany's recent history, the protests against the infrastructure project Stuttgart 21. We analyze all messages containing the hashtag #s21 that were posted between May 25, 2010, and November 14, 2010, by the 80,000 most followed Twitter users in Germany. We do this to answer three questions: First, what distinguishes events that resulted in high activity on Twitter from events that did not? Second, during times of high activity, does the behavior of Twitter users vary from their usual behavior patterns? Third, were the artifacts (retweets, links) that dominated conversations during times of high activity indicative of tactical support of the protests or of symbolic association with it?
 
Online Activities 
Article
This study examines what online activities politically interested Internet users regularly engage in and how online activities are linked to motivations for using the Internet. This study found that politically interested web users were motivated to go online for different reasons than the general public and therefore they participated in different activities online. However, only for entertainment sites did audience activity serve as an important intervening variable between gratifications sought and obtained.
 
Article
Computer-mediated discussion is an increasingly popular method of engaging in political talk with other citizens. This article presents a case study of a Usenet newsgroup focused on abortion, and discusses the creation of a public sphere by the conversants. The notion of the public sphere is discussed, and measures allowing an assessment of its democratic character are proposed. A formula for estimating entropy is developed and applied to data obtained from the case study of an ongoing discussion. A high level of inequality in participation among conversants is noted, with very few of the discussants responsible for an extraordinarily high proportion of the content. This inequality, though tempered by analysis of the newsgroup on a day-to-day basis, calls into question the democratic character of the public sphere represented by this conversation.
 
Article
As Internet use has proliferated worldwide, there has been debate whether some users develop disturbed patterns of Internet use (i.e., Internet abuse). This article highlights relevant literature on Internet abuse and computer-mediated communication effects that supports and disputes major questions about Internet abuse. Is the addiction paradigm appropriate for Internet use? Is behavior that has been labeled Internet abuse symptomatic of other problems such as depression, sexual disorders, or loneliness? What are alternative explanations for this phenomenon? Is there adequate research to support Internet abuse as a distinct disorder?
 
Article
Whether substance abuse treatment centers affect neighborhood crime is hotly debated. Empirical evidence on this issue is lacking because of the difficulty of distinguishing the crime effect of treatment centers in high-crime areas, the inability to make before-and-after comparisons for clinics founded before computerized crime data, and the need for appropriate control sites. The authors present an innovative method (without an actual data analysis) to overcome these challenges. Clinic addresses and crime data are geocoded by street address. Crimes are counted within concentric-circular, 25-meter “buffers” around the clinics. Regression analyses are used to calculate the “crime slope” (β) among the buffers. A negative β indicates more crimes closer to the site. A similar process is used to evaluate crimes around control sites: convenience stores, hospitals, and residential points. This innovative technique provides valid empirical evidence on crime around substance abuse treatment centers.
 
Correlations Among Key Variables Internet Addiction Index (IAI) Overall Grade Academic Competence
Regressing Internet Addiction on Demographics, Internet Literacy, and Internet Activities Internet Addiction Symptoms
Hierarchical Regression of Demographics, Internet Literacy, Addiction Symptoms, Internet Activities, and Mass Media Use on Academic Performance Academic Performance
Article
This study examines the interrelationships among Internet literacy, Internet addiction symptoms, Internet activities, and academic performance. Data were gathered from a probability sample of 718 children and adolescents, aged 9–19, in Hong Kong, using face-to-face interviews. Regression results show that adolescent Internet addicts tended to be male, in low-income families, and not confident in locating, browsing, and accessing information from multiple resources, but that they were technologically savvy and frequent users of social networking sites (SNS) and online games for leisure. Contrary to what was hypothesized, Internet literacy, especially in publishing and technology, increases—not decreases—the likelihood of someone getting addicted to the Internet. As expected, Internet activities, especially SNS and online games, were significantly and positively linked to Internet addiction as well as to all Internet addiction symptoms. This finding suggests that leisure-oriented Internet activities can be much more addictive than other applications such as communicating by e-mail or browsing webpages. Furthermore, the higher subjects scored on tool and social-structural literacy, the better their academic performance would be; however, technical literacy skills, such as publishing and technology literacy, were not significant predictors for academic performance. This indicates that adolescents who can locate, browse, and access different information resources and who are knowledgable about the context under which the information was created performed better both in overall grades and in academic competence.
 
Article
This narrative recounts experiences at the Virginia Polytechnic Institute and State University in the design process of two university courses meant to test the potential and the limits of the technologies represented in the Internet. The courses described evolved as a means for teaching a junior and senior level political theory class for students at distant locations. These courses were designed to be fully Internet-based, i.e., students could take the class from anywhere if they had Internet access. These two courses were designed to exploit the capabilities of the World Wide Web to carry text, e-mail, and online chat sessions as the communications means for teaching political science without convening face-to-face meetings at preset times. The students were able to: access all course materials (syllabi, course assignments, and class notes), read the assigned texts in full, original forms, submit all graded work online and receive grades back online, and interact with the instructor and each other in one-on-one or group e-mail and chat sessions. The course was self-paced and self-directed. The course design was constrained by the need to make source materials available (finding already available text and securing copyright permission for other text), the need to implement standard hypertext markup language to present the text, maintenance of the Web site, and the administrative and institutional framework of a large university. (JLS)
 
Article
The Patient-Reported Outcomes Measurement System (PROMIS) network, funded as part of the National Institute of Health's roadmap initiative, is in the process of developing a revolutionary computerized adaptive testing system for use in the clinical research community as a standardized method to select and implement patient-reported outcome measures. Soliciting end-user feedback on the system has posed logistical challenges, given the magnitude of the system's scope and the diversity of the target audience and their research needs. This case study presents the application of multiple qualitative methods—participant observation, usability testing, and focus groups—to determine end-users' acceptance of the system and its usability. Findings from these methods highlight the value in using a multifaceted approach to solicit end-user input to software development.
 
Article
Communication in cyberspace creates the possibility of forming communities of interest that can either compete with or reinforce communities in place. For some individuals, computer networks offer opportunities for greater participation in public life; for others, participation in on-line groups has the potential to diminish commitment to local communities. On-line communities contain unique forms of immediacy, asynchronicity, and anonymity that give them dynamics not shared by communities based on physical presence and face-to-face interaction. These new forms of community, and the technological and cultural resources required for participation in them, have the potential for creating new forms of stratification and hence new barriers to universal access.
 
Cultural theory typology.
Article
The Internet facilitates a level of interoperability that generates considerable innovation and opportunity. Yet threats to governments, businesses, and individuals who use the Internet are increasing exponentially. This article deploys an anthropological understanding of risk in order to examine public sector action and capacity with respect to the multidimensional challenge of cyber-security. Our objectives are threefold: to gain a fuller appreciation of the interplay of political, technological, organizational, and social dimensions of cyber-security; to understand how this interplay is further shaped by clashing values and perceptions of risk; and to offer some prescriptive insight into the sorts of roles for government most likely to maximize systemic resilience and learning in an increasingly interdependent and virtual environment. Governments, the private sector, and civil society must engage in more shared responsibilities and collective learning in what is a highly fragile and dynamic cyberspace.
 
Participation modes. e1-e14 are the measurement errors. Each R 2 is reported in boldface and italic above each item. The standardized regression weights are reported below the arrows from the modes to the activities. w 2 ¼ 426.190; df ¼ 71; CFI ¼ .901; PCFI ¼ .609; RMSEA ¼ .078; N ¼ 808. Note: CFI ¼ comparative fit index; df ¼ degree of freedom; PCFI ¼ Parsimony comparative fit index; RMSEA ¼ roots mean square error of approximation. 
Differences Between Participation Modes 
Model Results 
Participation Rates 
Article
In this article, the authors investigate whether and how young people combine online and offline civic activities in modes of participation. The authors discuss four participation modes in which online and offline activities may converge: Politics, Activism, Consumption, and Sharing. Applying confirmatory factor analysis to survey data about the civic participation among Dutch youth (aged 15−25 years; N = 808), the authors find that online and offline activities are combined in the Politics, Activism, and Sharing modes, and that these three modes correlate significantly with each other. Conversely, the Consumption mode can only be validated as a separate offline participation mode. The results confirm the conclusion of previous studies that youth’s participation patterns are relatively dependent of mode, and add that their participation is concurrently relatively independent of place (offline vs. online).
 
Characteristics of Respondents and Parental Referents of Adult Children
Recommended Moves to Retirement Community (Percentages)
Logistic Regression Results Showing Effects of Substantive Variables in Vignette Structure on Recommendations for Move to a Retirement Community, Older Adults and Adult Children
Logistic Regression Results Showing Effects of Incidental Variables and Effects of Positioning of Financial Dimension
Article
This article illustrates an innovative method of administering stated choice studies (or vignette experiments) using computers and the Internet. The use of video clips to deliver information to research participants makes vignettes more realistic, helps to engage interest of research participants, and can reduce framing effects. The method also provides research participants with interactive options before making judgments. A study to determine the views of older people regarding residential options is used to illustrate the method. Even older people with limited experience in using computers participated successfully. The study findings showed that research participants responded both to the audiovisual characteristics of vignette persons and to the variables in the vignette structure.
 
Article
The objective of this article is to analyze the quality of the information collected by a self-administered survey addressed to citizens of Andalusia, who were offered the possibility of answering using the post or Internet. The study showed the advantages of using web-based self-administered questionnaires. Web surveys showed a low number of unanswered questions, more detailed answers to open questions, and longer answers to questions than those generated from paper questionnaires. The five open questions with text showed longer answers from the web survey, around 63 characters more. In the open questions with numerical answers, the use of drop-box (select list) generated better response than the use of a blank space left in the paper questionnaire.
 
Article
Despite the extensive use of web surveys today, there are certain methodological factors related to participant cooperation and data quality, which remain unclear and require further study. Here, the authors compare responses to a survey administered in two formats—electronic or by post—in terms of overall response rate and the quality of the data collected. Web and mail questionnaires were sent to a sample of 572 PhD holders, asking them about aspects related to their academic career and personal and family data to investigate the factors that determine scientific productivity. The web questionnaire elicited a significantly higher response rate than the mail questionnaire. Response rates did not differ between males and females; however, topic salience had an effect on the response rate. Finally, data quality was higher in web surveys than in the mail surveys, with fewer overall errors, fewer missing items, and longer responses in open-ended questions.
 
Article
This text lists Internet sites useful to political scientists seeking to uncover online resources. It is not intended to be a comprehensive listing. Rather it is a form of resource inventory of selected sites most useful for those with general interests in political science and public administration. Permission is granted to reproduce for non-commercial handout purposes. Keywords political science, public administration, telecommunications, Internet, World Wide Web, federal government online research
 
Top-cited authors
Joseph B. Walther
  • University of California, Santa Barbara
Eszter Hargittai
  • University of Zurich
Scott D Crawford
  • SoundRocket
Thomas J. Johnson
  • University of Texas at Austin
Michael Bosnjak
  • ZPID - Leibniz Institute for Psychology