Computing Crime: Information Technology, Police Effectiveness and the Organization of Policing

Article (PDF Available) · October 2006with17 Reads
Source: RePEc
Abstract
How does information technology (IT) affect the organization of police work? How does it in turn affect police crime-fighting effectiveness? To answer these questions, we construct a new panel data set of police departments covering 1987-2003. We find that while IT adoption had substantial effects on a wide range of police organizational practices, it had, by itself, a negligible impact on crime-fighting effectiveness. These results are robust to various methods for controlling for agency-level characteristics and the endogeneity of IT use. We then suggest and test two explanations for this puzzle. First, we demonstrate that use of a particular technology, computerized record-keeping, increased recorded crime rates. Second, we provide evidence that IT investments only had a substantial impact on crime clearance rates and crime rates when undertaken as part of a broad set of complementary organizational practices such as those in the Compstat program.
Computing Crime: Information Technology, Police
Effectiveness, and the Organization of Policing
Luis Garicano
University of Chicago
and CEPR
Paul Heaton
University of Chicago
December 4, 2006
Abstract
How does information technology (IT) affect the organization of police work?
How does it in turn affect police crime-fighting effectiveness? To answer these
questions, we construct a new panel data set of police departments covering
1987-2003. We find that while IT adoption had substantial effects on a wide
range of police organizational practices, it had, by itself, a negligible impact on
crime-fighting effectiveness. These results are robust to various methods for con-
trolling for agency-level characteristics and the endogeneity of IT use. We then
suggest and test two explanations for this puzzle. First, we demonstrate that
use of a particular technology, computerized record-keeping, increased recorded
crime rates. Second, we provide evidence that IT investments only had a sub-
stantial impact on crime clearance rates and crime rates when undertaken as
part of a broad set of complementary organizational practices such as those in
the Compstat program.
JEL Classification: L23, M5, O33, K42
Garicano thanks the Toulouse Network for Information Technology (TNIT) and Heaton the Na-
tional Consortium on Violence Research (NCOVR) for financial support. We also thank Daron Ace-
moglu, David Autor, Austan Goolsbee, Robert Topel, and Toulouse Network members for their com-
ments. The authors can be reached at luis.garicano@gsb.uchicago.edu and psheaton@uchicago.edu.
1 Introduction
Crime fighting is essentially an information processing task police agents must use
the information available at the local and aggregate levels to prevent and solve crimes.
1
Thus, we should expect large changes in the cost of processing information to have an
important impact on the organization of police work. In this paper we study the impact
of IT on the organization and effectiveness of policing using a newly constructed panel
data set of police agencies covering the period 1987-2003, which we have merged with
FBI local-level crime data.
Our paper contributes to a large literature on the impact of IT on the organization
of work. Despite the growing size of this literature, our knowledge of IT’s impact is still
spotty, in part due to the lack of availability of firm-level data on organizational change
and information technology adoption over time. Some previous studies of the impact
of IT cannot examine organizational changes because they use the industry as the unit
of analysis [e.g. Stiroh (2002), Autor, Katz, and Krueger (1998), and Berman, Bound,
and Griliches (1994)], or because they rely on a cross-section of firms [e.g. Acemoglu,
Aghion, Lelarge, Reenan, and Zilibotti (2006) and Bresnahan, Brynjolfsson, and Hitt
(2002)
2
]. Others do follow individual firms over time, but either have no data on
information technology adoption [e.g. Rajan and Wulf (2006), Berman, Bound, and
Griliches (1994)], or no information on organizational change [e.g. Brynjolfsson and
Hitt (2003)]. Only a very small number of previous papers provide firm-level evidence
on the evolution of information technology, skill usage, and organizational change;
notably Caroli and Reenen (2001), which reports data for the 1980s in the UK, and from
the early 1990s for France, and Doms, Dunne, and Troske (1997), who s tudy a panel
of manufacturing firms between 1987 and 1992. Like these papers, our paper utilizes
firm-level data on the e volution of skills, organization, and information technology
1
Following Arrow (1974), a large literature studies organizations as information processing and
problem solving institutions e.g. Radner and Zandt (1992), Bolton and Dewatripont (1994), and
Garicano (2000).
2
This paper has panel data on IT and inputs, but cross-sectional information for organizational
variables
1
adoption. However, it is the first to systematically examine non-manufacturing firms
in our case, public organizations and the first to study the majority of the firms
within the observed industry. Moreover, it is the only paper to include a long panel
(16 years) covering most of the period of the recent IT revolution. Finally, by merging
in agency-level data on crime rates and arrest levels, we are able to incorporate rich
productivity measures into our analysis.
We start by studying the impact of computerization on productivity and organiza-
tion using a panel of police departments. Our main identification strategy compares,
controlling for city size and other characteristics, the organization and productivity
of departments that adopted more computing technology to that of departments that
adopted less. Consistent with previous research,
3
we find that IT adoption and skill
are complementary: departments that adopt IT increase police training and introduce
college requirements for new recruits. The evidence suggests that this increase in train-
ing is primarily related to the need to learn to use new devices, rather than IT-induced
enhancement in the training process. Moreover, adopting departments become larger,
increasingly employ special units, and include a larger fraction of support personnel.
In sum, departments become more highly skilled and their organization in many ways
more complex. Despite these changes, we find little evidence that general IT adoption
resulted by itself in an increase in the effectiveness of police work, as manifested both
in clearance rates and in crime rates. We carefully analyze the generality of these re-
sults, and find them robust to alternative samples (by size, by period, early adopters,
growing versus non-growing cities, etc.) and specifications of the IT measure.
Correctly interpreting the underlying causal mechanisms at work is an important
consideration here–our findings indicating that IT promotes organizational change
could reflect reverse causality or omitted variable bias. Given the nature of organi-
zational change, which often involves simultaneous adjustments on a number of dimen-
sions and which may be driven by factors unobserved to researchers, sorting out causal
pathways can be challenging. Using the available data, we attempt to address several
3
See, for example, Autor, Katz, and Krueger (1998) and Lehr and Lichtenberg (1999).
2
alternative explanations for the strong relationship between IT use and our organiza-
tional measures. By including both year and agency fixed effects in our specifications,
we first remove variation that may be due to systematic differences across departments
(such as geography) and well as macroeconomic trends. We also find our results robust
to inclusion of time trends by state or initial level of computerization.
If, as agencies increase in size, their optimal structure involves increasing use of IT
and changes in organizational form, failure to adequately account for agency size could
suggest a spurious effect of IT on organization. In each of our baseline regressions we
flexibly control for the relevant aspect of agency size or workload. As additional checks,
we rerun our regressions first limiting the sample to the largest and smallest agencies
and then including a full set of agency-size de cile and year interactions as controls.
The strong positive relationships between IT, worker training, and worker s kill persist
in these specifications.
Poorly managed departments may undergo overhauls that affect both IT use and
organizational variables. Using civil litigation cases filed against an agency in 1987 as a
measure of initial department quality, we uncover little evidence suggesting differential
IT adoption by poorly functioning agencies. Alternatively, younger, dynamic cities,
such as Houston or Seattle, may have unobserved characteristics that promote both
IT use and different bureaucratic evolution. Limiting the sample to shrinking cities or
cities with little population change does not alter our conclusions, however.
We also estimate specifications including leads of IT intensity as additional ex-
planatory variables to assess whether exogenous organizational reform could prompt
IT adoption (reverse causality), but obtain little indication of such effects. Another
possibility is that agencies with larger budgets are able to implement both information
technology and superior organizational practices such as increased training. However,
the strong relationship between IT and organization persists when we directly control
for equipment expenditure in our regressions, suggesting that this relationship is not
driven primarily by resource availability.
As a final check, we employ two different instrumental variables (PC availability in
3
the broader area and body armor use) that attempt to capture variation in the supply of
and demand for IT exogenous to our organization and effectiveness measures. Although
limited, our instrumental variables analysis supports the hypothesis that IT adoption
leads to organizational change. Taken as a whole, our evidence is most consistent
with a causal effect of information technology on organizational structure, with the
large technological changes driving IT adoption in the broader economy contributing
to both computerization and organizational change in the police sector but having little
apparent effect on productivity.
These findings are puzzling: while computers matter organizationally, their effects
do not show up in the productivity numbers.
4
We propose two possible explanations
for this puzzle, and test them: improvements in crime measurement and complemen-
tarities.
First, although some information technologies, such as those that identify crime
‘hot-spots’, should improve deterrence, others could actually worsen crime statistics.
For example, if crime reporting is improved, reported crime rates will increase w hile
clearance rates will drop. Our data contain detailed questions on computer functions,
such as record-keeping, police dispatch, fleet management, etc. We test for heteroge-
nous effects of different technologies by simultaneously entering record-keeping and
deployment measures in our panel regressions. Offense reports increase by 10% when
computers are used for record keeping. Consistent with this hypothesis, such increases
take place for crimes that are more likely to suffer from under-reporting, e.g. larceny,
rather than those which are severe and thus always reported such as homicide. Deploy-
ment technologies, in contrast, are negatively (albeit weakly) associated with offense
rates.
Second, we consider the complementarities hypothesis, first advanced formally by
Milgrom and Roberts (1990).
5
Although IT by itself may have little impact, its impact
4
There are precedents in the public sector for large increases in IT that lead to no observable
efficiency gains. Goolsbe e and Guryan (2006), for example, find that more access to the Internet by
scho ols does not measurably increase student achievement.
5
In their analysis of modern manufacturing, Milgrom and Roberts (1990) argue that, given the
existence of complementarities among organizational practices, a range of organizational choices may
4
may be substantial when introduced within the context of an organizational and human
resource system designed to take advantage of it. In the specific context of police
work, the complementarity hypothesis takes one very s alient form: Compstat. The
system of practices summarized by this name was initially introduced in the New York
Police Department by Police Commissioner William Bratton under Mayor Rudolph
Giuliani’s leadership and then spread throughout the country. The program aimed
to combine real-time geographic information on crime with strong accountability by
middle managers in the form of daily group meetings, geographic resource allocation,
and data intensive police techniques. The program was widely credited in the press
and by policymakers with playing a substantial role in the recent precipitous drop in
crime experienced by some cities.
6
To test the complementarity hypothesis, we study the impact of information tech-
nology when it is adopted together with skilled officers, new problem-solving tech-
niques, extensive use of ‘output’ information in evaluation and deployment of officers,
and a geographic-based structure.
7
Although the data available for testing this hy-
pothesis are much shorter and more limited (questions on these type of practices were
only introduced in the survey in 1997), they clearly endorse this hypothesis. We find
crime clearance rates were an average of 2.2 percentage points higher in agencies imple-
menting this integrated set of practices. Similarly, crime rates are negatively associated
with Compstat use. M oreover, the individual practices composing Compstat have no
independent ameliorative impact on crime levels or clearance rates.
8
We conclude that IT can increase police effectiveness, but that (1) its impact is
have to be altered together for a particular technological advance to improve efficiency. In the presence
of complementarities success is not “a matter of small adjustments, made independently at each of
several margins, but rather involve[s] substantial and closely coordinated changes in a whole range of
the firm’s activities.” (p. 513)
6
Some previous research has disputed the claim of a large effect of Compstat; see, for example,
Levitt (2004).
7
Our approach is similar to Ichniowski, Shaw, and Prennushi (1997), who study complementarities
among HRM practices and their impact on productivity. IT is not, however, a focus of their study.
8
Again, the causal interpretation of this increase must be qualified. If a system of complementary
changes must be undertaken, the fact that some departments choose not to undertake these changes
may reflect some omitted variable, such as the quality of management of the department, in which
case the 2.2% is biased upwards. This problem is common to a large extent to all of the literature on
organizational change (see e.g. Ichniowski and Shaw (Forthcoming)).
5
obscured by large increases in recorded crime, and (2) the increase in effectiveness only
takes place when IT is introduced in conjunction with certain organizational practices
oriented to take advantage of new data availability.
2 Data De scription
The data are drawn from the Law Enforcement Management and Administrative Sta-
tistics (LEMAS) series, a triennial survey of law enforcement agencies in the United
States covering the years 1987-2003.
9
Although not specifically designed as a longitu-
dinal survey, the broad coverage of the survey makes it possible to identify numerous
agencies at multiple points of time.
10
The surveys provide rich data on a wide vari-
ety of police operations, including shift scheduling, equipment usage, agency structure
and functions, officer compensation, and administrative policies. To supplement the
LEMAS data, we have matched the surveyed agencies with annual arrest and offense
data from the FBI’s Uniform Crime Rep orts (UCR) and place-level demographic data
from the Census where possible.
One of the strengths of this data set is that it contains questions on a variety of
different types of IT use and covers a period of enormous IT expansion. Figure 1 plots
aggregate trends in IT use by police agencies. The upper graph details use of different
types of information technologies, including PC’s, mobile data terminals (typically used
by officers to access vehicle, criminal background, or other information while in the
field), and mainframes and servers. In 1987, fewer than 20% of the surveyed agencies
used any computers, but over the next 12 years computer use showed substantial
increases, with PC use grow ing more rapidly than more specialized technologies. By
the end of the sample over 90% of responding agencies reported IT use. The large
increase in mainframe and server use near the end of the sample is likely attributable
to the increased importance of the Internet in the latter half of the 1990’s.
9
The 1996 survey was conducted in 1997, and an additional survey was conducted in 1999.
10
All state police agencies and all agencies with 100 or more officers are automatically sampled
with probability sampling for the remaining agencies. In each year roughly 3000 of the approximately
19000 U.S. law enforcement agencies are represented.
6
The middle chart in Figure 1 details use of computerized data files by function.
Growth trends in data use for arrests, service calls, and stolen property were comparable
through the entire sample. The final chart demonstrates a significant asymmetry in
IT adoption according to agency size. A substantial proportion of the largest police
agencies already had computing technologies by 1987. In stark contrast, only 2 of the
over 2200 departments with 100 or fewer employees in 1987 reported using computers.
Between 1987 and 1990 there was widespread adoption of computers by mid-siz ed
agencies, with the smallest agencies lagging behind until the final years of the sample.
Our preferred measure of computer usage is a computer index constructed as the
proportion of nine computer/data functions present in an agency in a given year.
11
The
advantage of this index relative to simple dichotomous IT measures is that it captures
not only the existence of IT within an organization, but the degree to which IT is used
to perform disparate tasks.
12
During our sample the computer index increases from an
average of .11 in 1987 to .64 in 2003.
Table 1 provides summary statistics describing the measures of output, organiza-
tional structure, and worker skill that we employ in our analysis. The varying number
of observations reflects the fact that not all survey items are available in all years and
UCR and Census data were not available for all agencies. In total, approximately 8600
police agencies are represented at some point in the sample, with over 1800 agencies
registering observations in 4 or more sample years. Many of the police agencies in the
sample are county or city agencies within moderately-sized jurisdictions, but university,
state, and large city police departments are also represented. The areas represented
are slightly poorer, less educated, and less diverse than the overall United States.
11
The nine functions we include are computer use for crime analysis, investigation, and dispatch
and data record use for arrests, service calls, criminal histories, stolen property, traffic citations, and
warrants.
12
In Section 4 we demonstrate the our basic results are not sensitive to this choice of IT measure.
7
3 Effects of IT Adoption
In this section we examine our organizational and outcome measures for evidence re-
garding the effects of computerization. In our basic analysis we use OLS regressions
and separately report specifications including year or agency and year fixed effects and
additional time trends. In all regressions we attempt to control for other relevant fac-
tors that may affect our outcomes of interest. We interpret our coefficients as measures
of the effect of IT on the outcomes of interest. This interpretation is appropriate if
differential acquisition of information technology is driven by factors exogenous to the
agency, such as variations in the cost of technology over time and place. In section
4, we re-estimate the regressions from this section using instrumental variables as well
other reasonable specifications. Accounting for possible endogeneity of IT adoption,
limiting the sample, using alternative IT measures, and richly controlling for potential
confounding factors does not alter our basic conclusions.
3.1 Agency Size and Complexity
To measure agency size, we count the total number of sworn and non-sworn personnel
reported by each agency.
13
The distribution of agency sizes is highly right-skewed, and
the median size of surveyed agencies rose from 25 in 1987 to 57 in 2003. Among agencies
reporting in all sample ye ars, the median size also rose from 328 to 472 employees over
the sample period, suggesting a general trend towards larger departments. In order to
control for workload in our regressions relating industry size to IT use, we include as
controls indicators for deciles of number of offenses reported by an agency.
We consider several measures of agency complexity. Our first measure of agency
complexity is the agency’s total number of special units. Special units are smaller
groups of officers with specific expertise and focus on a particular crime class or ad-
ministrative program, such as gang investigations or school outreach. In some cases
officers are involved in special units as their primary assignment, while in other cases
13
We consider part-time employees as equivalent to
1
2
of a full-time employee.
8
officers with general policing responsibility also participate in special units on a part-
time or ad hoc basis.
14
Unsurprisingly, agency size is highly correlated with special
units as well as our other complexity measures. In our empirical analysis we control
for agency size in all specifications.
Our next complexity measure is the number of hierarchical levels in the agency,
which we infer based up on whether or not separate salary ranges were reported for
chiefs, sergeants, and junior-level officers in the survey. This is a somewhat crude
proxy for the degree of hierarchy in the agency given that many larger police agencies
have substantially more than three administrative levels. Our final measure is the
number of written departmental policies.
15
This measure is likely affected by not only
the complexity of the organization (a more complex organization requires more policies)
but also the quality of management oversight and use of documentation, all factors that
may be affected by IT.
The top rows of Table 2 demonstrate that IT use is positively and significantly
associated with our measures of agency size and complexity. The coefficients on log
employees are much larger in the specification without agency fixed effects, likely be-
cause of the enormous cross-agency size heterogeneity. Even the smallest point estimate
of 8.5%, however, suggests a substantial effect of IT on size. IT appears to promote
specialization and use of written directives as well, although the strength of the rela-
tionship weakens as more fixed effects and trends are added to our regressions. IT is
not significantly associated with the number of hierarchical levels after controlling for
agency fixed effects and time trends.
3.2 Agency Composition
In addition to affecting the tasks performed by workers, new technologies also can
potentially shift the types of workers employed by agencies. The LEMAS data differ-
14
The special units that are consistent across sample years that we include are special units for child
abuse, community crime prevention, family violence, drug education, drunk driving, missing children,
police/prosecutor relations, career criminals, and victim assistance.
15
The possible policies are policies governing general police conduct, use of deadly force, handling
of domestic cases, and interactions with juveniles, the homeless, and the mentally ill.
9
entiates sworn officers, which are typically officers with arrest powers, from other types
of police employees.
16
It also provides separate counts of the number of employees
in field operations and technical support roles. Theoretically an IT-induced improve-
ment in the productivity of arresting officers could lead to an increase or decrease in
the field share of officers depending on the form of the policing production functions.
Many federal programs for funding IT improvements in police agencies, however, were
instituted with the explicit goal of shifting police personnel from desk to street duty.
17
Table 2 suggests that IT adoption has little effect on employment of sworn officers.
The point estimates for all four spe cification relating officers with arrest powers to
IT use are negative and of small magnitude. Similarly, after controlling for agency-
level heterogeneity there is little evidence that IT increases the proportion of workers
assigned to field operations. Increases in IT intensity are associated with increases of 4-
7% in the percentage of staff that are assigned technical support roles. This increase is
similar in magnitude to the increase in overall agency size, suggesting that the primary
effect of IT on personnel is to enlarge the police department by adding employees that
maintain the technology infrastructure as opposed to employees that perform more
traditional police functions.
18
3.3 Worker Skill
Most years of the survey ask whether the department requires new entrants to have
college experience, permitting us to directly link computer use to demand for college-
educated labor. Although a number of papers have posited a link between increases in
the demand for skilled labor and computerization (Mincer 1991, Bound and Johnson
16
One caution regarding this data is that some officers may officially have arrest powers even if they
essentially perform administrative duties.
17
For example, the descriptive material for Making Officer Redeployment Effective (MORE), a
Federal grant program administered by the Department of Justice Office of Community Oriented
Policing Services (COPS), states, “The MORE program increases the time available to law enforcement
professionals for community p olicing activities by funding technology that enables a department to
operate more efficiently.” (Office of Community Oriented Policing Services 2002)
18
These results sharply contrast with Doms, Dunne, and Troske (1997), who find using panel data
that increases in technology were not associated with increases in the non-production share of labor.
10
1992), most past empirical studies of this proposition have relied on indirect evidence,
such as cross-industry differentials in worker skill and computer use (Berman, Bound,
and Griliches 1994, Autor, Katz, and Krueger 1998). Implicit in most models of
computer-driven changes in skill premia is the notion that highly computerized firms
would be more likely to hire better educated workers because such workers can more
productively use the available information technology.
A lower row of Table 2 reports linear probability regressions
19
of the computer index
on an indicator for whether an agency requires entrants to have prior college experi-
ence. To account for factors potentially correlated with computer use and demand for
education, all specifications include controls for agency size as well as the income level
and adult educational distribution in the agency’s geographic area. A shift to complete
computerization is associated with a 3.3% increase in the probability of requiring col-
lege education, a sizable increase given that only about 13% of agencies require college
education. Indeed, in 1997, agencies reporting PC use were twice as likely to report
college requirements for entering workers than those without.
In addition to the relationship between worker education and I T, some past authors
have suggested that IT adoption can impose increased training requirements for new
workers, leading to skill increases generated within the firm (Bresnahan, Brynjolfsson,
and Hitt 2002). The bottom row of Table 2 reports estimates of the effect of IT adoption
on the number of training hours provided for new hires by the police agency. Although
controlling for year fixed effects reduces the estimated effects of IT adoption somewhat,
in all specifications the estimates are large and highly statistically significant.
20
IT use may affect training through two channels. One direct consequence of IT
adoption is the necessity of training new officers on the appropriate use of the technol-
ogy. Such training may be particularly necessary when agencies use IT specialized for
19
Logit and probit regressions provide similar results.
20
Although the magnitudes of the estimated effect of IT use on training appear large, they are not
unreasonable given the enormous increase in training observed over the sample period. Between 1987
and 1990, the period which saw the largest rise in adoption of IT, average training hours rose from 163
to 633, an almost four-fold increase. By 2003, almost half of sampled agencies required new officers
to undergo more than 6 months of training.
11
police work, such as mobile terminals or proprietary record-keeping software. Another
possibility is that the incorporation of information technology into the training process
(for example, by allowing for computer-based interactive activities) increases the value
of training hours for activities not directly involving computers. Unfortunately, the
training data do not contain additional detail that might permit disentanglement of
these mechanisms. However, because we do have information on different IT functions,
we can examine the extent to which increased training is associated with general-
purpose IT such as PCs, which have the potential to affect the overall training process,
versus more specialized technologies.
Table 3 tests this proposition by entering PC use and mobile computer use indi-
vidually into the training regression. We posit that mobile data computers, which
are typically specially designed systems that provide officers with in-vehicle access to
crime information, cannot enhance general training but likely require device- specific
instruction. Although the estimated coefficients for both types of IT are positive and
significant, the coefficient on mobile terminals is more than twice as large as that on
PCs. The control coefficients indicate that larger police agencies and agencies with col-
lege requirements also conduct more training. These results suggest that the increase
in training is due primarily to actual use of new information technologies as opposed
to technical improvements in training.
3.4 Policing Outcomes
One way in which our analysis departs from prior work is our use of public-sector data.
Many past studies of IT have focused on manufacturing industries with intuitive output
measures such as productivity and profits. Although some authors have argued that a
primary role of IT in police organizations is to improve police problem solving(Brown
and Brudney 2003), there is little empirical evidence connecting IT use to improved
enforcement. In this section we first examine the extent to which IT enhancements
12
led to increases in clearance rates
21
. To examine the deterrent aspect of enforcement
we also consider offending rates per population as dependent variables. To examine
possible asymmetries in the usefulness of IT across crimes, we also separately examine
violent and property arrests.
22
. Although arrest and offending measures are indicators
of enforcement output, these measures must be viewed w ith appropriate caution. In
particular, simple arrest and offending measures fail to account for conviction rates,
elimination of concentrated areas of crime, and other factors that may enter into the
objective functions of police departments.
Table 4 reports our results. Although there appears to be a positive effect of IT on
clearance rates in the first specification, the effects disappear after controlling for place-
level heterogeneity with agency fixed effects. The data do not suggest that agencies
that increased IT substantially over the sample period had greater clearance rates than
those with little IT adjustment. Estimates for offending rates, on the other hand, are
positive and sometimes statistically significant, although including agency fixed effects
and trends weakens the relationship somewhat. For property crime, an increase of
.53 in the computer index is associated with a .0016 percentage point, or roughly 5%
increase in offenses.
The final rows of Table 4 consider the possibility that the effect of IT is to allow
agencies to achieve the same clearance rates with less risk to officers. Such safety
enhancements might occur through substitution away from street officers to adminis-
trative personnel or by providing information to officers that allows them to identify
risky individuals and locations. The departmental organization data provided some
evidence of the former effect, although the evidence regarding percent sworn personnel
was mixed.
Increases in the IT index are generally associated with decreases in both officers as-
saulted and officers killed, although only the coefficients are statistically significant only
21
The clearance rate is the number of crimes for which an arrest was made divided by the total
number of offenses reported to police. Because arrests can occur for crimes committed in prior years,
clearance rates can be above 1. For a more detailed discussion of the use of clearance rates in crime
research see Skogan and Frydl (2004, 160).
22
Further disaggregation to individual index crimes provides comparable results
13
in the more parsimonious specifications. The fact that the effect on officers assaulted
disappears after controlling for computer index-year trends may reflect differential in-
vestment patterns in other technologies that can improve officer safety by high and low
computer agencies.
It is surprising that IT appears to exert little effect on policing outcomes given the
widespread use of IT in modern police departments. In the next section of the paper
we examine whether these results may be driven by misspecification, endogeneity, or
omitted variables. In Section 5 we provide two potential explanations for the weak link
between general IT and policing outcomes.
4 Robustness Checks
4.1 Alternative IT Measures and Samples
In Appendix Table A-1 we examine the robustness of our basic results to alternative
choices of computerization measure. Specification I uses a binary variable for PC use in
place of the computer index as a measure of IT while specification II uses an indicator
for any computing technology. Specification III improves the panel quality of the data
by limiting the analysis to agencies observed in 4 or more time periods. Specification
IV omits 1987 from the sample to assess whether the results are driven mainly by
the large increase in computer use that occurred between 1987 and 1990. The final
specification limits the analysis to agencies which reported some IT use in their first
year in the sample. This specification examines whether the observed effects can be
attributed to changes in IT intensity as opposed to simple adoption of any IT.
Employing different measures of IT use changes the results little, although the
alternative measures are more weakly associated with worker skill and training than
the index. The results for agencies observed in 4 or more years, which tend to be
slightly larger agencies, are also consistent with the baseline. Limiting the analysis
to 1990-1999 reduces the estimated coefficients on technical support staff and training
by about half, which is unsurprising since it seems likely that the largest training and
14
administrative changes would occur during the early stages of IT adoption. Agencies
with IT in 1987 saw the smallest increases in technical support staff and these agencies,
which are much more likely than average to include college requirements, saw relatively
small effects of IT on demand for skilled entrants. Overall, the results appear fairly
robust to sample and specification changes.
4.2 Alternative Interpretations
Up to this point we have interpreted our OLS results as estimates of the effects of
computerization on organizational outcomes. Such a causal interpretation is warranted
if agencies adopt new information technology due to exogenous factors such as changes
in the supply of these technologies. Clearly it is possible that agencies adopt new
technology because of organizational restructuring, in which case our estimates reflect
causality running from outcomes to IT, or that a third factor causes both IT and
organization to change together. In this section we conduct specific tests of some of
the main alternative explanations and then employ instrumental variables strategies
to address endogeneity more generally. Although no single approach we employ can
completely rule out endogeneity as an explanation for our results, taken as whole the
evidence for a causal effect of IT on organizations is strong.
Pre-existing trends: Departments adopting IT could be those that observed
previous negative trends in performance or other variables. As the department reforms,
it installs IT systems and adopts a wide range of changes that lead to upward biases of
the impact of IT a form of the “Ashenfelter dip.”
23
We examine this by replicating our
analysis including leads in the computer index as additional explanatory variables.
24
Any tendency toward decay in our dependent variables prior to the introduction of IT
should be manifest in negative coefficients on these leads.
Table 5 demonstrates that for our organizational measures there is little evidence
23
Ashenfelter (1978) observed that participants in training programs had unexplained drops in pay
just before enrollment.
24
One limitation of such analysis in our setting is that many agencies are observed in only a small
number of years. Introducing leads thus changes our sample composition somewhat.
15
of an Ashenfelter dip, with none of the coefficients on lead statistically different from
zero and equal numbers of positive and negative point estimates. The coefficients on
the index itself, in contrast, typically maintain the same sign and significance as the
baseline.
Delayed effects: Another potential explanation for the apparent lack of relation-
ship between IT and outcomes is that IT only improves policing after an initial learning
period. The second specification in Table 5 tests for this possibility with lags of the
computer index as additional regressors. Given the 3-year separation between most
surveys, it seems likely that any delayed benefits of IT implementation would be cap-
tured by this lagged term
25
. The coefficients on the lagged term, however, are generally
negligible and statistically indistinct from zero. The coefficients on the contempora-
neous index, in contrast, remain positive and significant for the agency size measures
and officer training.
Previous mismanagement: A strong association between IT use and reorgani-
zation could arise if agencies w hich are poorly managed or poorly functioning undergo
organizational overhauls in an attempt to improve performance. The fact that our
instrumental variables estimates, which exploit only variation in the index that is cor-
related with more general changes in IT or resource availability, generate similar results
to our baseline suggest that this is not the case. The LEMAS survey contains additional
data that we can use to test this alternative explanation.
The 1987 LEMAS survey includes questions on the number of civil litigation cases
filed against the agency in that year.
26
One reasonable conjecture is that more troubled
agencies would have experienced higher levels of litigation. If being troubled drives both
organizational change and IT change, we should observe different trajectories for the
IT index for agencies with high versus low initial levels of litigation. Appendix Table
A-2 examines this possibility by regressing long differences in the computer index on
the litigation level in 1987 and finds little evidence that agencies with high litigation
exhibited differential IT changes. The magnitudes of the differences in IT change
25
In unreported regressions we include an additional lag and obtain similar res ults.
26
Unfortunately this question was not asked in subsequent years.
16
between high and zero litigation agencies are small relative to overall change and not
statistically different from zero.
Size-related heterogeneity: Failure to adequately control for agency size could
also suggest a spurious relationship between IT and our organizational variables if,
as agencies grow, they require increased IT as well as a more specialized, educated
workforce. This concern seems particularly salient given the strong relationship be-
tween IT use and agency size documented in Figure 1. We have already attempted to
address this possibility by including controls capturing the relevant aspect of agency,
community, or workload size in all of our basic specifications.
As an additional check we allow for highly flexible effects of size on the organiza-
tional variables by re-estimating the regressions in Tables 2 and 4 including a full set of
interactions between indicators for deciles of agency size and year. We also rerun the
analysis limiting the sample to agencies with fewer than 25 employees or agencies with
greater than 100 employees. The results of these regressions are reported in Table 6.
The findings for agency size, technical support staff, training, and demand for college
are consistent with the baseline.
The middle columns of Table 6 report additional checks of the hypothesis that our
results are driven by general reorganization associated with population growth. Column
IV includes population decile-year interactions. Column V limits the analysis to only
places in which population fell over the course of the sample. Column VI includes
only agencies in areas in which the population changed by 10% or less. Column VII
builds a sample of agencies in comparable-sized areas by considering only agencies with
base population between 80000 and 120000. Areas with declining or stable populations
yield point estimates of similar magnitude to the universe of agencies. Thus, it does
not appear that community growth is driving both IT acquisition and organizational
restructuring. The evidence linking IT with improved officer safety is much we aker
after accounting for community growth.
Agency resources: A final concern is that agencies with greater financial resources
may both purchase IT and implement superior organizational practices, such as hiring
17
more workers and providing better training to existing workers. Column VIII of Table
6 reports regressions that attempt to flexibly control for differences in resource avail-
ability by interacting the log of equipment expenditures
27
with indicators for deciles
of equipment expenditure. Although these controls do not perfectly capture resource
differences across agencies, encouragingly, their inclusion does not substantively alter
our results.
Other Forms of Endogeneity: We attempt to deal with other potential sources
of endogeneity using instrumental variables. We consider two instruments for the
computer index. The first is the percentage of workers in an agency’s state and year
who use computers at work. The work computer-use data was obtained from CPS
Computer Use Supplement surveys and has been used in a number of past studies of
computerization including Krueger (1993).
28
Areas with many work computer users
likely had greater availability of computer hardware and expertise as well as increased
exposure to computers among the general population; both phenomena would likely
increase the likelihood of computer adoption. Conceptually, this instrument attempts
to capture exogenous variations in IT supply.
The second instrument we consider is an indicator variable for whether or not an
agency provides body armor to officers. Body armor represents a substantial expense
for police agencies
29
and is thus more likely to be available in jurisdictions with more
funds available for capital expenditures. Such agencies with larger capital resources are
more likely to adopt information technology as well. Here we thus assume that there
are exogenous factors determining the availability of resources that affect demand for
IT. Although neither of these instruments is ideal, our confidence in the results is
enhanced if taken together they provide comparable estimates to the baseline.
Table 7 reports the instrumental variables estimates.
30
Both instruments provide
27
Unfortunately, information on annual expenditures on equipment is available only from 1987 to
1997 and the value was estimated by some agencies.
28
Because the supplement is available in only selected years, we use linear interp olation across years
where necessary to generate computer use measures for the appropriate years.
29
A typical bulletproof vest for a single officer costs around $500 dollars.
30
The specifications reported in Table 7 include year but not agency fixed effects. The two instru-
ments appear generally strong, with average first-stage chi-squared statistics on the instruments of
18
evidence of a large effect of IT on agency size. The IV estimates for special units are
consistent with the baseline in sign and significance but somewhat larger, while IV
gives much larger and statistically significant estimates of the effect of IT on written
directives.
The IV results for the departmental organization and worker skill and training
measures are comparable in sign to our initial results. Relative to OLS, IV suggests
a stronger link between IT and technical support staff and e ducational demand. One
possibility is that agencies are particularly likely to employ additional technical staff
when there is a large supply of available IT workers or large agency budgets, which are
factors captured by the instruments. The IV point estimates for training hours, while
smaller than the initial estimates, are still large relative to the average training level.
For offense rates, the IV estimates remain positive and statistically significant.
5 Why Didn’t IT Improve Enforcement?
5.1 Heterogeneous Effects of Information Technologies
Given the robustness of our results that general IT has no net effect on clearances
and is actually associated with an increase in offense rates, it appears puzzling that
so many agencies adopted various forms of information technology during the 1990’s.
One potential explanation for the results is that our index, which provides a fairly
general measure of IT intensity, could mask heterogenous effects of different information
technologies on our enforcement outcomes. For example, offenses might be higher in
places with more computers simply because officers may be more willing to file police
incident reports when filing can be done using computers instead of by hand. Other
technologies, such as technologies that improve officer deployment, may at the same
time have a deterrent effect on crime.
25.20 and 45.32. Because the work computer instrument is available only at the state level, including
a full set of agency fixed effects absorbs much of the variation in this instrument. For the body armor
instrument, estimating the first stage using a logit and including agency fixed effects yields generally
strong first-stage relationships and overall results similar to those reported in Table 7.
19
To disentangle these potentially competing effects of IT, we exploit the detailed
use questions in the LEMAS survey. We measure IT use for report writing using an
indicator for whether computers are used for record keeping. To measure IT-enhanced
deployment, we use the average of indicators for computer use in dispatch, fleet man-
agement, and manpower allocation.
31
Table 8 reports regressions entering the record keeping and deployment variables
simultaneously as explanatory variables for offenses. The results are striking. For total
arrests, the coefficient on the record keeping computer use indicator is positive and sig-
nificant, suggesting that offense reports increase by 10% once computers are available
for record keeping. Computer use to enhance officer deployment, in contrast, enters
negatively and marginally significantly, consistent with a deterrent effect of improved
deployment capabilities. The additional columns replicate the analysis on different
individual crimes to provide an additional test of our interpretation. Given the se-
riousness and rarity of homicide, homicide is almost universally reported. Similarly,
because auto insurance companies typically require police reports in the case of motor
vehicle theft, computerization is less likely to affect reporting for this crime. For these
two crimes the estimated coefficients on the record keeping variable are statistically
indistinguishable from zero. It seems unlikely that improved police deployment could
exert a deterrent effect on rape
32
, and indeed the coefficient on the deployment index is
small and statistically insignificant. Thus it appears that one explanation for the pos-
itive association between general IT and crime is that reporting technologies increase
crime reports even while other technologies may deter crime.
5.2 Complementarities
Another reason for the small effect of IT on police effectiveness may be that IT it-
self does not increase police department produc tivity, but only in conjunction with a
31
Our results are not sensitive to separately or simultaneously entering in individual deployment
measures.
32
Braga (2001) reviews studies documenting the effects of police patrols on various types of crime
and disorder. None of the major studies includes rape as an outcome of interest.
20
broader set of practices. Milgrom and Roberts (1990) proposed such a complementarity
hypothesis in the manufacturing context, contrasting two basic systems, modern ver-
sus traditional manufacturing. Subsequent work, such as Bresnahan, Brynjolfsson, and
Hitt (2002), Brynjolfsson and Hitt (2003), and Bartel, Ichniowski, and Shaw (2005) has
identified empirical examples of complementarities between IT use and management
practices.
In the law enforcement context, modern policing has been most closely identified
with the Compstat system first introduced by the New York Police Department in
1994 by Commissioner William Bratton. In the popular imagination, Compstat is
characterized by two elements: the real-time mapping of crime, and the notorious early-
morning meetings where commanders must show they understand and are reacting to
the crime patterns projected on overhead screens. The program, in actuality, has
additional elements. Weisburd, Mastrofski, McNally, Greenspan, and Willis (2003,
427) argue that Compstat is composed of the following six elements: (1) statement of
the measurable goals of the department; (2) internal accountability, particulary through
Compstat meetings managers are accountable for understanding crime patterns and
reacting to them; (3) geographic organization of command district commanders have
authority and resources to accomplish their goals over their areas; (4) empowerment
of middle managers; (5) data driven problem identification and assessment; and (6)
innovative problem solving tactics.
33
Our data allow us to identify four management techniques corresp onding to ele-
ments identified by Weisburd, Mastrofski, McNally, Greenspan, and Willis (2003). In
particular, we consider (1) use of information technology for crime data collection and
analysis (5 above), (2) a problem-solving paradigm (6 above), (3) use of feedback for
priority-setting and evaluation (relating to 1, 2, and 5 above), and (4) a geographic-
based deployment structure (3 above). Following the economics literature on skill
complementarities, we include a fifth management practice the use of more highly
33
Other accounts coincide with the broad elements although not on all the details. For example,
the New York Times summarizes the program thus: “specialized units, statistics-driven deployment,
and a startling degree of hands-on leadership” (Dewan 2004).
21
educated officers.
Ideally, to test the complementary explanation we might use regressions similar
to those in Section 3 with separate and interacted measures of IT and management
practices as explanatory variables. Unfortunately, this approach is limited by our
available data in two important ways. Most of the questions regarding management
practices identified by Weisburd, Mastrofski, McNally, Greenspan, and Willis (2003)
as key to improving police performance were not asked in the LEMAS survey prior
to 1997, a time by which many larger agencies had already implemented Compstat or
similar programs. The other major data limitation is that because of the subjective
nature of some of the questions regarding management practices, there appears to be
some inconsistency across years in reported practices, potentially rendering inferences
based upon within-agency time-series variation in practices misleading.
34
To overcome
these limitations, we average responses to individual survey questions across years to
develop an agency-level indicator for each of the aforementioned management practices
covering the p eriod 1997-2003. We then de fine Compstat agencies as those which
simultaneously had eleme nts of all five key management techniques in at least half of
the sample years.
35
Our regressions examining complementarities thus exploit cross-
sectional variation in IT and management practices.
Table 9 reports regressions of the average agency clearance rate over 1997-2003 on
agency-level indicators for Compstat use and separate indicators for each of the five
components comprising a Compstat system. Separate specifications are reported for
total, violent, and property crimes with and without demographic controls. The results
are striking. Whereas the estimates on each of the individual management practices
are of negligible magnitude and generally statistically indistinguishable from zero, the
combination of practices into a Compstat system yields positive and significant effects
34
For example, one survey item asks whether officers use a particular police problem-solving method-
ology known as SARA (scanning, analysis, response, and assessment). The Little Rock PD reported
using SARA in 1997, 1999, and 2003 but not 2000 while the Bakersfield PD reported using SARA in
2000 but not 1999, 1997, or 2003. The police in Hillsborough, CA reported using SARA in 1997 and
2000 but not 1999 and 2003.
35
The construction of these variables is described in greater detail in the Appendix.
22
on clearance rates. The coefficient estimates of around 2% imply a roughly 10% gain
relative to the average clearance rate of 22%. For violent crimes, which in many cases
are an area of particular investigative emphasis, the point gains are even greater.
Table 10 reports comparable regressions with offense rates as the dependent vari-
able. Whereas some of the individual management practices appear positively associ-
ated with offense rates, consistent with our finding in section 5.1 regarding reporting,
use of a Compstat system is negatively and significantly related to total and property
offending. The estimated effects of a Compstat system on crime rates are small to
moderate the estimates imply, for example, that having a Compstat system has an
equivalent effect on crime rates as a 4.4 percentage point increase in the poverty rate.
36
Given that we only have approximate measures of the Compstat practices and are
forced by data limitations to rely on cross-sectional variation, our results on the role
of complementarities are only suggestive. Taken together, however, the regressions
in Tables 9 and 10 support the hypothesis that an additional reason for the weak
aggregate relationship between general IT and policing outcomes may be that while
many agencies utilize some type of IT, relatively few have yet implemented all of the
complementary management practices that allow IT to impact police effectiveness.
37
6 Conclusions
The introduction of information technology resulted in a widespread reorganization
of police departments. In interpreting the specific form of this reorganization, it is
useful to refer to the distinction between communication costs and processing costs
drawn in Garicano (2000) and Garicano and Rossi-Hansberg (2006). While drops in
36
Give n our findings here, one natural question is whether the outcome regressions rep orted pre-
viously suffer from omitted variable bias due to failure to control for management practices. If the
management practices are positively correlated with computer use (as they are in our sample) and are
actually beneficial in solving cases or deterring crimes, the omission would bias our estimates towards
finding an effect, which we do not.
37
One natural question arising from these findings is why Compstat or similar programs are not more
widely used by police agencies. Possible factors limiting Compstat adoption include the high monetary
cost of im plementing all the necessary practices, police union resistance to higher accountability
standards, and diffusion lags in knowledge of Compstat management techniques.
23
the cost of communicating problems allow for the substitution of asking for directions
for learning and doing it autonomously, so that workers in the field make less decisions
and specialized problem solvers deal with more of them, drops in the cost of acquiring
information are ‘decentralizing’ changes, which allow those in the field to make more
decisions and rely less on the center’s information. The organizational changes that we
observe (more layers, more specialization, more w hite collar/support workers) are not
consistent with drops in processing information/knowledge acquisition, but are instead
consistent with drops in the cost of communication between the field and the back office,
allowing a ‘recentralization’ of decision making. That is, what we observe (contrary
to the ‘empowerment’ theme in much of the literature) is that for police is more of a
‘centralizing’ force than a ‘decentralizing’ one. This interpretation is consistent with
the image of police commanders discussing crime patterns in Compstat sessions.
We also find that, while the effects of general information technology on crime
fighting and deterrence are statistically insignificant (in spite of our large samples),
this effect becomes relatively large when IT adoption is undertaken as part of a whole
package of organizational changes. That is, our results are a clear endorsement of
what we have called here the complementarity hypothesis, and suggest that police
departments, like firms, are likely to only enjoy the benefits of computerization when
they identify the specific ways the new information and data availabilities interact with
existing organizational practices and make adjustments accordingly.
Finally, our findings on the specific uses of computers add a caveat to researchers’
efforts to map the impact of IT on productivity. Computers are general-purp ose tech-
nologies, making it challenging to understand their impact without characterizing their
specific uses within a firm. In our case, some computing technologies increased recorded
crime, which appears as lower productivity. In more general cases, computers may
themselves alter the quality and type of data available to researchers in ways that
obscure productivity computation.
24
References
Acemoglu, D., P. Aghion, C. Lelarge, J. V. Reenan, and F. Zilibotti
(2006): “Technology, Information and the Decentrealization of the Firm,” Discussion
paper, MIT.
Arrow, K. (1974): The Limits of Orgnaization. Norton.
Ashenfelter, O . C. (1978): “Estimating the Effect of Training Programs on Earn-
ings,” The Review of Economics and Statistics, 60(1), 47–57.
Autor, D. H., L. F. Katz, and A. B. Krueger (1998): “Computing Inequal-
ity: Have Computers Changed The Labor Market?,” The Quarterly Journal of
Economics, 113(4), 1169–1213.
Bartel, A. P., C. Ichniowski, and K. L. Shaw (2005): “How Does Informa-
tion Technology Really Affect Productivity? Plant-Level Comparisons of Product
Innovation, Process Improvement and Worker Skills,” NBER Working Paper 11773,
National Bureau of Economic Research, Inc.
Berman, E., J. Bound, and Z. Griliches (1994): “Changes in the Demand for
Skilled Labor within U.S. Manufacturing: Evidence from the Annual Survey of Man-
ufactures,” The Quarterly Journal of Economics, 109(2), 367–97.
Bolton, P., and M. Dewatripont (1994): “The Firm as a Communication Net-
work,” The Quarterly Journal of Economics, 109(4), 809–39.
Bound, J., and G. Johnson (1992): “Changes in the Structure of Wages in the
1980’s: An Evaluation of Alternative Explanations,” American Economic Review,
82(3), 371–92.
Braga, A. A. (2001): “The Effects of Hot Spots Policing on Crime,” ANNALS of
the American Academy of Political and Social Science, 578(1), 104–125.
Bresnahan, T. F., E. Brynjolfsson, and L. M. Hitt (2002): “Information
Technology, Workplace Organization, And The Demand For Skilled Labor: Firm-
Level Evidence,” The Quarterly Journal of Economics, 117(1), 339–376.
Brown, M. M., and J. L. Brudney (2003): “Learning Organizations in the Pub-
lic Sector? A Study of Police Agencies Employing Information and Technology to
Advance Knowledge,” Public Administration Review, 63(14), 30–43.
Brynjolfsson, E., and L. M. Hitt (2003): “Computing Productivity: Firm-Level
Evidence,” Review of Economics and Statistics, 85(4), 793–808.
Caroli, E., and J. V. Reenen (2001): “Skill-Biased Organizational Change? Evi-
dence From A Panel Of British And French Establishments,” The Quarterly Journal
of Economics, 116(4), 1449–1492.
25
Dewan, S. K. (2004): “New York’s Gospel Of Policing by Data Spreads Across U.S.,”
New York Times, April 28, pg. A1.
Doms, M., T. Dunne, and K. R. Troske (1997): “Workers, Wages, and Technol-
ogy,” The Quarterly Journal of Economics, 112(1), 253–90.
Garicano, L. (2000): “Hierarchies and the Organization of Knowledge in P roduc-
tion,” Journal of Political Economy, 108(5), 874–904.
Garicano, L., and E. Rossi-Hansberg (2006): “Organization and Inequality in a
Knowldege Economy,” The Quarterly Journal of Economics, Forthcoming.
Goolsbee, A., and J. Guryan (2006): “The Impact of Internet Subsidies in Public
Schools,” The Review of Economics and Statistics, 88(2), 336–347.
Ichniowski, C., and K. Shaw (Forthcoming): “Insider Econometrics,” in Handbook
of Organizational Economics, ed. by R. Gibbons, and J. Roberts.
Ichniowski, C., K. Shaw, and G. Prennushi (1997): “The Effects of Human
Resource Management Practices on Productivity: A Study of Steel Finishing Lines,”
American Economic Review, 87(3), 291–313.
Krueger, A. B. (1993): “How Computers Have Changed the Wage Structure: Ev-
idence from Microdata, 1984-1989,” The Quarterly Journal of Economics, 108(1),
33–60.
Lehr, W., and F. Lichtenberg (1999): “Information Technology and Its Impact
on Firm-Level Productivity: Evidence from Government and Private Data Sources,
1977-1993,” Canadian Juornal of Economics, 32(2), 335–362.
Levitt, S. D. (2004): “Understanding Why Crime Fell in the 1990s: Four Factors
That Explain the Decline and Six That Do Not,” Journal of Economic Perspectives,
18(1), 163–190.
Milgrom, P., and J. Roberts (1990): “The Economics of Modern Manufacturing:
Technology, Strategy, and Organization,” American Economic Review, 80(3), 511–
28.
Mincer, J. (1991): “Human Capital, Technology, and the Wage Structure: What Do
Time Series Show?,” NBER Working Papers 3581, National Bureau of Economic
Research, Inc.
Office of Community Oriented Policing Services (2002): “COPS Fact
Sheet,” Washington D.C., U.S. Department of Justice.
Radner, R., and T. V. Zandt (1992): “Information Processing in Firms and Re-
turns to Scale,” Annales d’Economie et de Statistique, 25(26), 265–298.
26
Rajan, R., and J. Wulf (2006): “The Flattening Firm: Evidence from Panel
Data on the Changing Nature of Corporate Hierarchies,” Review of Economics and
Statistics, 88(4), 759–773.
Skogan, W., and K. Frydl (eds.) (2004): Fairness and Effectiveness in Policing:
The Evidence. National Academies Press.
Stiroh, K. J. (2002): “Information Technology and the U.S. Productivity Revival:
What Do the Industry Data Say?,” American Economic Review, 92(5), 1559–1576.
Weisburd, D., S. D. Mastrofski, A. M. McNally, R. Greenspan, and J. J.
Willis (2003): “Reforming to Preserve: Compstat and Strategic Problem Solving
in American Policing,” Criminology and Public Policy, 2(3), 421–456.
27
Table 1: Summary Statistics
Measure N Mean SD Min Max
Computerization Measures
Computer index 19461 0.509 0.362 0 1
PC use 13720 0.638 0.481 0 1
Computerized record keeping 16294 0.632 0.482 0 1
Departmental Size and Complexity
Number of employees 19893 197 959 0.5 55929
Number of special units 19461 1.12 2.26 0 8
Organizational levels 15216 2.82 0.50 1 3
Total written directives 14639 3.95 2.03 0 6
Departmental Organization
% officers with arrest powers 19893 0.785 0.171 0 1
% field operations staff 5549 0.606 0.193 0.003 0.995
% technical support staff 7764 0.120 0.117 0 0.877
Worker Skill and Training
College requirement for new officers 16164 0.128 0.334 0 1
Hours of training for new officers 15712 669 472 0 2080
Arrests, Offenses, and Officer Injury
Total crime clearance rate 14793 0.229 0.164 0 1.083
Violent crime clearance rate 13920 0.559 0.272 0 1.095
Property crime clearance rate 14525 0.188 0.147 0 1.083
Total offense rate 14773 0.034 0.033 0 0.496
Violent offense rate 14781 0.004 0.006 0 0.203
Property offense rate 14777 0.030 0.030 0 0.495
Officers assaulted 9458 63.7 290 0 9024
Officers killed 9458 0.019 0.159 0 4
Demographic Characteristics
Total population 19529 225608 1282226 0 35484453
% Black 19362 0.103 0.159 0 1
% Hispanic 19362 0.066 0.133 0 1
Poverty rate 19362 0.137 0.082 0 0.657
Median household income 19362 39173 15743 4208 202242
% high school graduate 19362 0.763 0.112 0.203 1.000
% college graduate 19362 0.200 0.124 0 0.883
Note: Computerization and other organizational measures are take n from the LEMAS survey. Ar-
rests are from FBI UCR data. Demographic information corresponds to the area covered by each
agency and are taken from the 1990 and 2000 Census with linear interpolation across years.
28
Table 2: Relationship Between IT Use and Organizational Structure
Dependent Variable (I) (II) (III) (IV) (V)
Departmental Size and Complexity
Log(Number of employees) .514** .0842** .0872** .0560** .0637**
(.0355) (.0100) (.00955) (.0107) (.0103)
Number of special units 2.78** .445** .451** .333* .299*
(.160) (.146) (.149) (.154) (.138)
Organizational levels .0410** .0214 .0172 -.0159 -.00219
(.00997) (.0129) (.0129) (.0147) (.0111)
Total written directives 1.54** 1.71** 1.68** .989** .927**
(.0494) (.0711) (.0715) (.0743) (.0661)
Departmental Organization
% officers with arrest powers -.00591 .00557 .00681 -.00287 .00443
(.00544) (.00475) (.00475) (.00521) (.00434)
% field operations staff .0720** -.00709 -.00893 .00212 -.00901
(.0148) (.00790) (.00772) (.00827) (.00774)
% technical support staff .0642** .0727** .0729** .0384** .0361**
(.00605) (.00610) (.00609) (.00716) (.00634)
Worker Skill and Training
College requirement for new officers .0549** .0338** .0308* .0470** .0241
(.0115) (.0129) (.0122) (.0129) (.0125)
Hours of training for new officers 230** 201** 203** 143** 102**
(12.0) (14.2) (14.1) (15.7) (14.2)
Include year fixed effects? Yes Yes Yes Yes Yes
Include agency fixed effects? No Yes Yes Yes Yes
Include state trends? No No Yes No No
Include computer index trends? No No No Yes No
Include initial level trends? No No No No Yes
Note: This table reports regressions of meas ures of organizational structure on a computerization index.
Each table entry represents a coefficient estimate from a separate regression where “Dependent Variable”
is the left-hand side variable and controls are included as specified in the bottom rows of the table. The
employees regression includes indicators for deciles of total offenses interacted with the log number of
offenses as additional controls. The special units, organizational levels, directives, and percentage workers
regressions include indicators for deciles of number of employees interacted with the log number of employees
as additional controls. The college requirements regressions include controls for the log number of agency
employees as well as the per capita income and percent of residents 25 and over with educational attainment
of below high school, some high school, high school, some college, ass ociate’s degrees, bachelor’s degrees,
and advanced degrees in each agency’s geographic area. The training regressions include controls for log
number of agency employees and whether new entrants are required to have college experience. Column
III includes state-specific yearly trends, column IV includes indicators for initial levels of the computer
index interacted with year trends, and column V indicators for deciles of the initial value of the dependent
variable interacted with year trends as additional controls. Standard errors clustered on agency are reported
in parenthesis. * denotes significance at the two-tailed 5% level and ** the 1% level.
29
Table 3: Effects of General and Specific IT Use on
Training
Explanatory Variable Estimate
PC indicator 43**
(9.05)
Mobile terminal indicator 110**
(14.2)
Log(Employees) 95**
(3.37)
College requirement indicator 88**
(15.6)
Number of departmental functions -15.739**
(2.42)
N 7552
R
2
0.412
Note: This table reports coefficients from a regression
of the total hours of training required of new recruits on
indicator variables for general and specific IT and other
controls. The regression includes year and state fixed
effects and incorporates data from 1990-1999. Standard
errors clustered on agency are reported in parenthesis.
* denotes significance at the two-tailed 5% level and **
the 1% level.
30
Table 4: Relationship Between IT Use and Arrests, Offenses, and Officer Injury
Dependent Variable (I) (I I) (III) (IV) (V)
Total crime clearance rate .00719 -3.73E-4 -.00141 -.00532 3.61E-4
(.00674) (.00603) (.00595) (.00683) (.00562)
Violent crime clearance rate -2.19E-4 .0109 .0126 -.0102 .0181
(.00983) (.0105) (.0103) (.0122) (.00944)
Property crime clearance rate .0197** .00289 .00210 -.00194 .00377
(.00596) (.00519) (.00514) (.00571) (.00492)
Total crime offense rate .0134** .00326** .00264** 9.21E-4 .00101
(9.93E-4) (7.53E-4) (7.15E-4) (7.64E-4) (6.77E-4)
Violent crime offense rate 9.87E-4** 1.47E-4 4.91E-5 -1.70E-4 -2.42E-4
(1.54E-4) (1.37E-4) (1.38E-4) (1.73E-4) (1.40E-4)
Property crime offense rate .0124** .00312** .00260** .00110 .00134*
(9.03E-4) (6.88E-4) (6.52E-4) (6.98E-4) (6.26E-4)
Officers assaulted -72.8** -50.8** -27.0* .845 -11.3
(15.9) (16.7) (11.8) (16.1) (15.2)
Officers killed -.0146* -.0143 -.0122 -.0116 -.0219*
(.00740) (.0128) (.0124) (.0132) (.0102)
Include year fixed effects? Yes Yes Yes Yes Yes
Include agency fixed effects? No Yes Yes Yes Yes
Include state trends? No No Yes No No
Include computer index trends? No No No Yes No
Include initial level trends? No No No No Yes
Note: This table reports regressions of the clearance rate (arrests/offenses), offense rate (of-
fenses/population), and officer injuries reported by an agency on a computerization index. Each table
entry represents a coefficient estimate from a separate regression where “Dependent Variable” is the left-
hand side variable and controls are included as specified in the bottom rows of the table. Arrest and
offending figures represent index crimes. All regressions include the percent Black, percent Hispanic, and
per capita income of the area covered by the agency as additional controls. The arrest regressions also
control for state fixed effects and agency size deciles interacted with log number of agency employees. The
offense regressions include indicators for deciles of resident population interacted with log population, log
agency employee s, and state fixed effects as additional controls. The assaults and officers killed regressions
also include controls log violent and property arrests, log population, and log employees. Column III
includes state-specific yearly trends, column IV includes indicators for initial levels of the computer index
interacted with year trends, and column V indicators for deciles of the initial value of the dependent vari-
able interacted with year trends as additional controls. Standard errors clustered on agency are reported
in parenthesis. * denotes significance at the two-tailed 5% level and ** the 1% level.
31
Table 5: Specifications With Leads and Lags of the Computer Index
Lead Specification Lag Specification
Computer Lead of Computer Lag of
Dependent Variable Index Index Index Index
Departmental Size and Complexity
Number of employees .0681** .0217 .0406** .0238
Number of special units .425* -.0162 .499** .442**
Organizational levels .0116 .00297 -.0193 -.00686
Total written directives 1.40** .499** .286** 6.48E-4
Departmental Organization
% officers with arrest powers .00187 .00642 .00136 -.00632
% field operations staff -.00367 -.00339 -.00862 .0113
% technical support staff .0572** .0133 .0107 -.00181
Worker Skill and Training
College requirement for new officers .0317 -.0164 .00785 -.00380
Hours of training for new officers 197** -24.2 58.8* 16.8
Arrests, Offenses, and Officer Injury
Total crime clearance rate -.0136 -9.20E-4 .00140 .00151
Violent crime clearance rate .00574 .0267 -.00960 -.0197
Property crime clearance rate -.00921 -.00188 .00140 .00138
Total offending rate .00414** .00230* .00182 .00123
Violent offending rate -9.70E-5 1.28E-6 1.30E-4 2.74E-4
Property offending rate .00424** .00229* .00169 9.57E-4
Officers assaulted -33.1 -25.8 -15.8 -39.5
Officers killed -.0251 -.0123 -.0139 .0156
Note: This table replicates the regressions from Column IV of Tables 2 and Table 4 including leads or
lags in the computer index as additional explanatory variables. Within a specification, each row reports
coefficient estimates from a separate regression. The first specification includes one additional lead of
the index and the second specification a single additional lag. denotes significance at the two-tailed
10% level, * the 5% level, and ** the 1% level.
32
Table 6: Additional Specifications
(I) (II) (III) (IV) (V) (VI) (VII) (VIII)
Size Small Large Population Low Budget
Dependent Variable Interaction A gencies Agencies Interaction Shrinking Growth Mid-Size Controls
Departmental Size and Complexity
Numbe r of employees .0315** .0511* .101** .0968** .0562** .0683** .111** .0995**
Numbe r of special units .440** N/A .441** .405** .564 .668* .200 .184
Organizational levels .00178 -.00425 -.00222 -.00355 .0510 .0153 .0247 .0336
Total written directives .825** .848** 1.73** 1.29** 1.56** 1.37** 1.18** 2.62**
Departmental Organization
% officers with arrest powers .00801 -.00157 .0129* .0101* -7.04E-5 -.00384 -.00930 -.00180
% field operations staff -.00852 N/A -.00734 -.00955 -.0135 -.0193 -.00362 -.0107
% technical support staff .0367** N/A .0592** .0540** .0858** .0849** .0500* .0631**
Worker Skill and Training
College requirement for new officers .0246 .0507** .0280 .0318* .0242 .0246 3.31E-5 .0648**
Hours of training for new officers 116** 116** 243** 166** 172** 167** 106 316**
Arrests, Offenses, and Officer Injury
Total crime clearance rate .00162 .00737 .00315 -.00125 -8.47E-4 -.0105 4.95E-4 -.0111
Violent crime clearance rate .00297 -.0184 .0104 .00833 .0129 .00798 .0124 .00814
Property crime clearance rate .00367 .0134 .00489 .00345 .0104 -.00656 .00595 4.27E-4
Total offending rate 7.00E-4 -.00179 .00274* .00225** .00316 .00310** .00792** .00591**
Violent offending rate -2.16E-4 1.17E-4 3.97E-5 -1.03E-4 2.00E-4 3.17E-4 .00110** 4.37E-4
Property offending rate 9.29E-4 -.00190 .00270** .00236** .00296* .00278** .00682** .00547**
Officers assaulted -.655 -3.90* -52.6* -21.3 11.7 -12.9 -.424 -8.80
Officers killed -.00810 .0146 -.0149 -.00888 .0398 -.0131 -.00444 -.0345
Note: This table replicates the regressions from Column II of Tables 2 and Table 4 including additional controls. Each table entry reports a
coefficient estimate from a separate regression. Column I includes as additional controls a full set of interactions between deciles of the agency size
distribution (measured by number of employees) and year indicators. Column II includes only agencies with fewer than 25 employees and Column
III with more than 100 employees. Column IV includes a full set interactions between deciles of the population size and year indicators. Column V
limits the analysis to agencies with p opulation decreases over the course of the sample, column VI to agencies with changes in population of less than
10%, and column VII agencies with population between 80000 and 100000. Column VIII includes interactions between the log annual equipment
budget and decile indicators for equipment budget as additional controls. An insufficient number of agencies with fewer than 25 employees reported
having special units and employee function categories to permit estimation. denotes significance at the two-tailed 10% level, * the 5% level, and
** the 1% level.
33
Table 7: Instrumental Variables Estimates of the Effect
of IT Use
Original
Measure Result (IV1) (IV2)
Departmental Size and Complexity
Log(Number of employees) + .203* .402**
(.0953) (.0905)
Number of special units + 6.60** 5.70**
(1.55) (.891)
Organizational levels 0 .338** .363**
(.0419) (.0449)
Total written directives + 5.74** 5.88**
(.177) (.184)
Departmental Organization
% officers with arrest powers 0 -.0507** -.0202
(.0156) (.0152)
% field operations staff 0 -.0389 -.133
(.0886) (.176)
% technical support staff + .323** .216**
(.0137) (.0190)
Worker Skill and Training
College requirement for new officers + .270** .274**
(.0711) (.0711)
Hours of training for new officers + 135** 126**
(33.2) (33.6)
Arrests, Offenses, and Officer Injury
Total crime clearance rate 0 .0234 .0250
(.0211) (.0203)
Violent crime clearance rate 0 .127** .0936**
(.0336) (.0310)
Property crime clearance rate 0 .0226 .0231
(.0184) (.0178)
Total crime offense rate 0 .0364** .0209**
(.00430) (.00375)
Violent crime offense rate 0 .00264** 7.83E-4
(6.51E-4) (5.43E-4)
Property crime offense rate + .0345** .0205**
(.00398) (.00348)
Assaults on officers 0 -523** -473**
(89.0) (77.7)
Officers killed 0 -.0921** -.117**
(.0329) (.0370)
34
Note: This table reports instrumental variables estimates of the effect of IT on organizational and
outcome measures. Each table entry reports a coefficient estimate from a separate IV regression.
Column IV1 uses the same-state perce ntage of individuals using a computer at work as an instrument
for the computer index. Column IV2 uses an indicator variable equal to 1 if an agency provides body
armor for officers as an instrument. The IV first-stage is estimated using a probit model. For the
college entry requirement, the IV estimation employs a bivariate probit model. The controls for each
specification are the sam e as those in column I of Tables 2 and 4, except that for the arrest and
offense outcomes the state fixed effects were omitted as controls in specification IV1. Standard errors
clustered on agency are reported in parenthesis. * denotes significance at the two-tailed 5% level and
** the 1% level.
35
Table 8: Relationship Between Offenses and Specialized Types of IT
Vehicle
Explanatory Variable Total Homicide Rape Larceny Theft
Computer record keeping .0977** .0310 .0476 .0513* -.0246
(.0235) (.0321) (.0276) (.0201) (.0222)
Computer deployment -.0494 -.0242 .0470 -.0106 -.00704
(.0286) (.0348) (.0328) (.0215) (.0260)
N 12077 5373 8294 11728 11085
R
2
0.966 0.911 0.935 0.980 0.974
Note: This table reports regressions of the log number of offenses reported by an agency
on measures of computerization of record keeping and computerization of officer deployment.
Each column reports results of a separate regression with the log offense measure used as
dependent variable indicated at the top of the column. The computer record keeping measure
is a 0-1 indicator of whether an agency uses computers for record keeping. The deployment
measure is a index ranging from 0 to 1 of the use of computers for three different deployment
functions. The regressions include the percent Black, percent Hispanic, per capita income,
poverty rate, log number of agency employees, and population decile indicators interacted
with log population as additional controls. The regressions also include agency and year fixed
effects. Standard errors clustered on agency are reported in parenthesis. * denotes significance
at the two-tailed 5% level and ** the 1% level.
36
Table 9: Complementarities Between IT and Management Practices in Solving Crimes
Clearance Rate For:
Explanatory Indicator All Crimes Violent Crimes Property Crimes
Compstat .0195* .0229** .0371** .0317* .0132 .0157
(.00823) (.00827) (.0139) (.0135) (.00804) (.00802)
Computer use .00158 .00965 .0196 .0194 .00784 .0146
(.00941) (.00953) (.0166) (.0166) (.00926) (.00934)
High-skilled workers -5.63E-4 .00676 .00150 1.25E-4 .00896 .0144*
(.00642) (.00655) (.0123) (.0120) (.00632) (.00640)
Problem-solving emphasis -.0114* -.00486 -.0353** -.0208* -.0110* -.00400
(.00560) (.00559) (.0106) (.0102) (.00548) (.00543)
Geographic awareness -.00552 .00735 -.0241* .00663 -.00825 .00656
(.00605) (.00657) (.0110) (.0112) (.00593) (.00645)
Evaluation -.00477 -.00319 -.0159 -.00529 -.00377 -.00110
(.00583) (.00577) (.0110) (.0106) (.00563) (.00553)
N 1768 1768 1765 1765 1768 1768
R
2
0.005 0.039 0.015 0.086 0.006 0.048
Include demographic controls? No Yes No Yes No Yes
Note: This table reports agency-level regressions of the 1997-2003 average clearance rate (arrest/offenses) on
indicators for a Compstat system as well as individual modern police management practices. Each column
entry reports coefficient estimates from a s eparate regression with inclusion of controls as specified in the
bottom row of the table. Agencies with a Compstat system simultaneously implemented elements of all five of
the listed management practices in more than half of the sample years between 1997-2003. The demographic
controls are the average percent Black, percent Hispanic, per capita income, poverty rate, and log population
of the area covered by the agency over 1997-2003. Heteroskedasticity-robust standard errors are reported in
parenthes is. * denotes significance at the two-tailed 5% level and ** the 1% level.
37
Table 10: Complementarities Between IT and Management Practices in Deterring Crimes
Offense Rate For:
Explanatory Indicator All Crimes Violent Crimes Property Crimes
Compstat -.00851** -.00473* -8.72E-4* -1.31E-4 -.00764** -.00460*
(.00247) (.00215) (3.85E-4) (2.95E-4) (.00218) (.00196)
Computer use .00108 .00674** -5.05E-4 4.03E-4 .00159 .00634**
(.00206) (.00199) (3.49E-4) (2.97E-4) (.00181) (.00178)
High-skilled workers .00133 .00658** -5.09E-4 4.08E-4* .00184 .00617**
(.00185) (.00161) (2.88E-4) (2.08E-4) (.00164) (.00147)
Problem-solving emphasis .00944** .00830** .00121** 8.52E-4** .00823** .00745**
(.00181) (.00160) (2.74E-4) (2.10E-4) (.00160) (.00145)
Geographic awareness .00455** .00216 .00108** 1.70E-4 .00348* .00199
(.00158) (.00148) (2.45E-4) (2.06E-4) (.00140) (.00134)
Evaluation .00734** .00463** 8.65E-4** 2.30E-4 .00647** .00440**
(.00176) (.00150) (2.82E-4) (2.06E-4) (.00154) (.00136)
N 1768 1768 1768 1768 1768 1768
R
2
0.057 0.253 0.046 0.402 0.055 0.216
Include demographic controls? No Yes No Yes No Yes
Note: This table reports agency-level regressions of the 1997-2003 average offense rate (offenses/population) on indi-
cators for a Compstat system as well as individual modern police management practices. Each column entry reports
coefficient estimates from a separate regression with inclusion of controls as specified in the bottom row of the table.
Agencies with a Compstat system simultaneously implemented elements of all five of the listed management practices
in more than half of the sample years between 1997-2003. The controls are the average percent Black, percent His-
panic, per capita income, poverty rate, and log number of agency employees over 1997-2003. Heteroskedasticity-robust
standard errors are reported in parenthesis. * denotes significance at the two-tailed 5% level and ** the 1% level.
38
Figure 1: Trends in Technology Use By Police Agencies
0 20 40 60 80 100
% of agencies using technology
1985 1990 1995 2000 2005
Year
Any computer PC
Mobile computer Mainframe/server
IT Use By Computer Type
0 20 40 60 80
% of agencies using data files
1985 1990 1995 2000 2005
Year
Arrests Service calls
Stolen property
Data Availability by Type
0 20 40 60 80 100
% of agencies with any computers
1985 1990 1995 2000 2005
Year
Ten or fewer employees Between 11 and 100 employees
Over 100 employees
IT Use By Agency Size
39
Appendix
Construction of Management Practices Measures
To examine the role of complementarities in crime reduction, we require separate
agency-level measures of computerization along with relevant modern police man-
agement practices. Following Weisburd, Mastrofski, McNally, Greenspan, and Willis
(2003), we identify five components of a Compstat system: 1) information technology
for crime data collection and analysis 2) use of skilled officers 3) a problem-solving par-
adigm 4) feedback-based evaluation and 5) a geographic-based deployment structure.
We code individual survey items 0-1 (No/Yes) to construct each of the five practice
measures. The constituent survey questions corresponding to each practice measure
are:
1. Information Technology (3)
Does the department use computers for crime analysis?
Does the department use computers for crime mapping?
Does the department use computers for investigation?
Does the department maintain computerized data on criminal histories?
Does the department maintain computerized data on crime incidents?
Does the department maintain computerized data on stolen property?
2. Skilled Officers (1)
Are more than 6 months (1040 hours) of training provided for new officers?
Are new officers required to have previous college experience?
3. Problem Solving (1.5)
Are officers encouraged to use SARA-type problem solving?
Are collaborative problem solving criteria used in officer evaluations?
Does the agency engage in problem solving projects with community groups
or government agencies?
4. Feedback-Based Evaluation (1)
Is citizen survey information collected and provided to patrol officers?
Is citizen survey information collected and used for allocating resources?
Is citizen survey information collected and used for prioritizing crime/disorder
problems?
Is citizen survey information collected and used for redistricting patrol ar-
eas?
40
5. Geographic Deployment (1)
Are officers assigned to geographic areas?
Are detectives assigned cases based on geographic areas?
The numbers in parenthesis above correspond to the average number of annual survey
questions that must be answered positively in order for an agency to be classified as
employing a particular management practice. For example, an agency which answered
yes to 2 of the problem-solving questions in 1997 and 1999, 1 in 2000, and 3 in 2003
would have an average problem-solving response of 21.5, so its problem-solving prac-
tice indicator would be coded as 1. We consider a department as having a Compstat
system in a given year if it answered yes to at least one of the constituent survey items
for e ach of the five practices in a given year. Compstat agencies were agencies with
Compstat systems in at least half of the available survey years. Of the 1768 agencies
in our pooled sample, 11.4% used Compstat, 85.6% information technology for crime
analysis, 13.8% high-skill officers, 38.5% problem-solving practices, 63.6% geographic
deployment, and 41.3% feedback-based evaluation.
41
Table A-1: Alternative Samples and Specifications
Alternative Specification
(I) (II) (III) (IV) (V)
Crime Any Better Omit Always
Measure Analysis Computer Panel 1987 Had IT
Departmental Size and Complexity
Log(Number of employees) .0236** .0706** .0811** .0664** .0558**
Number of special units .224** -.126 .430** .525** .562**
Organizational levels .00259 .0507** .0328* .0153 -.0153
Total written directives .524** 2.47** 1.81** .430** .342**
Departmental Organization
% officers with arrest powers 8.83E-4 .0114* .0109 .00747 1.12E-4
% field operations staff .00246 -.0212 -.00744 -.00435 -.00427
% technical support staff .0321** .120** .0745** .00129 .00205
Worker Skill and Training
College requirement for new officers .0225** .0193 .0258 .00297 .0179
Hours of training for new officers 67.5** 216** 205** 40.8* 60.9**
Arrests, Offenses, and Officer Injury
Total crime clearance rate -.00116 -.00210 -.00337 -.00698 -.00202
Violent crime clearance rate -.00982 .0223 .00806 -.00523 .0126
Property crime clearance rate 9.60E-4 .00223 -7.05E-4 -.00126 -3.31E-4
Total offenses rate .00103** .00484** .00374** .00286** .00124
Violent offense rate 1.74E-4* 4.71E-4** 7.93E-5 -5.55E-5 -1.21E-4
Property offense rate 8.58E-4* .00435** .00366** .00292** .00137
Assaults on officers -15.5* -92.9** -54.9** -15.1 -12.3
Officers killed -.00191 -.0242 -.0131 -.00674 -.00940
Note: This table reports robustness checks of the estimated effect of IT use on organizational and arrest outcomes.
Each table entry reports the results of a separate regress ion. The controls are the same as those reported for
column IV of Tables 2 and 4. Specifications I and I I respectively replace the computer index with an indicator
for computerized crime analysis and an indicator for use of any computing technology. Specification III limits
the sample to agencies with available data in 4 or more years. Specification IV omits observations from 1987.
Specification V limits the sample to agencies with a non-zero computer index in their earliest year of reporting.
Each table entry reports a coefficient estimate from a separate regression. Standard errors clustered on agency are
reported in parentheses. * denotes significance at the two-tailed 5% level and ** the 1% level.
42
Table A-2: Initial Litigation and IT Use
Avg. Litigation
Group Cases Per 100 (I) (II)
Employees
0-10 percentile .435 .0129 -.0312
(.0563) (.0584)
10-20 percentile .964 -.0682 -.0532
(.0685) (.0665)
20-30 percentile 1.35 -.0831 -.0986
(.0577) (.0576)
30-40 percentile 1.73 -.138* -.174**
(.0611) (.0615)
40-50 percentile 2.11 -.0473 -.0789
(.0590) (.0597)
50-60 percentile 2.58 -.0293 -.0499
(.0536) (.0544)
60-70 percentile 3.42 -.0698 -.115
(.0636) (.0638)
70-80 percentile 4.60 -.0545 -.0727
(.0654) (.0643)
80-90 percentile 6.69 -.0844 -.128*
(.0523) (.0526)
90-100 percentile 15.3 .0152 -2.15E-6
(.0529) (.0542)
N 709 2278
R
2
0.025 0.659
Note: This table rep orts regressions of the final level of the IT index
on indicators for deciles for the amount of litigation experienced
by an agency in 1987. Each column reports coefficient estimates
from a separate regression. Coefficients are measured relative to
agencies with no reported litigation cases in 1987. Specification
I limits the analysis to agencies for which IT data was available
in 1999 while specification II includes all agencies with at least
one observation on IT after 1987. Both regressions include the
initial level of the computer index as a control and specification II
includes indicators for the final year in which IT data was available
as additional controls. Heteroskedasticity-robust standard errors
are reported in parentheses. * denotes significance at the two-tailed
5% level and ** the 1% level.
43
    • "The two models were meant to enhance police decision-making when designing crime prevention programs and operational strategies. However, at this time impacts on crime reduction are far less than obvious and have become the subject of scholarly debates (Garicano and Heaton, 2006; Manning, 2003). More recently, several Anglo-Saxon law enforcement agencies have adopted Intelligence- Led Policing (ILP),a British police model based on a doctrine heavily influenced by a corporate and commercial vision in which police organizations adopt a " client-centred approach " by targeting serious crimes and repeat offenders. "
    Full-text · Chapter · Jan 2008 · Labour Economics
  • Article · · Labour Economics
  • [Show abstract] [Hide abstract] ABSTRACT: For the past twenty five years, economists have been building theories of the optimal management of firms. For example, economic models suggest that under some conditions, piece rate pay raises performance, and under other conditions, promotions tournaments raise performance. Some of these theories have been tested, others have not. Economists are now using new empirical research tools, that we label “insider econometrics,” to test the impact of management practices on productivity: to model how much productivity changes; to model why management practices raise productivity; and to examine which firms benefit and why from alternative management practices. The methodology we describe is “insider” because it uses inside knowledge and data from within firms. It is “econometrics” because the methodology is often the application of treatment effects methods to modeling changing management practices within firms. However, the methods and challenges of insider econometrics are unique, and we identify several key features that are important in undertaking empirical studies of workers' productivity. Now that more firms are keeping data on employees, it is time to improve our analysis of the empirical study of the productivity of workers within firms.
    Article · Dec 2009