Chapter

Governance Models Preferences for Security Information Sharing: An Institutional Economics Perspective for Critical Infrastructure Protection

Authors:
  • University of Applied Science of Western Switzerland
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Empirical studies have analyzed the incentive mechanisms that support security information sharing between human agents, a key activity for critical infrastructure protection. However, recent research shows that most Information Sharing and Analysis Centers – the most common institution organizing security information sharing – do not perform at Pareto optimal level, even when properly regulated. Using a meso-level of analysis, we close an important research gap by presenting a theoretical framework that links institutional economics and security information sharing. We illustrate this framework with a dataset collected through an online questionnaire addressed to all critical infrastructures (N=262) operating at the Swiss Reporting and Analysis Centre for Information Security (MELANI). Using descriptive statistics, we investigate how institutional rules offer human agents an institutional freedom to self-design an efficient security information sharing artifact. Our results show that a properly designed artifact can positively reinforces human agents to share security information and find the right balance between three governance models: A) public–private partnership, B) private or C) government-based. Overall, our work lends support to a better institutional design of security information sharing and the formulation of policies that can avoid non-cooperative and free-riding behaviors that plague the cybersecurity public-good.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In fact, studies have shown that cyber information sharing is primarily a problem of (bad) human behavior and misaligned incentives [8,7,14,11]. Existing research has shown that the human motivation to engage into cyber information sharing would be higher if the intelligence production is realized in a decentralized and privacy-preserving way [26] and under a public-private governance [28] institutional design. This theoretical ideal is, however, difficult to implement in practice because of diverging interests and incentives between institutions [3,2]. ...
Preprint
Full-text available
Cyber Threat Intelligence (CTI) sharing is an important activity to reduce information asymmetries between attackers and defenders. However, this activity presents challenges due to the tension between data sharing and confidentiality, that result in information retention often leading to a free-rider problem. Therefore, the information that is shared represents only the tip of the iceberg. Current literature assumes access to centralized databases containing all the information, but this is not always feasible, due to the aforementioned tension. This results in unbalanced or incomplete datasets, requiring the use of techniques to expand them; we show how these techniques lead to biased results and misleading performance expectations. We propose a novel framework for extracting CTI from distributed data on incidents, vulnerabilities and indicators of compromise, and demonstrate its use in several practical scenarios, in conjunction with the Malware Information Sharing Platforms (MISP). Policy implications for CTI sharing are presented and discussed. The proposed system relies on an efficient combination of privacy enhancing technologies and federated processing. This lets organizations stay in control of their CTI and minimize the risks of exposure or leakage, while enabling the benefits of sharing, more accurate and representative results, and more effective predictive and preventive defenses.
Thesis
Full-text available
The operational continuity of critical infrastructures (CIs) is vital for the functioning of modern societies. Yet, these CIs are monitored/managed by an interdependent ecosystem of information systems, exposing CIs to the systemic risk of cascading failures. Consequently, CIs require an information-systems defense capability – i.e., the ability to prevent, detect and respond to information systems’ failure. In order to ensure such a capability, the field of computer & information security develops a myriad of technologies. However, security incidents are caused by inappropriate organizational design and/or human-behavior aspects, at least as often as by inefficient IT design. Following this logic, information systems are apprehended as socio-technical systems constituted by a nexus of technologies (material resources) and human agents (human, and knowledge resources) who employ such technologies. Building on prior research on organizational capabilities and security economics, I explore the organizational design and human behavior aspects that are necessary for CIs to build an information-systems defense capability. Investigating the case of three specific critical infrastructures and their context, I deconstruct this capability into material, human, and knowledge resources, and I explore how they should be acquired to build such a capability. My contributions are threefold: 1) a model that helps to preempt the effect of disruptive technologies on the optimal level of investment in cyber-security, providing a framework in order to select and invest in the most effective technologies; 2) a recruitment framework in order to attract scarce IT-specialists; 3) a model in order to foster tacit-knowledge absorption related to cyber-security. Policy recommendations and a research agenda for future work are presented.
Conference Paper
Full-text available
Cybersecurity information sharing among participating organizations proactivly helps defend against attackers. However, such sharing also exposes potentially sensitive organizational information. We attack the problem of finding sharing incentives and penalties that maximize sharing utility while minimizing risk by modeling cybersecurity information sharing as an iterated prisoner's delimma-like game. A genetic algorithm then evolves potential strategies that lead to high utility for an organization participating in a Cybersecurity Information Exchange. Preliminary results indicate that the genetic algorith finds strategies better random or Tit-for-tat.
Article
Full-text available
Legislators in many countries enact security breach notification regulation to address a lack of information security. The laws designate authorities to collect breach reports and advise firms. We devise a principal–agent model to analyze the economic effect of mandatory security breach reporting to authorities. The model assumes that firms (agents) have few incentives to unilaterally report breaches. To enforce the law, regulators (principals) can introduce security audits and sanction noncompliance. However, audits cannot differentiate between concealment and nescience of the agents. Even under optimistic assumptions regarding the effectiveness of mandatory security breach reporting to authorities in reducing individual losses, our model predicts that it may be difficult to adjust the sanction level such that breach notification laws generate social benefit.
Article
Full-text available
We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend our analysis to endogenize the firm’s asset and this analysis largely confirms the preceding analysis with a fixed asset. We use the software Mathematica 10.1 (www.wolfram.com) to program the model mathematically with equilibrium constraints, and perform numerical analysis illustrated graphically.
Article
Full-text available
Nowadays, both the amount of cyberattacks and their sophistication have considerably increased, and their prevention is of concern of most of organizations. Cooperation by means of information sharing is a promising strategy to address this problem, but unfortunately it poses many challenges. Indeed, looking for a win-win environment is not straightforward and organizations are not properly motivated to share information. This work presents a model to analyse the benefits and drawbacks of information sharing among organizations that presents a certain level of dependency. The proposed model applies functional dependency network analysis to emulate attacks propagation and game theory for information sharing management. We present a simulation framework implementing the model that allows for testing different sharing strategies under several network and attack settings. Experiments using simulated environments show how the proposed model provides insights on which conditions and scenarios are beneficial for information sharing.
Article
Full-text available
Given the importance of cybersecurity to the survival of an organization, a fundamental economics based question that must be addressed by all organizations is: How much should be invested in cybersecurity related activities? Gordon and Loeb [1] presented a model to address this question, and that model has received a significant amount of attention in the academic and practitioner literature. The primary objective of this paper is to discuss the Gordon-Loeb Model with a focus on gaining insights for the model's use in a practical setting.
Article
Full-text available
The primary objective of this article is to develop an economics-based analytical framework for assessing the impact of government incentives/regulations designed to offset the tendency to under-invest in cybersecurity related activities by private sector firms. The analysis provided in the article shows that the potential for government incentives/regulations to increase cybersecurity investments by private sector firms is dependent on the following two fundamental issues: (i) whether or not firms are utilizing the optimal mix of inputs to cybersecurity, and (ii) whether or not firms are able, and willing, to increase their investments in cybersecurity activities. The implications of these findings are also discussed in this article, as well as a formal analysis of these implications. In addition , this article provides a discussion of existing actions by the US federal government that should be more effectively utilized before, or at least in conjunction with, considering new government incentives/regulations for increasing cybersecurity investments by private sector firms.
Conference Paper
Full-text available
The sharing of cyber security information between organizations, both public and private, and across sectors and borders is required to increase situational awareness, reduce vulnerabilities, manage risk and enhance cyber resilience. However, the notion of information sharing often is a broad and multi-faceted concept. This chapter describes an analytic framework for sharing cyber security information. A decomposition of the information sharing needs with regard to information exchange elements is mapped to a grid whose vertical dimension spans the strategic/policy, tactical and operational/technical levels and whose horizontal dimension spans the incident response cycle. The framework facilitates organizational and legal discussions about the types of cyber security information that can be shared with other entities along with the terms and conditions of information sharing. Moreover, the framework helps identify important aspects that are missing in existing information exchange standards.
Article
Full-text available
We contribute to the microfoundations of organizational performance by proffering the construct of joint production motivation. Under such motivational conditions individuals see themselves as part of a joint endeavor, each with his or her own roles and responsibilities; generate shared representations of actions and tasks; cognitively coordinate cooperation; and choose their own behaviors in terms of joint goals. Using goal-framing theory, we explain how motivation for joint production can be managed by cognitive/symbolic management and organizational design.
Book
Full-text available
The failure of a national critical infrastructure may seriously impact the health and well-being of citizens, the economy, the environment, and the functioning of the government. Moreover, critical infrastructures increasingly depend on information and communication technologies (ICT) or, in short, cyber. Cyber security and resilience are therefore seen as increasingly important governance topics and major challenges for today’s societies, as the threat landscape is continuously changing. Sharing cyber security related information between organisations – in a critical sector, cross-sector, nationally and internationally – is widely perceived as an effective measure in support of managing the cyber security challenges of organisations. Information sharing, however, is not an easy topic. It comes with many facets. For example, information sharing spans strategic, tactical, operational and technical levels; spans all phases of the cyber incident response cycle (proactive, pre-emption, prevention, preparation, incident response, recovery, aftercare/ follow up); is highly dynamic; crosses the boundary of public and private domains; and concerns sensitive information which can be potentially harmful for one organisation on the one hand, while being very useful to others. This Good Practice on information sharing discusses many of these facets. Its aim is to assist you as public and private policy-makers, middle management, researchers, and cyber security practitioners, and to steer you away from pitfalls. Reflect on the earlier lessons identified to find your own effective and efficient arrangements for information sharing which fit your specific situation.
Article
Full-text available
Cyber security breaches inflict costs to consumers and businesses. The possibility also exists that a cyber security breach may shut down an entire critical infrastructure industry, putting a nation’s whole economy and national defense at risk. Hence, the issue of cyber security investment has risen to the top of the agenda of business and government executives. This paper examines how the existence of well-recognized externalities changes the maximum a firm should, from a social welfare perspective, invest in cyber security activities. By extending the cyber security investment model of Gordon and Loeb [1] to incorporate externalities, we show that the firm’s social optimal investment in cyber security increases by no more than 37% of the expected externality loss.
Article
Full-text available
Research on the science enterprise ordinarily assumes that any hierarchy among academic departments reflects an egalitarian system based on merit. This assumption is predicated on the view that departmental prestige is primarily a function of faculty scholarship. In this article, the association between prestige and scholarship is examined for the discipline of sociology using evaluative ratings from three national studies and objective data on publications. Scholarship is found to be far less important in determining prestige ratings than either the past reputations of departments or their affiliated universities. Publishing is not necessarily a straightforward means of securing a department's prestige. Instead, past reputation reflective of an institutional context, rather than scholarly productivity, appears to be the critical property bearing on how departments are viewed.
Article
Full-text available
This chapter outlines the two basic routes to persuasion. One route is based on the thoughtful consideration of arguments central to the issue, whereas the other is based on the affective associations or simple inferences tied to peripheral cues in the persuasion context. This chapter discusses a wide variety of variables that proved instrumental in affecting the elaboration likelihood, and thus the route to persuasion. One of the basic postulates of the Elaboration Likelihood Model—that variables may affect persuasion by increasing or decreasing scrutiny of message arguments—has been highly useful in accounting for the effects of a seemingly diverse list of variables. The reviewers of the attitude change literature have been disappointed with the many conflicting effects observed, even for ostensibly simple variables. The Elaboration Likelihood Model (ELM) attempts to place these many conflicting results and theories under one conceptual umbrella by specifying the major processes underlying persuasion and indicating the way many of the traditionally studied variables and theories relate to these basic processes. The ELM may prove useful in providing a guiding set of postulates from which to interpret previous work and in suggesting new hypotheses to be explored in future research. Copyright © 1986 Academic Press Inc. Published by Elsevier Inc. All rights reserved.
Article
Full-text available
Design science research (DSR) has staked its rightful ground as an important and legitimate Information Systems (IS) research paradigm. We contend that DSR has yet to attain its full potential impact on the development and use of information systems due to gaps in the understanding and application of DSR concepts and methods. This essay aims to help researchers (1) appreciate the levels of artifact abstractions that may be DSR contributions, (2) identify appropriate ways of consuming and producing knowledge when they are preparing journal articles or other scholarly works, (3) understand and position the knowledge contributions of their research projects, and (4) structure a DSR article so that it emphasizes significant contributions to the knowledge base. Our focal contribution is the DSR knowledge contribution framework with two dimensions based on the existing state of knowledge in both the problem and solution domains for the research opportunity under study. In addition, we propose a DSR communication schema with similarities to more conventional publication patterns, but which substitutes the description of the DSR artifact in place of a traditional results section. We evaluate the DSR contribution framework and the DSR communication schema via examinations of DSR exemplar publications.
Article
Full-text available
The threat of cascading failures across critical infrastructures has been identified as a key challenge for governments. Cascading failure is seen as potentially catastrophic, extremely difficult to predict and increasingly likely to happen. Infrastructures are largely privately operated and private actors are thought to under-invest in mitigating this threat. Consequently, experts have demanded a more dominant role for government, including the use of regulation. Empirical evidence on cascading failure is, however, extremely scarce. This paper analyzes new data drawn from news reports on incidents. We find that, contrary to current thinking, cascades are not rare. Neither do they indicate a wide array of unknown dependencies across infrastructures. Rather, we find a small number of focused, unidirectional pathways around energy and telecommunications. We also found that most cascades were stopped quickly, in contrast to the oft-cited ‘domino effect’. These findings undermine the case for more intrusive public oversight of critical infrastructures.
Article
Full-text available
Individuals' knowledge does not transform easily into organizational knowledge even with the implementation of knowledge repositories. Rather, individuals tend to hoard knowledge for various reasons. The aim of this study is to develop an integrative understanding of the factors supporting or inhibiting individuals' knowledge-sharing intentions. We employ as our theoretical framework the theory of reasoned action (TRA), and augment it with extrinsic motivators, social-psychological forces and organizational climate factors that are believed to influence individuals' knowledge-sharing intentions. Through a field survey of 154 managers from 27 Korean organizations, we confirm our hypothesis that attitudes toward and subjective norms with regard to knowledge sharing as well as organizational climate affect individuals' intentions to share knowledge. Additionally, we find that anticipated reciprocal relationships affect individuals' attitudes toward knowledge sharing while both sense of self-worth and organizational climate affect subjective norms. Contrary to common belief, we find anticipated extrinsic rewards exert a negative effect on individuals' knowledge-sharing attitudes.
Article
Full-text available
Although research on trust in an organizational context has advanced considerably in recent years, the literature has yet to produce a set of generalizable propositions that inform our understanding of the organization and coordination of work. We propose that conceptualizing trust as an organizing principle is a powerful way of integrating the diverse trust literature and distilling generalizable implications for how trust affects organizing. We develop the notion of trust as an organizing principle by specifying structuring and mobilizing as two sets of causal pathways through which trust influences several important properties of organizations. We further describe specific mechanisms within structuring and mobilizing that influence interaction patterns and organizational processes. The principal aim of the framework is to advance the literature by connecting the psychological and sociological micro-foundations of trust with the macro-bases of organizing. The paper concludes by demonstrating how the framework can be applied to yield novel insights into traditional views of organizations and to stimulate original and innovative avenues of organizational research that consider both the benefits and downsides of trust.
Article
Full-text available
This second edition assesses some of the major refinements, extensions, and useful applications that have developed in neoinstitutionalist thought in recent years. More attention is given to the overlap between the New Institutional Economics and developments in economic history and political science. In addition to updated references, new material includes analysis of parallel developments in the field of economic sociology and its attacks on representatives of the NIE as well as an explanation of the institution-as-an- equilibrium-of-game approach. Already an international best seller, Institutions and Economic Theory is essential reading for economists and students attracted to the NIE approach. Scholars from such disciplines as political science, sociology, and law will find the work useful as the NIE continues to gain wide academic acceptance. A useful glossary for students is included. Eirik Furubotn is Honorary Professor of Economics, Co-Director of the Center for New Institutional Economics, University of Saarland, Germany and Research Fellow, Private Enterprise Research Center, Texas A&M University. Rudolph Richter is Professor Emeritus of Economics and Director of the Center for New Institutional Economics, University of Saarland, Germany. Copyright © Eirik G. Furubotn and Rudolf Richter 1998, 2005. All rights reserved.
Article
Full-text available
Contrasting views exist on how network characteristics predict knowledge sharing. Although large, open egocentric networks foster network positions that provide access to nonredundant knowledge, critics have noted that such networks impair knowledge sharing, because trust and reciprocity do not thrive in them. This problem may, however, be resolved by including motivation and ability to share knowledge as moderators of the association between network position and knowledge sharing. Our analysis of 705 employees in a consultancy shows that employees' knowledge acquisition and provision are highest when network centrality, autonomous motivation, and ability are all high, thus supporting the proposed three-way interaction.
Conference Paper
Empirical studies have analyzed the incentive mechanisms for sharing security information between human agents, a key activity for critical infrastructure protection. However, recent research shows that most Information Sharing and Analysis Centers do not perform optimally, even when properly regulated. Using a meso-level of analysis, we close an important research gap by presenting a theoretical framework that links institutional economics and security information sharing. We illustrate this framework with a dataset collected through an online questionnaire addressed to all critical infrastructures (N = 262) operating at the Swiss Reporting and Analysis Centre for Information Security (MELANI). Using descriptive statistics, we investigate how institutional rules offer human agents an institutional freedom to self-design an efficient security information sharing artifact. Our results show that a properly designed artifact can positively reinforces human agents to share security information and find the right balance between three governance models: (A) public-private partnership, (B) private, and (C) government-based. Overall, our work lends support to a better institutional design of security information sharing and the formulation of policies that can avoid non-cooperative and free-riding behaviors that plague cybersecurity.
Article
In today's interconnected digital world, cybersecurity risks and resulting breaches are a fundamental concern to organizations and public policy setters. Accounting firms, as well as other firms providing risk advisory services, are concerned about their clients’ potential and actual breaches. Organizations cannot, however, eliminate all cybersecurity risks so as to achieve 100% security. Furthermore, at some point additional cybersecurity measures become more costly than the benefits from the incremental security. Thus, those responsible for preventing cybersecurity breaches within their organizations, as well as those providing risk advisory services to those organizations, need to think in terms of the cost-benefit aspects of cybersecurity investments. Besides investing in activities that prevent or mitigate the negative effects of cybersecurity breaches, organizations can invest in cybersecurity insurance as means of transferring some of the cybersecurity risks associated with potential future breaches. This paper provides a model for selecting the optimal set of cybersecurity insurance policies by a firm, given a finite number of policies being offered by one or more insurance companies. The optimal set of policies for the firm determined by this selection model can (and often does) contain at least three areas of possible losses not covered by the selected policies (called the Non-Coverage areas in this paper). By considering sets of insurance policies with three or more Non-Coverage areas, we show that a firm is often better able to address the frequently cited problems of high deductibles and low ceilings common in today's cybersecurity insurance marketplace. Our selection model facilitates improved risk-sharing among cybersecurity insurance purchasers and sellers. As such, our model provides a basis for a more efficient cybersecurity insurance marketplace than currently exists. Our model is developed from the perspective of a firm purchasing the insurance policies (or the risk advisors guiding the firm) and assumes the firm's objective in purchasing cybersecurity insurance is to minimize the sum of the costs of the premiums associated with the cybersecurity insurance policies selected and the sum of the expected losses not covered by the insurance policies.
Article
Users' acceptance of new technologies in schools or companies is a critical challenge in modern society. In this study, we used self-reported intentional measures, an objective measure of actual behavior, namely university students' use of a new virtual learning environment platform, and a measure of their implicit attitude towards technology which we compared with two classic acceptance models (TPB, Theory of Planned Behavior; and TAM, Technology Acceptance Model). Our findings indicated that TPB was a better predictor of the intention to use the platform than TAM, but both models failed to predict actual behavior. On the contrary, implicit measures were linked to the degree to which the students explored the platform but not to their reported intention to use it. Taken together, these results provide a first argument for considering implicit measures in the field of technology acceptance.
Chapter
We propose an extension of the Gordon-Loeb model by considering multi-periods and relaxing the assumption of a continuous security breach probability function. Such adaptations allow capturing dynamic aspects of information security investment such as the advent of a disruptive technology and its consequences. In this paper, the case of big data analytics (BDA) and its disruptive effects on information security investment is theoretically investigated. Our analysis suggests a substantive decrease in such investment due to a technological shift. While we believe this case should be generalizable across the information security milieu, we illustrate our approach in the context of critical infrastructure protection (CIP) in which security cost reduction is of prior importance since potential losses reach unaffordable dimensions. Moreover, despite BDA has been considered as a promising method for CIP, its concrete effects have been discussed little.
Article
Cyber risk management largely reduces to a race for information between defenders of ICT systems and attackers. Defenders can gain advantage in this race by sharing cyber risk information with each other. Yet, they often exchange less information than is socially desirable, because sharing decisions are guided by selfish rather than altruistic reasons. A growing line of research studies these strategic aspects that drive defenders’ sharing decisions. The present survey systematizes these works in a novel framework. It provides a consolidated understanding of defenders’ strategies to privately or publicly share information and enables us to distill trends in the literature and identify future research directions. We reveal that many theoretical works assume cyber risk information sharing to be beneficial, while empirical validations are often missing.
Article
Sharing cyber security information helps firms to decrease cyber security risks, prevent attacks, and increase their overall resilience. Hence it affects reducing the social security cost. Although previously cyber security information sharing was being performed in an informal and ad hoc manner, nowadays through development of information sharing and analysis centers (ISACs), cyber security information sharing has become more structured, regular, and frequent. This is while, the privacy risk and information disclosure concerns are still major challenges faced by ISACs that act as barriers in activating the potential impacts of ISACs.
Article
Currently, two models of innovation are prevalent in organization science. The “private investment” model assumes returns to the innovator result from private goods and efficient regimes of intellectual property protection. The “collective action” model assumes that under conditions of market failure, innovators collaborate in order to produce a public good. The phenomenon of open source software development shows that users program to solve their own as well as shared technical problems, and freely reveal their innovations without appropriating private returns from selling the software. In this paper, we propose that open source software development is an exemplar of a compound “private-collective” model of innovation that contains elements of both the private investment and the collective action models and can offer society the “best of both worlds” under many conditions. We describe a new set of research questions this model raises for scholars in organization science. We offer some details regarding the types of data available for open source projects in order to ease access for researchers who are unfamiliar with these, and also offer some advice on conducting empirical studies on open source software development processes.
Conference Paper
Sharing incident data among Internet operators is widely seen as an important strategy in combating cybercrime. However, little work has been done to quantify the positive benefits of such sharing. To that end, we report on an observational study of URLs blacklisted for distributing malware that the non-profit anti-malware organization StopBadware shared with requesting web hosting providers. Our dataset comprises over 28,000 URLs shared with 41 organizations between 2010 and 2015. We show that sharing has an immediate effect of cleaning the reported URLs and reducing the likelihood that they will be recompromised; despite this, we find that long-lived malware takes much longer to clean, even after being reported. Furthermore, we find limited evidence that one-time sharing of malware data improves the malware cleanup response of all providers over the long term. Instead, some providers improve while others worsen.
Conference Paper
In a world where cybersecurity can be reduced to a race for information, defenders with different information sets can benefit from sharing what they observe and know. However, defenders supposedly share less than what is socially desirable, thereby leaving parts of society insufficiently protected. The reason for this behavior is almost certainly not a lack of technology to facilitate information exchange. Even an occasional mismatch between information needs and information supply cannot fully explain why defenders share so little information. The key obstacle is economics: defenders often have few incentives to share security information. Back in the 1980s, economists have extensively studied general information sharing between firms, often with the involvement of intermediaries, such as trade associations. These models have been taken up in the 2000s when information security emerged as a new application domain for economic reasoning. The bottom line of most economic models is that information sharing is very fragile, which concurs with our perception of reality. As current policy initiatives seek to improve information sharing, and the topic has gained enough attention to merit a specialized workshop in its third year, I take the opportunity and revisit the "old" models, their assumptions and implications, in order to derive possible new directions for future research. After all, security information is a very special good, which might call for tailored models: maybe there are better ways to conceptualize Information Sharing Analysis Centers (ISACs) than reducing them to trade associations?
Conference Paper
The Sicherheitsverbundsübung Schweiz 2014 tested the availability of critical infrastructures. The present case study examines the extent to which one of the largest retailers in Switzerland will be able to continue operations under this scenario. Various interviews, simulations and process categorizations were undertaken to this end. The findings clearly demonstrate that with a short-term black-out, damages incurred by the retailers focused on here quickly exceed the 100 million Swiss franc mark. Meanwhile, a longer power shortage situation would probably be an existential threat. Under these preconditions, it becomes obvious that the security of the food supply for the Swiss population via organised channels will not be guaranteed. The problem analysis clearly conveys that awareness as to critical infrastructures has far from reached decision maker level and that the resilience of processes has been weakened further due to strategic decisions.
Conference Paper
With the growing threat from overseas and domestic cyber attacks inter-organization cyber-security information sharing is an essential contributor to helping governments and industry to protect and defend their critical network infrastructure from attack. Encouraging collaboration directly impacts the defensive capabilities of all organizations involved in any cyber-information sharing community. A barrier to successful collaboration is the conflicting needs of collaborators to be able to both protect the source of their information for sensitivity, legal, or public relations reasons, but also to validate and trust the information shared with them. This paper uses as an example the UK government's Cyber-Security Information Sharing Partnership (CiSP), an online collaboration environment created by Surevine for sharing and collaborating on cyber-security information across UK industry and government. We discuss the organization and operating principles of the collaboration environment, how the community is structured, and the barriers to participation caused by the conflict between the need for anonymity versus the need to trust the information shared.
Chapter
The terrorist attacks of September 11, 2001, fundamentally challenged two key aspects of U.S. national security thinking. First, it altered the relationship between the private sector and the federal government by squarely thrusting the private sector into a new and unprecedented national security role. Second, it challenged long-standing priorities regarding the treatment of national security information, increasing the importance of sharing information and making it more widely available at the expense of traditional limitations on access to and dissemination of classified and other sensitive information. This chapter addresses the confluence of these challenges – information sharing by the federal government with the private sector to enhance national and homeland security. It provides a brief history of public–private information sharing efforts before 9/11, describes reforms and initiatives since 9/11, and assesses problems and prospects for improved information sharing in the future.
Article
The cyber threat intelligence information exchange ecosystem is a holistic approach to the automated sharing of threat intelligence. For automation to succeed, it must handle tomorrow's attacks, not just yesterday's. There are numerous ontologies that attempt to enable the sharing of cyber threats, such as OpenIOC, STIX, and IODEF. To date, most ontologies are based on various use cases. Ontology developers collect threat indicators that through experience seem to be useful for exchange. This approach is pragmatic and offers a collection of useful threat indicators in real-world scenarios. However, such a selection method is episodic. What is useful today may not be useful tomorrow. What we consider to be chaff or too hard to share today might become a critically important piece of information. Therefore, in addition to use casebased ontology, ontologies need to be based on first principles. In this document we propose taxonomy for classifying threatsharing technologies. The purpose of this taxonomy is to classify existing technologies using an agnostic framework, identify gaps in existing technologies, and explain their differences from a scientific perspective. We are currently working on a thesaurus that will describe, compare, and classify detailed cyber security terms. This paper focuses on the classification of the ontologies themselves. Copyright
Article
This study examines a critical incentive alignment issue facing FS-ISAC (the information sharing alliance in the financial services industry). Failure to encourage members to share their IT security-related information has seriously undermined the founding rationale of FS-ISAC. Our analysis shows that many information sharing alliances' membership policies are plagued with the incentive misalignment issue and may result in a "free-riding" or "no information sharing" equilibrium. To address this issue, we propose a new information sharing membership policy that incorporates an insurance option and show that the proposed policy can align members' incentives and lead to a socially optimal outcome. Moreover, when a transfer payment mechanism is implemented, all member firms will be better off joining the insurance network. These results are demonstrated in a simulation in which IT security breach losses are compared both with and without participating in the proposed information sharing insurance plan.
Article
Currently, two models of innovation are prevalent in organization science. The "private investment" model assumes returns to the innovator result from private goods and efficient regimes of intellectual property protection. The "collective action" model assumes that under conditions of market failure, innovators collaborate in order to produce a public good. The phenomenon of open source software development shows that users program to solve their own as well as shared technical problems, and freely reveal their innovations without appropriating private returns from selling the software. In this paper, we propose that open source software development is an exemplar of a compound "private-collective" model of innovation that contains elements of both the private investment and the collective action models and can offer society the "best of both worlds" under many conditions. We describe a new set of research questions this model raises for scholars in organization science. We offer some details regarding the types of data available for open source projects in order to ease access for researchers who are unfamiliar with these, and also offer some advice on conducting empirical studies on open source software development processes.
Article
This study examines the influence of the tertius iungens orientation on knowledge-sharing activities and individual job performance within enterprise social media environments. The empirical analysis reveals that knowledge self-efficacy, social interaction ties, and the norm of reciprocity positively influence the tertius iungens orientation and knowledge-sharing activities in social media, while enjoyment of helping does not have a significant influence. In addition, the tertius iungens orientation has a significant impact on knowledge-sharing activities in social media, which in turn influences individual job performance. Based on the results of this analysis, this study discusses the research findings and proposes theoretical and practical implications of the study as well as the research's limitations.
Book
This collection of essays comprises some of Rudolf Richter’s important contributions to research on New Institutional Economics (NIE). It deals with the central idea, principles, and methodology of New Institutional Economics and explores its relation to sociology and law. Other chapters examine applications of NIE to various microeconomic and macroeconomic issues in the face of uncertainty, from entrepreneurship to the euro crisis.
Article
Maintaining adequate cybersecurity is crucial for a firm to maintain the integrity of its external and internal financial reports, as well as to protect the firm’s strategic proprietary information. This paper demonstrates how information sharing could encourage firms to take a more proactive, as compared to a reactive, approach toward cybersecurity investments. In particular, information sharing could reduce the tendency by firms to defer cybersecurity investments. The basic argument presented in this paper is grounded in the real options perspective of cybersecurity investments. More to the point, the value of an option to defer an investment in cybersecurity activities increases as the uncertainty associated with the investment increases. To the extent that information sharing reduces a firm’s uncertainty concerning a cybersecurity investment, it decreases the value of the deferment option associated with the investment. As a result of this decrease in the deferment option value, it may well make economic sense for the firm to make the cybersecurity investment sooner than otherwise would be the case.
Article
Since the introduction of the TAM by [1], user attitudes have been considered an important determinant of IS adoption. Their impact on behavior is rooted in behavioral frameworks such as the theory of reasoned action [2] and the theory of planned behavior [3]. However, in IS adoption models, the predictive power of attitudes for behavior as an endogenous variable has proven strong in some circumstances and weak in others [4] and [5]. These inconsistencies resulted in an exclusion of the construct attitude from further modified versions of the TAM and other models explaining individual IS adoption and use. To better understand the prevalence of inconsistent attitude behavior relationships in IS adoption models, this research follows social psychological research practices focusing on situational factors prone to result in insignificant attitude-behavior relationships. To test for the impact of these factors on the attitude-behavior relationship, the article raises the level of analysis to a higher level of abstraction. A scientometric literature review and a meta-analysis of fourteen top IS journals from 1989 to 2014 finds that three out of four situational factors positively influence the attitude-behavior relationship in the IS fields: voluntariness, technology type and adoption context. These factors constitute “the attitude cube”, which provides conceptual guidance for researchers in assessing the conditions under which the attitude-behavior relationship is likely to be weak or strong.
Article
Prior studies on knowledge-sharing motivations mostly concentrate on discussing motivation in terms of level or amount, and thus, discussions regarding the quality of motivations, in terms of their levels of autonomy, are scarce. Additionally, while researchers have addressed the significant relationships among different types of motivations, there is still controversy concerning these relationships in a knowledge-sharing context. With reference to self-determination theory, this study examines a model that depicts the influence of various types of motivations on employees’ knowledge sharing behaviors (KSBs). Based on the data collected from 259 employees in 34 organizations, hard reward, soft reward, and altruism for organizational benefits are significant influencing factors of KSBs, while altruism for personal satisfaction is not. Additionally, soft reward has a significant positive effect on both altruism for organizational benefits and altruism for personal satisfaction. The theoretical and practical implications and suggestions for future research are discussed.
Article
Purpose The purpose of this paper is to investigate the factors that affect knowledge sharing in a public sector organization. Design/methodology/approach The paper is based on quantitative research. The data were gathered through questionnaires and analyzed using multiple regression. Findings Community‐related considerations, normative considerations and personal benefits were three motivators found to have a unique contribution to the variance in knowledge sharing. The following enablers had a significant main effect on knowledge sharing: social interaction, rewards, and organizational support. Two barriers, degree of courage and degree of empathy, which measured organizational climate, were found to have a significant main effect on knowledge sharing. The interaction of normative consideration with social interaction, personal benefit with organizational support, and normative considerations with degree of courage, had a moderating effect on the relationship between motivating factors and knowledge sharing. Research limitations/implications The study was conducted in a single public sector organization, which limits the generalizability of the findings to other settings. Another limitation is that attitudes toward knowledge sharing, and knowledge‐sharing behaviors, vary across cultures. Finally, self‐reported data are subject to response bias. Practical implications Identifying factors that influence knowledge sharing could help practitioners create a knowledge‐sharing culture that is needed to support knowledge sharing and knowledge management within public sector organizations. Originality/value This empirical study will contribute to the theoretical knowledge on knowledge sharing in the public sector, which has been neglected in knowledge‐sharing research.
Conference Paper
Communities, and the critical infrastructure that they rely upon, are becoming ever increasingly integrated into cyberspace. At the same time, communities are experiencing increasing activity and sophistication from a variety of threat agents. The effect of cyber attacks on communities has been observed, and the frequency and devastation of these attacks can only increase in the foreseeable future. Early detection of these attacks is critical for a fast and effective response. We propose detecting community cyber incidents by comparing indicators from community members across space and time. Performing spatiotemporal differentiation on these indicators requires that community members, such as private and governmental organizations, share information about these indicators. However, community members are, for good reasons, reluctant to share sensitive security related information. Additionally, sharing large amounts of information with a trusted, centralized location introduces scalability and reliability problems. In this paper we define the information sharing requirements necessary for fast, effective community cyber incident detection and response, while addressing both privacy and scalability concerns. Furthermore, we introduce a framework to meet these requirements, and analyze a proof of concept implementation.
Conference Paper
Information and Communication Technologies are increasingly intertwined across the economies and societies of developed countries. Protecting these technologies from cyberthreats requires collaborative relationships for exchanging cyber defense data and an ability to establish trusted relationships. The fact that Communication and Information Systems (CIS) security1 is an international issue increases the complexity of these relationships. Cyber defense collaboration presents specific challenges since most entities would like to share cyber-related data but lack a successful model to do so. We will explore four aspects of cyber defense collaboration to identify approaches for improving cyber defense information sharing. First, incentives and barriers for information sharing, which includes the type of information that may be of interest to share and the motivations that cause social networks to be used or stagnate. Second, collaborative risk management and information value perception. This includes risk management approaches that have built-in mechanisms for sharing and receiving information, increasing transparency, and improving entity peering relationships. Third, we explore procedural models for improving data exchange, with a focus on inter-governmental collaborative challenges. Fourth, we explore automation of sharing mechanisms for commonly shared cyber defense data (e.g., vulnerabilities, threat actors, black/ white lists). In order to reach a common understanding of terminology in this paper, we leverage the NATO CIS Security Capability Breakdown [19], published in November 2011, which is designed to identify and describe (CIS) security and cyber defense terminology and definitions to facilitate NATO, national, and multi-national discussion, coordination, and capability development.
Article
A proposed theory of planned behavior, an extension of Ajzen and Fishbein's (1980, Understanding attitudes and predicting social behavior. Englewood-Cliffs, NJ: Prentice-Hall) theory of reasoned action, was tested in two experiments. The extended theory incorporates perceived control over behavioral achievement as a determinant of intention (Version 1) as well as behavior (Version 2). In Experiment 1, college students' attendance of class lectures was recorded over a 6-week period; in Experiment 2, the behavioral goal was getting an “A” in a course. Attitudes, subjective norms, perceived behavioral control, and intentions were assessed halfway through the period of observation in the first experiment, and at two points in time in the second experiment. The results were evaluated by means of hierarchical regression analyses. As expected, the theory of planned behavior permitted more accurate prediction of intentions and goal attainment than did the theory of reasoned action. In both experiments, perceived behavioral control added significantly to the prediction of intentions. Its contribution to the prediction of behavior was significant in the second wave of Experiment 2, at which time the students' perceptions of behavioral control had become quite accurate. Contrary to expectations, there was little evidence for interactions between perceived behavioral control and the theory's other independent variables.
Article
Positive and negative selective incentives are shown analytically to have different structural implications when used to induce collective action. Positive selective incentives are effective for motivating small numbers of cooperators and generate pressures toward smaller, more "elite" actions, unless the incentives have jointness of supply. Nega­ tive selective incentives are effective for motivating unanimous co­ operation, but their use is often uneven and cyclical and may gener­ ate hostilities which disrupt the cooperation they enforce. Examples of these dynamics are found in many arenas of collective action and social movements. One important feature of collective action is the use of selective incen­ tives to reward those who cooperate in the action or punish those who do not. An arts fund may reward contributors by giving a lavish party or by printing their names in a program. Workers ensure cooperation with a strike by threatening to ostracize or beat up strikebreakers. In the 1960s, famous folksingers rewarded antiwar demonstrators by singing at protest rallies. In the 1970s, Louisville antibusing protesters threatened violence against other whites to induce them to keep their children out of school. This paper considers relations among potential cooperators, not their re­ lations with any "enemy." It discusses the processes that arise when actors reward and punish each other to motivate or sustain cooperation in some form of collective action. The first half of the paper provides a formal analysis which reviews the work of Mancur Olson and his critics, formal­ izes the decision to participate in collective action, and then formalizes and examines the decision to use a resource as a selective incentive to in­ duce others to act collectively. The second half of the paper draws out the implications of this analysis. The most important implication is the difference between rewards and punishments when they are used as selective incentives. This implication 1 I would like to thank James Wiggins, Elizabeth Martin, Patricia Rieker, Jean War­ ren, Ross Purdy, William Gamson, and Anthony Oberschall for helpful comments on earlier versions of this paper, and especially to acknowledge the extensive, detailed, and illuminating critical commentary of John Lemke, Bertrand Shelton, and three anony­ mous reviewers as this paper moved toward its final form.