Computers & Security

Published by Elsevier
Online ISSN: 0167-4048
Publications
Conference Paper
Traditional security and access control systems, such as MLS/Bell-LaPadula, RBAC are rigid and do not contain automatic mechanisms through which a system can increase or decrease users' access to classified information. Therefore, in this paper, we propose a risk-based decision method for an access control system. Firstly, we dynamically calculate the trust and risk values for each subject-object pair. Both values are adaptive, reflecting the past behavior of the users with particular objects. The past behavior is evaluated based on the history of reward and penalty points. These are assigned by the system after the completion of every transaction. Secondly, based on the trust and risk values, an access decision is made.
 
Article
Linked together, organisations can exchange information and engage in transactions in ways unanticipated before, the emphasis being on information, which became core to most business activities and without which business will fail to operate [Owens S. Information security management :an introduction. London: British Standards Institution; 1998. pp. 1–2]. Consequently, to contribute to ensuring business continuity, the protection of information resources had to be pursued. Risk analysis was traditionally used to analyse risks posing a threat to mostly IT assets [Jung C, Han I, Suh B. Risk analysis for electronic commerce using case-based reasoning. International Journal of Intelligent Systems in Accounting, Finance & Management 1999;8:61–73. John Wiley & Sons, Ltd., p. 62]. Resulting in recommendations for the implementation of appropriate security measures, to reduce those identified high priority risks to an acceptable level. However, Bandyopadhyay et al. [Bandyopadhyay K, Mykytyn PP, Mykytyn K. A framework for integrated risk management in information technology. Management Decision 1999;37(5):437–44. MCB Press, p. 440] state that the evaluation of risk related to IT alone is unrealistic. A holistic view of assessing risks should instead be adopted, moving away from the isolated and partial view of today's “closed world assumption” of searching only within a specific domain to evaluate the risks associated to IT, to consider the entire spectrum related to the IT environment. Thus an alternative approach to risk analysis might have to be developed, to assist in analysing risks to information-specific resources.
 
Article
In our last column we discussed the evidence acquisition process stressing the chain of evidence and the importance of following a documented method. These two issues are important in order to ensure that the evidence captured will be acceptable in a court of law — provided that it is relevant to the case at hand.
 
Article
This paper investigates the co-existence of and complementary use of COBIT and ISO 17799 as reference frameworks for Information Security governance. The investigation is based on a mapping between COBIT and ISO 17799 which became available in 2004, and provides a level of ‘synchronization’ between these two frameworks.
 
Article
This article will consider the social and ethical factors involved in the transmission of computer viruses and other malicious software. In addition to the people, we will consider the part the systems and technology play in the spread of this sort of data. We will draw parallels with one of the more well known scientific paradigms, the medical one, and note the similarities with the problems we now face. We will describe the evolution of methods of virus distribution: virus exchange bulletin boards, virus exchange networks, distribution sites, robot servers, and books. The article will discuss viruses for sale and make some comparisons between distribution of computer viruses and the distribution methods of ‘hacking tools’ Other issues examined in this article include the characteristics of individuals involved in the distribution of these types of programmes, and problems of legal redress, as well as possible solutions based on ethics and ethical theory.
 
Article
Two-dimensional (2D) barcode symbology is an emerging technology used for compactly storing and retrieving information. These barcodes can be found on the back of drivers' licenses and are encoded with secure text data. Standard 2D barcode such as PDF417 uses upper and lowercase alphabets, numeric digits and special characters for encoding. Some barcodes also include a compressed photo of the individual. The visual quality of the compressed image is usually poor and occupies a large amount of space which greatly reduces the capacity needed for encoding text. This paper presents a novel approach for embedding uncompressed images in a standard PDF417 2D barcode using a blind digital watermarking technique. The text is encoded in the standard PDF417 format with error correction, while the face and fingerprint images are watermarked in the encoded 2D barcode. Experimental results show that the proposed technique effectively increased the standard capacity of the PDF417 2D barcode without altering the contents of the encoded data. The results also show that the visual quality of the extracted photo image is high. The extracted fingerprint image when compared with the original fingerprint using an AFIS system yielded a high matching score.
 
Article
In many cryptographic protocols, double-exponentiation is a key arithmetic operation. In this study, we will present a multiplexer-based algorithm for double-exponentiation in GF(2m). The proposed algorithm utilizes the concept of the modified Booth's algorithm. Multiplexers are employed for implementation of the proposed algorithm. The proposed double-exponentiation algorithm only requires m multiplications and saves about 66% time complexity while comparing with the ordinary binary method.
 
Article
This paper, proposes a protocol (IDM3G) for implementing identity management for Internet applications over 3G mobile networks. IDM3G combines the identity management principles of the Liberty Alliance specifications, elements of the OASIS's SAML and the 3GPP UMTS security specifications, targeting to a more effective and lightweight identity management solution than the existing ones. IDM3G instead of establishing new authentication and authorization mechanisms, utilizes the latest security features of 3G mobile networks in order to implement trust relationships, focusing on mutual authentication and authorization, avoiding at the same time the submission of the user identity itself.
 
Article
IEEE 802.11 wireless LAN has become one of the hot topics on the design and development of network access technologies. In particular, its authentication and key exchange (AKE) aspects, which form a vital building block for modern security mechanisms, deserve further investigation. In this paper we first identify the general requirements used for WLAN authentication and key exchange (AKE) methods, and then classify them into three levels (mandatory, recommended, and additional operational requirements). We present a review of issues and proposed solutions for AKE in 802.11 WLANs. Three types of existing methods for addressing AKE issues are identified, namely, the legacy, layered and access control-based AKE methods. Then, we compare these methods against the identified requirements. Based on the analysis, a multi-layer AKE framework is proposed, together with a set of design guidelines, which aims at a flexible, extensible and efficient security as well as easy deployment.
 
Article
The unification of Europe at the end of 1992 will create information security challenges. The Single European Market will serve as a major landmark for the restructuring of Continent-wide institutions and services. There will be major changes in European finance, governance, and technology.Unification decisions will lessen existing controls and restrictions over financial processing as well as create new conditions where controls and restrictions have not been anticipated. In either of these cases, computer crimes may well increase as a consequence of the Single European Market.This article will discuss the impact of Unification '92 on the protection of information. While much attention has been paid by security experts to the development of the information technology security evaluation criteria (ITSEC), other important but less direct decisions are being made that are related to information security. The Unification '92 decisions that will be outlined could turn out to be of great importance in determining the nature of information protection in the post-1992 era.Many EC directives and decisions, some of which are not specifically treated or labeled as information security, will substantially affect what protections will be possible as well as necessary. The major categories of these directives and decisions are: 1.(1) technical decisions on the use of computer and communications technologies2.(2) legal decisions on how financial errors, crimes, and disagreements are defined and how they will be resolved3.(3) political and public policy decisions on the general economy and the regulatory constraints applied to financial services operations, services, and products.Illustrations of these categories will be drawn from important EC decisions. These will include decisions on stimulating European information services, auditing standards and requirements, money-laundering controls, open borders, and electronic data interchange (EDI).
 
Article
We describe our approach to multilevel database security and show that we can support element-level labeling in a Class A1 database system without the need to verify the entire database system, or even most of it. We achieve both the high degree of assurance required for Class A1 and the flexibility of element-level labeling by layering the TCB, where the lowest TCB layer is a reference monitor enforcing mandatory security; and by decomposing multilevel relations into single-level relations that are managed by the reference monitor. This decomposition means that multilevel relations are actually views over single-level base relations, which suggests that our multilevel relational system could be implemented on a standard (untrusted) relational system running on a reference monitor.
 
Article
In this article, we argue that traditional approaches for authorization and access control in computer systems (i.e., discretionary, mandatory, and role-based access controls) are not appropriate to address the requirements of networked or distributed systems, and that proper authorization and access control requires infrastructural support in one way or another. This support can be provided, for example, by an authentication and authorization infrastructure (AAI). Against this background, we overview, analyze, discuss, and put into perspective some technologies that can be used to build and operate AAIs. More specifically, we address Microsoft .NET Passport and some related activities (e.g. the Liberty Alliance Project), Kerberos-based solutions, and AAIs that are based on digital certificates and public key infrastructures (PKIs). We conclude with the observation that there is no single best approach for providing an AAI, that every approach has specific advantages and disadvantages, and that a comprehensive AAI must combine various technologies and approaches.
 
Article
Law on cracking security codes toughened, Amy Harmon. The US copyright office has passed a new law making it illegal to break the security methods in place to prevent the copying of digital music, books and movies. The law will update the existing 1998 Digital Millennium Copyright Act and will come into effect immediately. The decision was opposed by groups such as Universities, programmers and libraries who argue that copyrighted work should be archived and leant out. Programmers also claimed that reverse-engineering was a valid technique in the cause of learning how technology works. The Association of American Universities has stated that in order to maintain civil liberties, there must be broad exemptions to allow for “fair use” of purchased copyrighted digital goods. The Toronto Star, 30 October, E3.
 
Article
America's romance with the Internet continues unabated, as more corporations turn to it as a vehicle for their daily business. However, the Internet has also spawned a proliferation of employee abuses. Regardless of one's position on workplace monitoring, and the Internet, two key considerations have emerged: employee abuses of cyberspace can expose their companies to costly litigation; and that management needs to play a proactive role in addressing the problem. The best of Internet security policies and measures are not sufficient to eliminate all potential legal exposures connected to the Internet. Nevertheless, it is crucial that employers both recognize the problem and make efforts to address it.
 
Conference Paper
The ability to monetize domain names through resale or serving ad content has contributed to the rise of questionable practices in acquiring them, including domain-name speculation, tasting, and front running. In this paper, we perform one of the first comprehensive studies of these domain registration practices. In order to characterize the prevalence of domain-name speculation, we derive rules describing “hot” topics from popular Google search queries and apply these rules to a dataset containing all .com registrations for an eight-month period in 2008. We also study the extent of domain tasting throughout this time period and analyze the efficacy of ICANN policies intended to limit tasting activity. Finally, we automatically generate high-quality domain names related to current events in order to measure domain front running by registrars. The results of our experiments shed light on the methods and motivations behind these domain registration practices and in some cases underscore the difficulty in definitively measuring these questionable behaviors.
 
Article
Security awareness is important to reduce human error, theft, fraud, and misuse of computer assets. A strong ICT security culture cannot develop and grow in a company without awareness programmes. This paper focuses on ICT security awareness and how to identify key areas of concern to address in ICT security awareness programmes by making use of the value-focused approach. The result of this approach is a network of objectives where the fundamental objectives are the key areas of concern that can be used in decision making in security planning. The fundamental objectives were found to be in line with the acknowledged goals of ICT security, e.g. confidentiality, integrity and availability. Other objectives that emerged were more on the social and management side, e.g. responsibility for actions and effective use of resources.
 
Article
The present paper examines the perceived acceptability of biometric security systems by a sample of banking and university staff. Results from 76 respondents indicated that all biometric systems were perceived as less acceptable than the traditional password approach. Contrary to expectations, it was found that behaviourally based biometric systems were perceived as less acceptable than physiologically based systems. The acceptability of several methods increased as sensitivity of information increased. Conversely, for the password method a negative relationship between acceptability and sensitivity was found. Results are discussed in relation to the potential for some behaviourally based biometric systems to be perceived as capable of electronic performance monitoring (EPM). The need for more research addressing perceptions of security systems amongst employees is highlighted.
 
Article
Access control is the granting and enforcement of priveleges to access information in a system. The design of access control for knowledge base systems is based on access control concepts found in database systems. Additional characteristics found in knowledge base systems include object-oriented features as well as temporal information and predicative assertion languages to describe rules and constraints. Of particular complexity is the implication of access control on inherited objects, rules and constraints. The Group Security model provides users with discretionary access control to objects in a knowledge base system according to their requirement for a task. Further benefits and opportunities are achieved as a result of implementing access control in knowledge base systems.
 
Article
A fundamental requirement for the healthcare industry is that the delivery of care comes first and nothing should interfere with it. As a consequence, the access control mechanisms used in healthcare to regulate and restrict the disclosure of data are often bypassed in case of emergencies. This phenomenon, called “break the glass”, is a common pattern in healthcare organizations and, though quite useful and mandatory in emergency situations, from a security perspective, it represents a serious system weakness. Malicious users, in fact, can abuse the system by exploiting the break the glass principle to gain unauthorized privileges and accesses.In this paper, we propose an access control solution aimed at better regulating break the glass exceptions that occur in healthcare systems. Our solution is based on the definition of different policy spaces, a language, and a composition algebra to regulate access to patient data and to balance the rigorous nature of traditional access control systems with the “delivery of care comes first” principle.
 
Article
Access controls are difficult to implement and evidently deficient under certain conditions. Traditional controls offer no protection for unclassified information, such as a telephone list of employees that is unrestricted, yet available only to members of the company. On the opposing side of the continuum, organizations such as hospitals that manage highly sensitive information require stricter access control measures. Yet, traditional access control may well have inadvertent consequences in such a context. Often, in unpredictable circumstances, users that are denied access could have prevented a calamity had they been allowed access. It has been proposed that controls such as auditing and accountability policies be enforced to deter rather than prevent unauthorized usage. In dynamic environments preconfigured access control policies may change dramatically depending on the context. Moreover, the cost of implementing and maintaining complex preconfigured access control policies sometimes far outweighs the benefits. This paper considers an adaptation of usage control as a proactive means of deterrence control to protect information that cannot be adequately or reasonably protected by access control.
 
Article
Automated web tools are used to achieve a wide range of different tasks, some of which are legal activities, whilst others are considered attacks to the security and data integrity of online services. Effective solutions to counter the threat represented by such programs are therefore required. In this work, we present MosaHIP, a Mosaic-based Human Interactive Proof (HIP), which is able to prevent massive automated access to web resources. Properties of the proposed solution grant an improved security over usual text-based and image-based HIPs, whereas the user-friendliness of the system alleviates the user from the discomfort of typing any text before accessing to a web content. Experimental evidence of the effectiveness of the proposed technique is given by submitting our system to a series of tests simulating possible bot attacks.
 
Article
XPath is a standard for specifying parts of XML documents and a suitable language for both query processing and access control of XML. In this paper, we use the XPath expression for representing user queries and access control for XML. And we propose an access-control method for XML, where we control accesses to XML documents by filtering query XPath expressions through access-control XPath expressions. For filtering the access-denied parts out of query XPath expressions, set operations (such as, intersection and difference) between the XPath expressions are essential. However, it is known that the containment problem of two XPath expressions is coNP-hard when the XPath expressions contain predicates (or branch), wildcards and descendant axes. To solve the problem, we directly search XACT (XML Access Control Tree) for a query XPath expression and extract the access-granted parts. The XACT is our proposed structure, where the edges are structural summary of XML elements and the nodes contain access-control information. We show that the query XPath expressions are successfully filtered through the XACT by our proposed method, and also show the performance improvement by comparing the proposed method with the previous work.
 
Article
Workflow Systems are increasingly being used to streamline organizations’ business processes. During the execution of business processes, information often traverses organizations’ networks as documents. With the proliferation of the Internet, documents travel across open networks. These documents can, however, contain potentially sensitive information. The documents used in Workflow Systems must therefore be protected from unauthorized access.This paper enumerates three access control requirements of workflow environments, including the well-known principle of separation of duty. Thereafter the CSAC (Context-sensitive Access Control) model is presented to address the requirements. In conclusion it is demonstrated how this model can be implemented in an agent-based architecture.
 
Article
Role-based access control (RBAC) is a widely used access control paradigm. In large organizations, the RBAC policy is managed by multiple administrators. An administrative role-based access control (ARBAC) policy specifies how each administrator may change the RBAC policy. It is often difficult to fully understand the effect of an ARBAC policy by simple inspection, because sequences of changes by different administrators may interact in unexpected ways. ARBAC policy analysis algorithms can help by answering questions, such as user-role reachability, which asks whether a given user can be assigned to given roles by given administrators.Allowing roles and permissions to have parameters significantly enhances the scalability, flexibility, and expressiveness of ARBAC policies. This paper defines PARBAC, which extends the classic ARBAC97 model to support parameters, proves that user-role reachability analysis for PARBAC is undecidable when parameters may range over infinite types, and presents a semi-decision procedure for reachability analysis of PARBAC. To the best of our knowledge, this is the first analysis algorithm specifically for parameterized ARBAC policies. We evaluate its efficiency by analyzing its parameterized complexity and benchmarking it on case studies and synthetic policies. We also experimentally evaluate the effectiveness of several optimizations.
 
Article
Recently, Wu and Chang and Shen and Chen separately proposed a cryptographic key assignment scheme for solving access control problem in a partially ordered user hierarchy. However, this paper will show the security leaks inherent in both schemes based on polynomial interpolations. That is, the users can have access to the information items held by others without following the predefined partially ordered relation. Finally, we proposed two improvements to eliminate such security flaws.
 
Article
The universal adoption of the Internet and the emerging web services technologies constitutes the infrastructure that enables the provision of a new generation of e-services and applications. However, the provision of e-services through the Internet imposes increased risks, since it exposes data and sensitive information outside the client premises. Thus, an advanced security mechanism has to be incorporated, in order to protect this information against unauthorized access. In this paper, we present a context-aware access control architecture, in order to support fine-grained authorizations for the provision of e-services, based on an end-to-end web services infrastructure. Access permissions to distributed web services are controlled through an intermediary server, in a completely transparent way to both clients and protected resources. The access control mechanism is based on a Role-Based Access Control (RBAC) model, which incorporates dynamic context information, in the form of context constraints. Context is dynamically updated and provides a high level of abstraction of the physical environment by using the concepts of simple and composite context conditions. Also, the paper deals with implementation issues and presents a system that incorporates the proposed access control mechanism in a web services infrastructure that conform to the OPC XML-DA specification.
 
Article
A rogue access point (AP) is an unauthorized AP plugged into a network. This poses a serious security threat. To detect an AP, a network manager traditionally takes the electric wave sensor across an entire protected place. This task is very labor-intensive and inefficient. This study presents a new AP detection method without extra hardware or hard work. This new method determines whether the network packets of an IP address are routed from APs, according to client-side bottleneck bandwidth. The network manager can perform his job from his office by monitoring the packets passing through the core switch. The accuracies remain above 99% when the parameter, sliding window size, of the proposed algorithm is larger than 20, according to experimental results. The proposed method effectively reduces the network manager's workload, and increases network security.
 
Article
The popularity of online social networks makes the protection of users' private information an important but scientifically challenging problem. In the literature, relationship-based access control schemes have been proposed to address this problem. However, with the dynamic developments of social networks, we identify new access control requirements which cannot be fully captured by the current schemes. In this paper, we focus on public information in social networks and treat it as a new dimension which users can use to regulate access to their resources. We define a new social network model containing users and their relationships as well as public information. Based on the model, we introduce a variant of hybrid logic for formulating access control policies. In addition, we exploit a type of category relations among public information to further improve our logic for its usage in practice.
 
Article
In modern society, where a welfare recipient signs up for benefits under six identities, a child is released to a stranger from a day care centre, a hacker accesses sensitive databases, and a counterfeiter makes copies of bank cards, a trustworthy means of identifying employees or customers is urgently in demand. Biometrics is emerging as the most foolproof method of automated personal identification in today's highly computer dependent world. Biometric systems are automated methods of verifying or recognizing the identity of a living person on the basis of some physiological characteristics or some behavioural aspects. No one biometric system is likely to dominate the marketplace since all have limitations that compromise possible system solutions. The trade-offs in developing such systems invoke hardware cost, reliability, discomfort in using a device, the amount of data needed, and other factors. This article looks at different types of biometrics systems and discusses in detail the various security implications associated with their application.
 
Article
The problem of how to control access in a user hierarchy by cryptography was explored recently. However, these researches focus on the centralized environment. This paper presents a new distributed approach to assigning cryptographic keys which enables a member of the organization to derive, from his own key, the keys of members below him in the hierarchy, and consequently to have access to information enciphered under those keys. This proposed scheme also maintains the strength of adding new security classes into a partially ordered hierarchy without affecting the existing keys.
 
Article
The process of improving network security begins with documenting the network configuration and its components Network management must know who the system users, are, what they are accessing, when, where, how, and why. Once management has this information, steps can be taken to ensure that only authorized users are accessing the network.To manage a successful telecommunications environment, there must be a balance between cost, security, risk, performance, and connectivity. The controls discussed here, when applied in the proper combination, can help to achieve this balance. A company must already have sound business practices such as adequate separation of duties, management approvals, and a development methodology.The controls contained here focus primarily on data communications and are environment independent. Specific hardware, software products, and services are mentioned as examples and should not be considered an endorsement.This article is based upon a white paper developed and published by the members of Nasig, the Network Access Special Interest Group, organized by the San Francisco Bay Area Chapter of the Information System Security Association (Issa), Inc.
 
Article
Computer security is a balance between protecting information and enabling authorized access. Tightening security by making systems more inaccessible can hinder employees and make them less productive. It can also result in lower security as workers struggle to find ways around the security conditions to enable them to do their jobs. This study analyzes an information systems user survey to evaluate the tradeoffs between protection and accessibility. Over one-third of the respondents report problems with interference from security provisions. A structural equation model explores the impact of these effects on eventual security levels.
 
Article
The identification of the major information technology (IT) access control policies is required to direct “best practice” approaches within the IT security program of an organisation. In demonstrating the need for security access control policies in the IT security program, it highlights the significant shift away from centralised mainframes towards distributed networked computing environments. The study showed that the traditional and proven security control mechanisms used in the mainframe environments were not applicable to distributed systems, and as a result, a number of inherent risks were identified with the new technologies.Because of the critical nature of the information assets of organisations, then appropriate risk management strategies should be afforded through access control policies to the IT systems. The changing technology has rendered mainframe centralised security solutions as ineffective in providing controls on distributed network systemsThis investigation revealed that the need for policies for access control of an information system from corporate governance guidelines and risk management strategies were required to protect information assets of an organisation. The paper proposes a high level approach to implementing security policies through information security responsibilities, management accountability policy, and other baseline access control security policies individual and distributed systems.
 
Article
Discretionary access control policies are not so simple as most assume, especially in light of the requirements in the Trusted Computer System Evaluation Criteria for group authorizations and specific denial of authorizations at the higher evaluation classes. Many interpretations of these requirements can be made. We investigate the possible interpretations of specific denial of authorization and also how to reconcile conflicting user and group authorizations and denials. We discuss several alternatives for implementing such policies for complex systems, such as database systems.
 
Article
With the growing use of wireless networks and mobile devices, we are moving towards an era where location information will be necessary for access control. The use of location information can be used for enhancing the security of an application, and it can also be exploited to launch attacks. For critical applications, such as the military, a formal model for location-based access control is needed that increases the security of the application and ensures that the location information cannot be exploited to cause harm. In this paper, we show how the mandatory access control (MAC) model can be extended to incorporate the notion of location. We also show how the different components in the MAC model are related with location and how this location information can be used to determine whether a subject has access to a given object. This model is suitable for military applications consisting of static and dynamic objects, where location of a subject and object must be considered before granting access.
 
Article
A data processing environment allows users to create data objects of very different types. As a part of the management of these objects, the data processing system must support user- defined access control. In this paper we propose a general concept and user interface for user-defined access control and discuss a number of applications. The concept regards access control information as valuable information in its own right and consequently places this information, nowadays mostly specified within and as an integral part of meta-information either of objects or subjects, in objects of its own, called usage conditions. Each of these objects contains access control information corresponding to a task of the real world and formulated with general attributes. Access control for an object is implemented by references to appropriate usage conditions. Access to the object is granted if granted by at least one usage condition. Thus, by setting such references in an m-to-n manner, a restricted set of usage conditions is sufficient to implement even complex access control conditions.
 
Article
Increasingly, new regulations are governing organizations and their information systems. Individuals responsible for ensuring legal compliance and accountability currently lack sufficient guidance and support to manage their legal obligations within relevant information systems. While software controls provide assurances that business processes adhere to specific requirements, such as those derived from government regulations, there is little support to manage these requirements and their relationships to various policies and regulations. We propose a requirements management framework that enables executives, business managers, software developers and auditors to distribute legal obligations across business units and/or personnel with different roles and technical capabilities. This framework improves accountability by integrating traceability throughout the policy and requirements lifecycle. We illustrate the framework within the context of a concrete healthcare scenario in which obligations incurred from the Health Insurance Portability and Accountability Act (HIPAA) are delegated and refined into software requirements. Additionally, we show how auditing mechanisms can be integrated into the framework and how auditors can certify that specific chains of delegation and refinement decisions comply with government regulations.
 
Article
In this paper, we propose Simplified Regular Expression (SRE) signature, which uses multiple sequence alignment techniques, drawn from bioinformatics, in a novel approach to generating more accurate exploit-based signatures. We also provide formal definitions of what is “a more specific” and what is “the most specific” signature for a polymorphic worm and show that the most specific exploit-based signature generation is NP-hard. The approach involves three steps: multiple sequence alignment to reward consecutive substring extractions, noise elimination to remove noise effects, and signature transformation to make the SRE signature compatible with current IDSs. Experiments on a range of polymorphic worms and real-world polymorphic shellcodes show that our bioinformatics approach is noise-tolerant and as that because it extracts more polymorphic worm characters, like one-byte invariants and distance restrictions between invariant bytes, the signatures it generates are more accurate and precise than those generated by some other exploit-based signature generation schemes.
 
Article
To remain competitive, and to achieve our goals, we must refine our approaches to strategic planning and incorporate them into management activities. One important refinement of strategic planning involves the preparation and maintenance of an information security plan to help resolve a serious problem (exemplified by the “tragedy of the commons”) found in data communications and multi-user computer environments. This article addresses the factors compelling us to be more serious about planning such as the increasing amplitude and frequency of technological change. It discusses ways of using a plan to create an appropriate information systems security environment. These include using a plan to generate a sense of urgency about information security, and using a plan as the foundation for engineering a conversation with management about appropriate security measures. The article considers the specific problems encountered when information security planning is inadequate and proposes that a target secure computing environment be defined as the starting point of an information security plan. Ways of incorporating flexibility into a security plan are also described. 25 specific examples of data communications controls are provided to assist readers in developing control measures suitable for inclusion in their own information security plans.
 
Conference Paper
We present Ack-storm DoS attacks, a new family of DoS attacks exploiting a subtle design flaw in the core TCP specifications. The attacks can be launched by a very weak MitM attacker, which can only eavesdrop occasionally and spoof packets (a Weakling in the Middle (WitM)). The attacks can reach theoretically unlimited amplification; we measured amplification of over 400,000 against popular websites before aborting our trial attack. Ack storm DoS attacks are practical. In fact, they are easy to deploy in large scale, especially considering the widespread availability of open wireless networks, allowing an attacker easy WitM abilities to thousands of connections. Storm attacks can be launched against the access network, e.g. blocking address to proxy web server, against web sites, or against the Internet backbone. Storm attacks work against TLS/SSL connections just as well as against unprotected TCP connections, but fails against IPsec or link-layer encrypted connections. We show that Ack-storm DoS attacks can be easily prevented, by a simple fix to TCP, in either client or server, or using a packet-filtering firewall.
 
Article
This paper describes ACTEN, a conceptual model for the design of security systems. Security information is represented by action-entity pairs and organized into a framework composed of graphs and tables. The rules permitting the building and management of this framework are introduced.The model describes both static and dynamic aspects of the security system; in fact, it shows the access modalities between objects in the system and the evolution of such modalities due to grant and revocation of rights within the security system.ACTEN also allows the identification of the authority and protection level of each component of the system. The tools for this analysis are introduced and an example is given.
 
Article
As the advantages of being connected to the Internet multiply, so do the risks. Various techniques have, for example, been devised to infiltrate networks connected to the Internet. New vulnerabilities are created and exploited daily. By analysing a communication ses- sion in real time, it is possible to detect attacks as they are being launched. This is achieved by analysing the characteristics of the protocol being used. By determining the risk values for the vari- ous components, it is possible to consolidate them into a single risk value. The latter risk value can then be applied to determine which countermeasures need to be activated to reduce the risk level of the communication session to an acceptable level.
 
Article
Security concerns should be an integral part of the entire planning, development, and operation of a computer application. Inadequacies in the design and operation of computer applications are very frequent source of security vulnerabilities associated with computers. In most cases, the effort to improve security should concentrate on the application software. The system development life cycle (SDLC) technique provides the structure to assure that security safeguards are planned, designed, developed and tested in a manner that is consistent with the sensitivity of the data and/or the application. The software quality assurance process provides the reviews and audits to assure that the activities accomplished during the SDLC produce operationally effective safeguards.
 
Article
We analyze the “sinkhole” problem in the context of the Dynamic Source Routing (DSR) protocol for wireless mobile ad hoc networks (MANETs). The sinkhole effect is caused by attempts to draw all network traffic to malicious nodes that broadcast fake shortest path routing information. Two reliable indicators of sinkhole intrusion are proposed and analyzed. We will study how these sinkholes may be detected and term such as sinkhole intrusion detection. One indicator is based on the sequence number in the routing message and the other one is related to the proportion of the routes that travel to a suspected node. Threshold values that imply possible sinkhole intrusion are derived for these two indicators. The simulation results show that the indicators are consistent and reliable for detecting sinkhole intrusion.
 
Article
When security of a given network architecture is not properly designed from the beginning, it is difficult to preserve confidentiality, authenticity, integrity and non-repudiation in practical networks. Unlike traditional mobile wireless networks, ad hoc networks rely on individual nodes to keep all the necessary interconnections alive. In this article we investigate the principal security issues for protecting mobile ad hoc networks at the data link and network layers. The security requirements for these two layers are identified and the design criteria for creating secure ad hoc networks using multiple lines of defence against malicious attacks are discussed.
 
Article
An ad hoc network is a collection of nodes that do not need to rely on a predefined infrastructure to keep the network connected. Nodes communicate amongst each other using wireless radios and operate by following a peer-to-peer network model. In this article we investigate authentication in a layered approach, which results to multiple lines of defense for mobile ad hoc networks. The layered security approach is described and design criteria for creating secure ad hoc network using multiple authentication protocols are analyzed. The performance of several such known protocols, which are based on challenge–response techniques, is presented through simulation results.
 
Article
A mobile ad hoc network (MANET) is a wireless communication network which does not rely on a pre-existing infrastructure or any centralized management. Securing the exchanges in MANETs is compulsory to guarantee a widespread development of services for this kind of networks. The deployment of any security policy requires the definition of a trust model that defines who trusts who and how. Our work aims to provide a fully distributed trust model for mobile ad hoc networks. In this paper, we propose a fully distributed public key certificate management system based on trust graphs and threshold cryptography. It permits users to issue public key certificates, and to perform authentication via certificates' chains without any centralized management or trusted authorities. Moreover, thanks to the use of threshold cryptography; our system resists against false public keys certification. We perform an overall evaluation of our proposed approach through simulations. The results indicate out performance of our approach while providing effective security.
 
Article
Mobile Ad Hoc Networks (MANETs) are susceptible to a variety of attacks that threaten their operation and the provided services. Intrusion Detection Systems (IDSs) may act as defensive mechanisms, since they monitor network activities in order to detect malicious actions performed by intruders, and then initiate the appropriate countermeasures. IDS for MANETs have attracted much attention recently and thus, there are many publications that propose new IDS solutions or improvements to the existing. This paper evaluates and compares the most prominent IDS architectures for MANETs. IDS architectures are defined as the operational structures of IDSs. For each IDS, the architecture and the related functionality are briefly presented and analyzed focusing on both the operational strengths and weaknesses. Moreover, methods/techniques that have been proposed to improve the performance and the provided security services of those are evaluated and their shortcomings or weaknesses are presented. A comparison of the studied IDS architectures is carried out using a set of critical evaluation metrics, which derive from: (i) the deployment, architectural, and operational characteristics of MANETs; (ii) the special requirements of intrusion detection in MANETs; and (iii) the carried analysis that reveals the most important strengths and weaknesses of the existing IDS architectures. The evaluation metrics of IDSs are divided into two groups: the first one is related to performance and the second to security. Finally, based on the carried evaluation and comparison a set of design features and principles are presented, which have to be addressed and satisfied in future research of designing and implementing IDSs for MANETs.
 
Article
Electronic services in dynamic environment (e.g. e-government, e-banking, e-commerce, etc.), meet many different barriers reducing their efficient applicability. One of them is the requirement of information security when it is transmitted, transformed, and stored in an electronic service. It is possible to provide the appropriate level of security by applying the present-day information technology. However, the level of protection of information is often much higher than it is necessary to meet potential threats. Since the level of security strongly affects the performance of the whole system, the excessive protection decreases its reliability and availability and, as a result, its global security. In this paper we present a mechanism of adaptable security for, digital information transmission systems (being usually the crucial part of e-service). It makes it possible to guarantee the adequate level of protection for actual level of threats dynamically changing in the environment. In our model the basic element of the security is the Public Key Infrastructure (PKI) is enriched with specific cryptographic modules.
 
Article
Ordinary techniques of text compression provide some degree of privacy for messages being stored or transmitted. First, by recording messages compression protects them from the casual observer. Secondly, by removing redundancy it denies a cryptanalyst the leverage of the normal statistical regularities in natural language. Thirdly, and most important, the best text compression systems use adaptive modeling so that they can take advantage of the characteristics of the text being transmitted. The model acts as a very large key, without which decryption is impossible. Adaptive modeling means that the key depends on the entire text that has been transmitted so far since the time the encoder/decoder system was initialized.This paper introduces the modern approach to text compression and describes a highly effective adaptive method, with particular emphasis on its potential for protecting messages from eavesdroppers. The technique is potentially fast and provides both encryption and data compression.
 
Top-cited authors
Steven Furnell
  • University of Nottingham
Pedro García-Teodoro
  • University of Granada
Fred Cohen
  • Management Analytics
Gabriel Maciá-Fernández
  • University of Granada
Jan H. P. Eloff
  • University of Pretoria