Experience-based access management (EBAM) is a life-cycle model for identity and access management. It incorporates models, techniques, and tools to reconcile differences between the ideal access model, as judged by professional and legal standards, and the enforced access control, specific to the operational system. EBAM's principal component is an expected-access model that represents differences between the ideal and enforced models on the basis of access logs and other operational information. A technique called access rules informed by probabilities (ARIP) can aid EBAM in the context of healthcare organizations.
Modern network security rests on the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. Distributed systems, mobile and desktop applications, embedded devices, and all of secure Web rely on SSL/TLS for protection against network attacks. This protection critically depends on whether SSL/TLS clients correctly validate X.509 certificates presented by servers during the SSL/TLS handshake protocol. We design, implement, and apply the first methodology for large-scale testing of certificate validation logic in SSL/TLS implementations. Our first ingredient is "frankencerts," synthetic certificates that are randomly mutated from parts of real certificates and thus include unusual combinations of extensions and constraints. Our second ingredient is differential testing: if one SSL/TLS implementation accepts a certificate while another rejects the same certificate, we use the discrepancy as an oracle for finding flaws in individual implementations. Differential testing with frankencerts uncovered 208 discrepancies between popular SSL/TLS implementations such as OpenSSL, NSS, CyaSSL, GnuTLS, PolarSSL, MatrixSSL, etc. Many of them are caused by serious security vulnerabilities. For example, any server with a valid X.509 version 1 certificate can act as a rogue certificate authority and issue fake certificates for any domain, enabling man-in-the-middle attacks against MatrixSSL and GnuTLS. Several implementations also accept certificate authorities created by unauthorized issuers, as well as certificates not intended for server authentication. We also found serious vulnerabilities in how users are warned about certificate validation errors. When presented with an expired, self-signed certificate, NSS, Safari, and Chrome (on Linux) report that the certificate has expired-a low-risk, often ignored error-but not that the connection is insecure against a man-in-the-middle attack. These results demonstrate that automated adversarial testing with frankencerts is a powerful methodology for discovering security flaws in SSL/TLS implementations.
Interoperable medical devices (IMDs) face threats due to the increased attack surface presented by interoperability and the corresponding infrastructure. Introducing networking and coordination functionalities fundamentally alters medical systems' security properties. Understanding the threats is an important first step in eventually designing security solutions for such systems. Part 2 of this two-part article defines a failure model, or the specific ways in which IMD environments might fail when attacked. An attack-consequences model expresses the combination of failures experienced by IMD environments for each attack vector. This analysis leads to interesting conclusions about regulatory classes of medical devices in IMD environments subject to attacks. Part 1 can be found here: http://doi.ieeecomputersociety.org/10.1109/MSP.2012.128.
Interoperable medical devices (IMDs) face threats due to the increased attack surface presented by interoperability and the corresponding infrastructure. Introducing networking and coordination functionalities fundamentally alters medical systems' security properties. Understanding the threats is an important first step in eventually designing security solutions for such systems. Part 1 of this two-part article provides an overview of the IMD environment and the attacks that can be mounted on it.
Nature takes a variety of approaches regarding risk concentration. Stationary life tends to bend but not break, whereas mobile life tends toward risk concentration with stout border protection. Client and network devices tend to follow the latter model.
It's a difficult mental exercise to simultaneously envision how a system could be forced to fail while you're busy designing how it's meant to work. At George Mason University, instructors give their students practice at this skill by requiring them to write attack scripts for all their assignments. Creating an attack script is a mental exercise for the student in which they align themselves with an attacker's perspective to formulate a structured plan of attack: a series of tasks and experiments that gain information about the internal state of the probed system. The purpose of this exercise is to help the student nurture a mindset in which they can appreciate how systems might be attacked in all their aspects, from design and implementation to runtime configuration.
'Tis the season when we musically celebrate the 12 days of Christmas while quietly rejoicing that there are fewer days of Christmas than there are bottles of beer on the wall. 'Tis also the season for us to visit the Owned Price Index (OPI), our index of underground economy prices. The OPI mimics the PNI Christmas index - the price index of the 12 days of Christmas items. The PNI Christmas index, the market price of 12 drummers drumming down to a partridge in a pear tree, rose an impressive 10.1% in 2008 to a record US$86,609, outperforming most major market indices dramatically. Although we sheepishly concede that golden rings have outperformed Goldman Sachs, and swans a-swimming have outperformed Swanson Foods, it does serve to further distinguish between our true love and our financial advisor. In 2008, true love is Pareto dominant.
For efficiency, we should implement cryptographic subsystems with short keys, but reliably estimating minimal key lengths is a rather involved and complicated process - especially for systems with long life cycles and limited update capabilities. In symmetric cryptography, experts consider 56-bit IDES (Data Encryption Standard) keys to be inadequate for most applications: new devices can efficiently derive a DES key from known plaintext-ciphertext pairs. Discussion in asymmetric cryptography circles currently focuses on 1,024-bit RSA key security. Interestingly, in this discussion, a major argument put forward for the insecurity of 1,024-bit RSA isn't due to paramount theoretical progress but to hypothetical hardware devices for factoring large numbers. Unlike quantum computers, these special-purpose designs try to work within the bounds of existing technology; in this article, we look at the ideas underlying some of these designs and their potential
Role-based access control (RBAC) is assigned directly to a user, which can provide simpler security administration and finer-grained access control policy. RBAC has provided a widely used model for security administration in large networks of applications and other IT resources. INCITS 359 contains an RBAC reference model, RIIS (RBAC Implementation and Interoperability Standard), as well as a system and administrative functional specifications, which describes a framework of components, use-case scenarios, management interaction functions, data-exchange models, operational definitions and interoperability. The RBAC data exchange model provides the bridge to exchange role information between security domains. RIIS defines technical interaction functions as specific mechanisms for exchanging operational and management data. The CS1.1 RBAC task group is soliciting industry use cases to cite in the area of system-to-system RBAC information exchange.
The effect of power failure due to the blackout, that took place at 4:11PM Eastern Daylight Time (EDT) on August 14, 2003 on the Internet is discussed. The normal Border Gateway Protocol (BGP) chatter jumped several notches as border routers across the globe relayed the news of unreachable networks. The spreading of blackout resulted in power-grid failure, due to which more than 1 percent of the Internet was unreachable. In addition to this, as the blackout continued, routing tables shrank as hundreds and thousands of networks went offline.
As governments attempt to prevent, investigate, or prosecute crimes by persons who use the Internet to plan and carry out terrorist acts, the protection of private, personal information stored on computers becomes the subject of controversy. It is inevitable that the home computer becomes a target for surveillance, search, and seizure by government agents. As a result, courts will be asked to determine whether such agents have complied with applicable laws that condition such intrusions on meeting standards set by constitutions or laws that did not anticipate the home computer as a focal point for such controversies. Courts are more accustomed to addressing similar controversies in the context of a house and its material contents, rather than a computer and its digital files. It is likely that the judicial system will use analogies to the house when deciding controversies concerning the reasonable expectations of privacy in a home computer's contents. In apparent anticipation, the US Justice Department's policy on the search and seizure of computers in investigations uses such an analogy to justify its position that when several people share a computer, any one of them can grant the police permission to search and seize its contents
The authors look at the recent Metricon 2.0 conference and discuss its highlights. In particular, the conference focused on the importance of metrics, especially as they apply to security.
Internet privacy was the topic in this paper. A 2008 survey revealed that US Internet users' top three privacy concerns haven't changed since 2002, but privacy-related events might have influenced their level of concern within certain categories. The authors describe their results as well as the differences in privacy concerns between US and international respondents. They also mentioned that individuals have become more concerned about personalization in customized browsing experiences, monitored purchasing patterns, and targeted marketing and research.
We can find considerable information security debris in the wake of 2003's attack trends and new security flaws. New and serious vulnerabilities were discovered, disclosed, and subsequently exploited in many ways - from simple, straightforward methods to more advanced and innovative exploitation techniques. This paper examines a handful of the more than 3,000 unique vulnerabilities and 115,000 security incidents reported in 2003 (according to CERT Coordination Center's report for quarters one through three) and do my best to predict information security woes for 2004. The author's analysis focuses on the distinguishing characteristics of the 2003 attacks trends rather than on specific vulnerabilities or a precisely defined taxonomy of security bugs.
The International Association for Cryptologic Research (IACR; www.iacr.org) held its 24th annual International Cryptography Conference 15--19 August 2004 in Santa Barbara, California. The conference consisted of short sessions, invited talks, and presentations of conferences papers for interested attendees.
The European Institute Center for Anti-Virus Research (EICAR; www.eicar.org) held its 14th annual conference and attracted about 100 researchers, vendors, users, and government representatives interested in discussing the fields latest developments. This report presents a brief overview of what went on at the invited talks, paper presentations, and industry sessions.
The first Symposium on Usable Privacy and Security (SOUPS 2005) brought together a variety of academic and industry researchers from the emerging field of human-computer interaction and security (HCISEC). The symposium helped to define the challenges that HCISEC researchers face, and provide an overview of many of today's most important usable security-and-privacy problems.
A report on SecureWorld Expo 2005, held 21 to 22 September 2005 in Dearborn, Michigan. The SecureWorld Expo targets business and IT professionals with security concerns and provides them with an industry-wide agenda to help solve those concerns through a partnership with government agencies.
Information assurance experts at the Technology Forecasts 2005 give insights into how the threat's evolving nature, the current information technology environment, and various market forces are combining to yield new security challenges and likely new technology paths for the future. The integration of computation into the environment, rather than having computers as distinct objects, opens up a whole new class of information assurance problems. The advent of quantum computing, if it happens, could have a profound effect on the information assurance landscape. The widespread use of computers to monitor and control safety-critical processes.
Reading Andrew Jaquith's Security Metrics: Replacing Fear, Uncertainty, and Doubt will inspire you to start identifying and measuring meaningful computer security performance factors and begin the process of transforming our shaman-like discipline into a science.
Some of the distinguished information assurance experts have provided insights into how the evolving nature of threats, the current information technology environment, and various market forces are combining to yield new security challenges and new technology paths for the future. Terry V. Benzel has expressed that the future will see the commoditization of ubiquitous computing. Computing will move from a stand-alone conscious activity to a fully integrated aspect of daily life. Technically, it will move from workstations and laptops connected to networks and servers to embedded computational nodes and wide-spread sensor-based systems. Jeremy Epstein believes that availability of low-cost hardware in future will make feasible information assurance technologies. With the feasible networks of dedicated processors in a single computer of device, computing will see partitioning of applications with well-defined boundaries that will help to gain information assurance.
The Windows Server 2008 and Windows Vista, the System Access Control List (SACL) have been extended to carry integrity level information and is in the process of being converted to something like a mandatory access control (MAC) label. The basis for Microsoft's steps toward MAC like functionality in Windows Server 2008 and Windows Vista, in which a no-read-down policy have been introduced. The integrity label is used to establish the low label that marks the Internet Explorer process used in low rights Internet Explorer. The object have only one owner and one primary group, but the DACL and SACL can have many sub-components declaring the appropriate access, restrictions, and system settings for various principals. The Security issues are now mitigated by the Owner Access Restriction ACL, which is introduced with Windows Vista and Server 2008. The capabilities of Microsoft's ACL model in modern Windows systems allows fine grained control of object permissions.