Tabitha James

Virginia Polytechnic Institute and State University, Blacksburg, Virginia, United States

Are you Tabitha James?

Claim your profile

Publications (26)24.79 Total impact

  • Source
    Olga Bruyaka · Tabitha James · Deborah F. Cook · Reza Barkhi
    [Show abstract] [Hide abstract]
    ABSTRACT: The information age has increased our dependency on data, and consequently the economic value of information retrieval services (IRS) companies. While mergers and acquisitions (M&As) are a popular means to sustain growth for these companies, they often fail to fulfill the promise of shareholder value creation. This makes the inquiry into market valuation of M&As in the IRS industry timely and important. Using the concept of strategic complementarity that is relatively new in the M&A literature, we study industry and geographic complementarities between acquirers and targets as well as acquirer-and market-specific contingency factors to better understand market valuation of M&As. In an empirical study of 821 M&As by 150 firms in the US IRS industry between 1993 and 2006, we show that the two types of complementarities have contrasting effects on market valuation of M&As. While the effect is positive for geographic complementarity at both state and division levels, the effect of industry complementarity is found to be negative except for acquirers in the Internet software and services mid-industry. Additionally, our findings provide insights on the role of three contingent factors—acquirers' age, size and stock market growth—that can help better understand diverging effects of industry and geographic complementarities.
    Information Technology and Management 08/2014; DOI:10.1007/s10799-014-0194-0 · 0.14 Impact Factor
  • Source
    Tabitha L. James · Lara Khansa · Deborah F. Cook · Olga Bruyaka
    [Show abstract] [Hide abstract]
    ABSTRACT: As the use of networked computers and digital data increase, so have the reports of data compromise and malicious cyber-attacks. Increased use and reliance on technologies complicate the process of providing information security. This expanding complexity in supplying data security requirements coupled with the increased recognition of the value of information, have led to the need to quickly advance the information security area. In this paper, we examine the maturation of the information security area by analyzing the innovation activity of one of the largest and most ubiquitous information technology companies, Microsoft. We conduct a textual analysis of their patent application activity in the information security domain since the early 2000's using a novel text analysis approach based on concepts from social network analysis and algorithmic classification. We map our analysis to focal areas in information security and examine it against Microsoft's own history, in order to determine the depth and breadth of Microsoft's innovations. Our analysis shows the relevance of using a network-based text analysis. Specifically, we find that Microsoft has increasingly emphasized topics that fall into the identity and access management area. We also show that Microsoft's innovations in information security showed tremendous growth after their Trustworthy Computing Initiative was announced. In addition, we are able to determine areas of focus that correspond to Microsoft's major vulnerabilities. These findings indicate that while Microsoft is still actively, albeit not always successfully, fighting vulnerabilities in their products, they are quite vigorously and broadly innovating in the information security area.
    Computers & Security 07/2013; 36:49–67. DOI:10.1016/j.cose.2013.02.004 · 1.17 Impact Factor
  • Tabitha James · Quinton Nottingham · Byung Cho Kim
    [Show abstract] [Hide abstract]
    ABSTRACT: Our increased reliance on digital information and our expansive use of the Internet for a steadily rising number of tasks requires that more emphasis be placed on digital information security. The importance of securing digital information is apparent but the success in persuading individual users to adopt and utilize tools to improve security has been arguably more difficult. In this study, we propose a number of factors that may influence individual security practices. These constructs are developed by adapting existing theory from information security and privacy research to examine information security behaviors in the general public dimension. The influence of these factors on perceived need and actual behavior is then examined. The resulting model is shown to fit well and support is found for many of the proposed relationships. The determination of the antecedents of individual digital security practices may provide useful insight to tailoring programs for adoption and utilization of security tools by individuals in the general public dimension.
    Information Technology and Management 06/2013; 14(2). DOI:10.1007/s10799-012-0147-4 · 0.14 Impact Factor
  • Source
    Lara Khansa · Deborah F. Cook · Tabitha James · Olga Bruyaka
    [Show abstract] [Hide abstract]
    ABSTRACT: Title 1 of the Health Insurance Portability and Accountability Act (HIPAA) was enacted to improve the portability of healthcare insurance coverage and Title II was intended to alleviate fraud and abuse. The development of a health information system was suggested in Title II of HIPAA as a means of promoting standardization to improve the efficiency of the healthcare system and ensure that electronic healthcare information is transferred securely and kept private. Since the legislation places the onus of providing the described improvements on healthcare institutions and part of these requirements relate to information technology (IT) and information security (IS), the process of complying with the legislation will necessitate acquiring products and services from IT/IS firms. From the viewpoint of stock market analysts, this increase in demand for IT/IS products and services has the potential to boost the profitability of public IT/IS firms, in turn positively enhancing their stock market valuation. Following the same logic, the legislation's compliance burdens shared by healthcare firms are expected to require hefty costs, thus potentially reducing the profitability of healthcare firms and reflecting negatively on their stock price. The intent of this paper is to evaluate the stock market reaction to the introduction of HIPAA legislation by evaluating the abnormal movement in the price of the stock of public healthcare institutions, IT, and IS firms. We conduct event-study analyses around the announcement dates of the various provisions of HIPAA. An event study is a standard statistical methodology used to determine whether the occurrence of a specific event or events results in a statistically significant reaction in financial markets. The advantage of the event study methodology for policy analysis is that it provides an anchor for determining value, which eliminates reliance on ad hoc judgments about the impact of specific events or policies on stock prices. While event studies have been conducted that examine the market effect of security and privacy breaches on firms, none has attempted to determine the impact, in terms of resulting market reaction, of the HIPAA legislation itself. The results of the study confirm the logic above, while also providing insight into specific stages of the legislative path of HIPAA.
    Computers & Security 09/2012; 31(6):750-770. DOI:10.1016/j.cose.2012.06.007 · 1.17 Impact Factor
  • 10/2011; 1(4):1-20. DOI:10.4018/jvcsn.2009092201
  • BC Kim · Lara Khansa · Tabitha James
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper examines the relationship between trust and risk perceptions of online activities. Specifically, we study the impact of an individual's trust of other people on the severity and certainty of risk, which in turn influence an individual's risk perception. We administer a 23-item survey to 386 participants at a large southeastern university, and test our model using structural equation modeling. We find evidence that supports the proposed relationship, implying that perceived certainty of risk is negatively associated with trust of individuals and that both certainty and severity of risk have a positive impact on an individual's risk perception. Our results indicate that users may underestimate risks when they interact with people they trust, suggesting the need for a higher level of protection for transactions between individuals who are familiar with each other. From a modeling perspective, our straightforward model of trust and risk could be used in future studies that examine specific online activities.
    07/2011; 7(3):3-22. DOI:10.1080/15536548.2011.10855915
  • [Show abstract] [Hide abstract]
    ABSTRACT: The 2008 United States presidential election was a historic event in several ways. Among numerous other firsts, the 2008 election saw unprecedented use of technology to reach and influence the population. The 2008 election illustrated the many advantages of intelligent use of technologies to encourage involvement and to disseminate information. The use of technology in the election demonstrated how effective current technologies could be in organizing political efforts. This is especially true for young people who came into adulthood surrounded by technological advances and have become increasingly comfortable with their use. In this study, we empirically examine the influence of technology on youthful voters in the 2008 U.S. presidential election and draw insights into the impacts of technology usage on political interest and activism. The successes using technology to encourage involvement and the free exchange of information in politics could be recreated in other areas that require energizing and informing large numbers of individuals.
    IEEE Technology and Society Magazine 03/2011; 30(1):20-27. DOI:10.1109/MTS.2011.940292 · 0.49 Impact Factor
  • Tabitha James · César Rego
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces a new path relinking algorithm for the well-known quadratic assignment problem QAP in combinatorial optimization. The QAP has attracted considerable attention in research because of its complexity and its applicability to many domains. The algorithm presented in this study employs path relinking as a solution combination method incorporating a multistart tabu search algorithm as an improvement method. The resulting algorithm has interesting similarities and contrasts with particle swarm optimization methods. Computational testing indicates that this algorithm produces results that rival the best QAP algorithms. The authors additionally conduct an analysis disclosing how different strategies prove more or less effective depending on the landscapes of the problems to which they are applied. This analysis lays a foundation for developing more effective future QAP algorithms, both for methods based on path relinking and tabu search, and for hybrids of such methods with related processes found in particle swarm optimization.
    01/2011; 2:52-70. DOI:10.4018/jsir.2011040104
  • Source
    César Rego · Tabitha James · Fred Glover
    [Show abstract] [Hide abstract]
    ABSTRACT: Abstract – In ,this study ,we present ,a new ,tabu ,search ,algorithm ,for the ,quadratic assignment,problem,(QAP) that utilizes an embedded,neighborhood,construction,called anejection,chain. The,QAP is a,well known combinatorial,optimization,problem,most commonly,used,to model,a facility location,problem. The,acknowledged,difficulty of the QAP has,made,it the focus,of many,metaheuristic approaches.,A key component,of any metaheuristic,approach ,is the ,neighborhood ,definition. ,The ,most ,common neighborhood,applied ,to the ,QAP is a ,2-exchange ,(or swap) ,neighborhood. , Ejection chains,provide ,the ,ability ,to constructively create ,larger ,embedded ,neighborhood structures., We ,propose ,a move ,generation ,process ,that ,provides ,a combinatorial leverage effect, where the size of the neighborhood grows multiplicativelywhile the effort offinding,a best,move,in the,neighborhood,grows,only,additively. Our results,illustrate that significant,improvement,in solution quality over the traditional,swap,neighborhood can,be obtained,by the more,complex,moves,possible,with the ejection,chain,approach., Wealso develop two multi-start tabu search algorithms, utilizing the ejection chain approach, to demonstrate the power of embedding this neighborhood construction
    Networks 10/2010; 56(3):188-206. DOI:10.1002/net.20360 · 0.74 Impact Factor
  • Source
    T. James · E. Brown · C.T. Ragsdale
    [Show abstract] [Hide abstract]
    ABSTRACT: Many areas of research examine the relationships between objects. A subset of these research areas focuses on methods for creating groups whose members are similar based on some specific attribute(s). The blockmodel problem has as its objective to group objects in order to obtain a small number of large groups of similar nodes. In this paper, a grouping genetic algorithm (GGA) is applied to the blockmodel problem. Testing on numerous examples from the literature indicates a GGA is an appropriate tool for solving this type of problem. Specifically, our GGA provides good solutions, even to large-size problems, in reasonable computational time.
    IEEE Transactions on Evolutionary Computation 03/2010; 14(1-14):103 - 111. DOI:10.1109/TEVC.2009.2023793 · 5.55 Impact Factor
  • Lara Khansa · Tabitha James · Deborah F. Cook
    01/2010; 1(4):1-21. DOI:10.4018/jep.2010100101
  • Source
    Tabitha James · Cesar Rego · Fred Glover
    [Show abstract] [Hide abstract]
    ABSTRACT: In this study, we introduce a cooperative parallel tabu search algorithm (CPTS) for the quadratic assignment problem (QAP). The QAP is an NP-hard combinatorial optimization problem that is widely acknowledged to be computationally demanding. These characteristics make the QAP an ideal candidate for parallel solution techniques. CPTS is a cooperative parallel algorithm in which the processors exchange information throughout the run of the algorithm as opposed to independent concurrent search strategies that aggregate data only at the end of execution. CPTS accomplishes this cooperation by maintaining a global reference set which uses the information exchange to promote both intensification and strategic diversification in a parallel environment. This study demonstrates the benefits that may be obtained from parallel computing in terms of solution quality, computational time and algorithmic flexibility. A set of 41 test problems obtained from QAPLIB were used to analyze the quality of the CPTS algorithm. Additionally, we report results for 60 difficult new test instances. The CPTS algorithm is shown to provide good solution quality for all problems in acceptable computational times. Out of the 41 test instances obtained from QAPLIB, CPTS is shown to meet or exceed the average solution quality of many of the best sequential and parallel approaches from the literature on all but six problems, whereas no other leading method exhibits a performance that is superior to this.
    European Journal of Operational Research 06/2009; 195(3-195):810-826. DOI:10.1016/j.ejor.2007.06.061 · 1.84 Impact Factor
  • Source
    Tabitha James · CÉsar Rego · Fred Glover
    [Show abstract] [Hide abstract]
    ABSTRACT: The quadratic assignment problem (QAP) is a well-known combinatorial optimization problem with a wide variety of applications, prominently including the facility location problem. The acknowledged difficulty of the QAP has made it the focus of many metaheuristic solution approaches. In this paper, we show the benefit of utilizing strategic diversification within the tabu search (TS) framework for the QAP, by incorporating several diversification and multistart TS variants. Computational results for an extensive and challenging set of QAP benchmark test problems demonstrate the ability of our TS variants to improve on a classic TS approach that is one of the principal and most extensively used methods for the QAP. We also show that our new procedures are highly competitive with the best recently introduced methods from the literature, including more complex hybrid approaches that incorporate the classic TS method as a subroutine.
    IEEE Transactions on Systems Man and Cybernetics - Part A Systems and Humans 06/2009; 39(3-39):579 - 596. DOI:10.1109/TSMCA.2009.2014556 · 2.18 Impact Factor
  • Source
    Robert Gilles · Tabitha L. James · Reza Barkhi · Dimitrios Diamantaras
    [Show abstract] [Hide abstract]
    ABSTRACT: Social networks depict complex systems as graph theoretic models. The study of the formation of such systems (or networks) and the subsequent analysis of the network structures are of great interest. In the business disciplines, the ability to model and simulate a system of individuals interacting to achieve a certain socio-economic goal holds much promise for gathering beneficial insights. We use case-based decision theory to formulate a customizable model of information gathering in a social network. In this model, the agents in the network have limited awareness of the social network in which they operate and of the fixed, underlying payoff structure. Agents collect payoff information from neighbors within the prevailing social network, and they base their networking decisions on this information. Along with the introduction of the decision theoretic model, we developed software to simulate the formation of such networks in a customizable context to examine how the network structure can be influenced by the parameters that define social relationships. We present computational experiments that illustrate the growth and stability of the simulated social networks ensuing from the proposed model. The model and simulation capability allows for the customizable generation of social networks to be used as aids to study various socio-economic phenomena.
  • Journal of Organizational and End User Computing 09/2008; 18(3):1-24. DOI:10.4018/joeuc.2006070101 · 0.42 Impact Factor
  • 01/2008; 2(1):42-53. DOI:10.4018/jisp.2008010103
  • Tabitha L. James · Evelyn C. Brown · Kellie B. Keeling
    [Show abstract] [Hide abstract]
    ABSTRACT: The machine-part cell formation problem consists of constructing a set of machine cells and their corresponding product families with the objective of minimizing the inter-cell movement of the products while maximizing machine utilization. This paper presents a hybrid grouping genetic algorithm for the cell formation problem that combines a local search with a standard grouping genetic algorithm to form machine-part cells. Computational results using the grouping efficacy measure for a set of cell formation problems from the literature are presented. The hybrid grouping genetic algorithm is shown to outperform the standard grouping genetic algorithm by exceeding the solution quality on all test problems and by reducing the variability among the solutions found. The algorithm developed performs well on all test problems, exceeding or matching the solution quality of the results presented in previous literature for most problems.
    Computers & Operations Research 07/2007; 34(7-34):2059-2079. DOI:10.1016/j.cor.2005.08.010 · 1.72 Impact Factor
  • Tabitha James · Mark Vroblefski · Quinton Nottingham
    [Show abstract] [Hide abstract]
    ABSTRACT: With the growing use of mobile communication devices, the management of such technologies is of increasing importance. The registration area planning (RAP) problem examines the grouping of cells comprising a personal communication services (PCS) network into contiguous blocks in an effort to reduce the cost of managing the location of the devices operating on the network, in terms of bandwidth. This study introduces a hybridized grouping genetic algorithm (HGGA) to obtain cell formations for the RAP problem. The hybridization is accomplished by adding a tabu search-based improvement operator to a traditional grouping genetic algorithm (GGA). Results indicate that significant performance gains can be realized by hybridizing the algorithm, especially for larger problem instances. The HGGA is shown to consistently outperform the traditional GGA on problems of size greater than 19 cells.
    Computer Communications 07/2007; 30(10-30):2180-2190. DOI:10.1016/j.comcom.2007.04.018 · 1.35 Impact Factor
  • Kellie B. Keeling · Evelyn C. Brown · Tabitha L. James
    [Show abstract] [Hide abstract]
    ABSTRACT: Over the past 25 years, the machine-part cell formation problem has been the subject of numerous studies. Researchers have applied various methodologies to the problem in an effort to determine optimal clusterings of machines and optimal groupings of parts into families. The quality of these machine and part groupings have been evaluated using various objective functions, including grouping efficacy, grouping index, grouping capability index, and doubly weighted grouping efficiency, among others. In this study, we investigate how appropriate these grouping quality measures are in determining cell formations that optimize factory performance. Through the application of a grouping genetic algorithm, we determine machine/part cell formations for several problems from the literature. These cell formations are then simulated to determine their impact on various factory measures, such as flow time, wait time, throughput, and machine utilization, among others. Results indicate that it is not always the case that a “more efficient” machine/part cell formation leads to significant changes or improvements in factory measures over a “less efficient” cell formation. In other words, although researchers are working to optimize cell formations using efficiency measures, cells formed this way do not always demonstrate optimized factory measures.
    Engineering Applications of Artificial Intelligence 02/2007; 20(1-20):63-78. DOI:10.1016/j.engappai.2006.04.001 · 1.96 Impact Factor
  • Tabitha L. James · Reza Barkhi · John D. Johnson
    [Show abstract] [Hide abstract]
    ABSTRACT: Many problems in the operations research field cannot be solved to optimality within reasonable amounts of time with current computational resources. In order to find acceptable solutions to these computationally demanding problems, heuristic methods such as genetic algorithms are often developed. Parallel computing provides alternative design options for heuristic algorithms, as well as the opportunity to obtain performance benefits in both computational time and solution quality of these heuristics. Heuristic algorithms may be designed to benefit from parallelism by taking advantage of the parallel architecture. This study will investigate the performance of the same global parallel genetic algorithm on two popular parallel architectures to investigate the interaction of parallel platform choice and genetic algorithm design. The computational results of the study illustrate the impact of platform choice on parallel heuristic methods. This paper develops computational experiments to compare algorithm development on a shared memory architecture and a distributed memory architecture. The results suggest that the performance of a parallel heuristic can be increased by considering the desired outcome and tailoring the development of the parallel heuristic to a specific platform based on the hardware and software characteristics of that platform.
    Engineering Applications of Artificial Intelligence 12/2006; 19(8-19):843-856. DOI:10.1016/j.engappai.2006.02.004 · 1.96 Impact Factor