SSRN Electronic Journal

Online ISSN: 1556-5068
Publications
Article
Medicare beneficiaries have a distinctive combination of needs and circumstances that make providing appropriate health care services for them a greater challenge than for the working age population. Medicare beneficiaries generally have poorer health status, lower literacy levels, and suffer from more chronic conditions than the general population. More than half have arthritis and high blood pressure. Many Medicare beneficiaries are not totally independent, with forty-five percent needing help with at least one of the key activities of every day living, such as eating, bathing, shopping, using the telephone, or balancing a checkbook. Medicare beneficiaries also have an increased likelihood of financial insecurity; nearly sixty percent have annual incomes below $20,000, compared to only fourteen percent of the working-age population. Given the complex needs and circumstances of many Medicare beneficiaries, managing their health and navigating the health care system can be challenging. Choosing physicians, understanding doctors' orders, deciding among complex treatment options, or choosing a drug discount card are more difficult for them than for the general population. Decreased mobility hampers their ability to travel to health care facilities. Others do not use the telephone easily because they cannot hear or see well. Many do not have computers or know how to use the internet to find valuable information. For all these reasons, many beneficiaries need appropriate services and supports to ensure that Medicare is meeting their needs.
 
Article
Organized provider networks have been developed as a method of achieving efficiencies in the delivery of health care, and to reduce problems such as limited access to specialty and tertiary care, fragmentation and duplication of services, low-quality care and poor patient outcomes. This policy brief examines the experience of ten California counties participating in the Health Care Coverage Initiative (HCCI), a demonstration project to expand coverage to low-income and indigent residents, in overcoming these barriers and creating provider networks based on existing safety-net systems. These interim findings should provide valuable information for future efforts to develop effective networks based on safety-net providers. Specifically, the brief examines the structure of the networks built, how they were implemented, the types of services and reimbursements offered, the health information technologies employed in the effort, as well as plans to further enhance the networks in the future. The counties involved in HCCI are: Alameda, Contra Costa, Kern, Los Angeles, Orange, San Diego, San Francisco, San Mateo, Santa Clara and Ventura.
 
Article
Choice can be an extraordinary benefit or an immense burden. In some contexts, people choose not to choose, or would do so if they were asked. In part because of limitations of "bandwidth," and in part because of awareness of their own lack of information and potential biases, people sometimes want other people to choose for them. For example, many people prefer not to make choices about their health or retirement plans; they want to delegate those choices to a private or public institution that they trust (and may well be willing to pay a considerable amount to those who are willing to accept such delegations). This point suggests that however well accepted, the line between active choosing and paternalism is often illusory. When private or public institutions override people's desire not to choose and insist on active choosing, they may well be behaving paternalistically, through a form of choice-requiring paternalism. Active choosing can be seen as a form of libertarian paternalism, and a frequently attractive one, if people are permitted to opt out of choosing in favor of a default (and in that sense permitted not to choose); it is a form of nonlibertarian paternalism insofar as people are required to choose. For both ordinary people and private or public institutions, the ultimate judgment in favor of active choosing, or in favor of choosing not to choose, depends largely on the costs of decisions and the costs of errors.
 
Article
Medicare and Communities of Color originates from topics in The Role of Private Health Plans in Medicare: Lessons from the Past, Looking to the Future, the final report of the Study Panel on Medicare and Markets convened by the National Academy of Social Insurance. The brief also updates information from the Kaiser Family Foundation's 1999 brief, Faces of Medicare: Medicare and Minority Americans. Medicare and Communities of Color is a factual presentation highlighting principal issues in Medicare's interaction with people of color. A National Academy of Social Insurance study panel is examining how Medicare can be a leader in reducing racial and ethnic health disparities among its beneficiaries and the rest of the health system.
 
Article
This Medicare brief highlights the findings and recommendations of the National Academy of Social Insurance (NASI) Medicare study panels. These reports grapple with the most important policy challenges facing Medicare and its future, including financing, delivery of health services, and the administration of Medicare. Although each study panel had a specific charge, unifying themes emerge. This brief summarizes the findings of each study panel and identifies the common themes found among them.
 
Article
The Medicare Prescription Drug, Improvement and Modernization Act of 2003 (MMA, Public Law 108-173) provides about $14 billion over 10 years in new federal funding to encourage private plans to participate in Medicare Advantage.1 Historically, private plan participation in Medicare has fluctuated. Continuing changes in Medicare's funding policies and program requirements have hindered private health plans from meeting conflicting expectations. Over time, Congress established multiple goals for private plans: containing costs, improving benefits, and increasing plan participation and beneficiary enrollment in an effort to increase health care choices. Proponents of private plans tout the recent funding increase as the needed jumpstart to the program; skeptics claim that these new payments are excessive. Despite polarizing views about their potential, the history of Medicare's private health plans indicates that rising health care costs and constraints in Medicare payments result in private health plan withdrawal from Medicare. Early signs indicate that private health plans are interested in participating in Medicare Advantage, but only time will tell whether Congress will continue supporting higher payment levels to plans, and whether the plans' interest will be sustained. It also remains to be seen if the competitive bidding model adopted by the MMA will provide beneficiaries with more benefits at lower premiums.
 
Conference Paper
The development of a distributed testing environment would have to comply with recent Department of Defense (DoD) mandates requiring that the DoD architectural framework (DoDAF) be adopted to express high level system and operational requirements and architectures. Unfortunately, DoDAF and DoD net-centric mandates pose significant challenges to testing and evaluation since DoDAF specifications must be evaluated to see if they meet requirements and objectives, yet they are not expressed in a form that is amenable to such evaluation. DoDAF is the basis for integrated architectures and provides broad levels of specification related to operational, system, and technical views. In our earlier work, we described an approach to support specification of DoDAF architectures within a development environment based on DEVS (discrete event system specification) for semi-automated construction of the needed simulation models. The result is an enhanced system lifecycle development process that includes both development and testing in an integral manner. We also developed automated model generation using XML which paves the way for OVs to become service-providing components in the Web services architecture. In this paper we present the semantic structure for one of the operational view documents OV-6a that would aid the development of these semi-automated models. We will describe how 0V-6a can be structured in a more generalized meta-model framework such that every rule is reducible to meaningful code which is automatedly constructed through natural language processing (NLP) methods and further be reduced to DEVS based models. The paper also presents an overview of the life-cycle development methodology for these enterprise architectures and how a common enterprise domain-model can be used in customized business/domain-specific rules and policy structures
 
Cost savings for the roll-out of a UMTS900 instead of a UMTS2100 in the brownfield scenario  
Conference Paper
The roll-out of UMTS/HSPA networks in 900MHz band offers the possibility of bringing 3G and mobile broadband services to new areas with fewer base station sites and of improving indoor coverage in existing UMTS coverage areas. This paper presents the benefits in term of savings on site numbers and on network-related costs provided by the roll-out of a mixed-frequency UMTS network (networks with carriers operating at 900 and 2100MHz) instead of a single UMTS2100 network. These results are presented for dense urban, urban, suburban and rural areas and for three demand scenarios. Despite the higher reductions on site number achieved for lower demand scenarios, savings higher than 30% are found even in higher demand scenarios. Regarding costs savings, higher savings are obtained in the green field roll-out scenario. Lower cost savings are achieved in the brown field scenario but the introduction of a UMTS900 layer results in a more cost efficient solution than maintenance of the single frequency UMTS2100 network.
 
Conference Paper
Software cost and time estimation is the process of estimating the cost and time required to develop a software system. Software cost and time estimation supports the planning and tracking of software projects. Effectively controlling the expensive investment of software development is one of the important issues in software project management. Estimating software development cost with high precision is still a great challenge for project managers, because it allows for considerable financial and strategic planning. Software cost estimation refers to the predictions of the likely amount of effort, time, and staffing levels required to build a software system. A very helpful form of cost estimation is the one made at an early stage during a project, when the costing of the project is proposed for approval. However, estimates at the early stages of the development are the most difficult to obtain. In this paper a novel Constructive Cost Model (COCOMO) based on soft computing approach is proposed for software cost estimation. This model carries some of the desirable features of neural networks approach, such as learning ability and good interpretability, while maintaining the merits of the COCOMO model. Unlike the standard neural networks approach, the proposed model can be interpreted and validated by experts, and has good generalisation capability. The model deals effectively with imprecise and uncertain input and enhances the reliability of software cost estimates. From the experimental results, it was concluded that, by the proposed neural network model, the accuracy of cost estimation can be improved and the estimated cost can be very close to the actual cost.
 
A snapshot of the Collaboratorium. ACKNOWLEDGMENT I'd like to thank Thomas Malone and Marco Cioffi for many useful discussions about the ideas underlying this paper.  
Conference Paper
While argumentation tools represent a promising avenue for exploiting the Internet's vast potential for enabling collective intelligence, they have to date been applied almost exclusively to small-scale, usually centrally facilitated, groups. This paper explores some of the issues and design options involved in supporting large-scale on-line argumentation.
 
Conference Paper
International outsourcing of services has come to occupy an increasingly important part of international trade. Analytically services outsourcing is the export of services by a country to the outsourcing nation. Given this, the pattern of trade in these services would be decided in line with country specific comparative advantage equations. Since services are typically intensive in skilled labour and educated manpower, what matters for a country's comparative advantage in services is its resource base in terms of skilled and educated manpower. This is a cross country study which uses the intuitive logic of the Hecksher Ohlin model to determine comparative advantage for a sample of developed countries such as the US, that essentially constitute outsourcers, and developing nations such as India and China that carryout outsourcing. Different definitions of human capital are compared to identify comparative advantage. A key conclusion is that for services outsourcing, the definition of human capital needs to be restricted to secondary and particularly tertiary students rather than literacy. This is further validated by the significance of size of tertiary students in cross country equations estimating business service exports. Secondly, what matters for comparative advantage is the absolute size of the human capital base rather than in percentage terms
 
Conference Paper
This paper reports the effort of using agent-based mix-game model to predict financial time series. It introduces simple generic algorithm into the prediction methodology and gives an example of its application to forecasting Shanghai Index. The results show that this prediction methodology is effective and agent-based mix-game model is a potential good model to predict time series of financial markets
 
Conference Paper
A great deal of the psychological literature shows that individuals always appear to be overconfident preference in their economic life. Studying the influence of agent's overconfidence preference on principal-agent relationship has very profound theoretical and practical significance. Using the expressing methods of overconfidence provided by Daniel , Hirshleifer , Subrahmanyam , Gervais and Odean etc. for reference, this paper researches the principal-agent relationship model under the condition that the agent is overconfident, and the effect mechanism of the agent's overconfidence on the principal-agent relationship by setting up an appropriate mathematical model. The result shows that when the agent's active wage and the principal's monitoring cost are not zero, the optimal level of effort exerted by the agent in equilibrium would always increase in the agent's overconfident level and the principal's optimal intensity of monitoring in equilibrium would always decrease in the agent's overconfident level; and when the agent's active wage is zero, the agent's overconfidence would not affect the principal-agent relation. In addition, the optimal level of effort exerted by the agent and the principal's optimal intensity of monitoring in equilibrium would always decrease in the principal's monitoring cost, and the former would increase in the agent's fixed wage and active wage, and the latter would yet increase in the agent's fixed wage, but not always increase in the agent's active wage.
 
Conference Paper
This paper is related to two empirical research projects: one was an investigation of users of a University Service of Distance Learning; the second was an ethnographic research which was aimed at identifying the pedagogical models underpinning new professional degree courses (2004-2006). In Education and Training, the relationship which users of ICTE have with such technologies can be studied in part as a relationship to specific knowledge which, when put into other contexts, opens up other, much wider, theoretical questions at three differing levels: a generalized view of ICTE; the cultural experience of ICTE; and the systematization of ICTE. To each level, we have attributed three core concepts: 'gathering' butinage, 'poaching'-braconnage, and 'odd jobbing' -bricolage. In the generalized view of ICTE, 'gathering' relates dialectically to notions of navigation. Within the cultural experience of ICTE, 'poaching' relates dialectically to notions of programming, and in the systematizing of ICTE, 'odd-jobbing' relates dialectically to notions of systems development.
 
c) zooms in to show that, basically, spikes and antispikes seem to follow a similar dynamics, but in the reverse direction. Whereas spikes appear only during demand crests, antispikes appear only during demand troughs, and only occasionally. The overall dynamics seems to switch occasionally from a normal regime to two other (opposite) spiky regimes.
Conference Paper
Two electricity price models are presented in the frame of switching and threshold autoregressive exogenous (TARX) models, both in continuous time and in discrete time. The first model is based on the biologically inspired McKean model for spiking neurons, the second model is its extension crafted in such a way to easily display a spike and antispike phenomenology, showing random spikes only in price crest phases (daytime) and random antispikes only during price trough phases. The spiking behavior can be explained by the presence of a mathematical threshold mechanism that can be related to power grid congestions.
 
Conference Paper
Needs and expectations of customers in pre- and after-sales stages in the e-commerce purchase process in the luxury product field are not well known and identified. We were interested in discovering the opinions of customers concerning the role of internet in the pre- and after-sales stages of the purchasing process in this industry. As we are just at the early stages of the e-commerce era in the luxury sector, we chose to focus on blogs content. After-sales service on the Internet in the luxury sector is considered with circumspection. Practitioners feel that consumers are still very much attached to the 'physical' experience. On the contrary, we assumed that Netsurfers belonging to net communities are also luxury goods and services consumers. According to the netnography methodology [1,4], we analyzed the Netsurfers comments about pre- and after-sales services. We identified and selected blogs that are specific to the Web 2.0 generation; these blogs are discussing about luxury watches, cars, travels and art objects. The results allowed us to identify three categories of services needs: 1) the need for the service in broad terms, especially the service that should be linked to a product 2) the need for a specific or a very specific service, and 3) specific complaint about service experiences. These insights were discussed to be fully integrated in the e-commerce strategy in general.
 
A simple of neural networks 
The proposed COCOMO II model based on neural networks 
Conference Paper
The precision of software project estimation such as project cost estimation, project quality estimation and project risk analysis are important issues in software project management. The ability to accurately estimate software development costs is required by the project managers in planning and conducting software development activities. Since software effort drivers are vague and uncertain, software effort estimates, especially in the early stages of the development life cycle. The estimates are often the least accurate, because very little detail is known about the project and the product at the beginning. The need for reliable and accurate cost predictions in software engineering is an ongoing challenge for software engineers. In this paper a novel neural network Constructive Cost Model (COCOMO) is proposed for software cost estimation. This model carries some of the desirable features of neural networks approach, such as learning ability and good interpretability, while maintaining the merits of the COCOMO model. Unlike the standard neural networks approach, the proposed model can be interpreted and validated by experts, and has good generalisation capability. The model deals effectively with imprecise and uncertain input and enhances the reliability of software cost estimates. From the experimental results, it was concluded that, by the proposed neural network model, the accuracy of cost estimation can be improved and the estimated cost can be very close to the actual cost.
 
A generated taxonomy for "Blood". The fat links are correct wrt. the MeSH benchmark, semi-fat links are in grand-or great-grandchild relation in MeSH.
The figure shows the F 0.5-measure (solid) and the precision (dashed lines) for three MeSH benchmarks. Higher entropy of similarities expresses lower confidence in a taxonomy-link. Not filtering by entropy at all yield in precision and F-measure equal to the rightmost data point of each curve, indicated by horizontal dotted lines. The figure shows that indeed high entropy links are often wrong and precision decreases for all benchmark sets. Therefore, by filtering these low confidence links, the algorithm improves in terms of precision, while maintaining or slightly improving the Fmeasure. Any threshold above 0.7 increases precision without worsening the F-measure.
Conference Paper
We compare a family of algorithms for the automatic generation of taxonomies by adapting the Heymann-algorithm in various ways. The core algorithm determines the generality of terms and iteratively inserts them in a growing taxonomy. Variants of the algorithm are created by altering the way and the frequency, generality of terms is calculated. We analyse the performance and the complexity of the variants combined with a systematic threshold evaluation on a set of seven manually created benchmark sets. As a result, betweenness centrality calculated on unweighted similarity graphs often performs best but requires threshold fine-tuning and is computationally more expensive than closeness centrality. Finally, we show how an entropy-based filter can lead to more precise taxonomies.
 
Conference Paper
Modern project management is a well-understood discipline that can produce predictable, repeatable results. The methodologies of modern project management are highly analytic, usually requiring automated tools to support them on large projects. Like most other disciplines, it is learned through both practice and study. Project management encompasses many different skills, such as understanding the interdependencies among people, technologies, budgets, and expectations; planning the project to maximize productivity; motivating others to execute the plan; analyzing the actual results; and reworking and tuning the plan to deal with the realities of what really happens as the project is executed. In order to manage a project and bring it to a successful completion, its project manager must have a complete understanding of the methodologies being used for the management of different parts of the project. Managers prefer specific project methodology, while resist and face difficulties for an opportunity to manage another project with different methodology as they don’t know how much commonality exists between the preferred and the new required methodology.
 
Conference Paper
China provides an excellent empirical case study on the information transmitting efficiency in the capital market. Our paper extend the existing investigation based on much more new data and test the efficiency of information transmitting with Individual, Sub-sampling and Wild bootstrap Variance Ratio method. The empirical results show that gold prices reflect all the past information efficiently and no one can get abnormal return continuously. The China gold spot price followed Random Walk and the gold spot market is a weak-form efficiency market.
 
Difference in Means Tests
Startup Size and Liquidation Event Regressions
Conference Paper
Prior work on the commercialization of innovation is motivated by competitive dynamics between startup and incumbent firms and has looked at the determinants of innovator commercialization mode. We contribute to the literature by examining variation in performance resulting from the choice of whether to innovate technologically in new ventures. The institutional and business environment conditions which strategies lead to higher performance for start-up innovators. In contrast to the factors shown to determine commercialization mode, we show that the team characteristics and economic environment play a stronger role in predicting an innovator's entrepreneurial firm performance. Using unique data from a novel survey of entrepreneurial firms founded over five decades and across diverse industries, we show the conditions when technology focused vs. functionally diverse founding teams outperform. In addition, we show that in certain environments firms with more highly original innovations and innovating firms founded during a recession have higher performance.
 
Conference Paper
Because of ever growing use of wireless applications and inflexibilities in the current way spectrum is allocated, spectrum is becoming more and more scarce. One of the methods to overcome this is spectrum sensing. In spectrum sensing research, use case analysis is often used to determine challenges and opportunities for this technology. However, such use cases vary widely in terms of business context, so that assessment of viability for one cannot be a basis for inferring viability of another. Therefore, this paper proposes a number of basic parameters that set apart fundamentally different business scenarios for sensing technology, i.e. ownership, exclusivity, tradability and neutrality. Based on these parameters, a classification is outlined which sheds light on the particular business and regulatory opportunities and constraints related to spectrum sensing, and could therefore help business actors as well as regulators to fine-tune their decision-making vis-à-vis this technology.
 
Alberta power market prices; a) one year of hourly system prices p in c$, from hour 1 of Apr-07-2006 to hour 24 of Apr-07-2007, time ticks in months; b) hourly system logprices x, same scale; c) detail of b): 3 weeks of logprices x from hour 1 of Aug-14 to hour 24 of Sep-3, time ticks in week days.
Residuals statistics for synthetic (FNS) and market data; a) FNS data ACF, lags in days, upper and lower confidence bound indicated by horizontal lines; b) FNS data Q-Q plot, the 45 • broken line is a guide for the eye; c) market data ACF, lags in days, confidence bounds as in a), stems used to highlight data points; d) market data Q-Q plot, 45 • line as in b).
Conference Paper
This paper describes a simple way to estimate a recently introduced nonlinear stochastic power market model based on the stochastically resonating spiking mechanism.
 
A generic dairy value chain adapted from [11, 18-20]
Sources of papers and the number of publications per source
Conference Paper
RFID technology is currently considered as a key enabler of supply chain transformation. However, very little has been written about the deployment and use of RFID in the dairy industry. Drawing on an extensive literature review and a case example, this exploratory study seeks to present current applications and issues related to RFID's adoption in the dairy industry and discuss future research directions.
 
Conference Paper
The dominant models of supply chain management have a strong product orientation, with service supply chain models only starting to appear. The technologies used in existing supply chains generally aim at improving production and logistical operations by increasing their efficiencies and by linking and integrating collaboration between individual supply chain partners. This paper focuses on service supply chains and argues that despite their increasing importance, current information and communication technologies (ICTs) fail to fully support the specific needs of service co-production. The paper reviews current technology use from a service perspective and concludes that even the most advanced technologies that have huge potential to support services (such as RFID) remain product-oriented. It argues that such technologies should be deployed in ways that can support the service logic and identifies the danger that unreflected ICT use may pose for the pragmatic utility of ICT use in service-supply
 
Conference Paper
This exploratory paper applies an existing framework for the analysis of platform types in mobile services to the urban context and explores the role(s) cities may take up in the development and deployment of mobile services. It first presents the platform typology that will be used, provides some contextual information on the concept of Smart Cities and then applies several diverging cases to it by means of validation. The paper concludes that having control over value-adding assets such as datasets pertaining to the city is a more important factor for governments to take into account than solely developing a customer relationship with its inhabitants.
 
Evolution of ConSors' share price
Copeland's forecasts and valuation for Amazon (billion dollars)
Conference Paper
Author has described selected problems related to valuation and value creation of Internet related companies. Relationships between the brand, loyalty, trust and valuation were researched. The method of valuing companies based on valuation of Customer Lifetime Value was described. Two kinds of the Internet clients are considered: one-time deal clients and oriented to creating relationships with the supplier. Based on this partition, a valuation model is proposed.
 
Conference Paper
This paper introduces and analyses four potential business models for emerging mobile service platforms. For each model, a real-life case is described and briefly analyzed. It is found that 'platform leadership' strategies - focused a.o. on control of core assets on the one hand, and open interfaces on the other hand -, permeate the strategies and models introduced by the stakeholders vying for dominance in this market. In particular, these strategies and models differ strongly according to the specific set-up and control of a number of crucial 'gatekeeper roles'.
 
Defining Radio Operating Variables  
Permission and Protection Profiles  
Conference Paper
The increasingly intensive coexistence of diverse radio systems and the inability of existing institutions to resolve conflicts in a timely manner requires a change in the way operating rights are defined, assigned and enforced. We propose that operating rights should be articulated using transmission permissions and reception protections, defined probabilistically (the Three Ps). Transmission permissions should be based on resulting field strength over space and frequency, rather than radiated power at a transmitter. Reception protections should state the maximum electromagnetic energy an operator can expect from other operations. This formulation does not require a definition of harmful interference, which is delegated to operators and adjudicators should the need for it arise, and does not entail receiver standards. We also recommend ways to facilitate the adjustment of rights in the market place. The remedy (injunction vs. damages) that attaches to an operating right should be stipulated when it is issued. The regulator should separate its rule making and adjudication roles by not changing rights during dispute resolution. The regulator should leave parameter values unchanged after defining an entitlement, though it may add new parameters at license renewal. Values may be adjusted by negotiation between operators at any time, though any changes should be recorded in a public registry of entitlements. The number of parties to a negotiation should be limited by minimizing the number of rights recipients and by enabling direct bargaining between the parties through effective delegation, obviating the regulator's participation.
 
Overview of Conflict Model More specification of the logic is required in order to formulate the key equations. Figures 2 and 3 proceed then to unbundle the dynamics into more detailed key feedback relationships.  
Six core loops for impacts of master variables  
Conference Paper
In its Preface, The 9/11 commission report states: "We learned that the institutions charted with protecting...national security did not understand how grave this threat can be, and did not adjust their policies, plans, and practices to deter or defeat it" (2004: xvi). Given current realities and uncertainties "better preparedness" can be achieved by identifying, controlling and managing the elusive linkages & situational factors that fuel hostilities. This paper focuses on new opportunities and capabilities provided by anticipatory technologies that help understand, measure and model the complex dynamics shaping and precipitating conflict in specific settings worldwide. We introduce a research initiative focusing on linking pre- and post-conflict by drawing upon the power of system dynamics, augmented by new technologies for integrated information analysis, in conjunction with the development of conceptual and computational ontologies capturing the diversity, intensity, and dynamics of the conflict domain
 
Historical stock prices for Daimler- Chrysler. Top : New York; Bottom : Frankfurt These unusual observations result from unre- solved semantic conflicts between the data sources and the data receiver. In this case, not only are the currencies for stock prices different at the two exchanges, but the currency at Frankfurt also changed from German Marks to Euros at the beginning of 1999. Once the data is normalized using this knowledge, it can be seen there is nei- ther significant arbitraging opportunity nor abrupt price plunge for this stock. We call metadata knowledge such as the currency for price context knowledge , and the history of time varying metadata temporal context . 
Company time series data
Examples of temporal context knowledge
Conference Paper
The change in meaning of data over time poses significant challenges for the use of that data. These challenges exist in the use of an individual data source and are further compounded with the integration of multiple sources. In this paper, we identify three types of temporal semantic heterogeneities. We propose a solution based on extensions to the context interchange framework, which has mechanisms for capturing semantics using ontology and temporal context. It also provides a mediation service that automatically resolves semantic conflicts. We show the feasibility of this approach with a prototype that implements a subset of the proposed extensions.
 
Conference Paper
In the competing market nowadays, time has become an important concern at both demand and supply sides. With time-sensitive customers and additional benefits from intertemporal demand shift, we find that the sellers could turn to a time-differentiation based strategy as an effective revenue management tool. Motivated by real-life business issues of Toyota China dealerships, we consider inventory-control models with delivery upgrades, in which the seller allocates its on-hand inventory to price and delivery-time sensitive customers. The seller has two decisions: inventory commitment and inventory replenishment. The former addresses, within an inventory cycle, how on-hand inventories are allocated between the two classes of customers. The latter addresses, between inventory cycles, how the inventory is replenished. In this paper, we develop the optimal inventory allocation and upgrade, and inventory replenishment policies, and demonstrate that the optimal control can be characterized by a switching curve.
 
The Potential Effects of Increased Removal Effectiveness (Intelligence Sharing) versus Weakening the Message Strength (Moderate Rhetoric) 
Increasing the Regime Voice vs. Increased Removal Effectiveness 
Conference Paper
The potential loss of state stability in various parts of the world is a source of threat to U.S. national security. Every case is unique, but there are common processes. Accordingly, we develop a system dynamics model of state stability by representing the nature and dynamics of `loads' generated by insurgency activities, on the one hand, and by articulating the core features of state resilience and its `capacity' to withstand these `loads', on the other. The problem is to determine and `predict' when threats to stability override the resilience of the state and, more important, to anticipate propensities for `tipping points', namely conditions under which small changes in anti-regime activity can generate major disruptions. On this basis, we then identify appropriate actionable mitigation factors to decrease the likelihood of `tipping' and enhance prospects for stability
 
Conference Paper
The mechanism behind price formation in electricity futures markets is still under discussion. Theory suggests that hedging pressure caused by deviating risk preferences is the most promising approach. This paper contributes to the discussion through an empirical investigation of electricity futures for delivery in Germany traded at the European Energy Exchange (EEX). We analyse the futures from an ex post perspective and find evidence for significant positive risk premia at the short-end. Furthermore, we detect the existence of a term structure of risk premia and the existence of seasonality in the risk premia. When testing for factors influencing the risk premia the results suggest that risk premia are directly related to factors linked to risk considerations.
 
Conference Paper
A credit derivative is a path dependent contingent claim on the aggregate loss in a portfolio of credit sensitive securities. We estimate the value of a credit derivative by Monte Carlo simulation of the affine point process that models the loss. We consider two algorithms that exploit the direct specification of the loss process in terms of an intensity. One algorithm is based on the simulation of intensity paths. Here discretization introduces bias into the results. The other algorithm facilitates exact simulation of default times and generates an unbiased estimator of the derivative price. We implement the algorithms to value index and tranche swaps, and we calibrate the loss process to quotes on the CDX North America High Yield index.
 
Conference Paper
Often, software managers have to monitor and manage many projects concurrently. Unfortunately, some projects were completed successfully but some were not completed on time, over budget or being cancelled. Some of the reasons of this project failure are: lack of user involvement, lack of planning, incomplete requirements, lack of resources, incorrect cost estimation, etc. There are many project planning and scheduling techniques to manage and help to ensure project success. Some of these techniques, however, may not be suitable for specific types of projects and thus, cause projects to fail. This paper discusses the issues involved in project success and failure, and presents the results of seven projects undertaken by the undergraduate students taking the course project management.
 
Elements Involved in a Manipulation.
Conference Paper
In this paper, a systematic framework of market monitoring is introduced. The paper discusses the need for the development of systematic approaches to market monitoring and reviews existing work. Furthermore, the paper articulates the current and potential characteristics of market monitoring systems, their components, information management flows and their interactions. The proposed framework is exemplified using an insider trading manipulation case study from the NASDAQ market.
 
Proposed framework: overall structure 
Early growth of technological development 
Subtree for node "Biological materials" 
Conference Paper
This paper presents a novel framework for supporting the development of well-informed research policies and plans. The proposed methodology is based on the use of bibliometrics; i.e., analysis is conducted using information regarding trends and patterns of publication. While using bibliometric techniques in this way is not a new idea, the proposed approach extends previous studies in a number of important ways. Firstly, instead of being purely exploratory, the focus of our research has been on developing techniques for detecting technologies that are in the early growth phase, characterized by a rapid increase in the number of relevant publications. Secondly, to increase the reliability of the forecasting effort, we propose the use of automatically generated keyword taxonomies, allowing the growth potentials of subordinate technologies to be aggregated into the overall potential of larger technology categories. A proof-of-concept implementation of each component of the framework is presented, and is used to study the domain of renewable energy technologies. Results from this analysis are presented and discussed.
 
Conference Paper
Landmark recognition is identified as one important research area in robot navigation systems. It is a key feature for building robots capable of navigating and performing tasks in human environments. However, current object recognition research largely ignores the problems that the mobile robot context introduces. We developed a landmark recognition system which is used by a humanoid robot to identify landmarks during its navigation. The humanoid landmark recognition system is based on a two-step classification stage which is robust and invariant towards scaling and translations. Also, it provides a good balance between fast processing time and high detection accuracy. An appearance-based classification method is initially used to provide the rough initial estimate of the landmark. It is followed by a refinement step using a model-based method to estimate an accurate classification of the object. The goal of our work is to develop a rapid, robust object recognition system with a high detection rate that can actually be used by a humanoid robot to recognize landmarks during its navigation.
 
Conference Paper
Researchers are increasingly turning to live, dasiain the wildpsila phishing studies of users, who unknowingly participate without giving informed consent. Such studies can expose researchers to a number of unique, and fairly significant legal risks. This paper will present four case studies highlighting the steps that researchers have taken to avoid legal problems, and to highlight the legal risks that they were unable to avoid. It then provides a high-level introduction to a few particularly dangerous areas of American law. Finally, it concludes with a series of best practices that may help researchers to avoid legal trouble, however, this information should not be taken as legal advice.
 
Conference Paper
This paper proposes alternative mental models for wireless communication that are structurally similar to the prevailing spectrum-as-territory metaphor, but that use different referents. It explores several non-spatial metaphors, and then focuses in on trademark as a guiding analogy. In this new approach, a wireless communication system is described by a set of operating parameters that differentiate it from other systems, much as all the elements of a trademark are used to differentiate a particular product from others. Radio interference is likened to the unauthorized use of a trademark and its attendant potential to confuse a customer. The frequencies used in a communication are like the letters of a trade name. The regulator is like the trademark office. This approach inspires a system of rights different from those prompted by the spectrum-as-territory analogy. Rights and protections are obtained by registering system operating parameters such as transmitter and receiver masks, just as a trademark registration covers a variety of brand attributes. Any set of operating parameters that does not cause harmful interference with an existing operation can be registered. Auctions are possible, but are not essential. The definition of rights is driven by operators, not regulators. Importantly, rights can have an indefinite term, but also can be lost if not used. Non-spatial metaphors offer a way around the limitations of the existing approach, such as a fixation on frequency, and vague criteria for harmful interference. It restates wireless policy in terms of system operations rather than spectrum, and opens up new opportunities for innovation and use.
 
Conference Paper
Teacher student is an important role improving their own perception what science should be anticipated in classroom. Also, science learning in the current studies tries to have relied understanding in the nature of science. This research aimed to study teacher student's perception in the nature of science. One hundred and one of junior teacher students were studied by interviewing and questionnaire. Data were collected, categorized, and analyzed in terms of students' perceived about nature of science. Results can be presented in three manifests. First, teacher students perceived their own nature of science in term of knowledge construction. It can be raised into five criteria: science as a subject that studied natural phenomena, science as a body of knowledge, science as inquiry process, science as a thinking process, and science as a description of moral and ethics. Second, teacher students perceived their nature of science in term of process of knowledge construction. They expressed their opinions that no observation and experiment, no science. Finally, teacher students perceived their nature of science in terms of scientific enterprise. They expressed their opinions by means of relationship between science, technology, and society (STS). Science is a part of society, its role is important for social development and self-actualization. Some of them referred to science in negative point of view.
 
Article
Many environmental problems are large scale in terms of geographical units and long-term with regard to time. We therefore find a coincidence of different causes and impacts that qualify the interplay between humans and nature as highly uncertain (“transparency challenge”). In consequence we see a need for innovative analytical methods and modelling approaches to supplement the traditional monitoring-based approach in environmental policy. This should allow capturing different degrees of uncertainty which in general is out of power of any monitoring activity. Moreover, with regard to the design of monitoring approaches it requires collecting and connecting data from different fields of social activities in regard of a divergence of natural and social systems’ boundaries. This requires the provision of sufficient, frequently huge data sets (“availability challenge”) that need to fit with each other (“compatibility challenge”). Even if these challenges are met data processing remains a very complex and time-consuming task which should be supported by a user-friendly infrastructure. We here see a comparative advantage in using the GIS technology and a nested structure for data provision supporting the up and down scaling of information and the access of data from different perspectives (“connectivity challenge”) - a polluters, a victims and a regulators point of view.
 
Article
Is it better to live in a big country than a small one? In this paper, I examine whether economic and social conditions vary systematically with the population of a country. Economics provides a number of theoretical reasons why country size should matter, for instance, because of increasing returns to scale or because it is easier to provide public goods to a larger populace. However, there is little empirical evidence linking the scale of country size to any of a multitude of indicators of economic and social welfare.
 
Article
We provide a concise overview of time series analysis in the time and frequency domains, with lots of references for further reading.
 
Article
This paper provides a survey of recent experimental work in eco- nomics focussing on social and economic networks.The experiments consider networks of coordination and cooperation,buyer-seller net- works,and network formation.
 
Article
Risk management information systems are designed to overcome the problem of aggregating data across diverse trading units. The design of an information system depends on the risk measurement methodology that a firm chooses. Inherent in the design of both a risk management information system and a risk measurement methodology is a tradeoff between the accuracy of the resulting measures of risk and the burden of computing them. Technical progress will make this tradeoff more favorable over time, leading firms to implement more accurate methodologies, such as full revaluation of nonlinear positions. The current and likely future improvements in risk management information systems make feasible new ways of collecting aggregate data on firms' risk-taking activities.
 
Article
The fact that crossing a political border dramatically reduces trade flows has been widely documented in the literature. The increasing number of borders has surprisingly attracted much less attention. The number of independent countries has indeed risen from 72 in 1948 to 192 today. This paper estimates the effect of political disintegration since World War II on the measured growth in world trade. We first show that trade statistics should be considered carefully when assessing globalization over time, since the definition of trade partners varies over time. We document a sizeable resulting accounting artefact, which accounts for 17% of the growth in world trade since 1948. Second, we estimate that political disintegration alone since World War II has raised measured international trade flows by 9% but decreased actual trade flows (including inter-regional trade) by 4%...
 
Article
Politics, science, research and the public base their discussions on comprehensive, valid and systematic health care data. Data based information is needed in order to measure the results or success of a policy of procedure. In Germany, the Information System of the Federal Health Monitoring (IS-GBE) comprises of a comprehensive health system data collection. Currently, the information system contains health data and information from over a 100 different sources, for instance surveys done by the statistical offices of the federation or the L�nder, as well as many other surveys done within the health care system. Apart from the IS-GBE additional official and non-official health care data sources exist in Germany. Within this expertise changes made from 2001 on to existing health data sources as well as newly established data sources are presented. Data sources dealt with in other expertise are not considered. The up until now one time insuree sample collected in 2001 was, for example, collected in the course of the reform of the risk structure adjustment within the statutory health insurance is one of the more recent samples. From our point of view we highly recommend an update of this representative sample.
 
Top-cited authors
Michael C. Jensen
  • Harvard University
Campbell Harvey
  • Duke University
John Graham
  • Duke University
Ernst Fehr
  • University of Zurich
Ross Levine
  • University of California, Berkeley