About
45
Publications
11,110
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
388
Citations
Introduction
Current institution
Publications
Publications (45)
Risk analytics is developing rapidly, and analysts in the field need material that is theoretically sound as well as practical and straightforward. A one-stop resource for quantitative risk analysis, Practical Spreadsheet Risk Modeling for Management dispenses with the use of complex mathematics, concentrating on how powerful techniques and methods...
This study examines the relationship between competition and investment by Incumbert Local Exchange Carriers (ILECs) in US telecommunications markets. A panel data model and a dynamic panel data model that have not been used in previous studies are applied in this analysis. Results of the panel data model suggest that investment by ILECs is positiv...
Explores how in the USA the Telecommunications Act of 1996 was designed to usher in an era of competition in all telecommunications markets. In practice, the initial explosion of competitive local exchange carriers (CLECs) and an $87 billion publicly traded valuation has fallen instead to less than 200 CLECs and a value of only $4 billion; and has...
The Supreme Court decision in Verizon et al v. FCC et al has finally settled the legality of the FCC's methodology for setting prices for wholesale services that are "based on cost", as required by the Telecommunications Act of 1996. The Court's decision reveals unanimous agreement that forward-looking costs may be used as the measure of cost. It a...
The last decade has witnessed a dramatic substitutionof price cap regulation for rate-of-return regulationin the telecommunications industry. The 1996Telecommunications Act empowers state regulators toset the terms of competitive entry in local telephonemarkets. We investigate whether the form of regulationendogenously affects the regulator's behav...
The purpose of this chapter is to motivate the need for a jurisdictional analysis of the implementation of the Act. We adopt a jurisdictional approach for three reasons: structural, comparative, and empirical. The structural motivation derives from the fact that the telecommunications industry is fragmented both in terms of market segments and in t...
Almost five years ago, amidst much fanfare and even more promise, President Clinton signed into law The Telecommunications Act of 1996, the first comprehensive telecommunications reform legislation since 1934. The Act was heralded as a watershed event for the information economy—calling for a competitive free-for-all in the U.S. telecommunications...
We have seen that jurisdictional fragmentation could cause the FCC to set wholesale prices below cost. We have also seen that the FCC’s “efficient-firm” standard rests on a weak theoretical foundation but is capable of producing low wholesale prices. We have presented evidence that the wholesale prices set as a result of the Act are below forward-l...
A key aspect of the Telecommunications Act of 1996 is the unbundling of ILEC networks and the pricing of UNEs. The FCC (and the Federal State Board for Universal Service) specified that UNE costs should be based on forward-looking efficient economic costs. That part of the FCC Order (Docket 96-98, August 8, 1996) was stayed, subsequently vacated by...
The primary objective of this chapter is to provide an overview of emerging trends in the U.S. telecommunications industry following the passage of the Telecommunications Act of 1996. No attempt is made to supply a comprehensive analysis and description of the regulatory and institutional events that precipitated these events, as these are describe...
To explore the importance of jurisdiction in setting wholesale prices, we develop a model of how the FCC might set particular prices depending upon whether it has complete or only partial jurisdiction. The model demonstrates that even if the FCC’s objective is to promote economic efficiency, it would depart from first best efficient prices as a rea...
The Telecommunications Act of 1996 has a distinctly pro-competitive bent—relying upon the marketplace rather than regulation to unleash the full potential of the high-growth telecommunications industry. As stated in the introductory passage of the Act, the express purpose of this legislation is:
To promote competition and reduce regulation in order...
Acknowledgements. 1. Introduction and Overview. 2. Industry Trends and Market Structure. 3. The Stock Market Reacts. 4. A Jurisdictional Model. 5. The FCC's Efficient - Firm Standard - Telric. 6. Back to the Future. 7. Explaining State Regulatory Actions. 8. Conclusions. References. Name Index. Subject Index.
The issue of efficient pricing of unbundled network elements in telecommunications markets is being debated as competition is introduced into local telecommunications markets. There is currently no consensus among economists regarding the optimal pricing rule, nor does it seem likely that such a consensus will form any time soon. The problem is tha...
The determination of interconnection charges to essential facilities is an important problem in regulated telecommunications markets in the wake of the Telecommunications Act of 1996. This paper defines mathematical conditions for the essentiality of upstream productive inputs and examines their implications for efficient interconnection pricing. T...
We identify the incentive structure for a firm that participates in a pooling arrangement. These pooling arrangements have been common in the telecommunications industry, both in the United States and in Canada. We identify alternative mechanisms, including cost caps, yardstick competition, transfer prices, and fixed revenue allocators. A pooling f...
This chapter proceeds as follows. Section II examines the present tariff structure and how it may provide for overdevelopment of, and oversubscription to, private networks. Overdevelopment is defined in terms of economic efficiency. Oversubscription may well be the more critical policy issue, since much of the private network capacity is already a...
Demand estimation and welfare analysis for telecommunications services have often been plagued by apparent inconsistencies between actual consumer behavior and standard economic theory. The latter posits that consumers will subscribe to services when their consumer's surplus exceeds the subscription price. This paper presents an alternative model o...
One of the intriguing aspects of current telecommunications demand theory is that it is nearly identical to the theory of demand for other types of commodities or services. This is curious because telecommunications exhibits a number of unique characteristics that serve to differentiate it from all other commodities. First, telecommunications consu...
CETTE THESE A ETE REALISEE POUR REPONDRE A LA QUESTION SUIVANTE : LES INFORMATIONS APPORTEES PAR LES MARCHES A TERME DE COMMODITES ET LES TECHNIQUES DE COUVERTURE QU'ILS PROPOSENT PEUVENTELLES SERVIR A VALORISER UN GISEMENT PETROLIER ET DECIDER DE SA DATE D'EXPLOITATION ? LES TRAVAUX ONT DONC ETE CENTRES SUR LA VALORISATION D'UN BARIL DE PETROLE PO...
This paper investigates how individual choice is affected by increases in risk when the choice variable (instrument) affects the distribution of the random variable as well as the objective function. The effect of increased risk on optimal choice is shown to depend on attitudes towards risk and the interaction between exogenous uncertainty and the...
The economics literature on new product introduction has recently focused on the life cycle pricing of products where consumers
are uncertain about product quality. This paper explores product introductions where product quality is known with certainty,
but consumers are uncertain as to their own preferences, and experience is required for these to...
Two approaches are often used to analyse the effects of uncertainty upon individual decision making. The comparative equilibrium approach compares an uncertain model with a certainty model, usually with the random variable replaced by its mean. The comparative statics approach introduces more uncertainty into an already uncertain model and consider...
It is shown that most exhaustible resource depletion models are ‘diaster-proof’ in that survival is never an issue — consumption can be arbitrarily close to zero. Consequently, the possibility of substitute technologies becoming available allows for a less conservative optimal depletion policy. It is shown that introduction of a subsistence level f...
Questions
Question (1)
I am interested in examining the accuracy of predictive classification models (e.g., logistic regression, neural networks, random forests, boosted trees, etc.). I recently came across the technique of conformal prediction. It seems promising but I don't quite understand it. It is based on determining a p-value (not the statistical p value) for each observation. It appears that the p value measures the extent to which a classification label is consistent with other observations in the data.
Using this loose interpretation, and considering the binary case where there are only 2 classes (e.g., malignant and not, credit default or not, etc.), I would think that the p value is the probability of the observation belonging to class 1 (I will label the presence of the characteristic as class 1 and its absence as class 0). From the description of conformal prediction, the confidence of the predicted label is equal to 1 minus the p value of the second highest class.
Given my assumption that there are only 2 classes, if p is the probability an observation is in class 1, 1-p is the probability it is in class 0, and the confidence of predicting class 1 would be 1-(1-p) = p.
Then, conformal prediction provides the classification for each observation, depending on the level of confidence you wish to set. It seems to me that this is equivalent (in the binary case) to setting the threshold probability for assigning an observation to class 1 rather than class 0. If that is the case, then isn't this what the ROC curve already does? How is conformal prediction different from using the AUC as the measure of accuracy of a model? And, how does this measure the uncertainty of our prediction rather than the accuracy of our prediction (compare this question to the difference between measuring the accuracy of a regression model with the prediction interval for individual observations using that model)?