A Toolbox for ab initio 3-D reconstructions in single-particle electron microscopy

National Resource for Automated Molecular Microscopy and Department of Cell Biology, The Scripps Research Institute, La Jolla, CA 92037, USA.
Journal of Structural Biology (Impact Factor: 3.23). 12/2009; 169(3):389-98. DOI: 10.1016/j.jsb.2009.12.005
Source: PubMed


Structure determination of a novel macromolecular complex via single-particle electron microscopy depends upon overcoming the challenge of establishing a reliable 3-D reconstruction using only 2-D images. There are a variety of strategies that deal with this issue, but not all of them are readily accessible and straightforward to use. We have developed a "toolbox" of ab initio reconstruction techniques that provide several options for calculating 3-D volumes in an easily managed and tightly controlled work-flow that adheres to standard conventions and formats. This toolbox is designed to streamline the reconstruction process by removing the necessity for bookkeeping, while facilitating transparent data transfer between different software packages. It currently includes procedures for calculating ab initio reconstructions via random or orthogonal tilt geometry, tomograms, and common lines, all of which have been tested using the 50S ribosomal subunit. Our goal is that the accessibility of multiple independent reconstruction algorithms via this toolbox will improve the ease with which models can be generated, and provide a means of evaluating the confidence and reliability of the final reconstructed map.

Download full-text


Available from: Neil R Voss
  • Source
    • "This interface aims to streamline cryo-EM data processing by facilitating the use of flexible image processing workflows that use multiple programs from various software packages, for more information see the chapter by Carragher in this issue. Typical applications of ML2D inside this pipeline include the generation of templates for automated particle picking, data cleaning (by discarding images that give rise to bad class averages or with relatively flat probability distributions), and the generation of class averages for subsequent random conical tilt or common lines reconstructions (Voss et al., 2010). "
    [Show abstract] [Hide abstract]
    ABSTRACT: With the advent of computationally feasible approaches to maximum-likelihood (ML) image processing for cryo-electron microscopy, these methods have proven particularly useful in the classification of structurally heterogeneous single-particle data. A growing number of experimental studies have applied these algorithms to study macromolecular complexes with a wide range of structural variability, including nonstoichiometric complex formation, large conformational changes, and combinations of both. This chapter aims to share the practical experience that has been gained from the application of these novel approaches. Current insights on how to prepare the data and how to perform two- or three-dimensional classifications are discussed together with the aspects related to high-performance computing. Thereby, this chapter will hopefully be of practical use for those microscopists wishing to apply ML methods in their own investigations.
    Preview · Article · Jan 2010 · Methods in enzymology
  • [Show abstract] [Hide abstract]
    ABSTRACT: Oil is continually being created and destroyed. The amount that exists at any instant is the global resource. With certain geological assumptions, this volume can be linked to the flux rate (input or output) and an age function:Half-life x system flux rate=ln 2 x system sizeThe geological assumptions concern the long-term stability and equilibrium of the system in question, and a simple model of exponential oil destruction. This equation can be applied either to the entire global oil resource or to the oil in economically viable reservoirs. The relationship is also applicable to some other resource systems.Miller (1992) calculated that economically reservoired oil has a half-life of about 29 million years. The global oil generation rate is poorly constrained but seems to be about 2.7 million barrels/year, which is at least compatible with what is known of oil loss by seepage. Perhaps 0.8 million barrels of oil seep annually from reservoirs, which defines the minimum flux rate. These numbers suggest an ultimate global reserve of recoverable conventional oil of 5 trillion barrels, or more than twice most conventional estimates. If the conventional estimates are correct, then some aspect of the model and of our understanding is clearly flawed.Whatever the volume of undiscovered reserves may be, there are some striking aspects of oil supply and duration that should influence long-term planning in the oil exploration and exploitation industry. For example:- For each recoverable barrel conventionally assumed to remain to be found, the industry has abandoned 10–20 as unrecoverable in known fields.- Even a doubling of the conventional reserves will not increase their duration if consumption rises by 2% annually.- The present oil glut is in sharp contrast to the oil-supply fears of 20 years ago, but the conventional view of a 2000 billion barrel ultimate oil reserve has not changed in that time. If that view is correct, then we have only been finding the predicted remaining oil reserves faster than we expected; the long-term prospects for the industry remain essentially unchanged.
    No preview · Article · Dec 1996 · Norwegian Petroleum Society Special Publications
  • [Show abstract] [Hide abstract]
    ABSTRACT: The story of the development of theories and methods related to trapping mechanisms is a fascinating succession of brilliant observations, ludicrous misconceptions, empirical trials, and eventually the breakthrough of sound geological and physical principles. It took some 30 years from the start of the oil industry before petroleum geology began to have an impact. The period from 1885 to 1915 was very fruitful although the pendulum swung too much the other way and exploration focussed on anticlinal traps only. However, by 1915 considerable progress had been made and most trapping configurations had been recognised. Also the basic physical principles of trapping were understood in a qualitative sense. The years from 1915 to 1935 saw the development of most of the important exploration tools and also the invention of wireline logging and many petrophysical analysis methods. Consequently, the structural control on traps and the petrophysical characterisation improved significantly. By 1935, so much oilfield data had become available that several geologists in succession designed detailed classification systems for trapping configurations. After 1935, the physics of rock mechanics, flow in porous media and interfacial tension formed the subject of important studies that put petroleum geology and engineering on a much more scientific footing. This led in turn to more quantitative analysis of trapping capacity and trap integrity. After 1955, there was another upsurge in technical sophistication with respect to seismic quality, wireline logging, geochemistry and laboratory equipment. More recently, the understanding and quantification of trapping has improved steadily through sophisticated well test analysis, reservoir performance monitoring, borehole imaging logs and, in particular, the detailed images provided by 3D-seismic. Outcrop studies have been undertaken to learn more about fault zones. Research is by no means finished and there is still a wide variety of uncertainties and controversies concerning trapping phenomena.
    No preview · Article · Dec 1997 · Norwegian Petroleum Society Special Publications
Show more