Emre Brookes

University of Texas Health Science Center at San Antonio, San Antonio, Texas, United States

Are you Emre Brookes?

Claim your profile

Publications (22)28.03 Total impact

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A critical problem in material science is the accurate characterization of the size dependent properties of colloidal inorganic nanocrystals. Due to the intrinsic polydispersity present during synthesis, dispersions of such materials exhibit simultaneous heterogeneity in density ρ, molar mass M, and particle diameter d. The density increments partial of ρ over partial of d and the partial of rho over partial of M of these nanoparticles, if known, can then provide important information about crystal growth and particle size distributions. For most classes of nanocrystals, a mixture of surfactants is added during synthesis to control their shape, size and optical properties. However, it remains a challenge to accurately determine the amount of passivating ligand bound to the particle surface post synthesis. The presence of the ligand shell hampers an accurate determination of the nanocrystal diameter. Using CdSe and PbS semiconductor nanocrystals, and the ultrastable silver nanoparticle (M4Ag44(p-MBA)30) as model systems, we describe a Custom Grid method implemented in UltraScan-III for the characterization of nanoparticles and macromolecules using sedimentation velocity analytical ultracentrifugation. We show that multiple parametrizations are possible, and that the Custom Grid method can be generalized to provide high resolution composition information for mixtures of solutes that are heterogeneous in two out of three parameters. For such cases, our method can simultaneously resolve arbitrary 2-dimensional distributions of hydrodynamic parameters when a third property can be held constant. For example, this method extracts partial specific volume and molar mass from sedimentation velocity data for cases where the anisotropy can be held constant, or provides anisotropy and partial specific volume if the molar mass is known.
    Analytical chemistry. 07/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: A method for fitting sedimentation velocity experiments using whole boundary Lamm equation solutions is presented. The method, termed parametrically constrained spectrum analysis (PCSA), provides an optimized approach for simultaneously modeling heterogeneity in size and anisotropy of macromolecular mixtures. The solutions produced by PCSA are particularly useful for modeling polymerizing systems, where a single-valued relationship exists between the molar mass of the growing polymer chain and its corresponding anisotropy. The PCSA uses functional constraints to identify this relationship, and unlike other multidimensional grid methods, assures that only a single molar mass can be associated with a given anisotropy measurement. A description of the PCSA algorithm is presented, as well as several experimental and simulated examples that illustrate its utility and capabilities. The performance advantages of the PCSA method in comparison to other methods are documented. The method has been added to the UltraScan-III software suite, which is available for free download from http://www.ultrascan.uthscsa.edu.
    Biophysical Journal 04/2014; 106(8):1741-50. · 3.67 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Fibrinogen is a large heterogeneous aggregation/degradation-prone protein playing a central role in blood coagulation and associated pathologies, whose structure is not completely resolved. When a high-molecular-weight fraction was analyzed by size-exclusion high-performance liquid chromatography/small-angle X-ray scattering (HPLC-SAXS), several composite peaks were apparent and because of the stickiness of fibrinogen the analysis was complicated by severe capillary fouling. Novel SAS analysis tools developed as a part of the UltraScan Solution Modeler (US-SOMO; http://somo.uthscsa.edu/), an open-source suite of utilities with advanced graphical user interfaces whose initial goal was the hydrodynamic modeling of biomacromolecules, were implemented and applied to this problem. They include the correction of baseline drift due to the accumulation of material on the SAXS capillary walls, and the Gaussian decomposition of non-baseline-resolved HPLC-SAXS elution peaks. It was thus possible to resolve at least two species co-eluting under the fibrinogen main monomer peak, probably resulting from in-column degradation, and two others under an oligomers peak. The overall and cross-sectional radii of gyration, molecular mass and mass/length ratio of all species were determined using the manual or semi-automated procedures available within the US-SOMO SAS module. Differences between monomeric species and linear and sideways oligomers were thus identified and rationalized. This new US-SOMO version additionally contains several computational and graphical tools, implementing functionalities such as the mapping of residues contributing to particular regions of P(r), and an advanced module for the comparison of primary I(q) versus q data with model curves computed from atomic level structures or bead models. It should be of great help in multi-resolution studies involving hydrodynamics, solution scattering and crystallographic/NMR data.
    Journal of Applied Crystallography 12/2013; 46(Pt 6):1823-1833. · 3.34 Impact Factor
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: UltraScan Solution Modeler (US-SOMO) computes hydrodynamic parameters and small-angle scattering data from biological macromolecular structural representations and compares them with experimental data for structural determination and validation. At XSEDE 12, a GUI integrated gateway was introduced to offload large computations to various HPC resources. The gateway was directly integrated into the Qt/GUI based software to allow the users a seamless experience. The software is available as source code or precompiled for Apple Mac OSX, MS-Windows and Linux. Current cluster resources include TACC Lonestar and Stampede, SDSC Trestles and a 256 CPU cluster local to the University of Texas Health Science Center at San Antonio. The simplicity of design allowed the implementation of a new method of modeling small angle scattering data that provided new scientific insights and was presented at the 2012 international small-angle scattering conference. Since introduction, multiple workshops have been taught and users are beginning to utilize the gateway in their biological research.
    Proceedings of the Conference on Extreme Science and Engineering Discovery Environment: Gateway to Discovery, San Diego, California; 01/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: UltraScan Solution Modeler (US-SOMO) processes atomic and lower-resolution bead model representations of biological and other macromolecules to compute various hydrodynamic parameters, such as the sedimentation and diffusion coefficients, relaxation times and intrinsic viscosity, and small angle scattering curves, that contribute to our understanding of molecular structure in solution. Knowledge of biological macromolecules' structure aids researchers in understanding their function as a path to disease prevention and therapeutics for conditions such as cancer, thrombosis, Alzheimer's disease and others. US-SOMO provides a convergence of experimental, computational, and modeling techniques, in which detailed molecular structure and properties are determined from data obtained in a range of experimental techniques that, by themselves, give incomplete information. Our goal in this work is to develop the infrastructure and user interfaces that will enable a wide range of scientists to carry out complicated experimental data analysis techniques on XSEDE. Our user community predominantly consists of biophysics and structural biology researchers. A recent search on PubMed reports 9,205 papers in the decade referencing the techniques we support. We believe our software will provide these researchers a convenient and unique framework to refine structures, thus advancing their research. The computed hydrodynamic parameters and scattering curves are screened against experimental data, effectively pruning potential structures into equivalence classes. Experimental methods may include analytical ultracentrifugation, dynamic light scattering, small angle X-ray and neutron scattering, NMR, fluorescence spectroscopy, and others. One source of macromolecular models is X-ray crystallography. However, the conformation in solution may not match that observed in the crystal form. Using computational techniques, an initial fixed model can be expanded into a search space utilizing high temperature molecular dynamic approaches or stochastic methods such as Brownian dynamics. The number of structures produced can vary greatly, ranging from hundreds to tens of thousands or more. This introduces a number of cyberinfrastructure challenges. Computing hydrodynamic parameters and small angle scattering curves can be computationally intensive for each structure, and therefore cluster compute resources are essential for timely results. Input and output data sizes can vary greatly from less than 1 MB to 2 GB or more. Although the parallelization is trivial, along with data size variability there is a large range of compute sizes, ranging from one to potentially thousands of cores with compute time of minutes to hours. In addition to the distributed computing infrastructure challenges, an important concern was how to allow a user to conveniently submit, monitor and retrieve results from within the C++/Qt GUI application while maintaining a method for authentication, approval and registered publication usage throttling. Middleware supporting these design goals has been integrated into the application with assistance from the Open Gateway Computing Environments (OGCE) collaboration team. The approach was tested on various XSEDE clusters and local compute resources. This paper reviews current US-SOMO functionality and implementation with a focus on the newly deployed cluster integration.
    07/2012;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Ultrascan gateway provides a user friendly web interface for evaluation of experimental analytical ultracentrifuge data using the UltraScan modeling software. The analysis tasks are executed on the TeraGrid and campus computational resources. The gateway is highly successful in providing the service to end users and consistently listed among the top five gateway community account usage. This continued growth and challenges of sustainability needed additional support to revisit the job management architecture. In this paper we describe the enhancements to the Ultrascan gateway middleware infrastructure provided through the TeraGrid Advanced User Support program. The advanced support efforts primarily focused on a) expanding the TeraGrid resources incorporate new machines; b) upgrading UltraScan's job management interfaces to use GRAM5 in place of the deprecated WS-GRAM; c) providing realistic usage scenarios to the GRAM5 and INCA resource testing and monitoring teams; d) creating general-purpose, resource-specific, and UltraScan-specific error handling and fault tolerance strategies; and e) providing forward and backward compatibility for the job management system between UltraScan's version 2 (currently in production) and version 3 (expected to be released mid-2011).
    Proceedings of the 2011 TeraGrid Conference: Extreme Digital Discovery; 01/2011
  • Emre Brookes, Borries Demeler, Mattia Rocco
    [Show abstract] [Hide abstract]
    ABSTRACT: The US-SOMO suite provides a flexible interface for accurately computing solution parameters from 3D structures of biomacromolecules through bead-modeling approaches. We present an extended analysis of the influence of accessible surface area screening, overlap reduction routines, and approximations for non-coded residues and missing atoms on the computed parameters for models built by the residue-to-bead direct correspondence and the cubic grid methods. Importantly, by taking the theoretical hydration into account at the atomic level, the performance of the grid-type models becomes comparable or exceeds that of the corresponding hydrated residue-to-bead models.
    Macromolecular Bioscience 07/2010; 10(7):746-53. · 3.74 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We compare here the utility of sedimentation velocity (SV) to sedimentation equilibrium (SE) analysis for the characterization of reversible systems. Genetic algorithm optimization in UltraScan is used to optimize the model and to obtain solution properties of all components present in the system. We apply our method to synthetic and experimental data, and suggest limits for the accessible kinetic range. We conclude that equilibrium constants obtained from SV and SE analysis are equivalent, but that SV experiments provide better confidence for the K(d), can better account for the presence of contaminants and provide additional information including rate constants and shape parameters.
    Macromolecular Bioscience 07/2010; 10(7):775-82. · 3.74 Impact Factor
  • Emre H. Brookes, Borries Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: Solving large non-negatively constrained least squares systems is frequently used in the physical sciences to estimate model parameters which best fit experimental data. Analytical Ultracentrifugation (AUC) is an important hydrodynamic experimental technique used in biophysics to characterize macromolecules and to determine parameters such as molecular weight and shape. We previously developed a parallel divide and conquer method to facilitate solving the large systems obtained from AUC experiments. New AUC instruments equipped with multi-wavelength (MWL) detectors have recently increased the data sizes by three orders of magnitude. Analyzing the MWL data requires significant compute resources. To better utilize these resources, we introduce a procedure allowing the researcher to optimize the divide and conquer scheme along a continuum from minimum wall time to minimum compute service units. We achieve our results by implementing a preprocessing stage performed on a local workstation before job submission.
    01/2010;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.
    Biophysics of Structure and Mechanism 04/2009; 39(3):347-59. · 2.44 Impact Factor
  • Source
    Emre Brookes, Weiming Cao, Borries Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: We report a model-independent analysis approach for fitting sedimentation velocity data which permits simultaneous determination of shape and molecular weight distributions for mono- and polydisperse solutions of macromolecules. Our approach allows for heterogeneity in the frictional domain, providing a more faithful description of the experimental data for cases where frictional ratios are not identical for all components. Because of increased accuracy in the frictional properties of each component, our method also provides more reliable molecular weight distributions in the general case. The method is based on a fine grained two-dimensional grid search over s and f/f (0), where the grid is a linear combination of whole boundary models represented by finite element solutions of the Lamm equation with sedimentation and diffusion parameters corresponding to the grid points. A Monte Carlo approach is used to characterize confidence limits for the determined solutes. Computational algorithms addressing the very large memory needs for a fine grained search are discussed. The method is suitable for globally fitting multi-speed experiments, and constraints based on prior knowledge about the experimental system can be imposed. Time- and radially invariant noise can be eliminated. Serial and parallel implementations of the method are presented. We demonstrate with simulated and experimental data of known composition that our method provides superior accuracy and lower variance fits to experimental data compared to other methods in use today, and show that it can be used to identify modes of aggregation and slow polymerization.
    Biophysics of Structure and Mechanism 03/2009; 39(3):405-14. · 2.44 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The interpretation of solution hydrodynamic data in terms of macromolecular structural parameters is not a straightforward task. Over the years, several approaches have been developed to cope with this problem, the most widely used being bead modeling in various flavors. We report here the implementation of the SOMO (SOlution MOdeller; Rai et al. in Structure 13:723-734, 2005) bead modeling suite within one of the most widely used analytical ultracentrifugation data analysis software packages, UltraScan (Demeler in Modern analytical ultracentrifugation: techniques and methods, Royal Society of Chemistry, UK, 2005). The US-SOMO version is now under complete graphical interface control, and has been freed from several constraints present in the original implementation. In the direct beads-per-atoms method, virtually any kind of residue as defined in the Protein Data Bank (e.g., proteins, nucleic acids, carbohydrates, prosthetic groups, detergents, etc.) can be now represented with beads whose number, size and position are all defined in user-editable tables. For large structures, a cubic grid method based on the original AtoB program (Byron in Biophys J 72:408-415, 1997) can be applied either directly on the atomic structure, or on a previously generated bead model. The hydrodynamic parameters are then computed in the rigid-body approximation. An extensive set of tests was conducted to further validate the method, and the results are presented here. Owing to its accuracy, speed, and versatility, US-SOMO should allow to fully take advantage of the potential of solution hydrodynamics as a complement to higher resolution techniques in biomacromolecular modeling.
    Biophysics of Structure and Mechanism 03/2009; 39(3):423-35. · 2.44 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: A computational approach for fitting sedimentation velocity experiments from an analytical ultracentrifuge in a model-independent fashion is presented. This chapter offers a recipe for obtaining high-resolution information for both the shape and the molecular weight distributions of complex mixtures that are heterogeneous in shape and molecular weight and provides suggestions for experimental design to optimize information content. A combination of three methods is used to find the solution most parsimonious in parameters and to verify the statistical confidence intervals of the determined parameters. A supercomputer implementation with a MySQL database back end is integrated into the UltraScan analysis software. The UltraScan LIMS Web portal is used to perform the calculations through a Web interface. The performance and limitations of the method when employed for the analysis of complex mixtures are demonstrated using both simulated data and experimental data characterizing amyloid aggregation.
    Methods in enzymology 02/2009; 454:87-113. · 1.90 Impact Factor
  • Source
    Emre H. Brookes, Borries Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: Frequently in the physical sciences experimental data are analyzed to determine model parameters using techniques known as parameter estimation. Eliminating the eects of noise from experimental data often involves Tikhonov or Maximum-Entropy regularization. These methods in- troduce a bias which smoothes the solution. In the prob- lems considered here, the exact answer is sharp, containing a sparse set of parameters. Therefore, it is desirable to nd the simplest set of model parameters for the data with an equivalent goodness-of-t. This paper explains how to bias the solution towards a parsimonious model with a careful application of Genetic Algorithms. A method of representa- tion, initialization and mutation is introduced to ecien tly nd this model. The results are compared with results from two other methods on simulated data with known content. Our method is shown to be the only one to achieve the desired results. Analysis of Analytical Ultracentrifugation sedimentation velocity experimental data is the primary ex- ample application.
    Genetic and Evolutionary Computation Conference, GECCO 2007, Proceedings, London, England, UK, July 7-11, 2007; 01/2007
  • Emre Brookes, Borries Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: Sedimentation experiments can provide alarge amount of information about the composition of asample, and the properties of each component contained in the sample. To extract the details of the composition and the component properties, experimental data can be described by amathematical model, which can then be fitted to the data. If the model is nonlinear in the parameters, the parameter adjustments are typically performed by anonlinear least squares optimization algorithm. For models with many parameters, the error surface of this optimization often becomes very complex, the parameter solution tends to become trapped in alocal minimum and the method may fail to converge. We introduce here astochastic optimization approach for sedimentation velocity experiments utilizing genetic algorithms which is immune to such convergence traps and allows high-resolution fitting of nonlinear multi-component sedimentation models to yield distributions for sedimentation and diffusion coefficients, molecular weights, and partial concentrations.
    02/2006: pages 33-40;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a novel divide and conquer method for parallelizing a large scale multivariate linear optimization problem, which is commonly solved using a sequential algorithm with the entire parameter space as the input. The optimization solves a large parameter estimation problem where the result is sparse in the parameters. By partitioning the parameters and the associated computations, our technique overcomes memory constraints when used in the context of a single workstation and achieves high processor utilization when large workstation, clusters are used. We implemented this technique in a widely used software package for the analysis of a biophysics problem, which is representative for a large class of problems in the physical sciences. We evaluate the performance of the proposed method on a 512-processor cluster and offer an analytical model for predicting the performance of the algorithm.
    Proceedings of the ACM/IEEE SC2006 Conference on High Performance Networking and Computing, November 11-17, 2006, Tampa, FL, USA; 01/2006
  • E.H. Brookes, R.V. Boppana, B. Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a novel divide and conquer method for parallelizing a large scale multivariate linear optimization problem, which is commonly solved using a sequential algorithm with the entire parameter space as the input. The optimization solves a large parameter estimation problem where the result is sparse in the parameters. By partitioning the parameters and the associated computations, our technique overcomes memory constraints when used in the context of a single workstation and achieves high processor utilization when large workstation clusters are used. We implemented this technique in a widely used software package for the analysis of a biophysics problem, which is representative for a large class of problems in the physical sciences. We evaluate the performance of the proposed method on a 512-processor cluster and offer an analytical model for predicting the performance of the algorithm
    SC 2006 Conference, Proceedings of the ACM/IEEE; 01/2006
  • Emre Brookes, Borries Demeler
    [Show abstract] [Hide abstract]
    ABSTRACT: The advent of parallel computing technology and low-cost computing hardware has facilitated the adoption of high-performance computing tools for the analysis of sedimentation data. Over the past 15years, we have developed the UltraScan software (Demeler et al., http://ultrascan.uthscsa.edu ) to support sedimentation analysis, experimental design, and data management. We describe here recent extensions and advances in methodology that have been adapted in UltraScan. High-performance computing methods implemented on parallel supercomputers utilizing grid computing technology are used to analyze sedimentation experiments at much higher resolution than was previously possible. We discuss the implementation of parallel computing in three novel algorithms used in UltraScan for modeling of sedimentation velocity experiments and provide guidelines for effective data analysis.
    Colloid and Polymer Science 286(2):139-148. · 2.16 Impact Factor
  • Borries Demeler, Emre Brookes
    [Show abstract] [Hide abstract]
    ABSTRACT: High resolution analysis approaches for sedimentation experiments have recently been developed that promise to provide a detailed description of heterogeneous samples by identifying both shape and molecular weight distributions. In this study, we describe the effect experimental noise has on the accuracy and precision of such determinations and offer a stochastic Monte Carlo approach, which reliably quantifies the effect of noise by determining the confidence intervals for the parameters that describe each solute. As a result, we can now predict reliable confidence intervals for determined parameters. We also explore the effect of various experimental parameters on the confidence intervals and provide suggestions for improving the statistics by applying a few practical rules for the design of sedimentation experiments.
    Colloid and Polymer Science 286(2):129-137. · 2.16 Impact Factor

Publication Stats

221 Citations
28.03 Total Impact Points

Institutions

  • 2010–2014
    • University of Texas Health Science Center at San Antonio
      • Department of Biochemistry
      San Antonio, Texas, United States
  • 2013
    • California College San Diego
      San Diego, California, United States
  • 2009
    • Max Planck Institute of Colloids and Interfaces
      • Department of Colloid Chemistry
      Potsdam, Brandenburg, Germany
  • 2006–2007
    • University of Texas at San Antonio
      • Department of Computer Science
      San Antonio, Texas, United States