Tekin Bicer

Tekin Bicer
Argonne National Laboratory | ANL · Data Science and Learning Division

Doctor of Philosophy

About

70
Publications
7,830
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,222
Citations
Additional affiliations
June 2014 - present
Argonne National Laboratory
Position
  • PostDoc Position
March 2007 - May 2014
The Ohio State University
Position
  • Research/Teaching Assistant

Publications

Publications (70)
Preprint
Ptychography is a scanning coherent diffractive imaging technique that enables imaging nanometer-scale features in extended samples. One main challenge is that widely used iterative image reconstruction methods often require significant amount of overlap between adjacent scan locations, leading to large data volumes and prolonged acquisition times....
Preprint
Full-text available
Reducing the radiation dose in computed tomography (CT) is crucial, but it often results in sparse-view CT, where the number of available projections is significantly reduced. This reduction in projection data makes it challenging to accurately reconstruct high-quality CT images. In this condition, a sinogram, which is a collection of these project...
Article
Full-text available
Surface reconstruction and the associated severe strain propagation have long been reported as the major cause of cathode failure during fast charging and long-term cycling. Despite tremendous attempts, no known strategies can simultaneously address the electro-chemomechanical instability without sacrificing energy and power density. Here we report...
Article
Full-text available
Coherent imaging techniques provide an unparalleled multi-scale view of materials across scientific and technological fields, from structural materials to quantum devices, from integrated circuits to biological cells. Driven by the construction of brighter sources and high-rate detectors, coherent imaging methods like ptychography are poised to rev...
Preprint
Advancements in machine learning have introduced innovative approaches for analyzing and enhancing tomographic datasets. However, many of these neural networks pose challenges for non-technical users, limiting their practical application at high-speed synchrotron tomography instruments. This manuscript introduces TomoSuitePY, a Python-based module...
Preprint
We present an end-to-end automated workflow that uses large-scale remote compute resources and an embedded GPU platform at the edge to enable AI/ML-accelerated real-time analysis of data collected for x-ray ptychography. Ptychography is a lensless method that is being used to image samples through a simultaneous numerical inversion of a large numbe...
Preprint
Full-text available
CNN-based surrogates have become prevalent in scientific applications to replace conventional time-consuming physical approaches. Although these surrogates can yield satisfactory results with significantly lower computation costs over small training datasets, our benchmarking results show that data-loading overhead becomes the major performance bot...
Article
Full-text available
Powerful detectors at modern experimental facilities routinely collect data at multiple GB/s. Online analysis methods are needed to enable the collection of only interesting subsets of such massive data streams, such as by explicitly discarding some data elements or by directing instruments to relevant areas of experimental space. Thus, methods are...
Preprint
Full-text available
Coherent microscopy techniques provide an unparalleled multi-scale view of materials across scientific and technological fields, from structural materials to quantum devices, from integrated circuits to biological cells. Driven by the construction of brighter sources and high-rate detectors, coherent X-ray microscopy methods like ptychography are p...
Preprint
Full-text available
Powerful detectors at modern experimental facilities routinely collect data at multiple GB/s. Online analysis methods are needed to enable the collection of only interesting subsets of such massive data streams, such as by explicitly discarding some data elements or by directing instruments to relevant areas of experimental space. Such online analy...
Article
Full-text available
While the advances in synchrotron light sources, together with the development of focusing optics and detectors, allow nanoscale ptychographic imaging of materials and biological specimens, the corresponding experiments can yield terabyte-scale volumes of data that can impose a heavy burden on the computing platform. Although graphics processing un...
Chapter
Full-text available
Beamlines at synchrotron light source facilities are powerful scientific instruments used to image samples and observe phenomena at high spatial and temporal resolutions. Typically, these facilities are equipped only with modest compute resources for the analysis of generated experimental datasets. However, high data rate experiments can easily gen...
Article
This work extends our previous research entitled “MemXCT: Memory-centric X-ray CT Reconstruction with Massive Parallelization” that was originally published at SC19 conference (Hidayetoğlu et al. , 2019) with reproducibility of the computational imaging performance. X-ray computed tomography (XCT) is regularly used at synchrotron light sources to...
Preprint
Beamlines at synchrotron light source facilities are powerful scientific instruments used to image samples and observe phenomena at high spatial and temporal resolutions. Typically, these facilities are equipped only with modest compute resources for the analysis of generated experimental datasets. However, high data rate experiments can easily gen...
Article
Full-text available
Joint ptycho-tomography is a powerful computational imaging framework to recover the refractive properties of a 3D object while relaxing the requirements for probe overlap that is common in conventional phase retrieval. We use an augmented Lagrangian scheme for formulating the constrained optimization problem and employ an alternating direction met...
Preprint
Full-text available
While the advances in synchrotron light sources, together with the development of focusing optics and detectors, allow nanoscale ptychographic imaging of materials and biological specimens, the corresponding experiments can yield terabyte-scale large volumes of data that can impose a heavy burden on the computing platform. While Graphical Processin...
Preprint
Full-text available
Synchrotron-based X-ray computed tomography is widely used for investigating inner structures of specimens at high spatial resolutions. However, potential beam damage to samples often limits the X-ray exposure during tomography experiments. Proposed strategies for eliminating beam damage also decrease reconstruction quality. Here we present a deep...
Preprint
Full-text available
Joint ptycho-tomography is a powerful computational imaging framework to recover the refractive properties of a 3D object while relaxing the requirements for probe overlap that is common in conventional phase retrieval. We use an augmented Lagrangian scheme for formulating the constrained optimization problem and employ an alternating direction met...
Preprint
X-ray computed tomography is a commonly used technique for noninvasive imaging at synchrotron facilities. Iterative tomographic reconstruction algorithms are often preferred for recovering high quality 3D volumetric images from 2D X-ray images, however, their use has been limited to small/medium datasets due to their computational requirements. In...
Article
Deep Priors for Ptycho-tomography - Selin Aslan, Zhengchun Liu, Viktor Nikitin, Tekin Bicer, Sven Leyffer, Doga Gursoy
Article
Full-text available
Synchrotron-based x-ray tomography is a noninvasive imaging technique that allows for reconstructing the internal structure of materials at high spatial resolutions from tens of micrometers to a few nanometers. In order to resolve sample features at smaller length scales, however, a higher radiation dose is required. Therefore, the limitation on th...
Conference Paper
Full-text available
X-ray computed tomography (XCT)is used regularly at synchrotron light sources to study the internal morphology of materials at high resolution. However, experimental constraints, such as radiation sensitivity, can result in noisy or undersampled measurements. Further, depending on the resolution, sample size and data acquisition rates, the resultin...
Preprint
Full-text available
Experimental protocols at synchrotron light sources typically process and validate data only after an experiment has completed, which can lead to undetected errors and cannot enable online steering. Real-time data analysis can enable both detection of, and recovery from, errors, and optimization of data acquisition. However, modern scientific instr...
Article
Full-text available
We introduce a Bayesian framework for the ptychotomographic imaging of 3D objects under photon-limited conditions. This approach is significantly more robust to measurement noise by incorporating prior information on the probabilities of the object features and the measurement process into the reconstruction problem, and it can improve both the tem...
Article
Full-text available
We present the extension of ptychography for three-dimensional object reconstruction in a tomography setting. We describe the alternating direction method of multipliers (ADMM) as a generic reconstruction framework to efficiently solve the nonlinear optimization problem. In this framework, the ADMM breaks the joint reconstruction problem into two w...
Preprint
Full-text available
Synchrotron-based x-ray tomography is a noninvasive imaging technique that allows for reconstructing the internal structure of materials at high spatial resolutions. Here we present TomoGAN, a novel denoising technique based on generative adversarial networks, for improving the quality of reconstructed images for low-dose imaging conditions, as at...
Conference Paper
Full-text available
Modern parallel architecture design has increasingly turned to throughput-oriented devices to address concerns about energy efficiency and power consumption. However, graph applications cannot tap into the full potential of such architectures because of highly unstructured computations and irregular memory accesses. In this paper, we present GraphP...
Article
Full-text available
We investigate the effects of angular diversity on image-reconstruction quality of scanning-probe x-ray tomography for both fly- and step-mode data collection. We propose probe-coverage maps as a tool for both visualizing and quantifying the distribution of probe interactions with the object. We show that data sampling with more angular diversity y...
Article
Full-text available
Background Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation,...
Article
Full-text available
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and c...
Conference Paper
Full-text available
In-situ analytics has lately been shown to be an effective approach to reduce both I/O and storage costs for scientific analytics. Developing an efficient in-situ implementation, however, involves many challenges, including parallelization, data movement or sharing, and resource allocation. Based on the premise that MapReduce can be an appropriate...
Article
Quantitative measurements of direct injection fuel spray density and mixing are difficult to achieve using optical diagnostics, due to the substantial scattering of light and high optical density of the droplet field. For multi-hole sprays, the problem is even more challenging, as it is difficult to isolate a single spray plume along a single line...
Conference Paper
Synchrotron (x-ray) light sources permit investigation of the structure of matter at extremely small length and time scales. Advances in detector technologies enable increasingly complex experiments and more rapid data acquisition. However, analysis of the resulting data then becomes a bottleneck—preventing near-real-time error detection or experim...
Technical Report
Recent years have witnessed an increasing performance gap between I/O and compute capabilities. In-situ analytics, which can avoid expensive data movement of simulation output data by co-locating both simulation and analytics programs, has lately been shown to be an effective approach to reduce both I/O and storage costs. Developing an efficient in...
Article
Full-text available
A maximum a posteriori approach is proposed for X-ray diffraction tomography for reconstructing three-dimensional spatial distribution of crystallographic phases and orientations of polycrystalline materials. The approach maximizes the a posteriori density which includes a Poisson log-likelihood and an a priori term that reinforces expected solutio...
Technical Report
Full-text available
Recent years have witnessed an increasing performance gap between I/O and compute capabilities. In-situ analytics, which can avoid expensive data movement of simulation output data by co-locating both simulation and analytics programs, has lately been shown to be an effective approach to reduce both I/O and storage costs. Developing an efficient in...
Article
Full-text available
A penalized maximum-likelihood estimation is proposed to perform hyperspectral (spatio-spectral) image reconstruction for X-ray fluorescence tomography. The approach minimizes a Poisson-based negative log-likelihood of the observed photon counts, and uses a penalty term that has the effect of encouraging local continuity of model parameter estimate...
Conference Paper
Increasing number of cores in parallel computer systems are allowing scientific simulations to be executed with increasing spatial and temporal granularity. However, this also implies that increasing larger-sized datasets need to be output, stored, managed, and then visualized and/or analyzed using a variety of methods. In examining the possibility...
Conference Paper
Scientific simulations and instruments can generate tremendous amount of data in short time periods. Since the generated data is used for inferring new knowledge, it is important to efficiently store and provide it to the scientific endeavors. Although parallel and distributed systems can help to ease the management of such data, the transmission a...
Conference Paper
Compute cycles in high performance systems are increasing at a much faster pace than both storage and wide-area bandwidths. To continue improving the performance of large-scale data analytics applications, compression has therefore become promising approach. In this context, this paper makes the following contributions. First, we develop a new comp...
Conference Paper
Full-text available
Purpose-built clusters permeate many of today's organizations, providing both large-scale data storage and computing. Within local clusters, competition for resources complicates applications with deadlines. However, given the emergence of the cloud's pay-as-you-go model, users are increasingly storing portions of their data remotely and allocating...
Conference Paper
Full-text available
Recently, there has been growing interest in using Cloud resources for a variety of high performance and data-intensive applications. While there is currently a number of commercial Cloud service providers, Amazon Web Services (AWS) appears to be the most widely used. One of the main services that AWS offers is the Simple Storage Service (S3) for u...
Conference Paper
Full-text available
In this work, we consider the challenge of data analysis in a scenario where data is stored across a local cluster and cloud resources. We describe a software framework to enable data-intensive computing with cloud bursting, i.e., using a combination of compute resources from a local cluster and a cloud environment to perform Map-Reduce type proces...
Conference Paper
Full-text available
For many organizations, one attractive use of cloud resources can be through what is referred to as cloud bursting or the hybrid cloud. These refer to scenarios where an organization acquires and manages in-house resources to meet its base need, but can use additional resources from a cloud provider to maintain an acceptable response time during wo...
Conference Paper
Full-text available
This paper gives an overview of a framework for making existing MPI applications elastic, and executing them with user-specified time and cost constraints in a cloud framework. Considering the limitations of the MPI implementations currently available, we support adaptation by terminating one execution and restarting a new program on a different nu...
Conference Paper
Full-text available
There is a clear trend towards using cloud resources in the scientific or the HPC community, with a key attraction of cloud being the elasticity it offers. In executing HPC applications on a cloud environment, it will clearly be desirable to exploit elasticity of cloud environments, and increase or decrease the number of instances an application is...
Conference Paper
Full-text available
Over the last 2-3 years, the importance of data-intensive computing has increasingly been recognized, closely coupled with the emergence and popularity of map-reduce for developing this class of applications. Besides programmability and ease of parallelization, fault tolerance is clearly important for data-intensive applications, because of their l...

Network

Cited By