Roselyne Barreto Tchoua

Roselyne Barreto Tchoua
DePaul University · Computing and Digital Media

Doctor of Philosophy

About

37
Publications
8,578
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
548
Citations
Introduction
Roselyne Tchoua is an Assistant Professor at the School of Computing, DePaul University, USA. She enjoys working in the space between computer science and other science fields, extracting insight from data using machine learning and natural language processing techniques.

Publications

Publications (37)
Chapter
Extracting scientific facts from unstructured text is difficult due to challenges specific to the complexity of the scientific named entities and relations to be extracted. This problem is well illustrated through the extraction of polymer names and their properties. Even in the cases where the property is a temperature, identifying the polymer nam...
Conference Paper
Computer-aided Diagnosis (CAD) systems have long aimed to be used in clinical practice to help doctors make decisions by providing a second opinion. However, most machine learning based CAD systems make predictions without explicitly showing how their predictions were generated. Since the cognitive process of the diagnostic imaging interpretation i...
Chapter
The automated extraction of claims from scientific papers via computer is difficult due to the ambiguity and variability inherent in natural language. Even apparently simple tasks, such as isolating reported values for physical quantities (e.g., “the melting point of X is Y”) can be complicated by such factors as domain-specific conventions about h...
Chapter
Scientific Named Entity Referent Extraction is often more complicated than traditional Named Entity Recognition (NER). For example, in polymer science, chemical structure may be encoded in a variety of nonstandard naming conventions, and authors may refer to polymers with conventional names, commonly used names, labels (in lieu of longer names), sy...
Article
Typically, thousands of computationally expensive micro-scale simulations of brittle crack propagation are needed to upscale lower length scale phenomena to the macro-continuum scale. Running such a large number of crack propagation simulations presents a significant computational challenge, making reduced-order models (ROMs) attractive for this ta...
Preprint
Full-text available
In this paper, five different approaches for reduced-order modeling of brittle fracture in geomaterials, specifically concrete, are presented and compared. Four of the five methods rely on machine learning (ML) algorithms to approximate important aspects of the brittle fracture problem. In addition to the ML algorithms, each method incorporates dif...
Article
A wealth of valuable data is locked within the millions of research articles published each year. Reading and extracting pertinent information from those articles has become an unmanageable task for scientists. This problem hinders scientific progress by making it hard to build on results buried in literature. Moreover, these data are loosely struc...
Article
Structured databases of chemical and physical properties play a central role in the everyday research activities of scientists and engineers. In materials science, researchers and engineers turn to these databases to quickly query, compare, and aggregate various properties, thereby allowing for the development or application of new materials. The v...
Article
Applications running on leadership platforms are more and more bottlenecked by storage input/output (I/O). In an effort to combat the increasing disparity between I/O throughput and compute capability, we created Adaptable IO System (ADIOS) in 2005. Focusing on putting users first with a service oriented architecture, we combined cutting edge resea...
Conference Paper
Full-text available
Scientific communities have benefitted from a significant increase of available computing and storage resources in the last few decades. For science projects that have access to leadership scale computing resources, the capacity to produce data has been growing exponentially. Teams working on such projects must now include, in addition to the tradi...
Article
Full-text available
Collaboratively monitoring and analyzing large scale simulations from petascale computers is an important area of research and development within the scientific community. This paper addresses these issues when teams of colleagues from different research areas work together to help understand the complex data generated from these simulations. In pa...
Conference Paper
As simulations begin to scale to extreme processor counts trying to understand the mysteries of the universe, collaboration becomes an essential piece of the scientists' daily life as they work to run, analyze, and process their data from these simulations. Most of the teams that we collaborate with work identically to the way they did in the past,...
Article
Full-text available
EFFIS is a set of tools developed for working with large-scale simulations. EFFIS is used by researchers in the Center for Plasma Edge Simulation, as well as many other areas of science. EFFIS is composed of services including adaptable I/O, workflows, dashboards, visualization, code coupling, wide-area data movement, and provenance capturing. One...
Conference Paper
Full-text available
Collaboratively monitoring and analyzing large scale simulations from petascale computers is an important area of research and development within the scientific community. This paper addresses these issues when teams of colleagues from different research areas work together to help understand the complex data generated from these simulations. In pa...
Conference Paper
Full-text available
EFFIS is a set of tools developed for working with large-scale simulations. EFFIS is used by researchers in the Center for Plasma Edge Simulation, as well as many other areas of science. EFFIS is composed of services including adaptable I/O, workflows, dashboards, visualization, code coupling, wide-area data movement, and provenance capturing. One...
Conference Paper
Full-text available
Simulations that require massive amounts of computing power and generate tens of terabytes of data are now part of the daily lives of scientists. Analyzing and visualizing the results of these simulations as they are computed can lead not only to early insights but also to useful knowledge that can be provided as feedback to the simulation, avoidin...
Article
Full-text available
Performance prediction for ITER is based upon the ubiquitous experimental observation that the plasma energy confinement in the device core is strongly coupled to the edge confinement for an unknown reason. The coupling time-scale is much shorter than the plasma transport time-scale. In order to understand this critical observation, a multi-scale t...
Conference Paper
Full-text available
Workflow Management Systems (WFMS), such as Kepler, are proving to be an important tool in scientific problem solving. They can automate and manage complex processes and huge amounts of data produced by petascale simulations. Typically, the produced data need to be properly visualized and analyzed by scientists in order to achieve the desired scien...
Article
Full-text available
The emergence of leadership class computing is creating a tsunami of data from petascale simulations. Results are typically analyzed by dozens of scientists. In order for the scientist to digest the vast amount of data being produced from the simulations and auxiliary programs, it is critical to automate the effort to manage, analyze, visualize, an...
Article
Full-text available
A new predictive computer simulation tool targeting the development of the H-mode pedestal at the plasma edge in tokamaks and the triggering and dynamics of edge localized modes (ELMs) is presented in this report. This tool brings together, in a coordinated and effective manner, several first-principles physics simulation codes, stability analysis...
Article
Full-text available
Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary first-principles, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to del...
Conference Paper
Full-text available
Petascale computing is approaching quickly, but the task of building simulations on these machines is daunting. These computers contain hundreds of thousands of processors, and the simulations can run for weeks to produce vital results, which then must be analyzed and visualized. As the complexity of the simulations and computers increases, so does...
Article
Full-text available
Comprehensive, end-to-end, data and workflow management solutions are needed to handle the increasing complexity of processes and data volumes associated with modern distributed scientific problem solving, such as ultrascale simulations and high-throughput experiments. The key to the solution is an integrated network-based framework that is functio...
Article
Simulations of edge pressure pedestal buildup and ELM crash in a typical DIII-D H-mode discharge are performed using Kepler, an open-source scientific workflow system that manages complex applications. A Kepler workflow conducts an edge plasma simulation that loosely couples the kinetic code XGC0 with an ideal MHD linear stability analysis code ELI...

Network

Cited By

Projects

Project (1)