Eric W. Bridgeford

Eric W. Bridgeford
Johns Hopkins University | JHU · Department of Biostatistics

Bachelor of Science in Biomedical Engineering and Computer Science

About

36
Publications
8,858
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
408
Citations
Additional affiliations
August 2014 - present
Johns Hopkins University
Position
  • Researcher
Description
  • Working with the Open Connectome Project to provide a reliable service for uploading, analyzing, and comparing MR data. My goal is to incorporate fMRI processing techniques into our pipeline, so that we can effectively analyze both DTI and fMRI datasets.
May 2014 - February 2016
University of Pennsylvania
Position
  • Researcher
Description
  • Conducted Small World Network Analysis and tool development. Developed along with my coauthors the small world propensity as a quantitative tool to analyze the basic structure of weighted networks.
Education
August 2018 - August 2022

Publications

Publications (36)
Preprint
Why do networks have negative weights at all? The answer is: to learn more functions. We mathematically prove that deep neural networks with all non-negative weights are not universal approximators. This fundamental result is assumed by much of the deep learning literature without previously proving the result and demonstrating its necessity.
Preprint
Characterizing individual variations is central to interpreting individual differences in neuroscience and clinical studies. While the field has examined multifaceted individual differences in brain functional organization, it is only in recent years that neuroimaging researchers have begun to place a priority on its quantification and optimization...
Preprint
Full-text available
Connectomics-the study of brain networks-provides a unique and valuable opportunity to study the brain. However, research in human connectomics, accomplished via Magnetic Resonance Imaging (MRI), is a resource-intensive practice: typical analysis routines require impactful decision making and significant computational capabilities. Mitigating these...
Article
Full-text available
Replicability, the ability to replicate scientific findings, is a prerequisite for scientific discovery and clinical utility. Troublingly, we are in the midst of a replicability crisis. A key to replicability is that multiple measurements of the same item (e.g., experimental sample or clinical participant) under fixed experimental constraints are r...
Preprint
Full-text available
Batch effects, undesirable sources of variance across multiple experiments, present a substantial hurdle for scientific and clinical discoveries. Specifically, the presence of batch effects can create both spurious discoveries and hide veridical signals, contributing to the ongoing reproducibility crisis. Typical approaches to dealing with batch ef...
Article
Full-text available
The alignment strength of a graph matching is a quantity that gives the practitioner a measure of the correlation of the two graphs, and it can also give the practitioner a sense for whether the graph matching algorithm found the true matching. Unfortunately, when a graph matching algorithm fails to find the truth because of weak signal, there may...
Article
Full-text available
To solve key biomedical problems, experimentalists now routinely measure millions or billions of features (dimensions) per sample, with the hope that data science techniques will be able to build accurate data-driven inferences. Because sample sizes are typically orders of magnitude smaller than the dimensionality of these data, valid inferences re...
Article
Full-text available
Using brain atlases to localize regions of interest is a requirement for making neuroscientifically valid statistical inferences. These atlases, represented in volumetric or surface coordinate spaces, can describe brain topology from a variety of perspectives. Although many human brain atlases have circulated the field over the past fifty years, li...
Article
The data science of networks is a rapidly developing field with myriad applications. In neuroscience, the brain is commonly modeled as a connectome, a network of nodes connected by edges. While there have been thousands of papers on connectomics, the statistics of networks remains limited and poorly understood. Here, we provide an overview from the...
Preprint
Full-text available
The alignment strength of a graph matching is a quantity that gives the practitioner a measure of the correlation of the two graphs, and it can also give the practitioner a sense for whether the graph matching algorithm found the true matching. Unfortunately, when a graph matching algorithm fails to find the truth because of weak signal, there may...
Preprint
Full-text available
A connectome is a map of the structural and/or functional connections in the brain. This information-rich representation has the potential to transform our understanding of the relationship between patterns in brain connectivity and neurological processes, disorders, and diseases. However, existing computational techniques used to analyze connectom...
Preprint
The data science of networks is a rapidly developing field with myriad applications. In neuroscience, the brain is commonly modeled as a connectome, a network of nodes connected by edges. While there have been thousands of papers on connectomics, the statistics of networks remains limited and poorly understood. Here, we provide an overview from the...
Preprint
The advent of modern data collection and processing techniques has seen the size, scale, and complexity of data grow exponentially. A seminal step in leveraging these rich datasets for downstream inference is understanding the characteristics of the data which are repeatable -- the aspects of the data that are able to be identified under a duplicat...
Article
Full-text available
Suppose that one particular block in a stochastic block model is of interest,but block labels are only observed for a few of the vertices in the network. Utilizing a graph realized from the model and the observed block labels, the vertex nomination task is to order the vertices with unobserved block labels into a ranked nomination list with the goa...
Preprint
Full-text available
Using brain atlases to localize regions of interest is a required for making neuroscientifically valid statistical inferences. These atlases, represented in volumetric or surface coordinate spaces, can describe brain topology from a variety of perspectives. Although many human brain atlases have circulated the field over the past fifty years, limit...
Preprint
Full-text available
Replicability, the ability to replicate scientific findings, is a prerequisite for scientific discovery and clinical utility. Troublingly, we are in the midst of a replicability crisis. A key to replicability is that multiple measurements of the same item (e.g., experimental sample or clinical participant) under fixed experimental constraints are r...
Article
Full-text available
We introduce graspy, a Python library devoted to statistical inference, machine learning, and visualization of random graphs and graph populations. This package provides flexible and easy-to-use algorithms for analyzing and understanding graphs with a sklearn compliant API. graspy can be downloaded from Python Package Index (PyPi), and is released...
Preprint
Full-text available
With the increase in the amount of data in many fields, a method to consistently and efficiently decipher relationships within high dimensional data sets is important. Because many modern datasets are high-dimensional, univariate independence tests are not applicable. While many multivariate independence tests have R packages available, the interfa...
Preprint
Full-text available
Cognitive phenotypes characterize our memories, beliefs, skills, and preferences, and arise from our ancestral, developmental, and experiential histories. These histories are written into our brain structure through the building and modification of various brain circuits. Connectal coding, by way of analogy with neural coding, is the art, study, an...
Article
Cognitive phenotypes characterize our memories, beliefs, skills, and preferences, and arise from our ancestral, developmental, and experiential histories. These histories are written into our brain structure through the building and modification of various brain circuits. Connectal coding, by way of analogy with neural coding, is the art, study, an...
Preprint
Full-text available
We introduce GraSPy, a Python library devoted to statistical inference, machine learning, and visualization of random graphs and graph populations. This package provides flexible and easy-to-use algorithms for analyzing and understanding graphs with a scikit-learn compliant API. GraSPy can be downloaded from Python Package Index (PyPi), and is rele...
Article
Full-text available
Clustering is concerned with coherently grouping observations without any explicit concept of true groupings. Spectral graph clustering clustering the vertices of a graph based on their spectral embedding is commonly approached via K-means (or, more generally, Gaussian mixture model) clustering composed with either Laplacian or Adjacency spectral e...
Article
Full-text available
Understanding the relationships between different properties of data, such as whether a genome or connectome has information about disease status, is increasingly important. While existing approaches can test whether two properties are related, they may require unfeasibly large sample sizes and often are not interpretable. Our approach, 'Multiscale...
Preprint
Full-text available
Classifying samples into categories becomes intractable when a single sample can have millions to billions of features, such as in genetics or imaging data. Principal Components Analysis (PCA) is widely used to identify a low-dimensional representation of such features for further analysis. However , PCA ignores class labels, such as whether or not...
Article
Full-text available
Big imaging data is becoming more prominent in brain sciences across spatiotemporal scales and phylogenies. We have developed a computational ecosystem that enables storage, visualization, and analysis of these data in the cloud, thusfar spanning 20+ publications and 100+ terabytes including nanoscale ultrastructure, microscale synaptogenetic diver...
Preprint
Full-text available
Clustering is concerned with coherently grouping observations without any explicit concept of true groupings. Spectral graph clustering - clustering the vertices of a graph based on their spectral embedding - is commonly approached via K-means (or, more generally, Gaussian mixture model) clustering composed with either Laplacian or Adjacency spectr...
Preprint
Analogous to the neural code, which is a model that characterizes the relationships between brain activity and sensory input and motor output, the connectome code is a model that characterizes the relationship between brain structure and brain or body activities, encoded in either plasticity or development. We describe the basic statistical formul...
Technical Report
In modern scientific discovery, it is becoming increasingly critical to uncover whether one property of a dataset is related to another. The MGC (pronounced magic), or Multiscale Graph Correlation, provides a framework for investigation into the relationships between properties of a dataset and the underlying geometries of the relationships, all wh...
Technical Report
Supervised learning techniques designed for the situation when the dimensionality exceeds the sample size have a tendency to overfit as the dimensionality of the data increases. To remedy this High dimensionality; low sample size (HDLSS) situation, we attempt to learn a lower-dimensional representation of the data before learning a classifier. That...
Preprint
Full-text available
Determining how certain properties are related to other properties is fundamental to scientific discovery. As data collection rates accelerate, it is becoming increasingly dicult, yet ever more important , to determine whether one property of data (e.g., cloud density) is related to another (e.g., grass wetness). Only if two properties are related...
Preprint
Full-text available
The connectivity of the human brain is fundamental to understanding the principles of cognitive function, and the mechanisms by which it can go awry. To that extent, tools for estimating human brain networks are required for single participant, group level, and cross-study analyses. We have developed an open-source, cloud-enabled, turn-key pipeline...
Article
Full-text available
Neuroscientists are now able to acquire data at staggering rates across spatiotemporal scales. However, our ability to capitalize on existing datasets, tools, and intellectual capacities is hampered by technical challenges. The key barriers to accelerating scientific discovery correspond to the FAIR data principles: findability, global access to da...
Article
Full-text available
Quantitative descriptions of network structure can provide fundamental insights into the function of interconnected complex systems. Small-world structure, diagnosed by high local clustering yet short average path length between any two nodes, promotes information flow in coupled systems, a key function that can differ across conditions or between...
Article
Quantitative descriptions of network structure in big data can provide fundamental insights into the function of interconnected complex systems. Small-world structure, commonly diagnosed by high local clustering yet short average path length between any two nodes, directly enables information flow in coupled systems, a key function that can differ...

Network

Cited By

Projects

Projects (4)
Project
Development of statistical methodologies for identifying dependent structures at optimal sample sizes.
Project
Identifying optimal linear embedding techniques for high dimension; low sample size (HDLSS) classification problems.