ArticlePDF Available

BrainGlobe Atlas API: a common interface for neuroanatomical atlases

Authors:

Abstract

Neuroscientists routinely perform experiments aimed at recording or manipulating neural activity, uncovering physiological processes underlying brain function or elucidating aspects of brain anatomy. Understanding how the brain generates behaviour ultimately depends on merging the results of these experiments into a unified picture of brain anatomy and function. Brain atlases are crucial in this endeavour: by outlining the organization of brain regions they provide a reference upon which our understanding of brain function can be anchored. More recently, digital high-resolution 3d atlases have been produced for several model organisms providing an invaluable resource for the research community. Effective use of these atlases depends on the availability of an application programming interface (API) that enables researchers to develop software to access and query atlas data. However, while some atlases come with an API, these are generally specific for individual atlases, and this hinders the development and adoption of open-source neuroanatomy software. The BrainGlobe atlas API (BG-Atlas API) overcomes this problem by providing a common interface for programmers to download and process data across a variety of model organisms. By adopting the BG-Atlas API, software can then be developed agnostic to the atlas, increasing adoption and interoperability of packages in neuroscience and enabling direct integration of different experimental modalities and even comparisons across model organisms.
BrainGlobe Atlas API: a common interface for
neuroanatomical atlases
Federico Claudi1, Luigi Petrucco*2, 3, Adam L. Tyson*1, Tiago
Branco1, Troy W. Margrie1, and Ruben Portugues2, 3, 4
1Sainsbury Wellcome Centre, University College London, London, U.K. 2Institute of Neuroscience,
Technical University of Munich, Munich, Germany 3Max Planck Institute of Neurobiology, Research
Group of Sensorimotor Control, Martinsried, Germany 4Munich Cluster for Systems Neurology
(SyNergy), Munich, Germany
DOI: 10.21105/joss.02668
Software
Review
Repository
Archive
Editor: Olivia Guest
Reviewers:
@typically
@vitay
Submitted: 04 September 2020
Published: 05 October 2020
License
Authors of papers retain
copyright and release the work
under a Creative Commons
Attribution 4.0 International
License (CC BY 4.0).
Summary
Neuroscientists routinely perform experiments aimed at recording or manipulating neural activ-
ity, uncovering physiological processes underlying brain function or elucidating aspects of brain
anatomy. Understanding how the brain generates behaviour ultimately depends on merging
the results of these experiments into a unied picture of brain anatomy and function. Brain
atlases are crucial in this endeavour: by outlining the organization of brain regions they provide
a reference upon which our understanding of brain function can be anchored. More recently,
digital high-resolution 3d atlases have been produced for several model organisms providing
an invaluable resource for the research community. Eective use of these atlases depends
on the availability of an application programming interface (API) that enables researchers to
develop software to access and query atlas data. However, while some atlases come with an
API, these are generally specic for individual atlases, and this hinders the development and
adoption of open-source neuroanatomy software. The BrainGlobe atlas API (BG-Atlas API)
overcomes this problem by providing a common interface for programmers to download and
process data across a variety of model organisms. By adopting the BG-Atlas API, software can
then be developed agnostic to the atlas, increasing adoption and interoperability of packages
in neuroscience and enabling direct integration of dierent experimental modalities and even
comparisons across model organisms.
Statement of need
To facilitate the study of neural function, a long-standing approach has been to identify
neuroanatomically dened brain regions: structures with dened function, connectivity and
anatomical location. The study of these brain regions led to the development of a number
of brain atlases for various species. Typically these atlases are made up of a reference image
of a brain, voxel-wise annotations (e.g. a mapping from each voxel to a brain structure) and
additional metadata such as region hierarchy (region A is a subdivision of region B). These
atlases are used throughout neuroscience, for teaching, visualisation of data, and registration
of imaging data to a common coordinate space.
Many excellent and open access atlases exist, such as the Allen Mouse Brain Common Coordi-
nate Framework (Wang et al., 2020) and the Max Planck Larval Zebrash Atlas (Kunst et al.,
2019), from which the neuroscience community benets enormously. These atlases provide
a valuable resource for individual scientists and enabled important open-science projects such
Joint rst author, ordered alphabetically
Claudi et al., (2020). BrainGlobe Atlas API: a common interface for neuroanatomical atlases. Journal of Open Source Software, 5(54), 2668.
https://doi.org/10.21105/joss.02668
1
as Janelia Campus’ Mouse Light project (Winnubst et al., 2019). Furthermore, for several at-
lases stand-alone software is available that can be used to explore the atlas’ data and requires
no coding experience, thus making the atlases accessible to a broader audience. However,
to be used in the context of new software (e.g. new visualization tools, or brain registration
pipelines) it is necessary that atlases expose their data through an API. Several commonly
used atlases come with APIs, but learning how to use each of them is a time-consuming
endavour and can require considerable coding experience. For this reason, often developers
produce software that works only with a specic atlas. A single and well documented API that
worked across atlases would thus lower the cost of developing new software, which can also
be made available for a larger number of scientists. An eort in this direction has been made
in the R ecosystem with the natverse package (Bates et al., 2020), but, to our knowledge,
no such option exists in Python, which is emerging as the programming language of choice in
neuroscience (Muller et al., 2015).
bg-atlasapi was built to address these issues and with two main design goals in mind. The
rst was to simplify the use of atlases for neuroscientists by providing a simple, concise and
well-documented API. The second was to reduce the burden required to develop tools that can
be used across atlases. The majority of neuroanatomical software tools developed currently
are for a single model organism, yet many of these tools could be of great use for many other
neuroscientists.
Developers can use bg-atlasapi to access data from multiple atlases in common formats.
Each atlas can be instantiated by passing the atlas name to the BrainGlobeAtlas class. A
number of les are provided as class attributes including a reference (structural) image, an
annotation image (a map of brain regions coded by voxel intensity), meshes for each brain
region, and various metadata such as the authors of the atlas, and the hierarchy of the brain
regions. There are methods for many common tasks such as orienting data and parsing the
region hierarchy.
Currently six atlases across three species (larval zebrash, mouse and human) are available
(Chon, Vanselow, Cheng, & Kim, 2019; Ding et al., 2016; Kunst et al., 2019; Wang et al.,
2020), with work underway to add further atlases (e.g. rat, drosophila). The available atlases
were created by parsing their relative online sources and restructuring the data to a standard
format. The atlases were then made accessible by hosting the data in a GNode respository
(https://gin.g-node.org/brainglobe/atlases). The python code used for generating
the atlases is also made available in a separate repository in the BrainGlobe organization:
bg-atlasgen. The same code can be used for easily developing new atlases in BG-AtlasAPI’s
format and we encourage users to contribute new atlases to the project by submitting new
scripts to bg-atlasgen.
BG-atlasAPI’s exible infrastructure already proved crucial in the development and extension
of two software tools for use in neuroscience: brainreg (Tyson, Rousseau, & Margrie, 2020)
for 3D registration of image data between sample and atlas coordinate space and brainrender
(Claudi, Tyson, & Branco, 2020) for 3D visualisation of both user-generated data and atlas
data. We hope that other developers will use the API, and develop tools that can be used
across neuroscience and other research elds, increasing their reach, and preventing duplication
of eort.
Acknowledgments
We would like to thank Nouwar Mokayes for the assistance in packaging the Max Planck Ze-
brash Brain Atlas within BG-atlasAPI. This work was supported by grants from the Gatsby
Charitable Foundation (GAT3361, T.W.M. and T.B.), Wellcome Trust (090843/F/09/Z,
T.W.M. and T.B.; 214333/Z/18/Z, T.W.M.; 214352/Z/18/Z, T.B.) and by the Deutsche
Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence
Claudi et al., (2020). BrainGlobe Atlas API: a common interface for neuroanatomical atlases. Journal of Open Source Software, 5(54), 2668.
https://doi.org/10.21105/joss.02668
2
Strategy within the framework of the Munich Cluster for Systems Neurology (EXC 2145 SyN-
ergy – ID 390857198).
References
Bates, A. S., Manton, J. D., Jagannathan, S. R., Costa, M., Schlegel, P., Rohlng, T., &
Jeeris, G. S. (2020). The natverse, a versatile toolbox for combining and analysing
neuroanatomical data. eLife,9, e53350. doi:10.7554/eLife.53350
Chon, U., Vanselow, D. J., Cheng, K. C., & Kim, Y. (2019). Enhanced and unied anatomical
labeling for a common mouse brain atlas. Nature communications,10(1), 1–12. doi:10.
1038/s41467-019-13057-w
Claudi, F., Tyson, A. L., & Branco, T. (2020). Brainrender. A python based software for
visualisation of neuroanatomical and morphological data. bioRxiv. doi:10.1101/2020.02.
23.961748
Ding, S.-L., Royall, J. J., Sunkin, S. M., Ng, L., Facer, B. A., Lesnar, P., Guillozet-Bongaarts,
A., et al. (2016). Comprehensive cellular-resolution atlas of the adult human brain.
Journal of Comparative Neurology,524(16), 3127–3481. doi:10.1002/cne.24080
Kunst, M., Laurell, E., Mokayes, N., Kramer, A., Kubo, F., Fernandes, A. M., Förster, D.,
et al. (2019). A cellular-resolution atlas of the larval zebrash brain. Neuron,103(1),
21–38.e5. doi:10.1016/j.neuron.2019.04.034
Muller, E., Bednar, J. A., Diesmann, M., Gewaltig, M.-O., Hines, M., & Davison, A. P.
(2015). Python in neuroscience. Frontiers in Neuroinformatics,9, 11. doi:10.3389/fninf.
2015.00011
Tyson, A. L., Rousseau, C. V., & Margrie, T. W. (2020). brainreg: automated 3D brain
registration with support for multiple species and atlases. Zenodo. doi:10.5281/zenodo.
3991718
Wang, Q., Ding, S. L., Li, Y., Royall, J., Feng, D., Lesnar, P., Graddis, N., et al. (2020). The
Allen Mouse Brain Common Coordinate Framework: A 3D Reference Atlas. Cell,181(4),
936–953.e20. doi:10.1016/j.cell.2020.04.007
Winnubst, J., Bas, E., Ferreira, T. A., Wu, Z., Economo, M. N., Edson, P., Arthur, B. J.,
et al. (2019). Reconstruction of 1,000 projection neurons reveals new cell types and
organization of Long-Range connectivity in the mouse brain. Cell,179 (1), 268–281.e13.
doi:10.1016/j.cell.2019.07.042
Claudi et al., (2020). BrainGlobe Atlas API: a common interface for neuroanatomical atlases. Journal of Open Source Software, 5(54), 2668.
https://doi.org/10.21105/joss.02668
3
... Digital, interactive 3D atlases already exist for common model-organisms such as the mouse (Wang et al. 2020) or zebrafish (Kenney et al. 2021). These atlases, along with many others, have been incorporated into the BrainGlobe Atlas API (Claudi et al. 2020) enabling automatic registration of sample brains or mapping implanted devices and labelled cell populations in a common reference space using open-source processing tools (Niedworok et al. 2016;Tyson et al. 2021;. For bird brains, all presently available online atlases have significant limitations. ...
... The finalized atlas is freely available at https://gin.g-node.org/BrainGlobe/atlases and can be used programmatically through the BrainGlobe Atlas API (Claudi et al. 2020) under the atlas name "eurasian_blackcap_25_ µm". The atlas is available for use within the BrainGlobe ecosystem (https://brainglobe.info/documentation) and other open-source tools built by the community (https://brainglobe.info/community/externaltools) such as ABBA (Chiaruttini et al. 2024). ...
... We manually cropped each image to tightly enclose the brain tissue and downsampled them to an isotropic resolution of 25 µm. The images were reoriented to conform to BrainGlobe's ASR convention-placing the origin at the anterior superior right corner and ordering the axes as anterior-posterior, superior-inferior, and right-left (Claudi et al. 2020). The reoriented images were then converted from TIFF to NIfTI format for compatibility with the Advanced Normalization Tools (ANTs) software suite (Tustison et al. 2021). ...
Preprint
Full-text available
Birds undisputedly range amongst nature's foremost navigators. To successfully navigate between breeding and wintering quarters, they, in addition to other natural orientation cues, rely on their ability to sense the Earth's magnetic field. For this reason, migratory birds have become key model species for studying the sensory mechanisms underlying magnetic field-guided navigation, as evidenced by the identification of several brain regions believed to be involved in processing magnetic field information. However, there is as yet no readily accessible, high-resolution three-dimensional (3D) brain atlas to serve as a common reference within and across studies. Here we provide the neuroscience research community with the first freely available, digital, high-resolution (25 μm), 3D bird brain atlas. It is based on light microscopy images from ten Eurasian blackcaps ( Sylvia atricapilla ), a night-migratory songbird widely used model species in magnetoreception and navigation research. We outline the individual steps for the creation of a brain atlas, from whole-brain imaging using serial-section, two-photon tomography, to the creation of an average template at an isotropic 25-μm voxel size, and finally to brain area segmentation and annotation. In this first version of the atlas, we have mapped a total of 24 brain areas, including 6 principal compartments, 13 conspicuous anatomical subdivisions common to all bird species and 5 functionally defined areas of the visual and trigeminal sensory systems involved in processing magnetic field information. This atlas is accessible via the standardised BrainGlobe Atlas API, making it compatible with a growing suite of computational neuroanatomy tools provided by the BrainGlobe Initiative. This integration enables precise alignment of future experimental data to a common coordinate space, facilitating collaboration, data visualization and sharing. Furthermore, this resource enables the accurate localization and comparison of implanted devices, injection sites, and/or cell populations across individual brains, both within and across studies.
... Importantly, only brains that had viral transduction levels of the white matter below 0.1% of the total bolus volume were included (Figure 1-figure supplement 2B). Following ex vivo two-photon tomography (Ragan et al., 2012) and 3D brain registration (Niedworok et al., 2016) detected cell nuclei were assigned according to the cortical areas of the Allen Mouse Brain Common Coordinate Framework (Claudi et al., 2020;Wang et al., 2020, CCFv3, Figure 1B, excluding the targeted injection area). We found that the vast majority of neurons (>99%) projecting to all target areas were nonoverlapping with GAD-expressing cells (Figure 1-figure supplement 4A). ...
... Images were registered to the Allen Mouse Brain Common Coordinate Framework (Wang et al., 2020) using the software brainreg (Tyson et al., 2022) based on the aMAP algorithm (Niedworok et al., 2016). All atlas data were provided by the BrainGlobe Atlas API (Claudi et al., 2020). For registration, the sample image data were initially down-sampled to the voxel spacing of the atlas used and reoriented to align with the atlas orientation using bg-space (https://doi.org/10.5281/zenodo. ...
Article
Full-text available
The neocortex comprises anatomically discrete yet interconnected areas that are symmetrically located across the two hemispheres. Determining the logic of these macrocircuits is necessary for understanding high level brain function. Here in mice, we have mapped the areal and laminar organization of the ipsi- and contralateral cortical projection onto the primary visual, somatosensory, and motor cortices. We find that although the ipsilateral hemisphere is the primary source of cortical input, there is substantial contralateral symmetry regarding the relative contribution and areal identity of input. Laminar analysis of these input areas show that excitatory Layer 6 corticocortical cells (L6 CCs) are a major projection pathway within and between the two hemispheres. Analysis of the relative contribution of inputs from supra- (feedforward) and infragranular (feedback) layers reveals that contra-hemispheric projections reflect a dominant feedback organization compared to their ipsi-cortical counterpart. The magnitude of the interhemispheric difference in hierarchy was largest for sensory and motor projection areas compared to frontal, medial, or lateral brain areas due to a proportional increase in input from L6 neurons. L6 CCs therefore not only mediate long-range cortical communication but also reflect its inherent feedback organization.
... Our pre-trained WNet3D generalizes quite favorably on most datasets, and on average has the highest F1-Score on each individual dataset (Table 1, Figure 3a Lastly, as a worked example, we tested our pre-trained WNet3D on mouse whole-brain tissue that was cleared and stained with cFOS then imaged with a mesoSPIM microscope (Figure 4a and b; see Methods). We used the BrainReg (Tyson et al., 2022;Niedworok et al., 2016;Claudi et al., 2020) registration toolkit to align our sample to the Allen Institute Brain Atlas (https://mouse.brain-map. org/). ...
Article
Full-text available
Understanding the complex three-dimensional structure of cells is crucial across many disciplines in biology and especially in neuroscience. Here, we introduce a set of models including a 3D transformer (SwinUNetR) and a novel 3D self-supervised learning method (WNet3D) designed to address the inherent complexity of generating 3D ground truth data and quantifying nuclei in 3D volumes. We developed a Python package called CellSeg3D that provides access to these models in Jupyter Notebooks and in a napari GUI plugin. Recognizing the scarcity of high-quality 3D ground truth data, we created a fully human-annotated mesoSPIM dataset to advance evaluation and benchmarking in the field. To assess model performance, we benchmarked our approach across four diverse datasets: the newly developed mesoSPIM dataset, a 3D platynereis-ISH-Nuclei confocal dataset, a separate 3D Platynereis-Nuclei light-sheet dataset, and a challenging and densely packed Mouse-Skull-Nuclei confocal dataset. We demonstrate that our self-supervised model, WNet3D – trained without any ground truth labels – achieves performance on par with state-of-the-art supervised methods, paving the way for broader applications in label-scarce biological contexts.
... Brain atlas development is a growing field: with new atlases being released and existing atlases extended on a continuous basis. To assist developments using these resources, the BrainGlobe initiative provides an overview of available atlases for small animal models and have created an Atlas API which compiles these atlases and their metadata as a resource for developers (https://brainglobe.info/) 59 . We are in communication with them regarding future API developments, with potential to expand our atlas repertoire to match this collection in future releases. ...
Preprint
Full-text available
Advancements in methodologies for large-scale acquisition of high-resolution serial microscopy image data have opened new possibilities for experimental studies of cellular and subcellular features across whole brains in animal models. There is a high demand for open-source software and workflows for automated or semi-automated analysis of such data, facilitating anatomical, functional, and molecular mapping in healthy and diseased brains. These studies share a common need to precisely identify, visualize, and quantify the location of observations within anatomically defined regions, ensuring consistent and reproducible interpretation of anatomical locations and thereby allowing meaningful comparisons of results across multiple independent studies. Addressing this need, we have developed a suite of desktop and web-applications for registration of serial brain section images to three-dimensional brain reference atlases (QuickNII, VisuAlign, WebAlign, WebWarp, and DeepSlice) and for performing data analysis in a spatial context provided by an atlas (Nutil, QCAlign, SeriesZoom, LocaliZoom, and MeshView). The software can be utilized in various combinations, creating customized analytical pipelines suited to specific research needs. The web-applications are integrated in the EBRAINS research infrastructure and coupled to the EBRAINS data platform, establishing the foundation for an online analytical workbench. We here present our software ecosystem, exemplify its use by the research community, and discuss possible directions for future developments.
... The atlas we developed has also been integrated into the BrainGlobe platform (Claudi et al., 2020), enhancing its accessibility and utility for the neuroscience community (https://brainglobe.info/index.html). BrainGlobe offers a comprehensive suite of Python-based tools designed for computational neuroanatomy, making it an ideal platform for exploring and utilizing our mouse brain atlas. ...
Article
Full-text available
Brain atlases are indispensable tools for quantifying cellular composition across mouse brain regions. The widely used Common Coordinate Framework version 3 (CCFv3) from the Allen Institute delineates over 600 anatomical regions but lacks coverage of the most rostral and caudal brain areas, including the main olfactory bulb, cerebellum, and medulla. Additionally, the CCFv3 does not include annotations for key cerebellar layers, and its Nissl-stained reference volume is misaligned, limiting its efficiency. To overcome these limitations, we developed the Blue Brain Project (BBP) CCFv3 augmented atlas (CCFv3BBP), which includes a fully annotated mouse brain and an improved Nissl-stained reference volume aligned with the CCFv3BBP. This enhanced atlas also features the central nervous system annotation. Building on this enhanced resource, we aligned 734 Nissl-stained brains to generate an average Nissl template at 10 µm resolution. This new atlas version enabled the construction of the first comprehensive in silico model of cell distribution across the whole mouse central nervous system. This open-access resource broadens the applicability of brain atlases, supporting advancements in alignment accuracy, cell type mapping, and multimodal data integration.
... Imaging intact cleared brain tissue was done using light-sheet microscopy. To automatically identify c-fos-positive cells in the brain, we used the open-source CellFinder package (Tyson et al., 2021) that is part of the BrainGlobe suite of Python-based software tools (Claudi et al., 2020). Finally, to automatically parcellate the brain into individual regions, we used advanced normalization tools (ANTs; Avants et al., 2009) to register autofluorescence images to AZBA (Kenney et al., 2021). ...
Article
Full-text available
Zebrafish have gained prominence as a model organism in neuroscience over the past several decades, generating key insight into the development and functioning of the vertebrate brain. However, techniques for whole-brain mapping in adult stage zebrafish are lacking. Here, we describe a pipeline built using open-source tools for whole-brain activity mapping in adult zebrafish. Our pipeline combines advances in histology, microscopy, and machine learning to capture c-fos activity across the entirety of the brain. Following tissue clearing, whole-brain images are captured using light-sheet microscopy and registered to the recently created adult zebrafish brain atlas (AZBA) for automated segmentation. By way of example, we used our pipeline to measure brain activity after zebrafish were subject to the novel tank test, one of the most widely used behaviors in adult zebrafish. c-fos levels peaked 15 min following behavior and several regions, including those containing serotoninergic and dopaminergic neurons, were active during exploration. Finally, we generated a novel tank test functional brain network. This revealed that several regions of the subpallium form a cohesive subnetwork during exploration. Functional interconnections between the subpallium and other regions appear to be mediated primarily by ventral nucleus of the ventral telencephalon (Vv), the olfactory bulb, and the anterior part of the parvocellular preoptic nucleus (PPa). Taken together, our pipeline enables whole-brain activity mapping in adult zebrafish while providing insight into neural basis for the novel tank test.
... 135 Images were then registered to the Allen Mouse Brain Common Coordinate Framework version 3 (CCFv3, 25 mm resolution) 136 using brainreg 127,137 and the BrainGlobe Atlas API. 138 Probe tracks were confirmed by DiI fluorescence and were traced using brainglobe-segmentation. 127 The position of the probe tracks in a common reference framework was rendered using brainrender. 128 To approximate the precise location of individual electrode sites, we used additional electrophysiological information. ...
Article
Full-text available
Knowing whether we are moving or something in the world is moving around us is possibly the most critical sensory discrimination we need to perform. How the brain and, in particular, the visual system solves this motion-source separation problem is not known. Here, we find that motor, vestibular, and visual motion signals are used by the mouse primary visual cortex (VISp) to differentially represent the same visual flow information according to whether the head is stationary or experiencing passive versus active translation. During locomotion, we find that running suppresses running-congruent translation input and that translation signals dominate VISp activity when running and translation speed become incongruent. This cross-modal interaction between the motor and vestibular systems was found throughout the cortex, indicating that running and translation signals provide a brain-wide egocentric reference frame for computing the internally generated and actual speed of self when moving through and sensing the external world.
Preprint
Full-text available
Cajal-Retzius (CR) cells are glutamatergic neurons that transiently populate the most superficial layer of the isocortex and allocortex during development, serving an essential role during both prenatal and early postnatal brain development. Notably, these cells disappear from most cortical areas by postnatal day 14, but persist for much longer in the hippocampus. We developed a novel intersectional genetic labeling approach for CR cells that captures almost all of the TRP73-positive CR cells throughout the isocortex and allocortex. This intersectional strategy offers several advantages over previous methods commonly used for CR cell targeting. Here, we applied this new CR cell labeling strategy to investigate the distribution and persistence of CR cells throughout the whole mouse brain, at four different postnatal ages. We observed that the initial CR cell density and the rate of their disappearance varies substantially across different brain areas during development. Strikingly, we observed variation in cell death rate even between adjacent cortical subregions: comparing the medial and the lateral entorhinal cortex, the former retains a high density of CR cells for several months in contrast to the latter. Our results present a necessary revision of the phenomenon of CR cell persistence, showing that, in addition to hippocampus, several other cortical areas maintain a high density of these cells beyond the first two postnatal weeks.
Article
Full-text available
Identifying brain-wide neural circuits and targeting these areas for neuropharmacological interventions are significant challenges in contemporary neuroscience. Traditional methods for registering and quantifying fluorescence in brain slices are labor-intensive and struggle to extract functional insights from complex datasets. To address these challenges, we introduce Brainways—an AI-based, open-source software that streamlines neural network identification from digital imaging to network analysis. Brainways facilitates neurobiological research by enabling automatic registration of coronal brain slices to any 3D brain atlas, along with precise quantification of fluorescent markers, such as activity markers and tracers, across brain regions. Brainways incorporates advanced statistical tools to identify neural patterns and functional networks associated with specific experimental contrasts. Trained on rat and mouse brain atlases, Brainways achieves over 93% atlas registration accuracy. The software also allows users to easily adjust the automatic registration through a user-friendly interface for enhanced accuracy. We present two experiment analyses demonstrating Brainways’ capabilities. The first replicates and extends findings from a prior experiment on pro-social behavior in rats, wherein rats learned to free a trapped cagemate from a restrainer under ingroup and outgroup social conditions. Using Brainways, we analyzed approximately 300 times more tissue area than in our previous manual approach. The second experiment utilizes Multiplex RNAscope imaging for whole-brain registration, enabling combined quantification of cell type expression and activity markers. These analyses highlight Brainways’ ability to link specific cell types and their activity to task conditions, providing detailed neural insights. Brainways offers a rapid and accurate solution for large-scale neurobiological projects, creating new opportunities to understand neural networks underlying complex behaviors.
Article
Full-text available
Machine learning research has achieved large performance gains on a wide range of tasks by expanding the learning target from mean rewards to entire probability distributions of rewards—an approach known as distributional reinforcement learning (RL)¹. The mesolimbic dopamine system is thought to underlie RL in the mammalian brain by updating a representation of mean value in the striatum², but little is known about whether, where and how neurons in this circuit encode information about higher-order moments of reward distributions³. Here, to fill this gap, we used high-density probes (Neuropixels) to record striatal activity from mice performing a classical conditioning task in which reward mean, reward variance and stimulus identity were independently manipulated. In contrast to traditional RL accounts, we found robust evidence for abstract encoding of variance in the striatum. Chronic ablation of dopamine inputs disorganized these distributional representations in the striatum without interfering with mean value coding. Two-photon calcium imaging and optogenetics revealed that the two major classes of striatal medium spiny neurons—D1 and D2—contributed to this code by preferentially encoding the right and left tails of the reward distribution, respectively. We synthesize these findings into a new model of the striatum and mesolimbic dopamine that harnesses the opponency between D1 and D2 medium spiny neurons4, 5, 6, 7, 8–9 to reap the computational benefits of distributional RL.
Article
Full-text available
To analyse neuron data at scale, neuroscientists expend substantial effort reading documentation, installing dependencies and moving between analysis and visualisation environments. To facilitate this, we have developed a suite of interoperable open-source R packages called the natverse. The natverse allows users to read local and remote data, perform popular analyses including visualisation and clustering and graph-theoretic analysis of neuronal branching. Unlike most tools, the natverse enables comparison across many neurons of morphology and connectivity after imaging or co-registration within a common template space. The natverse also enables transformations between different template spaces and imaging modalities. We demonstrate tools that integrate the vast majority of Drosophila neuroanatomical light microscopy and electron microscopy connectomic datasets. The natverse is an easy-to-use environment for neuroscientists to solve complex, large-scale analysis challenges as well as an open platform to create new code and packages to share with the community.
Preprint
Full-text available
Here we present brainrender, an open source python package for rendering three-dimensional neuroanatomical data aligned to the Allen Mouse Atlas. Brainrender can be used to explore, visualise and compare data from publicly available datasets (e.g. from the Mouse Light project from Janelia) as well as data generated within individual laboratories. Brainrender facilitates the exploration of neuroanatomical data with three-dimensional renderings, aiding the design and interpretation of experiments and the dissemination of anatomical findings. Additionally, brainrender can also be used to generate high-quality, publication-ready, figures for scientific publications.
Article
Full-text available
Anatomical atlases in standard coordinates are necessary for the interpretation and integration of research findings in a common spatial context. However, the two most-used mouse brain atlases, the Franklin-Paxinos (FP) and the common coordinate framework (CCF) from the Allen Institute for Brain Science, have accumulated inconsistencies in anatomical delineations and nomenclature, creating confusion among neuroscientists. To overcome these issues, we adopt here the FP labels into the CCF to merge the labels in the single atlas framework. We use cell type-specific transgenic mice and an MRI atlas to adjust and further segment our labels. Moreover, detailed segmentations are added to the dorsal striatum using cortico-striatal connectivity data. Lastly, we digitize our anatomical labels based on the Allen ontology, create a web-interface for visualization, and provide tools for comprehensive comparisons between the CCF and FP labels. Our open-source labels signify a key step towards a unified mouse brain atlas.
Article
Full-text available
Detailed anatomical understanding of the human brain is essential for unraveling its functional architecture, yet current reference atlases have major limitations in terms of lack of whole-brain coverage, relatively low image resolution, and sparse structural annotation. We present the first digital human brain atlas to incorporate neuroimaging, high-resolution histology, and chemoarchitecture across a complete adult female brain, consisting of MRI, DWI, and 1356 large-format cellular resolution (1 µm/pixel) Nissl and immunohistochemistry anatomical plates. The atlas is comprehensively annotated for 862 structures, including 117 white matter tracts and several novel cyto- and chemoarchitecturally defined structures, and these annotations were transferred onto the matching MRI dataset. Neocortical delineations were done for sulci, gyri, and modified Brodmann areas to link macroscopic anatomical and microscopic cytoarchitectural parcellations. Correlated neuroimaging and histological structural delineation allowed fine feature identification in MRI data and subsequent structural identification in MRI data from other brains. This interactive online digital atlas is integrated with existing Allen Institute for Brain Science gene expression atlases and is publicly accessible as a resource for the neuroscience community. This article is protected by copyright. All rights reserved.
Article
Recent large-scale collaborations are generating major surveys of cell types and connections in the mouse brain, collecting large amounts of data across modalities, spatial scales, and brain areas. Successful integration of these data requires a standard 3D reference atlas. Here, we present the Allen Mouse Brain Common Coordinate Framework (CCFv3) as such a resource. We constructed an average template brain at 10 μm voxel resolution by interpolating high resolution in-plane serial two-photon tomography images with 100 μm z-sampling from 1,675 young adult C57BL/6J mice. Then, using multimodal reference data, we parcellated the entire brain directly in 3D, labeling every voxel with a brain structure spanning 43 isocortical areas and their layers, 329 subcortical gray matter structures, 81 fiber tracts, and 8 ventricular structures. CCFv3 can be used to analyze, visualize, and integrate multimodal and multiscale datasets in 3D and is openly accessible (https://atlas.brain-map.org/).
Article
Neuronal cell types are the nodes of neural circuits that determine the flow of information within the brain. Neuronal morphology, especially the shape of the axonal arbor, provides an essential descriptor of cell type and reveals how individual neurons route their output across the brain. Despite the importance of morphology, few projection neurons in the mouse brain have been reconstructed in their entirety. Here we present a robust and efficient platform for imaging and reconstructing complete neuronal morphologies, including axonal arbors that span substantial portions of the brain. We used this platform to reconstruct more than 1,000 projection neurons in the motor cortex, thalamus, subiculum, and hypothalamus. Together, the reconstructed neurons constitute more than 85 meters of axonal length and are available in a searchable online database. Axonal shapes revealed previously unknown subtypes of projection neurons and suggest organizational principles of long-range connectivity.
Article
Understanding brain-wide neuronal dynamics requires a detailed map of the underlying circuit architecture. We built an interactive cellular-resolution atlas of the zebrafish brain at 6 days post-fertilization (dpf) based on the reconstructions of over 2,000 individually GFP-labeled neurons. We clustered our dataset in “morphotypes,” establishing a unique database of quantitatively described neuronal morphologies together with their spatial coordinates in vivo. Over 100 transgene expression patterns were imaged separately and co-registered with the single-neuron atlas. By annotating 72 non-overlapping brain regions, we generated from our dataset an inter-areal wiring diagram of the larval brain, which serves as ground truth for synapse-scale, electron microscopic reconstructions. Interrogating our atlas by “virtual tract tracing” has already revealed previously unknown wiring principles in the tectum and the cerebellum. In conclusion, we present here an evolving computational resource and visualization tool, which will be essential to map function to structure in a vertebrate brain. Video Abstract eyJraWQiOiI4ZjUxYWNhY2IzYjhiNjNlNzFlYmIzYWFmYTU5NmZmYyIsImFsZyI6IlJTMjU2In0.eyJzdWIiOiI4OTYzMjc1OGNkMmM2MTc2Yjc4YjY0NTQ4MWI5NWE0MSIsImtpZCI6IjhmNTFhY2FjYjNiOGI2M2U3MWViYjNhYWZhNTk2ZmZjIiwiZXhwIjoxNTkwNTIwMjU5fQ.WWWcYqmCfAylt2GEFvqF8UbIH80NyOKXGZ73alX_MIyynUOwOVH5Zxsf9V125VEN2EAMrtpwnQDkshSR7_QEW5eVQtvRs8xmtSwWmN8sc1P24Uqeb80F05XLe4GNdQFa3qGOq81_z2uvZ3oeej9MN_JZNQfG1vHwL97uj4DqYDt1asv_mAztxGkpFcy5t5O1xVP3ueIXwSGlF3IC9E8eNL35FeFUyXeD4fMecRCNF_a-bfZTt1qmGQrJ8pcOG_ZFxymnFZryElZV34_l4LY4EdizbQ_g-D0KhRJJiDMEKvUteI-TqiTiNRFpr2WMtx-4RT5yrLPWfqtKuD7J5V47_Q (mp4, (47.56 MB) Download video
brainreg: automated 3D brain registration with support for multiple species and atlases
  • A L Tyson
  • C V Rousseau
  • T W Margrie
Tyson, A. L., Rousseau, C. V., & Margrie, T. W. (2020). brainreg: automated 3D brain registration with support for multiple species and atlases. Zenodo. doi:10.5281/zenodo. 3991718