He Chen’s research while affiliated with Peking University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (14)


Figure 1. Neural population geometries in the visual memory pathway (A) Anatomical depiction of neural populations obtained from the 10 brain regions in nine macaques during the four different behavioral tasks in Exps. 1 to 4. (B-E) Rotational (B), curvy (C), straight (D), and unclear dynamics (E) detected by visual inspection. Single trajectory geometry was obtained in each brain region using neural population data. Number of neurons were 116-590 (Table S1). In A-E, the 10 brain regions are numbered as follows: 1. TE, 2. STRt, 3. PRC, 4. CDb, 5. HPC, 6. VS, 7. cOFC, 8. CDh&b, 9. PHC, and 10. mOFC. The 0.05 s time bin was used for the analysis. The time from visual stimulus appearance was shown in sec. Characters indicate stimulus conditions; Ib: best item; I2 to I7: second best to seventh best item; Iw: worst item; D: delay, M: magnitude; P: probability. See also Figures S1-S3 and Table S1.
Figure 2. Quantitative evaluation of geometric structures according to the rotational features (A) Schematic depictions of the estimation of accumulated angle difference weighted by the deviance, Sdq. The accumulated angle difference indicates the degree of geometric change in terms of the rotational force across time. Vector distance (d), rotational speed (q/0.1s), and start to endpoint distance (d S-E ) were also estimated. (B) Dendrogram estimated from these four parameter values based on bootstrap resampling across 10 neural populations. (C) Percentage of variance explained by PCA of bootstrap resampling data across 10 neural populations. (D) Clusters detected among the four parameters based on the PCA. Dots represent replicates composed of 20,000 (1000 replicates in each 10 brain regions times two task conditions). (E) Percentage of the identified clusters in each of the 10 brain regions. Each neural population contained two components of neural information: the best (B) and worst (W) conditions in Exps. 1 and 3, magnitude (M) and delay (D) of the rewards in Exp. 2, and magnitude (M) and probability (P) of rewards in Exp. 4. Colors on the atlas indicate geometry types on visual inspection in Figure 1A.
Figure 4. Summary of the observed dynamics and anatomical connections in the visual memory pathway (A) Geometries depicted in the same arbitrary scales on the PC1-2 plane for the eight neural populations shown in Figures 1B-1D. The start of the trajectory (S) is aligned to describe each trajectory. e indicates the end of the trajectory at 0.6 s. (B) Proportion of the clusters defined in each of the 10 brain regions are described with the anatomical connection. Reddish: rotational, greenish: curvy, bluish: straight dynamics. Data from CDh&b and CDb are merged (CD).
Formation of brain-wide neural geometry during visual item recognition in monkeys
  • Article
  • Full-text available

January 2025

·

127 Reads

·

1 Citation

iScience

He Chen

·

·

·

[...]

·

Neural dynamics are thought to reflect computations that relay and transform information in the brain. Previous studies have identified the neural population dynamics in many individual brain regions as a trajectory geometry, preserving a common computational motif. However, whether these populations share particular geometric patterns across brain-wide neural populations remains unclear. Here, by mapping neural dynamics widely across temporal/frontal/limbic regions in the cortical and subcortical structures of monkeys, we show that 10 neural populations, including 2,500 neurons, propagate visual item information in a stochastic manner. We found that visual inputs predominantly evoked rotational dynamics in the higher-order visual area, TE, and its downstream striatum tail, while curvy/straight dynamics appeared frequently downstream in the orbitofrontal/hippocampal network. These geometric changes were not deterministic but rather stochastic according to their respective emergence rates. Our meta-analysis results indicate that visual information propagates as a heterogeneous mixture of stochastic neural population signals in the brain.

Download


Formation of brain-wide neural geometry during visual item recognition in monkeys

August 2024

·

51 Reads

Neural dynamics reflect canonical computations that relay and transform information in the brain. Previous studies have identified the neural population dynamics in many individual brain regions as a trajectory geometry in a low-dimensional neural space. However, whether these populations share particular geometric patterns across brain-wide neural populations remains unclear. Here, by mapping neural dynamics widely across temporal/frontal/limbic regions in the cortical and subcortical structures of monkeys, we show that 10 neural populations, including 2,500 neurons, propagate visual item information in a stochastic manner. We found that the visual inputs predominantly evoked rotational dynamics in the higher-order visual area, the TE and its downstream striatum tail, while curvy/straight dynamics appeared more frequently downstream in the orbitofrontal/hippocampal network. These geometric changes were not deterministic but rather stochastic according to their respective emergence rates. These results indicated that visual information propagates as a heterogeneous mixture of stochastic neural population signals in the brain.




Figure 3. Example activity of neurons during the single-cue and ILR tasks. A, An example activity histogram of a cOFC neuron modulated by the probability and magnitude of rewards during the single-cue task. Activity aligned with cue onset is represented for three different levels of probability (P, 0.1-0.3, 0.4-0.7, 0.8-1.0) and magnitude (M, 0.1-0.3 ml, 0.4-0.7 ml, 0.8-1.0 ml) of rewards. Gray hatched areas indicate the 1 s time window used to estimate the neural firing rates shown in B. Histograms smoothed using a Gaussian kernel (s ¼ 50 ms). B, An activity plot of the cOFC neuron during the 1 s time window shown in A against the probability and magnitude of rewards. C, The percentage of neural modulation types detected in 1 s time window shown in A: the P, M, Both, and NO. D, Percentages of neural modulation type detected in the 0.02 s time bins during the 1.0 s after cue onset. Calibration: 0.2 s. E, Regression coefficient plots for the probability and magnitude of rewards estimated for all cOFC neurons in Exp. 1. Regression coefficients in the 0.02 s time bin shown every 0.1 s during the 0.6 s after cue onset (0-0.02 s, 0.10-0.12 s, 0.20-0.22 s, 0.30-0.32 s, 0.40-0.42 s, 0.50-0.52 s, and 0.58-0.60 s). Filled gray indicates significant regression coefficient for either Probability or Magnitude at p , 0.05. F, An example of an HPC neuron showing sample-triggered sample-location signals and item signals. A 0.08-1.0 s time window after sample onset was used to estimate the neural firing rates shown in G. Histograms are smoothed using a Gaussian kernel (s ¼ 20 ms). G, An activity plot of the HPC neuron during the time window shown in F against item and location. H, The percentage of neural modulation types detected in the 0.08-1.0 s window shown in F; Item, Location, Both, and NO. I, Percentages of neural modulation types detected in the 0.02 s time bins during the 1.0 s after sample onset. J, Regression coefficient plots for the best and worst items estimated for all HPC neurons in Exp. 2. Filled gray indicates significant regression coefficient for item at p , 0.05 using ANOVA without interaction term. The location modulation was not shown because we showed changes of neural modulation by the sample stimulus, whereas the location had already been provided to the monkeys. A, B, and D were published previously in the study by Yamada et al. (2021).
Figure 4. Graphic methods for the conventional rate-coding analysis and state-space analysis in the regression subspace. Conventional analysis (top and middle rows): in each single neuron, activity modulations by task variables are detected in the fixed time window (top row) using linear regression and ANOVA for continuous (left, Exp. 1) and categorical (right, Exp. 2) task parameters (Fig. 2, see for the task details), respectively. The same analyses were applied in a fine time resolution in Exp. 1 and Exp. 2 (middle row). The conventional analyses using a general linear model (linear regression and ANOVA) provide the extent of neural
Figure 5. The state-space analysis provides a temporal structure of neural modulation in the cOFC. A, Cumulative variance explained by PCA in the cOFC population. The arrowhead indicates the percentage of variance explained by PC1 and PC2. B, Time series of eigenvectors, PC1 to PC3 in the cOFC population. C, A series of eigenvectors for PC1 to PC3 are plotted against PC1 and PC2, and PC2 and PC3 dimensions in the cOFC population. Plots at the beginning and end of the series of vectors are labeled as start (s) and end (e), respectively. a.u., Arbitrary unit.
Figure 6. Temporal structure of neural modulation in the HPC population. A, Cumulative variance explained by PCA in the HPC population. The arrowhead indicates the percentages of variances explained by PC1 and PC2. B, Time series of eigenvectors for six items in the HPC population. The top three PCs are shown. C, Time series of eigenvectors for four locations. D, A series of eigenvectors for PC1 to PC3 are plotted against PC1 and PC2, and PC2 and PC3 dimensions in the HPC population. a.u., Arbitrary unit. Extended Data Figure 6-1 represents shuffled control results.
Figure 8. Quantitative evaluations of eigenvector properties in the cOFC and HPC populations. A, Time series of vector size estimated in the cOFC population for P and M of rewards. Vector sizes are estimated in the PC1-PC2 plane (top) and PC2-PC3 plane (bottom), respectively. a.u., Arbitrary unit. The solid-colored lines indicate interpolated lines using a cubic spline function to provide a resolution of 0.005 s. B, Time series of vector size estimated in the HPC population for the best and worst items. C, Boxplots of vector size estimated in the cOFC population for probability and magnitude of rewards. D, Boxplots of vector size in the HPC population for the best and worst items and locations. E, F, Boxplots of vector angle estimated in the cOFC (E) and HPC (F) populations. G, H, Boxplots of vector deviance from the mean estimated in the cOFC (G) and HPC (H) populations. In C-H, data after 0.1 s are used. *p , 0.05, ***p , 0.001.
Stable Neural Population Dynamics in the Regression Subspace for Continuous and Categorical Task Parameters in Monkeys

June 2023

·

215 Reads

·

2 Citations

eNeuro

Neural population dynamics provide a key computational framework for understanding information processing in the sensory, cognitive, and motor functions of the brain. They systematically depict complex neural population activity, dominated by strong temporal dynamics as trajectory geometry in a low-dimensional neural space. However, neural population dynamics are poorly related to the conventional analytical framework of single-neuron activity, the rate-coding regime that analyzes firing rate modulations using task parameters. To link the rate-coding and dynamic models, we developed a variant of state-space analysis in the regression subspace, which describes the temporal structures of neural modulations using continuous and categorical task parameters. In macaque monkeys, using two neural population datasets containing either of two standard task parameters, continuous and categorical, we revealed that neural modulation structures are reliably captured by these task parameters in the regression subspace as trajectory geometry in a lower dimension. Furthermore, we combined the classical optimal-stimulus response analysis (usually used in rate-coding analysis) with the dynamic model and found that the most prominent modulation dynamics in the lower dimension were derived from these optimal responses. Using those analyses, we successfully extracted geometries for both task parameters that formed a straight geometry, suggesting that their functional relevance is characterized as a unidimensional feature in their neural modulation dynamics. Collectively, our approach bridges neural modulation in the rate-coding model and the dynamic system, and provides researchers with a significant advantage in exploring the temporal structure of neural modulations for pre-existing datasets.


Allocentric information represented by self-referenced spatial coding in the primate medial temporal lobe

February 2023

·

39 Reads

·

14 Citations

Hippocampus

For living organisms, the ability to acquire information regarding the external space around them is critical for future actions. While the information must be stored in an allocentric frame to facilitate its use in various spatial contexts, each case of use requires the information to be represented in a particular self-referenced frame. Previous studies have explored neural substrates responsible for the linkage between self-referenced and allocentric spatial representations based on findings in rodents. However, the behaviors of rodents are different from those of primates in several aspects; for example, rodents mainly explore their environments through locomotion, while primates use eye movements. In this review, we discuss the brain mechanisms responsible for the linkage in nonhuman primates. Based on recent physiological studies, we propose that two types of neural substrates link the first-person perspective with allocentric coding. The first is the view-center background signal, which represents an image of the background surrounding the current position of fixation on the retina. This perceptual signal is transmitted from the ventral visual pathway to the hippocampus (HPC) via the perirhinal cortex and parahippocampal cortex. Because images that share the same objective-position in the environment tend to appear similar when seen from different self-positions, the view-center background signals are easily associated with one another in the formation of allocentric position coding and storage. The second type of neural substrate is the HPC neurons' dynamic activity that translates the stored location memory to the first-person perspective depending on the current spatial context.


Comparison of neural population dynamics in the regression subspace between continuous and categorical task parameters

January 2022

·

71 Reads

·

1 Citation

Neural population dynamics, presumably fundamental computational units in the brain, provide a key framework for understanding information processing in the sensory, cognitive, and motor functions. However, neural population dynamics is not explicitly related to the conventional analytic framework for single-neuron activity, i.e., representational models that analyze neuronal modulations associated with cognitive and motor parameters. In this study, we applied a recently developed state-space analysis to incorporate the representational models into the dynamic model in combination with these parameters. We compared neural population dynamics between continuous and categorical task parameters during two visual recognition tasks, using the datasets originally designed for a single-neuron approach. We successfully extracted neural population dynamics in the regression subspace, which represent modulation dynamics for both continuous and categorical task parameters with reasonable temporal characteristics. Furthermore, we combined the classical optimal-stimulus analysis paradigm for the single-neuron approach (i.e., stimulus identified as maximum neural responses) into the dynamic model, and found that the most prominent modulation dynamics at the lower dimension were derived from these optimal responses. Thus, our approach provides a unified framework for incorporating knowledge acquired with the single-neuron approach into the dynamic model as a standard procedure for describing neural modulation dynamics in the brain.



Reunification of Object and View-Center Background Information in the Primate Medial Temporal Lobe

December 2021

·

87 Reads

·

6 Citations

Recent work has shown that the medial temporal lobe (MTL), including the hippocampus (HPC) and its surrounding limbic cortices, plays a role in scene perception in addition to episodic memory. The two basic factors of scene perception are the object (“what”) and location (“where”). In this review, we first summarize the anatomical knowledge related to visual inputs to the MTL and physiological studies examining object-related information processed along the ventral pathway briefly. Thereafter, we discuss the space-related information, the processing of which was unclear, presumably because of its multiple aspects and a lack of appropriate task paradigm in contrast to object-related information. Based on recent electrophysiological studies using non-human primates and the existing literature, we proposed the “reunification theory,” which explains brain mechanisms which construct object-location signals at each gaze. In this reunification theory, the ventral pathway signals a large-scale background image of the retina at each gaze position. This view-center background signal reflects the first person’s perspective and specifies the allocentric location in the environment by similarity matching between images. The spatially invariant object signal and view-center background signal, both of which are derived from the same retinal image, are integrated again (i.e., reunification) along the ventral pathway-MTL stream, particularly in the perirhinal cortex. The conjunctive signal, which represents a particular object at a particular location, may play a role in scene perception in the HPC as a key constituent element of an entire scene.


Citations (7)


... These neural properties might be related to the larger changes in carried information as a function of firing rates and dynamic range ( Figure 4B, compare FSNs and RSN regression slopes, Figure 4C, red). As a result, the output neurons in cortical (9, 10, 12, 13) and subcortical (40)(41)(42)(43) structures becomes active via feedforward inhibition ( Figure 4A) during economic behavior. ...

Reference:

Fast-spiking neurons in monkey orbitofrontal cortex underlie economic value computation
Formation of brain-wide neural geometry during visual item recognition in monkeys

iScience

... Indeed, cortical inhibitory dysfunction results in various diseases including mental disorders (6,7). Since excitatory neurons constitute the majority of neurons at the core cortical center, the orbitofrontal cortex (OFC), they have been well examined in relation to economic behavior to obtain rewards (8)(9)(10)(11)(12)(13)(14). ...

Stable Neural Population Dynamics in the Regression Subspace for Continuous and Categorical Task Parameters in Monkeys

eNeuro

... There is increasing interest in the functioning of the human hippocampus in memory, given that the representations in the human and primate hippocampus, and hippocampal connectivity, is very different in humans and other primates from rodents (Rolls 2023d(Rolls , 2023c(Rolls , 2024cRolls and Treves 2024). For example, in primates including humans, the spatial representations are especially of locations in spatial scenes being viewed as exemplified by hippocampal and parahippocampal spatial view cells (Rolls, Robertson, and Georges-François 1997;Robertson, Rolls, and Georges-François 1998;Rolls et al. 1998;Georges-François, Rolls, and Robertson 1999;Ekstrom et al. 2003;Xiang 2005, 2006;Rolls, Xiang, and Franco 2005;Ison, Quian Quiroga, and Fried 2015;Wirth et al. 2017;Rolls and Wirth 2018;Qasim et al. 2019;Tsitsiklis et al. 2020;Mao et al. 2021;Qasim, Fried, and Jacobs 2021;Donoghue et al. 2023;Rolls 2023bRolls , 2023cYang, Chen, and Naya 2023;Piza et al. 2024), whereas in rodents the representations are especially of the place where the rodent is located as exemplified by hippocampal place cells (O'Keefe 1979;Burgess, Recce, and O'Keefe 1994;Hartley et al. 2014;Moser, Moser, and McNaughton 2017). In line with this difference, there is now evidence for a ventromedial visual cortical stream to the human hippocampus (Rolls 2023c;Rolls et al. 2023a;Rolls, Deco, Zhang, et al. 2023;Tullo et al. 2023;Rolls, Yan, et al. 2024) via the parahipocampal place area (Epstein and Julian 2013;Epstein and Baker 2019). ...

Allocentric information represented by self-referenced spatial coding in the primate medial temporal lobe
  • Citing Article
  • February 2023

Hippocampus

... Exp. 2 We used conventional techniques to record the single-neuron activity in the STRt, including the caudate and putamen tails. A tungsten microelectrode (1-3 MU FrHC; 0.5-1.5 MU Alpha Omega Engineering) was used to record single-neuron activity. ...

Reunification of Object and View-Center Background Information in the Primate Medial Temporal Lobe

... It remains to be tested what spatial view exactly encodes: it could be a memory code for the viewed space or the relationship between the viewed space and an object (that is, a vectorial representation). One recent study demonstrated that view-centered, large-scale background information may be one important input to spatial view representation (47). It would also be interesting to investigate what aspects of the spatial codes in the OFC and RSC are inherited from the hippocampus, and how they are integrated with local information processing. ...

Automatic Encoding of a View-Centered Background Image in the Macaque Temporal Lobe
  • Citing Article
  • July 2020

Cerebral Cortex

... Here we use the term object-location binding to refer to the association of objects with their spatial positions, in the classic sense that successful object recognition requires processing both what an object is and where it is, raising the fundamental challenge of linking "what" and "where" (Treisman, 1996). In addition to the vast research exploring neural mechanisms and computational models of object-location binding (Arcaro et al., 2009;Carlson et al., 2011;Chen & Naya, 2020;Cichy et al., 2011;DiCarlo & Maunsell, 2003;Hannula & Ranganath, 2008;Hemond et al., 2007;Nummenmaa et al., 2017;Roelfsema, 2006;Salehi et al., 2024;Yang et al., 2017), a recent behavioral phenomenon has provided a robust and accessible window into various aspects of objectlocation binding, via the "spatial congruency bias" (SCB; Golomb et al., 2014). In these studies, two objects are presented sequentially in peripheral locations. ...

Forward Processing of Object-Location Association from the Ventral Stream to Medial Temporal Lobe in Nonhuman Primates
  • Citing Article
  • August 2019

Cerebral Cortex

... Moreover, dvCA1 Calb1− neurons, but not dCA1 Calb1− neurons, displayed susceptibility to phospho-tau accumulation and contributed to tau-induced impairment of temporal discrimination of objects. It has been reported that the hippocampus, the prefrontal cortex [85,86], the entorhinal cortex, and the perirhinal cortex, are critical for binding individual events or items to their temporal context in episodic memory [87]. Here, we revealed for the first time the heterogeneity of hippocampal CA1 under both physiological and pathological conditions, and identified a specific behavior phenotype associated with initial accumulation of phospho-tau in the hippocampal dvCA1 Calb1− neurons. ...

Contributions of primate prefrontal cortex and medial temporal lobe to temporal-order memory
  • Citing Article
  • November 2017

Proceedings of the National Academy of Sciences