Frontiers in Computational Neuroscience Journal Impact Factor & Information

Publisher: Frontiers Research Foundation, Frontiers

Journal description

Current impact factor: 2.23

Impact Factor Rankings

2015 Impact Factor Available summer 2015
2013 / 2014 Impact Factor 2.233
2012 Impact Factor 2.481
2011 Impact Factor 2.147
2010 Impact Factor 2.586

Impact factor over time

Impact factor
Year

Additional details

5-year impact 2.61
Cited half-life 2.10
Immediacy index 0.81
Eigenfactor 0.00
Article influence 1.17
ISSN 1662-5188
OCLC 250614660
Material type Document, Internet resource
Document type Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Frontiers

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author can archive a post-print version
  • Conditions
    • On open access repositories
    • Authors retain copyright
    • Creative Commons Attribution License
    • Published source must be acknowledged
    • Publisher's version/PDF may be used
    • Set statement to accompany [This Document is Protected by copyright and was first published by Frontiers. All rights reserved. it is reproduced with permission.]
    • Articles are placed in PubMed Central immediately on behalf of authors.
    • All titles are open access journals
    • Publisher last contacted on 16/07/2015
  • Classification
    ​ green

Publications in this journal

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In order to be able to examine the extracellular potential's influence on network activity and to better understand dipole properties of the extracellular potential, we present and analyze a three-dimensional formulation of the cable equation which facilitates numeric simulations. When the neuron's intra- and extracellular space is assumed to be purely resistive (i.e., no free charges), the balance law of electric fluxes leads to the Laplace equation for the distribution of the intra- and extracellular potential. Moreover, the flux across the neuron's membrane is continuous. This observation already delivers the three dimensional cable equation. The coupling of the intra- and extracellular potential over the membrane is not trivial. Here, we present a continuous extension of the extracellular potential to the intracellular space and combine the resulting equation with the intracellular problem. This approach makes the system numerically accessible. On the basis of the assumed pure resistive intra- and extracellular spaces, we conclude that a cell's out-flux balances out completely. As a consequence neurons do not own any current monopoles. We present a rigorous analysis with spherical harmonics for the extracellular potential by approximating the neuron's geometry to a sphere. Furthermore, we show with first numeric simulations on idealized circumstances that the extracellular potential can have a decisive effect on network activity through ephaptic interactions.
    Frontiers in Computational Neuroscience 07/2015; 9:94. DOI:10.3389/fncom.2015.00094
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The impressive precision of mammalian limb movements relies on internal feedback pathways that convey information about ongoing motor output to cerebellar circuits. The spino-cerebellar tracts (SCT) in the cervical, thoracic and lumbar spinal cord have long been considered canonical neural substrates for the conveyance of internal feedback signals. Here we consider the distinct features of an indirect spino-cerebellar route, via the brainstem lateral reticular nucleus (LRN), and the implications of this pre-cerebellar "detour" for the execution and evolution of limb motor control. Both direct and indirect spino-cerebellar pathways signal spinal interneuronal activity to the cerebellum during movements, but evidence suggests that direct SCT neurons are mainly modulated by rhythmic activity, whereas the LRN also receives information from systems active during postural adjustment, reaching and grasping. Thus, while direct and indirect spino-cerebellar circuits can both be regarded as internal copy pathways, it seems likely that the direct system is principally dedicated to rhythmic motor acts like locomotion, while the indirect system also provides a means of pre-cerebellar integration relevant to the execution and coordination of dexterous limb movements.
    Frontiers in Computational Neuroscience 07/2015; 9:75. DOI:10.3389/fncom.2015.00075
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Brain-Computer Interfaces (BCIs) that convert brain-recorded neural signals into intended movement commands could eventually be combined with Functional Electrical Stimulation to allow individuals with Spinal Cord Injury to regain effective and intuitive control of their paralyzed limbs. To accelerate the development of such an approach, we developed a model of closed-loop BCI control of arm movements that (1) generates realistic arm movements (based on experimentally measured, visually-guided movements with real-time error correction), (2) simulates cortical neurons with firing properties consistent with literature reports, and (3) decodes intended movements from the noisy neural ensemble. With this model we explored (1) the relative utility of neurons tuned for different movement parameters (position, velocity, and goal) and (2) the utility of recording from larger numbers of neurons-critical issues for technology development and for determining appropriate brain areas for recording. We simulated arm movements that could be practically restored to individuals with severe paralysis, i.e., movements from an armrest to a volume in front of the person. Performance was evaluated by calculating the smallest movement endpoint target radius within which the decoded cursor position could dwell for 1 s. Our results show that goal, position, and velocity neurons all contribute to improve performance. However, velocity neurons enabled smaller targets to be reached in shorter amounts of time than goal or position neurons. Increasing the number of neurons also improved performance, although performance saturated at 30-50 neurons for most neuron types. Overall, our work presents a closed-loop BCI simulator that models error corrections and the firing properties of various movement-related neurons that can be easily modified to incorporate different neural properties. We anticipate that this kind of tool will be important for development of future BCIs.
    Frontiers in Computational Neuroscience 07/2015; 9:84. DOI:10.3389/fncom.2015.00084
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The landmark experiments by Posner in the late 1970s have shown that reaction time (RT) is faster when the stimulus appears in an expected location, as indicated by a cue; since then, the so-called Posner task has been considered a "gold standard" test of spatial attention. It is thus fundamental to understand the neural mechanisms involved in performing it. To this end, we have developed a Bayesian detection system and small integrate-and-fire neural networks, which modeled sensory and motor circuits, respectively, and optimized them to perform the Posner task under different cue type proportions and noise levels. In doing so, main findings of experimental research on RT were replicated: the relative frequency effect, suboptimal RTs and significant error rates due to noise and invalid cues, slower RT for choice RT tasks than for simple RT tasks, fastest RTs for valid cues and slowest RTs for invalid cues. Analysis of the optimized systems revealed that the employed mechanisms were consistent with related findings in neurophysiology. Our models predict that (1) the results of a Posner task may be affected by the relative frequency of valid and neutral trials, (2) in simple RT tasks, input from multiple locations are added together to compose a stronger signal, and (3) the cue affects motor circuits more strongly in choice RT tasks than in simple RT tasks. In discussing the computational demands of the Posner task, attention has often been described as a filter that protects the nervous system, whose capacity is limited, from information overload. Our models, however, reveal that the main problems that must be overcome to perform the Posner task effectively are distinguishing signal from external noise and selecting the appropriate response in the presence of internal noise.
    Frontiers in Computational Neuroscience 07/2015; 9:81. DOI:10.3389/fncom.2015.00081
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: For optimal action planning, the gain/loss associated with actions and the variability in motor output should both be considered. A number of studies make conflicting claims about the optimality of human action planning but cannot be reconciled due to their use of different movements and gain/loss functions. The disagreement is possibly because of differences in the experimental design and differences in the energetic cost of participant motor effort. We used a coincident timing task, which requires decision making with constant energetic cost, to test the optimality of participant's timing strategies under four configurations of the gain function. We compared participant strategies to an optimal timing strategy calculated from a Bayesian model that maximizes the expected gain. We found suboptimal timing strategies under two configurations of the gain function characterized by asymmetry, in which higher gain is associated with higher risk of zero gain. Participants showed a risk-seeking strategy by responding closer than optimal to the time of onset/offset of zero gain. Meanwhile, there was good agreement of the model with actual performance under two configurations of the gain function characterized by symmetry. Our findings show that human ability to make decisions that must reflect uncertainty in one's own motor output has limits that depend on the configuration of the gain function.
    Frontiers in Computational Neuroscience 07/2015; 9:88. DOI:10.3389/fncom.2015.00088
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Deep brain stimulation (DBS) in the pedunculopontine tegmental nucleus (PPTg) has been proposed to alleviate medically intractable gait difficulties associated with Parkinson's disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20 Hz, 90 μs pulse width) was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4 mA). These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts), which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.
    Frontiers in Computational Neuroscience 07/2015; 9:93. DOI:10.3389/fncom.2015.00093
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we investigate how clustering factors influent spiking regularity of the neuronal network of subnetworks. In order to do so, we fix the averaged coupling probability and the averaged coupling strength, and take the cluster number M, the ratio of intra-connection probability and inter-connection probability R, the ratio of intra-coupling strength and inter-coupling strength S as controlled parameters. With the obtained simulation results, we find that spiking regularity of the neuronal networks has little variations with changing of R and S when M is fixed. However, cluster number M could reduce the spiking regularity to low level when the uniform neuronal network's spiking regularity is at high level. Combined the obtained results, we can see that clustering factors have little influences on the spiking regularity when the entire energy is fixed, which could be controlled by the averaged coupling strength and the averaged connection probability.
    Frontiers in Computational Neuroscience 07/2015; 9:85. DOI:10.3389/fncom.2015.00085
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The mammalian auditory system extracts features from the acoustic environment based on the responses of spatially distributed sets of neurons in the subcortical and cortical auditory structures. The characteristic responses of these neurons (linearly approximated by their spectro-temporal receptive fields, or STRFs) suggest that auditory representations are formed, as early as in the inferior colliculi, on the basis of a time, frequency, rate (temporal modulations) and scale (spectral modulations) analysis of sound. However, how these four dimensions are integrated and processed in subsequent neural networks remains unclear. In this work, we present a new methodology to generate computational insights into the functional organization of such processes. We first propose a systematic framework to explore more than a hundred different computational strategies proposed in the literature to process the output of a generic STRF model. We then evaluate these strategies on their ability to compute perceptual distances between pairs of environmental sounds. Finally, we conduct a meta-analysis of the dataset of all these algorithms' accuracies to examine whether certain combinations of dimensions and certain ways to treat such dimensions are, on the whole, more computationally effective than others. We present an application of this methodology to a dataset of ten environmental sound categories, in which the analysis reveals that (1) models are most effective when they organize STRF data into frequency groupings-which is consistent with the known tonotopic organization of receptive fields in auditory structures -, and that (2) models that treat STRF data as time series are no more effective than models that rely only on summary statistics along time-which corroborates recent experimental evidence on texture discrimination by summary statistics.
    Frontiers in Computational Neuroscience 07/2015; 9:80. DOI:10.3389/fncom.2015.00080
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Parkinson's disease (PD) is a neurodegenerative disorder which follows from cell loss of dopaminergic neurons in the substantia nigra pars compacta (SNc), a nucleus in the basal ganglia (BG). Deep brain stimulation (DBS) is an electrical therapy that modulates the pathological activity to treat the motor symptoms of PD. Although this therapy is currently used in clinical practice, the sufficient conditions for therapeutic efficacy are unknown. In this work we develop a model of critical motor circuit structures in the brain using biophysical cell models as the base components and then evaluate performance of different DBS signals in this model to perform comparative studies of their efficacy. Biological models are an important tool for gaining insights into neural function and, in this case, serve as effective tools for investigating innovative new DBS paradigms. Experiments were performed using the hemi-parkinsonian rodent model to test the same set of signals, verifying the obedience of the model to physiological trends. We show that antidromic spiking from DBS of the subthalamic nucleus (STN) has a significant impact on cortical neural activity, which is frequency dependent and additionally modulated by the regularity of the stimulus pulse train used. Irregular spacing between stimulus pulses, where the amount of variability added is bounded, is shown to increase diversification of response of basal ganglia neurons and reduce entropic noise in cortical neurons, which may be fundamentally important to restoration of information flow in the motor circuit.
    Frontiers in Computational Neuroscience 06/2015; 9:78. DOI:10.3389/fncom.2015.00078
  • Source
    Frontiers in Computational Neuroscience 06/2015; 9:77. DOI:10.3389/fncom.2015.00077
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The impact of learning and long-term memory storage on synaptic connectivity is not completely understood. In this study, we examine the effects of associative learning on synaptic connectivity in adult cortical circuits by hypothesizing that these circuits function in a steady-state, in which the memory capacity of a circuit is maximal and learning must be accompanied by forgetting. Steady-state circuits should be characterized by unique connectivity features. To uncover such features we developed a biologically constrained, exactly solvable model of associative memory storage. The model is applicable to networks of multiple excitatory and inhibitory neuron classes and can account for homeostatic constraints on the number and the overall weight of functional connections received by each neuron. The results show that in spite of a large number of neuron classes, functional connections between potentially connected cells are realized with less than 50% probability if the presynaptic cell is excitatory and generally a much greater probability if it is inhibitory. We also find that constraining the overall weight of presynaptic connections leads to Gaussian connection weight distributions that are truncated at zero. In contrast, constraining the total number of functional presynaptic connections leads to non-Gaussian distributions, in which weak connections are absent. These theoretical predictions are compared with a large dataset of published experimental studies reporting amplitudes of unitary postsynaptic potentials and probabilities of connections between various classes of excitatory and inhibitory neurons in the cerebellum, neocortex, and hippocampus.
    Frontiers in Computational Neuroscience 06/2015; 9:74. DOI:10.3389/fncom.2015.00074
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Noradrenergic modulation from the locus coerulus is often associated with the regulation of sensory signal-to-noise ratio. In the olfactory system, noradrenergic modulation affects both bulbar and cortical processing, and has been shown to modulate the detection of low concentration stimuli. We here implemented a computational model of the olfactory bulb and piriform cortex, based on known experimental results, to explore how noradrenergic modulation in the olfactory bulb and piriform cortex interact to regulate odor processing. We show that as predicted by behavioral experiments in our lab, norepinephrine can play a critical role in modulating the detection and associative learning of very low odor concentrations. Our simulations show that bulbar norepinephrine serves to pre-process odor representations to facilitate cortical learning, but not recall. We observe the typical non-uniform dose-response functions described for norepinephrine modulation and show that these are imposed mainly by bulbar, but not cortical processing.
    Frontiers in Computational Neuroscience 06/2015; 9:73. DOI:10.3389/fncom.2015.00073