# American Institute of Mathematical Sciences

doi: 10.3934/era.2021035
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

## Enhancement of gamma oscillations in E/I neural networks by increase of difference between external inputs

 1 College of Information Science and Technology, Donghua University, Shanghai 201620, China 2 Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education, Fudan University, Shanghai 200433, China

* Corresponding author: Fang Han and Zhijie Wang

Received  December 2020 Revised  March 2021 Early access April 2021

Experimental observations suggest that gamma oscillations are enhanced by the increase of the difference between the components of external stimuli. To explain these experimental observations, we firstly construct a small excitatory/inhibitory (E/I) neural network of IAF neurons with external current input to E-neuron population differing from that to I-neuron population. Simulation results show that the greater the difference between the external inputs to excitatory and inhibitory neurons, the stronger gamma oscillations in the small E/I neural network. Furthermore, we construct a large-scale complicated neural network with multi-layer columns to explore gamma oscillations regulated by external stimuli which are simulated by using a novel CUDA-based algorithm. It is further found that gamma oscillations can be caused and enhanced by the difference between the external inputs in a large-scale neural network with a complicated structure. These results are consistent with the existing experimental findings well.

Citation: Xiaochun Gu, Fang Han, Zhijie Wang, Kaleem Kashif, Wenlian Lu. Enhancement of gamma oscillations in E/I neural networks by increase of difference between external inputs. Electronic Research Archive, doi: 10.3934/era.2021035
##### References:

show all references

##### References:
The structure of the small E/I network with external stimuli. Wiring among E- and I- neurons are all-to-all and only three cells of the E-neuron population (red dots) and I-neuron population (blue dots) are depicted here. Directed wiring is red for excitatory and blue for inhibitory connections. $w_{\rm{EE}}$, $w_{\rm{II}}$, $w_{\rm{EI}}$ and $w_{\rm{IE}}$ are the synaptic weights ($g_{{\mathrm{max}}}$) of E to E, I to I, I to E, and E to I connections, respectively
Gamma oscillations are caused by the typical input that $S1$ is 0.6 and $S2$ is zero (50ms-450ms in a 1s simulation). (a) Raster plots of the spiking times of neurons. (b) Average population activity. (c) The power spectrum of the population activity
Gamma oscillations are caused by the typical input that $S2$ is 0.2 and $S1$ is zero (50ms-450ms in a 1s simulation). (a) Raster plots of the spiking times of neurons. (b) Average population activity. (c) The power spectrum of the population activity
Increase of the peak power of gamma oscillations in the small E/I neural network with the increasing of the difference between $S1$ and $S2$. The increase of the peak power in the second case (blue curve) is more evident than that in the first case (green curve)
Intra-layer and inter-layer connections in a single column
The framework of simulation for the large-scale complicated network with multi-layer columns. $t_{0}$ is the initial time. $\Delta{t}$ is the time step
The pseudo-code of constructing the multi-layer column structure of the large-scale complicated network. The variable $m\_uiNumCell\_per\_column$ denotes the number of neurons in each column, the variables $idx\_column$, and $jdx\_column$ denote the IDs of columns and the variables $ii\_idx\_column$ and $jj\_jdx\_column$ denote the IDs of neurons
Gamma oscillations are generated in the large-scale complicated neural network. (a) Raster plots of the spiking times of neurons in each layer of columns when $S1 = 0.3, S2 = 0$. (b) Raster plots of the spiking times of neurons in each layer of columns when $S1 = 0, S2 = 0.3$. (c) Power spectrums of average population activities with the two input cases
Increase of the peak power of gamma oscillations in the large-scale complicated network with the increasing of the difference between $S1$ and $S2$
Connection Types and parameters of neurons within a column
 presynaptic neuron postsynaptic neuron $\mathit{\boldsymbol{\alpha}}$ $\mathit{\boldsymbol{\beta}}$ $\mathit{\boldsymbol{g_{{\mathrm{max}}}}}$ $\mathit{\boldsymbol{d(ms)}}$ E23 E23, E4, E5, I23 0.9 0.003 0.004 2 I23 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1 E4 E4, E5, E6, I4 0.9 0.003 0.004 2 I4 E4, I4 0.9 0.003 0.05 1 E5 E23, E4, E5, E6, I5 0.9 0.003 0.004 2 I5 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1 E6 E6, I6 0.9 0.003 0.004 2 I6 E23, E5, E6, I5, I6 0.9 0.003 0.05 1
 presynaptic neuron postsynaptic neuron $\mathit{\boldsymbol{\alpha}}$ $\mathit{\boldsymbol{\beta}}$ $\mathit{\boldsymbol{g_{{\mathrm{max}}}}}$ $\mathit{\boldsymbol{d(ms)}}$ E23 E23, E4, E5, I23 0.9 0.003 0.004 2 I23 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1 E4 E4, E5, E6, I4 0.9 0.003 0.004 2 I4 E4, I4 0.9 0.003 0.05 1 E5 E23, E4, E5, E6, I5 0.9 0.003 0.004 2 I5 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1 E6 E6, I6 0.9 0.003 0.004 2 I6 E23, E5, E6, I5, I6 0.9 0.003 0.05 1
Parameters and counts of different types of neurons within a column
 Type $\mathit{\boldsymbol{N}}$ $\mathit{\boldsymbol{V_{\rm{th}}(mV)}}$ $\mathit{\boldsymbol{V_{\rm{reset}}(mV)}}$ $\mathit{\boldsymbol{E_{\rm{syn}}(mV)}}$ $\mathit{\boldsymbol{\tau(ms)}}$ $\mathit{\boldsymbol{R(k\Omega)}}$ $\mathit{\boldsymbol{V_{\rm{L}}(mV) }}$ E23 200 -47 -65 0 5 10 -65 I23 50 -45 -65 -75 1 10 -65 E4 200 -47 -65 0 5 10 -65 I4 50 -45 -65 -75 1 10 -65 E5 200 -47 -65 0 5 10 -65 I5 50 -45 -65 -75 1 10 -65 E6 200 -47 -65 0 5 10 -65 I6 50 -45 -65 -75 1 10 -65
 Type $\mathit{\boldsymbol{N}}$ $\mathit{\boldsymbol{V_{\rm{th}}(mV)}}$ $\mathit{\boldsymbol{V_{\rm{reset}}(mV)}}$ $\mathit{\boldsymbol{E_{\rm{syn}}(mV)}}$ $\mathit{\boldsymbol{\tau(ms)}}$ $\mathit{\boldsymbol{R(k\Omega)}}$ $\mathit{\boldsymbol{V_{\rm{L}}(mV) }}$ E23 200 -47 -65 0 5 10 -65 I23 50 -45 -65 -75 1 10 -65 E4 200 -47 -65 0 5 10 -65 I4 50 -45 -65 -75 1 10 -65 E5 200 -47 -65 0 5 10 -65 I5 50 -45 -65 -75 1 10 -65 E6 200 -47 -65 0 5 10 -65 I6 50 -45 -65 -75 1 10 -65
Connection types and parameters of neurons between columns
 presynaptic neuron postsynaptic neuron $\mathit{\boldsymbol{\alpha}}$ $\mathit{\boldsymbol{\beta}}$ $\mathit{\boldsymbol{g_{{\mathrm{max}}}}}$ $\mathit{\boldsymbol{d(ms)}}$ E23 I23 0.8 0.001 0.98 1 E4 I4 0.8 0.001 0.98 1 E5 E5 0.8 0.001 0.16 1 E5 I5 0.8 0.001 0.98 1 E5 E23 0.8 0.001 0.16 1 E5 I23 0.8 0.001 0.98 1 E6 I6 0.8 0.001 0.98 1
 presynaptic neuron postsynaptic neuron $\mathit{\boldsymbol{\alpha}}$ $\mathit{\boldsymbol{\beta}}$ $\mathit{\boldsymbol{g_{{\mathrm{max}}}}}$ $\mathit{\boldsymbol{d(ms)}}$ E23 I23 0.8 0.001 0.98 1 E4 I4 0.8 0.001 0.98 1 E5 E5 0.8 0.001 0.16 1 E5 I5 0.8 0.001 0.98 1 E5 E23 0.8 0.001 0.16 1 E5 I23 0.8 0.001 0.98 1 E6 I6 0.8 0.001 0.98 1
 [1] Junlin Xiong, Wenjie Liu. $H_{\infty}$ observer-based control for large-scale systems with sparse observer communication network. Numerical Algebra, Control & Optimization, 2020, 10 (3) : 331-343. doi: 10.3934/naco.2020005 [2] Boris Kramer, John R. Singler. A POD projection method for large-scale algebraic Riccati equations. Numerical Algebra, Control & Optimization, 2016, 6 (4) : 413-435. doi: 10.3934/naco.2016018 [3] Danuta Gaweł, Krzysztof Fujarewicz. On the sensitivity of feature ranked lists for large-scale biological data. Mathematical Biosciences & Engineering, 2013, 10 (3) : 667-690. doi: 10.3934/mbe.2013.10.667 [4] Mahmut Çalik, Marcel Oliver. Weak solutions for generalized large-scale semigeostrophic equations. Communications on Pure & Applied Analysis, 2013, 12 (2) : 939-955. doi: 10.3934/cpaa.2013.12.939 [5] Philippe Bonneton, Nicolas Bruneau, Bruno Castelle, Fabien Marche. Large-scale vorticity generation due to dissipating waves in the surf zone. Discrete & Continuous Dynamical Systems - B, 2010, 13 (4) : 729-738. doi: 10.3934/dcdsb.2010.13.729 [6] Tsuguhito Hirai, Hiroyuki Masuyama, Shoji Kasahara, Yutaka Takahashi. Performance analysis of large-scale parallel-distributed processing with backup tasks for cloud computing. Journal of Industrial & Management Optimization, 2014, 10 (1) : 113-129. doi: 10.3934/jimo.2014.10.113 [7] Suli Zou, Zhongjing Ma, Xiangdong Liu. Auction games for coordination of large-scale elastic loads in deregulated electricity markets. Journal of Industrial & Management Optimization, 2016, 12 (3) : 833-850. doi: 10.3934/jimo.2016.12.833 [8] Bo You, Chengkui Zhong, Fang Li. Pullback attractors for three dimensional non-autonomous planetary geostrophic viscous equations of large-scale ocean circulation. Discrete & Continuous Dynamical Systems - B, 2014, 19 (4) : 1213-1226. doi: 10.3934/dcdsb.2014.19.1213 [9] Boling Guo, Guoli Zhou. Finite dimensionality of global attractor for the solutions to 3D viscous primitive equations of large-scale moist atmosphere. Discrete & Continuous Dynamical Systems - B, 2018, 23 (10) : 4305-4327. doi: 10.3934/dcdsb.2018160 [10] Masataka Kato, Hiroyuki Masuyama, Shoji Kasahara, Yutaka Takahashi. Effect of energy-saving server scheduling on power consumption for large-scale data centers. Journal of Industrial & Management Optimization, 2016, 12 (2) : 667-685. doi: 10.3934/jimo.2016.12.667 [11] Bo You, Chunxiang Zhao. Approximation of stationary statistical properties of the three dimensional autonomous planetary geostrophic equations of large-scale ocean circulation. Discrete & Continuous Dynamical Systems - B, 2020, 25 (8) : 3183-3198. doi: 10.3934/dcdsb.2020057 [12] Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control & Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395 [13] Gaohang Yu. A derivative-free method for solving large-scale nonlinear systems of equations. Journal of Industrial & Management Optimization, 2010, 6 (1) : 149-160. doi: 10.3934/jimo.2010.6.149 [14] Bo You. Optimal distributed control of the three dimensional primitive equations of large-scale ocean and atmosphere dynamics. Evolution Equations & Control Theory, 2020  doi: 10.3934/eect.2020097 [15] Linfei Wang, Dapeng Tao, Ruonan Wang, Ruxin Wang, Hao Li. Big Map R-CNN for object detection in large-scale remote sensing images. Mathematical Foundations of Computing, 2019, 2 (4) : 299-314. doi: 10.3934/mfc.2019019 [16] Bo You. Well-posedness for the three dimensional stochastic planetary geostrophic equations of large-scale ocean circulation. Discrete & Continuous Dynamical Systems, 2021, 41 (4) : 1579-1604. doi: 10.3934/dcds.2020332 [17] Yigui Ou, Wenjie Xu. A unified derivative-free projection method model for large-scale nonlinear equations with convex constraints. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021125 [18] Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021143 [19] Jianhong Wu, Ruyuan Zhang. A simple delayed neural network with large capacity for associative memory. Discrete & Continuous Dynamical Systems - B, 2004, 4 (3) : 851-863. doi: 10.3934/dcdsb.2004.4.851 [20] Peter Benner, Ryan Lowe, Matthias Voigt. $\mathcal{L}_{∞}$-norm computation for large-scale descriptor systems using structured iterative eigensolvers. Numerical Algebra, Control & Optimization, 2018, 8 (1) : 119-133. doi: 10.3934/naco.2018007

2020 Impact Factor: 1.833

## Tools

Article outline

Figures and Tables