Theoretical Mechanics of Biological Neural Networks

Free download. Book file PDF easily for everyone and every device. You can download and read online Theoretical Mechanics of Biological Neural Networks file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Theoretical Mechanics of Biological Neural Networks book. Happy reading Theoretical Mechanics of Biological Neural Networks Bookeveryone. Download file Free Book PDF Theoretical Mechanics of Biological Neural Networks at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Theoretical Mechanics of Biological Neural Networks Pocket Guide.

Notice that even if in the case in which the input is a perfect sinusoidal and therefore non-Gaussian, the mean-field equation for the power spectrum Eq 8 is still valid. However, since x is also not Gaussian anymore, in order to find the mean-field solution we need to modify our iterative scheme by splitting the activation variable x into its Gaussian and its oscillatory part [ 4 ]. Simulations solid blue and theory dashed blue are superimposed. Note that these quantities depend on the size of the frequency discretization bin. Bottom: Graphical interpretation of P bkg , i. The presence of the input affects the dynamics of the mean-field network, quantified by the power spectral density Fig 4b.

Notice that both this shift and the shaping of the chaotic activity are nonlinear effects due to the recurrent dynamics. As an additional nonlinear effect, the network activity also exhibits harmonics at the driving frequency of the external input. To characterize the response to the external stimulus, we split the power spectrum S x f into an oscillatory component and a chaotic component that constitutes the background activity 14 where b k are positive coefficients and we included the multiples of the driving frequency in order to account for the harmonics.

First, we will look at the transmission of the oscillatory signal near the driving frequency, i. At the driving frequency f I we write see Fig 4c 15 i. The signal-to-noise ratio SNR at the driving frequency f I is then given by Finally, we have seen in the example in Fig 4b that the oscillatory input can suppress background activity at frequencies far from f I.

In order to quantify this chaos-suppression effect, we split the total variance of x into two contributions Fig 4c If the oscillatory input is weak, chaos is not entirely suppressed and acts as internally-generated noise on the transmission of the oscillatory input. We now study how the network transmits this oscillatory input signal, and how the transmission quality depends on the signal frequency f I. It is known from linear response theory that the transmission of weak signals through single homogeneous populations with strong intrinsic or external noise does not benefit from adaptation [ 35 , 50 ].

This is because in the signal-to-noise ratio SNR both the signal and the noise are affected in the same way [ 35 ].

Neural network - Wikipedia

We wondered whether in a strongly coupled, large random network, adaptation could have a different effect on the oscillatory signal than on the noise, thereby re-shaping the SNR. A particularly interesting question is how signals are transmitted in the presence of purely intrinsically-generated chaotic fluctuations that are shaped by adaptation and recurrent connectivity. Top row: response of the network to oscillatory drive and independent white noise to each neuron. Bottom row: response of the network to oscillatory drive and independent low-frequency noise to each neuron.

For each row, from left to right, we plot the power spectrum of the input noise, the background component of the power spectrum , the oscillatory component of the power spectrum , and the SNR as a function of the driving frequency. The hat over the symbols A bkg and A osc indicates that, to highlight the network shaping, they are normalized to have the same maximum height equal to one.

Notice that, since both signal and noise are shaped in the same way in the linear response framework, the introduction of adaptation does not affect the SNR. We plot the same quantities as in panel a. Notice that, due to the nonlinearity of the network, signal and internally-generated noise are shaped in different ways, with the signal being subject to a broader effective filter. As a consequence, the introduction of adaptation in the nonlinear network shapes the SNR by favoring low frequencies. This means that both the signal and the noise are shaped by the same factor that characterizes the network Fig 5a.

The SNR of the output at the driving frequency, defined as in Eq 16 , is given by 20 i. Notice that we considered the activation variable x as our output.


  1. Cross-talk theory of memory capacity in neural networks | SpringerLink.
  2. Neural network - Wikipedia?
  3. Epic battles of the last days?
  4. Replay in biological and artificial neural networks.
  5. Where EGOs Dare: The Untold Truth About Narcissistic Leaders - and How to Survive Them?
  6. The Soul of Man Under Socialism and Selected Critical Prose.

Eq 20 implies that the SNR depends only on the power spectra of the signal and of the noise. For example, if we consider low-frequency dominated noise, high-frequency signals will be transmitted more easily, but once again the introduction of adaptation will not play any role Fig 5a. While this argument is based on a linear response approximation, we verified using the DMFT solution that the linear approximation is quite accurate.

What is an Artificial Neural Network?

Deviations are visible very close to the criticality, but once again the SNR is almost entirely independent of the neuron parameters. The findings are completely different for a network in the chaotic phase, i. For clarity, let us assume that there is no external noise, such that the noise is only internally generated by the network.

In this case, the linear response theory framework cannot be applied; in order to predict the effect of the network in shaping both the input and the internally-generated noise, we need to solve the DMFT equations Eq 8 iteratively. As in the previous section, the resulting power spectrum can be split into a chaotic component and into an oscillatory component see Eq How does the introduction of adaptation shape these two components?

The state of resonant chaos survives in the presence of weak input. As the driving frequency changes, the amplitude of the transmitted signal A osc f I passes through a maximum at the resonance frequency f 0 of the network; however, A osc f I decreases with the distance from the resonance frequency more slowly than the noise amplitude A bgk f I does Fig 5b.

As a consequence, the SNR is maximal at very slow driving frequencies, goes through a minimum at the resonance frequency f 0 before it increases again Fig 5b. In other words, resonant chaos acts as a notch filter with respect to the transmission of weak signals. The factor exhibits a maximum at the resonance frequency, and therefore sharpens A bgk f I. The signal power that sticks out of the noise power is for a weak input signal, where the first term is the direct influence of the signal on x t and the second term represents indirect contributions caused by recurent connections.

Replay in biological and artificial neural networks

If the direct effect dominates, the signal-to-noise ratio is approximately. Importantly, even though the expression for the SNR looks formally similar to Eq 20 for the case of external noise, the effect of single neuron dynamics is markedly different: As shown above, the power spectrum of internally generated noise depends on the single neuron filter , whereas the corresponding power spectrum of external noise in Eq 20 is, by definition, independent of single neuron dynamics.

Thus, in the case of internally generated fluctuations, single neuron dynamics strongly affects the SNR through network-enhanced noise shaping. If the strength of the input is increased, the interaction between noise and signal becomes stronger, leading to a deformation of the SNR Fig 6a. However, even for strong drive we observe a peak of the SNR at frequencies that are lower than the resonance one.

Lecture 10 - Neural Networks

As A I increases, nonlinear interaction between signal and noise become stronger, leading to a qualitative change in the SNR profile. In the presence of strong input, chaos suppression together with the formation of a sharp peak are indications that at the microscopic level the network is driven towards a limit cycle. Similarly to [ 4 ], we now study how chaos suppression depends on the driving frequency f I. A osc depends smoothly on f I , reaching its largest value around f 0. On the other hand, A chaos is zero for input frequencies that are close to f 0 , indicating that the network is driven into a limit cycle.

While a network without adaptation also exhibits such a non-monotonic dependence [ 4 ], in our case this effect is more pronounced due to the resonant power spectrum of the spontaneous activity in the presence of adaptation. We have seen how adaptation, by changing the response function of singe neurons, shapes the chaotic dynamics of a recurrent network and consequently the signal-transmission properties of the network.

In biology, several other mechanisms could contribute to the response properties of neurons, such as synaptic filtering, facilitation or the presence of dendritic compartments [ 28 — 40 ]. We account for multiple of such mechanisms by considering a general D -dimensional linear-nonlinear rate model. We assume that the rate is the only signal that unit j uses to communicate with other units.

Conversely, the signals coming from other units only influence the variable , i. The choice of having the same variable sending and receiving signals is dictated by simplicity and is not necessary for the development of the theory. Unit i receives input from all the other units, via a set of random connections J ij , sampled i. Subscripts in Latin letters indicate the index of the unit in the network and run from 1 to N , while superscripts in Greek letters indicate the index of the variable in the rate model and run from 1 to D.

Quantum Neural Machine Learning: Theory and Experiments

The matrix A is assumed to be non-singular and to have eigenvalues with negative real parts. As in the case of adaptation, is the squared modulus of the linear response function of single neurons in the frequency domain see Methods. By solving the mean-field theory, we find that, similarly to the case of adaptation, for small coupling the power spectrum converges to zero for all frequencies. We studied how the dynamics of a random network of rate neurons are shaped by the properties of single neurons, and in particular by the presence of history-dependent mechanisms such as adaptation.

ISBN 13: 9780124642553

To this end, we generalized DMFT, a well-established theoretical tool [ 1 ], to the case of multi-dimensional rate units. This allowed us to reduce the high-dimensional, deterministic network model to a low dimensional system of stochastic differential equations. Standard approaches to solving the mean-field equations [ 1 ] were not fruitful in the multi-dimensional setting. However, the mean-field solution could be found efficiently in a semi-analytical way using an iterative approach.

"Grey neurons": statistical mechanics of neural networks

The iterative approach highlights how recurrent connections sharpen the response function of single neurons, i. Previous studies that considered the role of single neuron properties on random network dynamics focused only on the role of the gain function [ 2 , 14 ]. To our knowledge, this is the first result that relates the single neuron frequency response function to the spectral properties of random network dynamics.

We studied in detail the important case of neuronal adaptation, using a two-dimensional rate model. We observed that the resonance frequency can be computed from the single unit properties and it is therefore independent of the connection strength g. On the other hand, the presence of recurrent connections increases the coherence of the oscillations and therefore influences the correlation time. Indeed, as it is typical of critical behavior, the correlation time in the chaotic phase diverges when approaching the criticality.

In the presence of adaptation, this happens because the system approaches a limit cycle. It is interesting to observe that for slow adaptation there are two separate contributions to the correlation time of the network activity: an oscillatory component, related to the resonance frequency, and a long tail that scales with the adaptation timescale.

Such multi-scale structure of the autocorrelation could be advantageous for network computations that require expressive dynamics over multiple timescales, as it is often the case in motor control. Indeed, adaptation has been proposed to play a role in sequential memory retrieval [ 51 ], slow activity propagation [ 52 ], perceptual bistability [ 53 ] and decision making [ 54 ].

Moreover, SFA has beneficial consequences both for reservoir computing approaches [ 10 ] and for spiking neuron-based machine learning architectures [ 55 ]. Further work could explore the relation between long correlation time induced by adaptation and computational properties.


  • Evaluation of Online Higher Education: Learning, Interaction and Technology!
  • Navigation menu.
  • New Perspectives on Old Stones: Analytical Approaches to Paleolithic Technologies.
  • Identity-Based Cryptography?
  • We found that, when a network in the resonant chaotic state is driven by an oscillatory input, chaos is more easily suppressed when the driving frequency is close to the resonance frequency. In the presence of weak input, in contrast, chaos is not fully suppressed. Interestingly, we found that in the chaotic regime the presence of adaptation shapes the SNR in frequency space. In particular, adaptation increases the SNR for low-frequency signals, a possibly important feature since behaviorally relevant stimuli can have information encoded in slow signal components [ 56 ].

    It is known that the properties of biological neurons, including adaptation parameters, can be dynamically adjusted using neuromodulators [ 57 , 58 ]. In view of our results, this would allow to dynamically shape the SNR depending on the requirements imposed by the behavioral context. The authors of [ 59 ] used a slightly different network architecture and did not focus on the relation between single neuron response and spectral properties, but rather on the correlation time of the network activity and on the effect of white noise input.

    One major difference is the conclusion reached regarding correlation time: by using a different definition, in [ 59 ] the authors conclude that the correlation time does not scale with the adaptation timescale. Based on our analysis, we infer that the definition of correlation time used in [ 59 ] captures only the oscillatory contribution to the correlation time, and not its long tail. Current mean-field theories for spiking neural networks [ 61 ] are self-consistent only with respect to mean activities firing rates , whereas second-order statistics such as autocorrelation function or power spectral density of inputs and outputs are inconsistent [ 62 ].

    While iterative numerical procedures are available [ 62 — 64 ], a self-consistent analytical calculation of the autocorrelation or power spectrum via DMFT for networks of spiking neurons is known to be a hard theoretical problem. In the present manuscript, the rate-based modeling framework allowed us to put forward explicit expressions for the map of autocorrelations. However, for polynomial nonlinearities the series simplifies to a finite sum, e.