**Abstract:
**

We present fast methods for filtering voltage measurements and performing optimal inference of the location and strength of synaptic connections in large dendritic trees. Given noisy, subsampled voltage observations we develop fast L1-penalized regression methods for Kalman state-space models of the neuron voltage dynamics. The value of the L1-penalty parameter is chosen using cross-validation or, for low signal-to-noise ratio, a Mallows’ Cp-like criterion. Using low-rank approximations, we reduce the inference runtime from cubic to linear in the number of dendritic compartments. We illustrate our results with simulations on toy and real neuronal geometries, using different sampling schemes. Sampling fixed points in the dendritic tree leads to poor inference of the synaptic weights; much better results are obtained when the sampling points change in time, either in a scanned manner or using randomized sampling schemes.

** **

Title: A Neural Network Model of Working Memory

When: Thursday, April 26th, 2012

Time: ???

Where: Simons Center ]]>

**Abstract:
**

The computational capabilities of the nervous system are the result of nonlinear dynamics emerging from the exchange among neurons of stereotyped messages, the action potentials. This web of cells sparsely linked through synaptic contacts is a heterogeneous excitable medium whose collective dynamics can be effectively described by mean-field theories. I will introduce a widely used approach relying on a multi-dimensional Fokker-Planck equation for the time-dependent densities of neuronal membrane potentials. Additional dimensions take into account of other activity-dependent variables and non-instantaneous synaptic transmission. I will show how from this equation an effective dynamics for the rate of spikes emitted by the whole network can be worked out, allowing to predict a rich repertoire of collective dynamical regimes. Finally, I will use such theoretical framework to characterize the neuronal substrate of the so called “slow rhythms” (< 1 Hz) occurring during slow-wave sleep and under deep anesthesia of intact brains, and in cortical slices of mammals maintained in vitro. In this spontaneous activity regime of the nervous tissue, Up states at relatively high firing rate alternate with almost quiescent periods (Down states). The constructive role of endogenous noise and of heterogeneities in these nonlinear neuronal networks will be also discussed.

** **

**Abstract:**

Purkinje cells (PCs) of the cerebellar cortex have long been considered to perform similarly as perceptrons: Given an input pattern in the granular layer, they should learn to provide an adequate motor output, thanks to plasticity of the parallel fiber (PF) to PC synapses, under the supervision of the climbing fiber input which is assumed to carry an error signal (Marr 1969, Albus 1971). Supervised learning in the perceptron model has been studied extensively in the case of random uncorrelated input/output associations. In particular, it is known that when synapses are constrained to be positive (to account for the fact that PF-PC synapses are excitatory), the synaptic weight distribution at maximal storage capacity is composed of a large fraction of zero-weight synapses (‘silent’ synapses, Brunel et al. 2004). However, in the case of the cerebellum, the assumption of uncorrelated inputs and outputs is clearly unrealistic, as any naturalistic inputs/motor sequences will carry some substantial degree of temporal correlations. We therefore investigated both the capacity and the optimal connectivity in feed-forward networks learning associations between temporally correlated input/output sequences. We then consider a bistable output to mimic the postulated bistability of the PC (Yartsev et al. 2009, Loewenstein et al. 2005, Williams et al. 2002, Oldfield et al. 2010). We show that bistability can increase capacity, when the output correlation is bigger than the input correlation. Moreover, the weight distribution of the PF-PC synapses consists in any case of a large number of silent synapses and does not depend on the level of correlations.

]]>Title: Self-organizing Maps

When: Tuesday, March 20th, 2012

Time: 11:30AM

Where: Simons Center, Room 313 ]]>

**Abstract:**

Synaptic connectivity of neural circuits may range from completely disordered to very structured. Common hypotheses are that high level areas such as the prefrontal cortex are disordered, while early sensory areas like the primary visual cortex are structured. Even at the structured end of the spectrum, however, the connectivity matrix is not completely regular and contains significant randomness. Another general aspect of biological connectivity matrices is nonnormality. Nonnormality gives rise to a hidden feedforward structure between orthogonal activity patterns, and to transient amplification: small perturbations of a stable system from its fixed point leading to large transient responses. “Balanced amplification” in excitatory-inhibitory neural networks was used [1] to explain the observed similarity of spontaneous activity patterns with orientation maps in the cat V1 [2]. However, [1] only considered networks with regular connectivity, without any randomness.

The mathematical study of structured, nonnormal random matrices has been underdeveloped. We recently developed tools for studying various phenomena in networks with connectivity matrices of the general form M + J, with M a deterministic matrix representing structure, and J a fully random matrix with zero mean i.i.d. entries. These included general formulae for the distribution of eigenvalues in the complex plane, the magnitude of transient amplification, and the frequency power spectrum of response to external input. Here, using these tools, we study specific examples of M that mimic neural connectivity, have various hidden feedforward structures, and highlight extreme nonnormality. We also extend the results of [3] which studied the eigenvalue distribution of random matrices of the above type, with an M describing the connectivity of excitatory-inhibitory neurons with all excitatory (inhibitory) synapses having equal average strength. [3] showed that unlike the M=0 case, whose eigenvalue distribution is the well-known circular law, here the distribution also contains eigenvalues lying significantly outside the traditional circle. We calculate the density of these outliers, and furthermore extend the results to more general M’s, like those mimicking the connectivity of orientation tuned simple cells in V1.

**References:**

[1] Murphy, B. K. and Miller, K. D. (2009). Neuron, 61(4), 635–648.

[2] Kenet, T. et al. (2003). Nature, 425(6961), 954–956.

[3] Rajan, K. and Abbott, L. F. (2006). Physical Review Letters, 97(18).

]]>Title: Neocortical Machinery for Understanding the World

When: Wednesday, September 14, 2011

Time: 11:30 am

Where: Simons Center, Third Floor, Seminar Room 313 ]]>

Wednesday July 13 at 11:30 in room 313

**Speaker:**

Konstantin K. Likharev

Department of Physics and Astronomy, Stony Brook University

CrossNets: Possible Nanoelectronic Neuromorphic Networks

**Abstract:**

I will review recent work on devices, circuits and architectures for possible hybrid CMOS/nanoelectronic integrated circuits based on nanowire crossbars, with similar, simple, two-terminal devices formed at each crosspoint. Special attention will be given to application of such circuits in mixed-signal, adaptive neuromorphic networks (“CrossNets”) which may provide unparalleled performance for some important information processing tasks, and in future may become the first hardware basis for challenging the mammal cerebral cortex in both density and speed, at manageable power.

The work has been supported by AFOSR, DoD, FCRP via FENA Center, and NSF.

A recent review of the work, with a list of key publications, is available online at http://rsfq1.physics.sunysb.edu/~likharev/nano/SAM11.pdf

]]>We will review at an elementary level some statistical methods useful

in the analysis of spike train data. No background needed beyond basic

probability.

Wednesday June 15th

11:30-1:00, Simons Center for Geometry and Physics, Room 313

Conditional intensity function. Maximum likelihood estimation. Model

validation and the time-rescaling theorem.

Thursday June 16th

11:30-1:00, Simons Center for Geometry and Physics, Room 313

**Lecture 2: Unobserved neural processes
**Models with hidden variables. Probabilistic classification. The

Expectation-Maximization algorithm. Probabilistic graphical models.

Wednesday June 22nd

11:30-1:00, Simons Center for Geometry and Physics, Room 313

**Lecture 3: Multi-state neural systems
**Hidden Markov models. Forward and backwards probabilities. The Viterbi

algorithm for decoding.

Thursday June 23rd

11:30-1:00, Simons Center for Geometry and Physics, Room 313

**Lecture 4: Sampling
**Rejection and importance sampling. Markov Chains. Metropolis-Hastings

and Gibbs sampling.

Wednesday at 1130 in room 313

First episode: Today’s talk see below

May 11th at 11:30am in the Simons Center for Geometry and Physics Room 313

]]>PDF of presentation

video part 1

**Speaker:**

Giancarlo La Camera, PhD

Dept. Neurobiology and Behavior

Life Sciences Bldg 513

SUNY Stony Brook

Title:

Selected problems in learning and decision making

Abstract:

I will give a brief introduction to Neuroscience with a focus on learning and decision-making. I will then describe some recent results on how people and animals make decisions between a safe and a risky option, and what factors influence their preference. Some prominent theories of predictive learning will be tested against these phenomena. Finally, I will discuss current efforts to model reinforcement learning in populations of spiking neurons, and how this could provide a biological solution to the problem of learning how to extract meaningful segments from a sensory stream.