Monday, August 25, 2008

Learning and Information theory

As a system learns, it evolves its knowledge about the world. If we think of this knowledge being represented by a probabilistic model of the world, we can define entropy of the

In 1970, E. Pfaffelhuber wrote a paper titled "Learning and Information Theory" and submitted to the international journal of neuroscience [doi]. The introduction is very intriguing.
Intuitively, learning means an accumulation of a system's knowledge or information about a set X of data or events x or, equivalently, a decrease of the system's missing information about these data in the lapse of time. Thus, a quantitative definition of learning seems possible, provided one is able to introduce a measure for a system's missing information. As has been pointed out by various authors, Leibovic (1969, "information processing in the nervous system"), Shannon's classical information measure is not appropriate to describe behavioral processes of biological systems, the reason being that this measure is not appropriate to describe behavioral processes of biological systems, the reason being that this measure is based solely upon objective probabilities and cannot, therefore, represent a system's knowledge or beliefs nor can it discriminate between events which are of great importance and others which are irrelevant for an individual system.
The paper ends up talking about Kullback-Leibler divergence as a measure of difference between the actual probability and subjective probability. However, it is not readily usable by any learning system, because the actual probability is not known. This concept is extended by Palm's 1981 paper "Evidence, Information, and Surprise" in Biological Cybernetics [doi].

Data analysis tools for computational neuroscience

Mostly from the comp-neuro mailing list
Database

Sunday, August 24, 2008

A step towards Kernel Kalman filter

For an arbitrary continous nonlinearity in both state space and observation models, we came up with a analytical approximation for the extended recursive least squares filter using kernel method and some linear algebra. Now I need to fill the details and implement this. I wonder if this is the Kalman filter...cause the model and solution looks the same.

This posting is a test from my blackberry.

Friday, August 22, 2008

Excellent student project for philosophy of science

Recently in computational neuroscience mailing list (comp-neuro), renowned scholars started discussion of why neuroscience is not advancing as much as we put money, time and effort to the field. Many interesting discussions regarding the nature of biological science compared to physical science, and Kolmogorov complexity complete systems, realistic vs simple models for understanding, reproducibility of experiments and so on are all over the place.
It would be a nice class project or even a preliminary topic for a thesis to analyze these hot discussions! Any volunteers?

Monday, August 11, 2008

Input induced synchrony, and desynchrony due to internal dynamics


This is perhaps the most uninteresting network of neurons: uncoupled (independent) oscillating neurons. The only interesting thing is that they share a common input. The oscillation can be reset by the input, thus, inducing synchrony of the population.
The plot on the right is how they desynchronize over time. It is due to the variability of each neuron's period. The more variability, the faster the desynchronization.

Tuesday, August 05, 2008

Robustness of Cognitive processes

10:30 am
I was browsing through the TOC alert emails and found the following paper
The nature of the memory trace and its neurocomputational implications
by: P de Vries, K van Slochteren
Journal of Computational Neuroscience, Vol. 25, No. 1. (2008), pp. 188-202.

The abstract was stating exactly what I have thought about 10 years ago. I am sure the idea is generally known to many people, but this was stated as I liked:
The brain processes underlying cognitive tasks must be very robust. Disruptions such as the destruction of large numbers of neurons, or the impact of alcohol and lack of sleep do not have negative effects except when they occur in an extreme form. This robustness implies that the parameters determining the functioning of networks of individual neurons must have large ranges or there must exist stabilizing mechanisms that keep the functioning of a network within narrow bounds...


The paper builds a high level model of neural assemblies.

Donation to TortoiseSVN

I share all my research documents and code via Subversion (SVN). And for the windows machines I use TortoiseSVN, a wonderful piece of software. Today, being in a good mood, I donated US dollars to the developers.
Keep up the good work TortoiseSVN!

Monday, August 04, 2008

Living neurons as liquid in LSM, why it makes sense

8am

LCN: living cortical network
LSM: liquid state machine

Advantage of using LCN for LSM

In the original LSM framework, any dynamical system that satisfies the separation property can be used as the liquid. However, how to choose a proper liquid for a specific problem is not yet well established. Although the details are unknown, the LCN has the capability to adapt to the signals that it is exposed, and self-organize itself. Therefore, the information processing through LCN could be interesting. In fact, well known phenomenological synaptic plasticity rules including spike-timing dependent plasticity turned out to have the power of self-organization and mutual information maximization. Being a biological system that is far from being fully understood, the LCN system has the power equivalent to the brain—neurons grow axon and dendrites, self-regulate ion channels, synapses grow, split and disappear, and more. The totality of LCN cannot be simulated in a computer as a traditional LSM would work with. Even if it is possible to simulate the system, it is always computationally cheaper to use the actual physical system rather than the complicated simulation.