Indices for testing neural codes Academic Article uri icon


MeSH Major

  • Information Theory
  • Models, Neurological
  • Neurons


  • One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon's mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem.

publication date

  • December 2008



  • Academic Article



  • eng

PubMed Central ID

  • PMC2671011

PubMed ID

  • 18533812

Additional Document Info

start page

  • 2895

end page

  • 936


  • 20


  • 12