Program & Abstracts / OCNC2015

OCNC2015 Program

*All the lectures take place in the Seminar Room, OIST Seaside House unless otherwise indicated.
*Tutorials are optional sessions.
 

Faculty Meeting Signup 

*Signup for each meeting will open according to the schedule in the signup sheet
*Meetings with faculty take place in Meeting room 1 
*Meetings with faculty on Tuesday 16th take placce at OIST main campus (Rooms to be announced)
 

Week 1 (Jun 8-14) : Methods

Monday, June 8

09:30-09:45   Greetings from the organizers

10:00-13:00   Parallel Sessions 

                     Biologists:  Kenji Doya  Introduction to numerical methods for ordinary and partial differential equations

   Theoreticians: Yoko Yazaki-Sugiyama   Neuronal basis for information processing. / Meeting Room 1

14:00-18:00   Student poster presentations (14:00-16:00 Group1 / 16:00-18:00 Group2)

19:00-21:00   Reception & Dinner

  

Tuesday, June 9

09:30-12:30   Erik De Schutter  Modeling biochemical reactions, diffusion and reaction-diffusion systems

14:00-15:00   Introduction of the tutors

15:30-18:00   Tutorial: Python

 

Wednesday, June 10

09:30-12:30   Bernd Kuhn

                  1. Ion channel physiology and the Hodgkin-Huxley model of neuronal activity

                  2. Functional optical imaging

14:00-16:00   Tutorial: Brian Simulator

16:00-18:00   Tutorial: NEURON

 

Thursday, June 11

09:30-12:30   Erik De Schutter  Introduction to modeling neurons and networks

14:00-16:00   Tutorial: NEST

16:00-18:00   Tutorial: MATLAB (parallel sessions – basic/advanced)

 

Friday, June 12

09:30-12:30   Kenji Doya  Introduction to reinforcement learning and Bayesian inference

14:00-16:00   Tutorial: STEPS

16:00-18:00   Tutorial: Other tutorials/Discussion with tutors

 

Saturday, June 13

09:30-12:30   Greg Stephens    TBA

14:00-16:00   Meeting with Dr. Stephens

 

Sunday, June 14 (Day off)

 

 

Week 2 (Jun 15-21) : Neurons, Networks and Behavior I

Monday, June 15

09:30-12:30   Jonathan Rubin    Dynamical Systems Methods for Neuroscience: the Power of Reduced Models

14:00-16:00   Project work or faculty meeting with Dr. Rubin

16:00-18:00   Project work or faculty meeting with Dr. Doya

 

Tuesday, June 16

09:30-12:30   Xiao-Jing Wang   Cognitive-type microcircuits: decision-making and working memory

13:40-16:00   Visit to OIST campus  (14:00-14:20 overview of campus / 14:20-16:00 poster session presented by OIST researchers) 

16:00-18:00   Project work or meeting with Dr. Kuhn or Dr. Sugiyama

 

Wednesday, June 17

09:30-12:30   Taro Toyoizumi  Dynamical Systems Approach to Neuroscience

14:00-16:00   Project work or meeting with Dr. Wang

16:00-18:00   Project work or meeting with Dr. De Schutter

 

Thursday, June 18

09:30-11:00   Peter Latham  Rate vs. temporal coding: what it means, and the role of network dynamics

11:00-12:30   Steve Prescott  Rate vs. temporal coding and the importance of biophysical factors

14:00-16:00   Project work or meeting with Dr. Toyoizumi

16:00-18:00   Project work

 

Friday, June 19

09:30-11:00   Peter Latham  Rate vs. temporal coding: what it means, and the role of network dynamics

11:00-12:30   Steve Prescott  Rate vs. temporal coding and the importance of biophysical factors

14:00-16:00   Project work or meeting with Dr. Latham

16:00-18:00   Project work or meeting with Dr.Prescott

 

Saturday, June 20

09:30-12:30   Project work 

 

Sunday, June 21 (Day off)

 

 

Week 3 (Jun 22 – 25) : Neurons, Networks and Behavior II

Monday, June 22

09:30-12:30   Jackie Schiller   Dendritic processing in cortical pyramidal neurons in-vitro and in-vivo.

14:00-17:00   Yukio Nishimura   Rewiring the damaged neural pathways via a computer interface

17:00-18:00   Project work or meeting with Dr. Nishimura

 

Tuesday, June 23

09:30-12:30   Netta Cohen   Neural and neuromechanical computation in a physical world

14:00-16:00   Project work or meeting with Dr. Schiller

16:00-18:00   Project work or meeting with Dr. Cohen

 

Wednesday, June 24

09:30-12:30   Wako Yoshida  Computational cognitive neuroimaging and social decision-making

14:00-16:00   Project work or meeting with Dr. Yoshida

16:00-18:00   Project work or meeting with Dr. Izhikevich

 

Thursday, June 25

09:30-11:00   Eugene Izhikevich  Spikes

11:15-12:30   Student project presentations

14:00-16:00   Student project presentations

19:00-21:00   Banquet & Dinner


 

Abstracts

Lecturer Kenji Doya
Title

Introduction to numerical methods for ordinary and partial differential equations

Abstract

This tutorial introduces the basic concepts of differential equations and how to solve them, or simulate their behaviors in time, using a computer. Key concepts like eigenvalues and stability are explained while solving simple differential equation. Some examples of Hodgkin-Huxley type neuron models and cable equations are also introduced.

Suggested Readings

Doya K: Reinforcement learning: Computational theory and biological mechanisms. HFSP Journal, 1(1), 30-40 (2007)

Free on-line access: http://dx.doi.org/10.2976/1.2732246

Doya K, Ishii S: A probability primer. In Doya K, Ishii S, Pouget A, Rao RPN eds. Bayesian Brain: Probabilistic Approaches to Neural Coding, pp. 3-13. MIT Press (2007).

Free on-line access: http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=11106

 

Lecturer Kenji Doya
Title

Introduction to reinforcement learning and Bayesian inference

Abstract

The aim of this tutorial is to present the theoretical cores for modeling animal/human action and perception. In the first half of the tutorial, we will focus on "reinforcement learning", which is a theoretical framework for an adaptive agent to learn behaviors from exploratory actions and resulting reward or punishment. Reinforcement learning has played an essential role of understanding the neural circuit and neurochemical systems behind adaptive action learning, most notably the basal ganglia and the dopamine system. In the second half, we will familiarize ourselves with the framework of Bayesian inference, which is critical in understanding the process of perception from noisy, incomplete observations.

Suggested Readings

Doya K: Reinforcement learning: Computational theory and biological mechanisms. HFSP Journal, 1(1), 30-40 (2007)

Free on-line access: http://dx.doi.org/10.2976/1.2732246

Doya K, Ishii S: A probability primer. In Doya K, Ishii S, Pouget A, Rao RPN eds. Bayesian Brain: Probabilistic Approaches to Neural Coding, pp. 3-13. MIT Press (2007).

Free on-line access: http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=11106

 

Lecturer

Yoko Yazaki-Sugiyama

Title

Neuronal basis for information processing.

Abstract 

We are acquiring visual information at the eye, auditory information in the ear, olfactory information at the nose etc., which is conveyed to the brain and processed and transformed to make us to recognize as a sense. The brain also works for generating and controlling a complicated behavior, and are responsible to define aspects of behavior as feelings and abstract of thought.  

Neurons are the smallest component of the brain and are the key players for signal processing for making these difficult tasks with wiring each other.     

 In this lecture we will learn basic physiological character and mechanism of neurons to see how those complicated tasks can be performed. We will also try to get an idea how neurons can compute signals by wisely connecting each other. 

Suggested Readings

The neuron: Cell and Molecular Biology. I.B. Levitan and L.K. Kaczmarek, Oxford University Press

 

Lecturer Erik De Schutter
Title

Modeling biochemical reactions, diffusion and reaction-diffusion systems

Abstract

In my first talk I will use calcium dynamics modeling as a way to introduce deterministic solution methods for reaction-diffusion systems. The talk covers exponentially decaying calcium pools, diffusion, calcium buffers and buffered diffusion, and calcium pumps and exchangers. I will describe properties of buffered diffusion systems and ways to characterize them experimentally. Finally I will compare the different modeling approaches.

In the second talk I will turn towards stochastic reaction-diffusion modeling. Two methods will be described: Gillespie's Stochastic Simulation algorithm extended to simulate diffusion, and particle-based methods. I will briefly describe the STEPS software. I will then describe two applications: stochastic reaction modeling of LTD induction in Purkinje cells and stochastic diffusion modeling of anomalous diffusion in spiny dendrites.

Suggested Readings

• U.S. Bhalla and S. Wils: Reaction-diffusion modeling. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)

• E. De Schutter: Modeling intracellular calcium dynamics. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)

• G. Antunes and E. De Schutter: A stochastic signaling network mediates the probabilistic induction of cerebellar long-term depression. Journal of Neuroscience 32: 9288–9300. (2012).

• F. Santamaria, S. Wils, E. De Schutter and G.J. Augustine: Anomalous diffusion in Purkinje cell dendrites caused by dendritic spines. Neuron 52: 635–648 (2006).

• Several chapters in Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston (2009).

• V. Steuber et al.: Cerebellar LTD and pattern recognition by Purkinje cells. Neuron 54: 121–136 (2007).

 

Lecturer Erik De Schutter
Title

Introduction to modeling neurons and networks

Abstract

In the first talk I will discuss methods to model morphologically detailed neurons. I will briefly introduce cable-theory, the mathematical description of current flow in dendrites. By discretizing the cable equation we come to compartmental modeling, the standard method to simulate morphologically detailed models of neurons. I will discuss the challenges in fitting compartmental models to experimental data with an emphasis on active properties. The talk will finish with a brief overview of dendritic properties predicted by cable theory and experimental data confirming these predictions.

The second talk will briefly introduce network modeling. I will introduce simpler neuron models like integrate-and-fire neurons and then move on to modeling synaptic currents. I will wrap up with an overview of network connectivity.

Suggested Readings

• U.S. Bhalla and S. Wils: Reaction-diffusion modeling. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)

• E. De Schutter: Modeling intracellular calcium dynamics. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)

• G. Antunes and E. De Schutter: A stochastic signaling network mediates the probabilistic induction of cerebellar long-term depression. Journal of Neuroscience 32: 9288–9300. (2012).

• F. Santamaria, S. Wils, E. De Schutter and G.J. Augustine: Anomalous diffusion in Purkinje cell dendrites caused by dendritic spines. Neuron 52: 635–648 (2006).

• Several chapters in Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston (2009).

• V. Steuber et al.: Cerebellar LTD and pattern recognition by Purkinje cells. Neuron 54: 121–136 (2007).

 

Lecturer

Bernd Kuhn 

Title

Ion channel physiology and the Hodgkin-Huxley model of neuronal activity

Abstract

In my first lecture I will talk about electric activity in neurons. I will start with the basics of ion channels, and specifically focus on voltage-gated channels and their dynamics in response to membrane voltage. Neurons use a combination of different voltage-gated channels to generate fast (about 1 ms), depolarizing action potentials. I will explain the first action potential model by Hodgkin and Huxley . Finally, I will discuss more recent additions or fine-tuning of the time-honored Hodgin-Huxley model.

Suggested Readings

Johnston and Wu: Foundation of cellular neurophysiology, MIT press

Helmchen, Konnerth: Imaging in Neuroscience, 2011

Yuste, Lanni, Konnerth: Imaging Neurons, 2000

 

Lecturer Bernd Kuhn
Title

Functional optical imaging 

Abstract

Functional optical imaging has becomes one of the key techniques in neuroscience. In my second lecture I will introduce fluorescence and the most important imaging methods. I will explain what we can learn from them but also discuss their limitations.

Suggested Readings

ohnston and Wu: Foundation of cellular neurophysiology, MIT press

Helmchen, Konnerth: Imaging in Neuroscience, 2011

Yuste, Lanni, Konnerth: Imaging Neurons, 2000

 

Lecturer

Greg Stephens

Title TBA
Abstract TBA
Suggested Readings TBA

 

Lecturer Jonathan Rubin
Title Dynamical Systems Methods for Neuroscience: the Power of Reduced Models
Abstract

Reduced models of neurons and neuronal networks offer the potential for efficient simulations and for insights about fundamental mechanisms that contribute to neuronal behavior.  Dynamical systems theory provides the toolbox that we can use to extract such insights.   I will review some basic elements of dynamical systems theory in the context of reduced neuronal models.  I will also use these ideas to discuss some interesting neuronal behaviors such as post-inhibitory rebound and bursting.

 

Lecturer Taro Toyoizumi
Title

Dynamical Systems Approach to Neuroscience

Abstract

Advances in neuronal recording technologies have enabled the acquisition of data at an unprecedented scale and resolution. However, the increase in data complexity poses a challenge to reductionist approaches. Motivated by general principles of dynamical systems, we characterized nonlinear cross-embedding relationships for broad-spectrum neurophysiological data. Unlike conventional approaches, our method extracts dynamical complexity and directional interactions among brain regions. This approach exhibits a hierarchical organization of brain areas, where interactions are directed from low to high complexity, revealing an emergent dynamical code of inter-area messaging. 

Suggested Readings

H. Shimazaki, K. Sadeghi, T. Ishikawa, Y. Ikegaya, and T. Toyoizumi, Scientific Reports in press (2015).

T. Toyoizumi, M. Kaneko, M. P. Stryker, and K. D. Miller, Neuron 84, 497-510 (2014).

T. Toyoizumi, H. Miyamoto, Y. Yazaki-Sugiyama, N. Atapour, T. K. Hensch, and K. D. Miller, Neuron 80, 51-63 (2013).

T. Toyoizumi and L. F. Abbott, Physical Review E 84, 051908 (2011).

 

Lecturer

Xiao-Jing Wang

Title Cognitive-type microcircuits: decision-making and working memory
Abstract Neural circuit mechanisms of higher cognitive functions have become a focus of intense research in the last decade. In this lecture, I will introduce a core circuit model of "the cognitive type" that has emerged from this line of research. I will start with the basics from models of single neurons and synapses, to strongly recurrent network dynamics. I will show how such a cognitive-type microcircuit can subserve both decision-making and working memory, with testable predictions. In the final part of the lecture I will discuss how to go beyond local circuits to large-scale brain circuit modeling.
Suggested Readings Wang X-J (2008) Decision making in recurrent neural circuits (invited review). Neuron 60, 215-234. 

Wang X-J (2002) Probabilistic decision making by slow reverberation in neocortical circuits. Neuron 36, 955-968. 

Wang X-J (2001) Synaptic reverberation underlying mnemonic persistent activity. Trends in Neurosci 24, 455-463.

 

Lecturer

Peter Latham

Title

Rate vs. temporal coding: what it means, and the role of network dynamics

Abstract

I'll be giving two 1.5 hour talks.

talk 1:

Neural coding schemes are often divided according to whether information is represented primarily by the rate or timing of spikes. Quantifying exactly what this means, however, is far from trivial. One possibility is that rate and timing codes lie on a continuum; another is that there is no continuum, but in fact rate and timing codes are qualitatively different. Perhaps surprisingly, even after 20 years of neural coding studies, nobody agrees. Nevertheless, I will attempt to clarify what each of these possibilities means, and I'll review some studies aimed at determining whether rate or timing is most important.

talk 2:

One thing both Steve Prescott and I agree on is that synchronous spikes are critical for spike timing codes. However, they are necessary, but not sufficient. I'll address the question: can recurrent networks of neurons maintain spike timing codes? Here the focus is deep in cortex, beyond sensory areas. I'm interested in questions like: when you're thinking about what you had for dinner last night, do your neurons communicate primarily using rate or timing? I'll focus on two studies, one showing that networks in the brain tend to be chaotic ("Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex," London et al., Nature 466:123-127; 2010), and the other exploring the effects of chaos on temporal memory ("Randomly connected networks have short temporal memory," Wallace et al., Neural Comput. 25:1408-1439; 2013).

Suggested Readings

It would be nice to know a little information theory (you can check out my scholarpedia page for an intro,   http://www.scholarpedia.org/article/Mutual_information

And there are two relevant papers:

    1. Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex. 

    2. Randomly connected networks have short temporal memory.

Both can be found on my website,

    http://www.gatsby.ucl.ac.uk/~pel/publications.html

 

Lecturer

Steve Prescott

Title

Rate vs. temporal coding and the importance of biophysical factors

Abstract

Neural coding schemes are often divided according to whether information is represented primarily by the rate or timing of spikes. Peter Latham and I will debate the evidence for and against such coding schemes. I will demonstrate that the reliability and precision of spiking is not so easily disrupted by factors like background noise when spikes occur synchronously across sets of neurons; it is precisely this sort of correlated activity this is required to drive precise spike timing in the first place. Therefore, temporal coding is viable under noisy conditions (although Peter might disagree). Indeed, I will explain how spike initiation dynamics, which reflect the nonlinear interaction between different ion channels, is critical for understanding if and when a neuron (and its neighbors) will spike. Capitalizing on these biophysical nuances, the brain might increase its bandwidth by combining rate- and synchrony based coding to achieve multiplexed coding of slow and fast signals.

Suggested Readings

Prescott SA, De Koninck Y, Sejnowski TJ. Biophysical basis for three distinct dynamical mechanisms of action potential initiation. PLoS Comput. Biol. 2008; 4: e1000198.

Ratté S, Hong S, De Schutter E, Prescott SA. Impact of neuronal properties on network coding: roles of spike initiation dynamics and robust synchrony transfer. Neuron. 2013 Jun 5;78(5):758-72.

 

Lecturer

Jackie Schiller

Title

Dendritic processing in cortical pyramidal neurons in-vitro and in-vivo.

Abstract

In the first part of my lecture I will give a review about dendritic processing in pyramidal neurons, which will follow by a second part describing recent developments in dendritic processing in-vivo. I will concentrate on neocortical neurons including layer 5 neurons from which we have a large data base, the layer 2-3 neurons and layer 4 as well. I will describe how EPSP's are integrated in the different portions of the dendritic tree; the basic amplification mechanisms in the different portions of the tree and I will talk about plasticity mechanisms taking place at the different dendritic locations. Finally I will describe recent work from my lab and from other labs describing the role of active mechanisms in processing of information in-vivo.

 

Lecturer Yukio Nishimura
Title

Rewiring the damaged neural pathways via a computer interface

Abstract Functional loss of limb control in individuals with spinal cord injury or stroke is attributed to interruption of descending pathways to spinal network. Although neural circuits locate below and above the lesion remain most of their function. I will show an artificial neuronal connection that bridges supra-spinal system and spinal network beyond the lesion site restores lost function. The artificial connection was produced by a brain-computer interface that can detect the neural activity and converted in real-time to activity-contingent electrical stimuli delivered to nervous system. A promising application is to bridge impaired biological connections, as demonstrated for cortically controlled electrical stimulation to a spinal site or muscles. Our results document that monkey utilized the artificial connection instead of physiological connections. Recent work has shown that volitionally controlled walking in individuals with spinal cord lesion can be restored by musle-controlled magnetic stimulation to lumbar spinal cord. A second application of the artificial connection is to produce Hebbian synaptic plasticity through cortically spike-triggered stimulation to a spinal related site, which can strengthen physiological mono-synaptic connections between the motor cortex and spinal cord. These results suggest that artificial neural connections can compensate for interrupted descending pathways. Furthermore, these paradigms have numerous potential applications, depending on the input signals, the computed transform and the output targets.

 

Lecturer Netta Cohen
Title

Neural and neuromechanical computation in a physical world

Abstract

The neuroscience of motor control tends to focus on understanding how specialized neural circuits drive coordinated and rhythmic patterns of movement. By and large, this view neglects the physics of the body and sensory feedback from the environment. This talk will introduce a more holistic view of the neural control of motor behavior, illustrated through a study of crawling and swimming in C.  elegans worms. I will begin by introducing principles and models of neural networks that drive rhythmic patterns of muscles activity. I will then build a model of C. elegans locomotion step by step, verifying and falsifying assumptions and model predictions in an iterative manner to demonstrate the process of model building and its challenges. I will conclude by comparing some of the principles unraveled in this nervous system with other invertebrate and vertebrate nervous systems.

Suggested Readings

Katz, Paul S. and Hooper, Scott L.(2007) "Invertebrate Central Pattern Generators". In "Invertebrate Neurobiology", (Eds. North, G and Greenspan RJ), pp. 251-279, DOI: 10.1101/087969819.49.251. Cold Spring Habor. New York.

 (PDF also available online: http://cshmonographs.org/index.php/monographs/article/viewFile/4467/3601) 

Stefano Berri , Jordan H. Boyle , Manlio Tassieri , Ian A. Hope & Netta Cohen (2009) Forward locomotion of the nematode C. elegans is achieved through modulation of a single gait, HFSP Journal, 3:3, 186-193, DOI: 10.2976/1.3082260. (Full text available online: http://dx.doi.org/10.2976/1.3082260)

 

Lecturer Wako Yoshida
Title

Computational cognitive neuroimaging and social decision-making

Abstract

An increasing number of studies now use computational models of brain function to analyse fMRI data. In this approach, instead of simply modelling observed brain signals in terms of experimental factors, we try to explain the data in terms of quantities the brain must encode, under simplifying assumptions about how the brain works. Computational neuroimaging is particularly well suited to the study of higher-order cognition, social decision-making, and behavioural psychiatry. In my talk, I will first introduce the basic framework of computational (model-based) neuroimaging and then discuss some recent findings of social decision-making using these methods. 

Suggested Readings

Yoshida W, Seymour B, Friston KJ and Dolan RJ.  Neural mechanisms of belief inference during cooperative games. The Journal of Neuroscience. 30(32), 10744-10751, 2010. 

Yoshida W, Dolan RJ and Friston KJ. Game theory of mind. PLoS Computational Biology. 4(12), 2008.

 

Lecturer

Eugene Izhikevich

Title

Spikes

Abstract

Most communication in the brain is via spikes. While we understand the spike-generation mechanism of individual neurons, we fail to appreciate the spike-timing code and its role in neural computations.

The speaker starts with simple models of neuronal spiking and bursting, describes small neuronal circuits that learn spike-timing code via spike-timing dependent plasticity (STDP), and finishes with large-scale brain models.

Suggested Readings

E.M. Izhikevich: Dynamical Systems in Neuroscience The MIT Press (2007)

http://www.izhikevich.org/publications/dsn/index.htm

Izhikevich E.M. (2006) Polychronization: Computation With Spikes. Neural Computation,18:245-282

http://www.izhikevich.org/publications/spnet.pdf

Szatmary B. and Izhikevich E. M. (2010) Spike-Timing Theory of Working Memory. PLoS Computational Biology, 6(8): e1000879

http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1000879