Program and Abstract / OCNC2016

Link for handy Weekly Calender  [Week1] [Week2] [Week3] /  Link for SH Floor Map

All the lectures take place in the Seminar Room, OIST Seaside House unless otherwise indicated.

Faculty-Student Meeting Signup

  • Signup for each meeting will open according to the schedule in the signup sheet
  • Meetings with faculty take place in Meeting room 1 
  • Meeting with faculty on Thu.23th takes place at OIST main campus (Room# C102)

 

Week 1 (Jun 13-19) : Methods

Monday, June 13

09:30-09:45   Greetings from the organizers

10:00-13:00   Parallel Sessions 

  1. Theoreticians: Yoko Yazaki-Sugiyama   Neuronal basis for information processing. Meeting Room 1

  2. Biologists: Sungho Hong (as the substitute for Kenji Doya)  Introduction to numerical methods for ordinary and partial differential equations

14:00-18:00   Student poster presentations (14:00-16:00 Group1 / 16:00-18:00 Group2)

19:00-21:00   Reception & Dinner

  

Tuesday, June 14

09:30-12:30   Michael Häusser   (tentative) Cable theory and Neuron Modeling

13:30-15:30   Meeting with De. Häusser

15:30-16:00   Introduction of the tutors

16:00-18:00   Tutorial: Python

 

Wednesday, June 15

09:30-12:30   Bernd Kuhn

                      1. Ion channel physiology and the Hodgkin-Huxley model of neuronal activity

                      2. Functional optical imaging

14:00-16:00   Tutorial: Matlab (parallel sessions:basic/advanced)

16:00-18:00   Tutorial: Brian

 

Thursday, June 16

09:30-12:30   Erik De Schutter   Modeling biochemical reactions, diffusion and reaction-diffusion systems

14:00-16:00   Tutorial: Nest

16:00-18:00   Tutorial: Neuron

 

Friday, June 17

09:30-12:30   Tomoki Fukai    Neural and synaptic dynamics for memory

14:00-16:00   Meeting with Dr. Fukai  

16:00-18:00   Meeting with Dr. De Schutter

 

Saturday, June 18

09:30-12:30   Greg Stephens    TBA

14:00-16:00   Meeting with Dr. Stephens

16:00-18:00   Tutorial: STEPS/LFPy/Other topics

 

Sunday, June 19 (Day off)

 

 

Week 2 (Jun 20-26) : Neurons, Networks and Behavior I

Monday, June 20

09:30-12:30   John Rinzel   Nonlinear dynamics of neuronal systems 

14:00-16:00   Project work or Meeting with Dr. Yazaki-Sugiyama

16:00-18:00   Project work 

 

Tuesday, June 21

09:30-12:30   Kenji Doya    Introduction to reinforcement learning and Bayesian inference   

14:00-16:00   Project work or meeting with Dr. Rinzel

16:00-18:00   Project work or Meeting with Dr. Kuhn (at Campus)

 

Wednesday, June 22

09:30-12:30   Chris Eliasmith     Building a brain: From neurons to cognition

14:00-16:00   Project work

16:00-18:00   Project work

 

Thursday, June 23

09:30-12:30   Claudia Clopath     Modelling synaptic plasticity, leaning and memory

13:40-16:00   Visit to OIST Main Campus / Poster session by OIST researchers

16:00-18:00   Meeting with Clopath or Join OIST Tea Time

 

Friday, June 24

09:30-12:30  Optional Tutorial: Nengo by Dr. Eliasmith

14:00-16:00   Project work or Meeting with Dr. Eliasmith

16:00-18:00   Project work

 

Saturday, June 25

09:30-12:30   Shinji Nishimoto     Modeling brains under natural conditions

14:00-16:00   Project work 

 

Sunday, June 26 (Day off)

 

 

Week 3 (Jun 27 – 30) : Neurons, Networks and Behavior II

Monday, June 27

09:30-12:30   Stefan Mihalas  Modeling Networks of Populations of Neurons

14:00-16:00   Project work or Meeting with Dr. Nishimoto

16:00-18:00   Project work or Meeting with Dr. Doya

 

Tuesday, June 28

09:30-12:30   Etienne Koechlin   Prefrontal executive function and human adaptive behavior 

14:00-16:00   Project work or meeting with Dr. Mihalas

16:00-18:00   Project work or meeting with Dr. Koechlin

 

Wednesday, June 29

09:30-12:30   Partha Mitra  Graph theory and Neural Networks

14:00-16:00   Project work or Meeting with Dr. Mitra 

16:00-18:00   Project work or Meeting with Dr. Prinz

 

Thursday, June 30

09:30-12:30   Astrid Prinz  Ensemble modeling to investigate mechanisms of neuronal and network robustness

14:00-17:00   Student project presentations

19:00-21:00   Banquet & Dinner

 

Abstracts (in process)

Monday Jun.13 

Lecturer

Yoko Yazaki-Sugiyama
Title Neuronal basis for information processing.
Abstract

We are acquiring visual information at the eye, auditory information in the ear, olfactory information at the nose etc., which is conveyed to the brain and processed and transformed to make us to recognize as a sense. The brain also works for generating and controlling a complicated behavior, and are responsible to define aspects of behavior as feelings and abstract of thought.  

Neurons are the smallest component of the brain and are the key players for signal processing for making these difficult tasks with wiring each other.     

 In this lecture we will learn basic physiological character and mechanism of neurons to see how those complicated tasks can be performed. We will also try to get an idea how neurons can compute signals by wisely connecting each other.

Suggested Readings The neuron: Cell and Molecular Biology. I.B. Levitan and L.K. Kaczmarek, Oxford University Press

 

Lecturer Sungho Hong (on behalf of Kenji Doya)
Title Introduction to numerical methods for ordinary and partial differential equations
Abstract This tutorial introduces the basic concepts of differential equations and how to solve them, or simulate their behaviors in time, using a computer. Key concepts like eigenvalues and stability are explained while solving simple differential equation. Some examples of Hodgkin-Huxley type neuron models and cable equations are also introduced.

 

Tuesday Jun.14

Lecturer Michael Hausser
Title (tentative) Cable theory and Neuron Modeling
Abstract TBA
Suggested Readings TBA

 

Wednesday Jun.15

Lecturer Bernd Kuhn
Title 1 Ion channel physiology and the Hodgkin-Huxley model of neuronal activity
Abstract 1  In my first lecture I will talk about electric activity in neurons. I will start with the basics of ion channels, and specifically focus on voltage-gated channels and their dynamics in response to membrane voltage. Neurons use a combination of different voltage-gated channels to generate fast (about 1 ms), depolarizing action potentials. I will explain the first action potential model by Hodgkin and Huxley . Finally, I will discuss more recent additions or fine-tuning of the time-honored Hodgin-Huxley model.

Title 2

Functional optical imaging 
Abstract 2 Functional optical imaging has becomes one of the key techniques in neuroscience. In my second lecture I will introduce fluorescence and the most important imaging methods. I will explain what we can learn from them but also discuss their limitations.
Suggested Readings

Johnston and Wu: Foundation of cellular neurophysiology, MIT press

Helmchen, Konnerth: Imaging in Neuroscience, 2011

Yuste, Lanni, Konnerth: Imaging Neurons, 2000

 

Thursday Jun.16

Lecturer Erik De Schutter
Title Modeling biochemical reactions, diffusion and reaction-diffusion systems
Abstract In my first talk I will use calcium dynamics modeling as a way to introduce deterministic solution methods for reaction-diffusion systems. The talk covers exponentially decaying calcium pools, diffusion, calcium buffers and buffered diffusion, and calcium pumps and exchangers. I will describe properties of buffered diffusion systems and ways to characterize them experimentally. Finally I will compare the different modeling approaches.
In the second talk I will turn towards stochastic reaction-diffusion modeling. Two methods will be described: Gillespie's Stochastic Simulation algorithm extended to simulate diffusion, and particle-based methods. I will briefly describe the STEPS software. I will then describe two applications: stochastic reaction modeling of LTD induction in Purkinje cells and stochastic diffusion modeling of anomalous diffusion in spiny dendrites.
Suggested Readings • U.S. Bhalla and S. Wils: Reaction-diffusion modeling. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)
• E. De Schutter: Modeling intracellular calcium dynamics. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)
• G. Antunes and E. De Schutter: A stochastic signaling network mediates the probabilistic induction of cerebellar long-term depression. Journal of Neuroscience 32: 9288–9300. (2012).
• F. Santamaria, S. Wils, E. De Schutter and G.J. Augustine: Anomalous diffusion in Purkinje cell dendrites caused by dendritic spines. Neuron 52: 635–648 (2006).
• Several chapters in Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston (2009).
• V. Steuber et al.: Cerebellar LTD and pattern recognition by Purkinje cells. Neuron 54: 121–136 (2007).

 

Friday Jun.17

Lecturer Tomoki Fukai
Title Neural and synaptic dynamics for memory
Abstract Information about episodes is processed by a cascade of neural networks in the hippocampus and neocortex in a highly dynamical manner. Though our knowledge on these neural systems is rapidly accumulating, many things still remain unclear about their biological properties and functions. In my lecture, I will discuss the circuit mechanisms of memory processing, covering the following topics: Non-random features of the hippocampal networks and their implications; Pattern separation with neurogenesis; Synaptic plasticity and cell assembly formation; Preplay and replay in spatial navigation; interactions between hippocampus and neocortex. I will provide the necessary fundamentals of neural network models and machine learning in my lecture.
Suggested Readings

1. Yoshiyuki Omura, Milena M. Carvalho, Kaoru Inokuchi, and Tomoki Fukai (2015) A lognormal recurrent network model for burst generation during hippocampal sharp waves, J Neurosci. 35:14585–14601.

2. Hiratani N, Fukai T (2014) Interplay between short- and long-term plasticity in cell-assembly formation. PLoS One 9:e101535.

3. Teramae JN, Tsubo Y, Fukai T (2012) Optimal spike-based communication in excitable networks with strong-sparse and weak-dense links. Sci Rep 2:485.

4. Girardeau G1, Zugaro M (2011) Hippocampal ripples and memory consolidation. Curr Opin Neurobiol. 21:452-459.

5. Yamamoto J, Suh J, Takeuchi D, Tonegawa S (2014) Successful execution of working memory linked to synchronized high-frequency gamma oscillations. Cell. 157:845-857.

 

Saturday Jun.18

Lecturer Greg Stephens
Title TBA
Abstract TBA
Suggested Readings TBA

 

Monday Jun.20

Lecturer John Rinzel
Title Nonlinear dynamics of neuronal systems
Abstract I will describe dynamical properties of neurons, synapses, and networks and use biophysically meaningful but idealized models to understand their nonlinear behaviors. I will introduce basic concepts and tools from dynamical systems theory to dissect the dynamical properties such as excitability, oscillations, and bistability. The mathematical approach will be geometrical rather than detailed analytic computation. I will illustrate the foundations with some case studies that include: HH-like models for action potentials and repetitive firing, Wilson-Cowan-like firing rate models for network rhythms and for perceptual/cognitive dynamics such as decision-making and perceptual grouping.
Suggested Readings

Rinzel, J.; Ermentrout, G. Analysis of neural excitability and oscillations. Methods in neuronal modelling: from synapses to networks, 251–291. Second edition. Edited by C. Koch and I. Segev. MIT Press, Cambridge, Mass., 1998.  [PDF]

Borisyuk A, Rinzel J: Understanding neuronal dynamics by geometrical dissection of minimal models. In: Chow C, Gutkin B, Hansel D, Meunier C, Dalibard J, eds. Models and Methods in Neurophysics, Proc Les Houches Summer School 2003, (Session LXXX), Elsevier, 2005: 19-72. [PDF]

Huguet G, Rinzel J. Multistability in perceptual dynamics. In: Jaeger D., Jung R. (Ed.) Encyclopedia of Computational Neuroscience: SpringerReference (www.springerreference.com). Springer-Verlag Berlin Heidelberg, 2014. [PDF]

 

Tuesday Jun.21

Lecturer Kenji Doya
Title Introduction to reinforcement learning and Bayesian inference
Abstract The aim of this tutorial is to present the theoretical cores for modeling animal/human action and perception. In the first half of the tutorial, we will focus on "reinforcement learning", which is a theoretical framework for an adaptive agent to learn behaviors from exploratory actions and resulting reward or punishment. Reinforcement learning has played an essential role of understanding the neural circuit and neurochemical systems behind adaptive action learning, most notably the basal ganglia and the dopamine system. In the second half, we will familiarize ourselves with the framework of Bayesian inference, which is critical in understanding the process of perception from noisy, incomplete observations.
Suggested Readings

Doya K: Reinforcement learning: Computational theory and biological mechanisms. HFSP Journal, 1(1), 30-40 (2007)
Free on-line access: http://dx.doi.org/10.2976/1.2732246
 

Doya K, Ishii S: A probability primer. In Doya K, Ishii S, Pouget A, Rao RPN eds. Bayesian Brain: Probabilistic Approaches to Neural Coding, pp. 3-13. MIT Press (2007).
Free on-line access: http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=11106

 

Wednesday Jun.22

Lecturer Chris Eliasmith
Title Building a brain: From neurons to cognition
Abstract There has recently been an international surge of interest in building large brain models.  The European Union's Human Brain Project (HBP) has received 1 billion euros worth of funding, and President Obama announced the Brain Initiative along with a similar level of funding. However the large scale models affiliated with both projects do not demonstrate how their generated complex neural activity relates to observable behaviour -- arguably the central challenge for neuroscience. I will present our recent work on large-scale brain modeling that is focussed on both biological realism and reproducing human behaviour.  I will demonstrate how the model relates to both low-level neural data -- employing single neuron models as complex as those in the HBP -- and high-level behavioural data.
Suggested Readings

Eliasmith, C., Stewart T. C., Choo X., Bekolay T., DeWolf T., Tang Y., Rasmussen, D. (2012). A large-scale model of the functioning brain. Science. Vol. 338 no. 6111 pp. 1202-1205. DOI: 10.1126/science.1225266 (Full text is accessible from:http://nengo.ca/publications/spaunsciencepaper) 

Terrence C. Stewart and Chris Eliasmith. Large-scale synthesis of functional spiking neural circuits. Proceedings of the IEEE, 102(5):881-898, May 2014. doi:10.1109/JPROC.2014.2306061.(Full text is accessible from: http://compneuro.uwaterloo.ca/publications/Stewart2014.html)

Chris Eliasmith, Jan Gosmann, and Xuan-Feng Choo. Biospaun: a large-scale behaving brain model with complex neurons. ArXiv, 2016. (http://arxiv.org/abs/1602.05220)

 

Thursday Jun.23

Lecturer Claudia Clopath
Title Modelling synaptic plasticity, learning and memory 
Abstract During my lecture, we will cover the different models of synaptic plasticity for the three different types of learning schemes: supervised, unsupervised and reinforcement learning. 

 

Saturday Jun.25

Lecturer Shinji Nishimoto
Title Modeling brains under natural conditions
Abstract Advancement in measuring and analyzing large-scale bio-imaging data allows us to study brain activity evoked under complex, natural perceptual conditions. Here I present an encoding modeling framework as a powerful tool to decipher brain activity underlying our experiences. The modeling framework aims to build predictive models of brain activity using high-dimensional representations of perceptual and cognitive features. We have used this framework to model data acquired in both neurophysiological and fMRI experiments. These studies have provided new insights about natural cortical representation, including spatiotemporal and semantic representation in the visual system and their attentional modulation. The framework can also be used to decode objective and subjective experiences from brain activity. The modeling framework is quite general and has many potential applications for studying our perception and cognition.
Suggested Readings

Gallant JL, Nishimoto S, Naselaris T, Wu MCK. System identification, encoding models and decoding models: a powerful new approach to fMRI research. In Kriegeskorte N and Kreiman G (eds.), Visual Population Codes (pp.163-188), Cambridge: MIT press. (2011)

Nishimoto S, Vu AT, Naselaris T, Benjamini Y, Yu B, Gallant JL. Reconstructing visual experiences from brain activity evoked by natural movies. Current Biology 21(19):1641-6. (2011)

Huth AG, Nishimoto, S, Vu AT, Gallant JL. A continuous semantic space describes the representation of thousands of object and action categories across the human brain. Neuron 76(6):1210-24. (2012)

Çukur T, Nishimoto S, Huth AG, Gallant JL. Attention during natural vision warps semantic representation across the human brain. Nature Neuroscience. 16(6):763-70. (2013)

Huth AG, de Heer WA, Griffiths TL, Theunissen FE, Gallant JL. Natural speech reveals the semantic maps that tile human cerebral cortex. Nature 532(7600):453-8. (2016)

 

Monday Jun.27

Lecturer Stefan Mihalas
Title Modeling Networks of Populations of Neurons
Abstract

One hypothesis of the neuronal code is that relevant information about a stimulus (e.g. the orientation of a bar) is represented by the activity of a set of relatively homogenous populations of neurons (e.g. neurons of different orientation tuning). Even though such neurons might respond differently under different stimuli (e.g. they might have different color preferences), for description of orientation tuning we can bundle them together. 

Is it possible to simulate the response statistics of populations of neurons directly without simulating individual neurons? Yes, for fairly simple neuronal and synaptic models. To this end we developed DiPDE, a simulation platform for numerically solving the time evolution of coupled networks of neuronal populations. Instead of solving the sub-threshold dynamics of individual model leaky-integrate-and-fire (LIF) neurons, DiPDE models the voltage distribution of a population of neurons with a single population density equation. In this way, DiPDE can facilitate the fast exploration of mesoscale (population-level) network topologies, where large populations of neurons are treated as homogeneous with random fine-scale connectivity.

I will provide a series of examples on how this method can be used to study the activity in a small number of coupled populations and how it can be used to explore the computational properties of a cortical column model. In the end I will discuss the possibilities of using these tool in combination with measured mesocopic connectivity data.

Suggested Readings A good set of tutorials, examples and references are at: http://alleninstitute.github.io/dipde/

 

Tuesday Jun.28

Lecturer Etienne Koechlin
Title TBA
Abstract TBA
Suggested Readings TBA

 

Wednesday Jun.29

Lecturer Partha Mitra
Title Graph theory and Neural Networks
Abstract The circuit connectivity of neurons underlies the capabilities of brains, and is important to understand for answering fundamental scientific questions, understanding brain disorders and engineering intelligent machines. Multiple efforts are now under way to map neuronal connectivity at different scales. However, methods to explore, analyze and extract meaningful information from this data are still at early stages. The lecture will provide a tutorial introduction to the mathematical subject of graph theory, keeping in mind the analysis of neural connectivity data as well as artificial networks used in machine learning.

 

Thursday Jun.30

Lecturer Astrid Prinz
Title Ensemble modeling to investigate mechanisms of neuronal and network robustness
Abstract Recent experimental and theoretical evidence suggests that neurons and neuronal networks can generate stable and functional electrical output on the basis of variable cellular and synaptic properties. Based on these findings I will argue that computational neuroscientists should embrace rather than ignore neuronal parameter variability. I will introduce the concept of ensemble modeling, i.e. the approach of representing and studying neurons and networks not based on a unique computational model, but based on an ensemble of model variants that mimic the variability and diversity of natural neuron populations. I will discuss different parameter space exploration techniques that are being used to construct model ensembles and will describe analysis and visualization methods that can elucidate the high-dimensional structure of neuronal and network solution spaces.
Suggested Readings

1. Prinz AA, Billimoria CP, Marder E (2003). Alternative to hand-tuning conductance-based models: construction and analysis of databases of model neurons. J Neurophysiol 90: 3998-4015.

2. Prinz AA, Bucher D, Marder E (2004). Similar network activity from disparate circuit parameters. Nature Neurosci 7:1345-1352.

3. Taylor AL, Hickey TJ, Prinz AA, Marder E (2006). Structure and visualization of high-dimensional conductance spaces. J Neurophysiol 96(2): 891-905.

4. Hudson AE, Prinz AA (2010). Conductance ratios and cellular identity. PLoS Comp Biol 6(7): e1000838.

5. Prinz AA (2010). Computational approaches to neuronal network analysis. Philos T R Soc B 365(1551): 2397-2405.