OCNC2008

Program | Lectures | People

Okinawa Computational Neuroscience Course 2008

June 16 - July 4, 2008 in Okinawa, Japan

The 2008 course will run for three weeks with ample time in the afternoon for student projects. New this year will be that the first week will be devoted to methods only, with introductory talks in the morning and software tutorials in the afternoon.
The sponsor will provide lodging and meals during the course and support travel for those without funding. We hope that this course will be a good opportunity for theoretical and experimental neuroscientists to meet each other and to explore the attractive nature and culture of Okinawa, the southernmost island prefecture of Japan.

Schedule

A list of lecture topics can be found at program.

  • June 16-21: Methods
  • June 23-28: Neurons, Networks and Behaviors I
  • June 30-July 3: Neurons, Networks and Behaviors I

Organizers

  • Erik DeSchutter
  • Kenji Doya
  • Klaus Stiefel
  • Jeff Wickens

Lecturers

  • Arbuthnott, Gordon
  • Bell, Tony
  • Bhalla, Upi
  • Butera, Robert
  • Deneve, Sophie
  • De Schutter, Erik
  • Destexhe, Alain
  • Doya, Kenji
  • Fairhall, Adrienne
  • Gewaltig, Marc-Oliver
  • Häusser, Michael
  • Ishii, Shin
  • Koch, Christof
  • Li, Zhaoping
  • Longtin, André
  • Stiefel, Klaus
  • Tsodyks, Misha
  • Wang, Xiao-Jing
  • Wickens, Jeff

Tutors

  • Achard, Pablo
  • Cannon, Robert
  • Hong, Sungho
  • Kriener, Birgit
  • Tetzlaff, Tom
  • Toyoizumi, Taro
  • Yamazaki, Tadashi

Students

  •  Biessmann, Felix
  • Dejean, Cyril
  • De Martino, Benedetto
  • FitzGerald, Thomas
  • Fontaine, Bertrand
  • Gjorgjieva, Julijana
  • Hopp, Elisabeth
  • Hsiao, Janet
  • Hirose, Satoshi
  • Hituri, Katri
  • Ikeda, Kaori
  • Jackson, Carl
  • Jarsky, Tim
  • John, Jessy
  • Kalantzis, Georgios
  • Kurashige, Hiroki
  • Kyle, Robert
  • Lazar, Leslee
  • Lindgren, Jussi
  • Mattioni, Michele
  • Ody, Chrystele
  • Ohmae, Shogo
  • Plotkin, Josh
  • Rahnev, Dobromir
  • Reid, Lee
  • Vossen, Christine
  • Wang, Xu-dong
  • Watanabe, Masayuki
  • Weihberger, Oliver
  • Yger, Pierre
  • Yun, Kyongsik

Top | Program | Lectures | People


Program

Week 1 (Jun 16-21)

Monday, June 16

9:30-9:45: Introduction (De Schutter, Doya, Stiefel, Wickens)
09:45-12:45: Parallel track:

  • Biologists: Intro to ODE and PDE solving with MATLAB (Kenji Doya) / Seminar room
  • Theoreticians: Brain structure and functions (Gordon Arbuthnott) / Meeting Room 1

14:00-16:00: Student poster presentations I (Biessmann - Kalantzis)
16:00-18:00: Student poster presentations II (Kurashige - Yun)

Tuesday, June 17

09:30-12:30
    Parallel track:

  •     Biologists: Dynamical systems methods as a tool for single neuron modeling and analysis (Robert Butera)
  •     Theoreticians: Synaptic plasticity and behaviour (Jeff Wickens)

14:00-17:00
    Student project guidance

Wednesday, June 18

09:30-12:30
    Modeling channels and morphologically detailed neurons (Klaus Stiefel)
14:00-17:00
    NEURON tutorial (Klaus Stiefel)

Thursday, June 19

09:30-12:30
    Stochastic analysis of a population of neurons and synapses (Marc-Oliver Gewaltig)
14:00-17:00
    Simulating large neural networks with the Neural Simulation Tool NEST (Marc-Oliver Gewaltig)

Friday, June 20

09:30-12:30
    Modeling biochemical reactions, diffusion and reaction-diffusion systems (Erik De Schutter)
14:00-17:00
    PSICS (Robert Cannon) and STEPS (Stefan Wils) tutorials

Saturday, June 21

09:30-12:30
    Decision making in an uncertain world: tutorials on reinforcement learning and Bayesian inference (Shin Ishii)
14:00-18:00
    Student project time

Week 2 (Jun 23-28)

Monday, June 23

09:30-12:30
    Modeling signaling chemistry (Upi Bhalla)
14:00-17:30
    Student project time
17:30-18:30
    Modeling Extracellular Fields of Hippocampal and Cortical Neurons (Christof Koch)

Tuesday, June 24

09:30-12:30
    Visual attentional selection and the contribution by the primary visual cortex (Zhaoping Li)
14:00-15:00
    From Selective, Saliency-Driven Visual Attention to Object-Specific and Invariant Sparse Representations in the Medial Temporal Lobe (Christof Koch)
15:00-18:00
    Student project time

Wednesday, June 25

09:30-12:30
    Andre Longtin:

  • Stochastic firing: implications for sensory coding and plasticity
  • Case study: Neural coding in the electrosensory system

14:00-18:00
    Student project time

Thursday, June 26

09:30-12:30
    Neural Networks with Dynamic Synapses (Misha Tsodyks)
14:00-18:00
    Student project time

Friday, June 27

09:30-12:30
    Biophysics of adaptive neural coding (Adrienne Fairhall)
14:00-18:00
    Student project time

Saturday, June 28

09:30-12:30
    Integrative properties of neocortical neurons in high-conductance states (Alain Destexhe)
14:00-18:00
    Student project time

Week 3 (Jun 30-Jul 3)

Monday, June 30

09:30-12:30
    Cortical circuit mechanisms of decision making (Xiao-Jing Wang)
14:00-18:00
    Student project time

Tuesday, July 1

09:30-12:30
    Some ideas on unsupervised learning and multiscale order in biology (Tony Bell)
14:00-18:00
    Student project time

Wednesday, July 2

09:30-12:30
    Single neuron computation (Michael Häusser)
14:00-18:00
    Student project time

Thursday, July 3

09:30-12:30
    Neural basis of probabilistic computations and decision making (Sophie Deneve)
14:00-18:00
   Student presentations

Program | Lectures | People


Lecture Abstracts & Readings

Gordon Arbuthnot

Brain Structure and function - an introduction for non-biologists

This session will have very modest aims. We should cover the entire Neuroscience literature from 1890's to today! We'll try to pick the good bits and to look at some of the assumptions that have become 'reasonable' as the subject has progressed.

It will have to be selective and so will not cover anything in detail but should at least introduce you to the main players in the field -- neurons, glial cells, synapses, channels. The basic machinery of brains. We will leave out the 'higher functions' to be dealt with in the afternoon but they need somehow to be a consequence of the properties that we will discuss in the morning.

We'll use a recent paperback book The Architecture of Brains as a way of sorting the various parts of the system into groups. Most of what I'll talk about are very fundamental ideas that are covered in standard text books of Neuroscience. I've listed the best of them below but I haven't read any of them from cover to cover! You will only need one of them and then only if you don't have access to a decent University library to go look at particular problems in them.

I hate text books -- heavy, expensive and out of date - but they do let you catch up quickly on what is safe to assume as common knowledge. May not be right but it will be safe.

Readings

  • Swanson L.W.: Brain architecture: understanding the basic plan. Oxford University Press (2003)

Tony Bell

Some ideas on unsupervised learning and multiscale order in biology

The basis of machine learning and inference is the estimation of probability distributions. Can we connect this theory with the order observed in biology?

To do so we need to properly solve the problems of temporal density estimation, to understand the information flows across levels of a system and to make a density estimation theory that works in loops (ie: in which representations partly 'create' the distributions they observe). If we do so, we may end up connecting 'energies' in physics, biology and learning theory. It's a lot to ask for, but we I will make the case that we may be closer to understanding this than we think. The lecture will consist of a tutorial on ideas from probabilistic machine learning, and a review of information flows across levels in the nervous system, to lay the groundwork for the main arguments.

Readings

Upinder Bhalla

Modeling signaling chemistry

Chemical signaling systems perform a wide range of complex computations as part of neuronal functioning, synaptic plasticity, and cellular housekeeping.

While the underlying ODEs for chemical rate equations are relatively simple, there are a lot of ODEs. To set these up, we need to figure out the reactions, find the kinetic parameters, and manage all the data. I will give an overview of the process of signaling model building and parameterization. I will also discuss simulation tools and model analysis techniques, model representation, and hint at how one incorporates signaling models into multi-scale simulations of neurons.

Chemical memory switches

It is hard to store information reliably in any medium. The tiny squishy synaptic contacts between nerve cells seem a particulary unpromising medium to do so, yet these appear to store information for a lifetime. I will discuss how to store information using chemistry, and illustrate this with several reasonably established biological models, and some synthetic ones. Having described memory switches, I will point out three severe flaws with them: molecular turnover, diffusion, and stochasticity. I will discuss some possible mechanisms that may sustain memory despite these difficulties.

Readings

  • Tyson JJ, Chen KC, Novak B.: Sniffers, buzzers, toggles and blinkers: dynamics of regulatory and signaling pathways in the cell. Curr Opin Cell Biol. 2003 Apr;15(2):221-31
  • Bhalla US: Understanding complex signaling networks through models and metaphors. Prog Biophys Mol Biol. 2003 Jan;81(1):45-65.
  • Book of Genesis

Robert Butera

Dynamical systems methods as a tool for single neuron modeling and analysis

In this talk, I will review basic concepts from dynamical systems theory as they apply to the development and analysis of neuron models.

Topic 1 (60 minutes):Introduction to State Space Dynamics.

We will describe what is meant by an autonomous dynamical system and how this description applies to a large class of single neuron models. Basic concepts for analyzing low-order dynamical systems will be described, including state space and phase plane analysis, flow fields, and nullclines. Tools for rapidly modeling such systems will also be demonstrated (especially XPP/AUTO). Finally we will introduce the ideas of stability of both equilibria and limit cycles and how such stability is determined. We will finish with the idea of topological equivalence - how various models are similar or dissimilar in a dynamical sense.

Topic 2 (60 minutes):Bifurcations and Canonical Models.

In this hour we will use some well-defined low-dimensional neuron models to illustrate that a relatively small number of bifurcations transitions in the "landscape" (topology) of the state space dynamics. These perspective on bifurcations provides a complementary perspective to electrophysiology-based classification schemes for neuron models. Various bifurcations from silence to spiking and spiking to silence will be shown and studied in the phase space. This ultimately leads into the use of numerical bifurcation analysis as a tool for analyzing such models.

Topic 3 (60 minutes): Complex Models, Simple Models, Model Reduction, and Reality.

Real neurons are more complex than the models used so far, in terms of both their spatial extent and their complex repertoire of ion channels. What are such reduced models good for? And not so good for? Here we will demonstrate the types of electrophysiological phenomena that simple models are capable of demonstrating, as well as their limits. We will also overview some general techniques for model reduction, motivated again by the concept of topological equivalence. If time permits, we will extent the previous topic to consider the dynamics of complex bursting models as well.

Readings

  • Rinzel, J. & Ermentrout, G. B. [1989]: Analysis of neural excitability and oscillations. Methods in Neuronal Modeling,eds. Koch, C. & Segev. MIT Press, 2007. pp. 114-128.
  • Izhikevich, E.: Dynamical Systems in Neuroscience: The Geometry of Neural Excitability and Bursting. MIT Press, 2007.

Sophie Deneve

Neural basis of probabilistic computations and decision making

Understanding how human perception is translated into actions and how our experience forms our worldview has been one of the central questions of psychology, cognitive science and neuroscience. In particular it is very hard to understand how our perceptions and strategies persist and/or change in the face of our continuous experience as active agents in an unpredictable and perpetually changing world. Theories of Bayesian inference and learning have been very successful recently in describing the behaviors of humans and animals, and particularly their perceptual and motor biases. Indeed, the tasks humans are facing in natural'' situations (as opposite to simplistic laboratory settings) require the combination of multiple noisy and ambiguous sensory cues, as well as the use of prior knowledge, either innate or acquired from previous experiences. Such incomplete and imperfect sensory cues and prior knowledge can only provide probabilistic information (such as object structures that are more likely, or the probability of moving in a particular direction given the observed optic flow).

How the neural substrates perform these probabilistic inference tasks is still an open question. Arguably, one cannot deny that the brain performs efficient probabilistic computations. However, to do so it uses computational units that are slow (neurons have time constants of the order of several tens of milliseconds, as opposed to processors that are millions time quicker) and unreliable (synaptic transmission in the cortex fails 20% of the time on average, spike counts are very variable from trial to trial). Moreover, its code is not composed of continuous values (such as a probability) but by trains of discreet and rare events (spikes).

During this lecture we will first consider how neural structures provide precise estimates of sensory and motor variables in the presence and ambiguities.

Population coding is a solution: Many behaviorally relevant variables, such as movement direction, are represented by large groups of neurons with wide, overlapping tuning curves, and very noisy responses. Interconnected population of such neurons could clean up'' the noise and converge to the most probable estimate of the variable, e.g. the most likely direction. When several population codes represent statistically related variables, for example the direction of motion of an object on the skin and on the retina, these networks could perform optimal cue integration and converge to the most precise multimodal estimate.

Alternatively, rather than making decisions right away'', single neurons and neural populations could be representing, implicitly or explicitly, probability distributions. We will review several alternative models for how this could be performed by neurons with various degree of biophysical realism. First of all, population coding with noisy neurons can be interpreted as implicitly representing probabilities. More surprisingly, even single spiking neurons can be interpreted as computing probabilities over time. Networks of these Bayesian'' neurons could combine these elementary building blocks'', learning progressively more complex models of the world based on its statistical regularities.

Eventually, making behavioral decisions in the presence of uncertainty involves choosing a single option among all the probable'' ones. It might involve convergence to an attractor or threshold crossing of competing integrators. Deciding When to decide'' in a perpetually changing wold and based on a continuous stream of noisy sensory inputs is very problematic. We will review theories of and some evidence for alternative neural mechanisms for decision making.

Numerous challenge and questions remain: alternative hypothesis regarding the neural basis of probabilistic computation will need extensive experimental investigation and validation. The Bayesian framework has its own limitation, particularly visible when applied to real world problem. Importantly, despite its current popularity the Bayesian framework is not, strickly speaking, a theory of brain function. It is a mathematical tool, similar to information theory'' or dynamical system theory'', that can be useful to formalize and clarify existing and generate new theories of neural coding and computation.

Readings

Basic readings:

  • Pouget A, Dayan P, Zemel RS: Inference and computation with population codes. Annu Rev Neurosci. 2003;26:381-410. Epub 2003 Apr 10. Review.

More advanced readings:

  • Rao RP: Bayesian computation in recurrent neural circuits. Neural Comput. 2004 Jan;16(1):1-38.
  • Ma WJ, Beck JM, Latham PE, Pouget A.: Bayesian inference with probabilistic population codes. Nature Neuroscience - 9, 1432 - 1438 (2006)
  • Deneve S.: Bayesian spiking neurons I: inference. Neural Comput. 2008 Jan;20(1):91-117.

Erik De Schutter

Modeling biochemical reactions, diffusion and reaction-diffusion systems

In my first talk I will use different approaches to modeling calcium dynamics as a way to introduce deterministic methods to solving reaction-diffusion systems. In this talk I will cover exponentially decaying calcium pools, diffusion, calcium buffers and buffered diffusion, and calcium pumps and exchangers. I will describe properties of buffered diffusion systems and ways to characterize them. In my second talk I will turn towards stochastic reaction-diffusion modeling. I will first describe two methods to model chemical reactions: Gillespie's Stochastic Simulation algorithm and detection of collisions using ray tracing, and how these can be extended to simulate diffusion. I will discuss some of the problems in generating correct descriptions of microscopic 3D geometries and briefly describe two software packages: MCell and STEPS. I will then describe an application of this modeling approach to simulating anomalous diffusion in spiny dendrites.

Readings

  • J.R. Stiles and T.M. Bartol: Monte Carlo methods for simulating realistic synaptic microphysiology using MCell. (in Computational neuroscience: realistic modeling for experimentalists. (ed. E. De Schutter) CRC Press. pp. 87-127 (2000)
  • G. Bormann, F. Brosens and E. De Schutter: Diffusion. (in Computational Modeling of Genetic and Biochemical Networks (eds. J.M. Bower and H. Bolouri) MIT Press, Boston, 189-224 (2001).
  • F. Santamaria, S. Wils, E. De Schutter and G.J. Augustine: Anomalous diffusion in Purkinje cell dendrites caused by dendritic spines. Neuron 52: 635-648 (2006).

Alain Destexhe

Integrative properties of neocortical neurons in high-conductance states

In awake animals, neurons of cerebral cortex are in a "high-conductance" (HC) state, characterized by sustained, irregular and very noisy spike discharges. Neurons have very special integrative properties during such states, in particular regarding t he integration of excitatory and inhibitory inputs. Studying this complex integrative dynamics requires a tight association between in vivo, in vitro and computational techniques. HC states are measured intracellularly in vivo in anesthetized animals, and these measurements are then integrated into computational models to recreate such states numerically.

These models are then used in "dynamic-clamp" in vitro experiments, in which computational models interact directly with living neurons recorded intracellularly. This back-and-forth dynamics between techniques in vivo, in vitro and in computo allows us to recreate HC states in vitro and benefit from this preparation to reconstruct the transfer ("input-output") function of the neuron during HC states. This information is necessary to understand the dynamics of information processing during active states of cerebral cortex.

Readings

  • Destexhe, A., Rudolph,M. and Paré, D.: The high-conductance state of neocortical neurons in vivo. Nature Reviews Neuroscience 4: 739-751, 2003.
  • Destexhe, A and Marder, E. : Plasticity in single neuron and circuit computations. Nature 431: 789-795, 2004.
  • Rudolph, M. and Destexhe, A.: An extended analytic expression for the membrane potential distribution of conductance-based synaptic noise. Neural Computation 17: 230 1-2315, 2005.
  • Wolfart, J., Debay, D., LeMasson, G., Destexhe, A. and Bal, T. : Synaptic back ground activity controls spike transfer from thalamus to cortex. Nature Neurosci. 8: 176 0-1767, 2005.
  • Bedard, C., Kroeger, H. and Destexhe, A. : Does the 1/f frequency-scaling of brain signals reflect self-organized critical states? Physical Review Letters 97: 118102, 2006.
  • Destexhe, A. and Contreras, D. : Neuronal computations with stochastic network states. Science 314: 85-90, 2006.
  • Rudolph, M., Pospischil, M., Timofeev, I. and Destexhe, A. : Inhibition determines membrane potential dynamics and controls action potential generation in awake and sleeping cat cortex. J. Neurosci. 27: 5280-5290, 2007.
  • Destexhe, A., Hughes, S.W., Rudolph, M. and Crunelli, V. : Are corticothalamic 'up' states fragments of wakefulness? Trends in Neurosciences 30: 334-342, 2007.
  • Destexhe, A.: High-conductance state. Scholarpedia 2(11): 1341 (2007)

Kenji Doya

Solving Differential Equations with MATLAB

In modeling the nervous system, differential equations are ubiquitous: Hodgkin-Huxley type equations for the cell membrane potential, chemical reaction equations for sub-cellular molecular cascades, and neural network equations for macroscopic function of the brain. The best way to intuitively understand what those differential equations mean is to solve them numerically using your own computer and see how the solution evolves while you change the input, the parameters, and the structure of the equations.

In this tutorial, I will first go though the very basics of linear and non-linear dynamical systems. The concepts like stability, limit cycle, and bifurcation are explained while actually solving differential equations. There are many software tools available, but for this tutorial we will use MATLAB, which is one of the most popular language for mathematical analysis and modeling. I will cover the basics of MATLAB, but recommend that you install the program before the tutorial starts so that you can at least get

 >> 1+1
  ans =  2

We will provide MATLAB trial licenses to be downloaded before the course.

Download tutorial files here

Readings

Adrienne Fairhall

Biophysics of adaptive neural coding

I will discuss the computations and coding properties of single neurons from a number of perspectives. Automatic statistical methods can recover rich characterizations of the computation of a neuron or neural system in terms of a receptive field and threshold function. I will discuss the relationship between the features derived from such an analysis and the underlying biophysics. Further information is conveyed about the slowly varying statistical properties of the stimulus in the neuronal firing rate. I will show how different dynamical configurations lead to different classes of encoding of these statistics. Finally, representations in terms of either instantaneous or time-averaged firing rate adapt to changes in stimulus statistics. Certain aspects of these adaptive changes are due to intrinsic nonlinearities of neurons; we will derive the dependence on statistics of the sampled neuronal receptive field properties for certain simple cases.

Readings

  • Fairhall et al.: Efficiency and ambiguity in an adaptive neural code. Nature (2001)
  • Aguera y Arcas, Fairhall and Bialek: What does a single neuron compute? Neural Computation (2003)
  • Hong, Aguera y Arcas and Fairhall: From dynamical system to feature detector. Neural Computation (2007)

Marc-Oliver Gewaltig

Stochastic analysis of a population of neurons and synapses

Generally, an impact of a single neuron on an innervating neuron is an order of magnitude smaller than what is needed to elicit a spike. Therefore, neurons need 'a power of population' to transmit signals. The Fokker-Planck equation that was originally introduced to analyze Brownian motion is a useful tool to analyze how a population of neuron behave. In my lecture, I first introduce the Fokker-Planck equation describing a population of the integrate-and-fire neurons and demonstrate that the equation can successfully describe the population phenomena called synfire chain.

I also explain that the Fokker-Planck equation is useful to analyze a population of synapses instead of a population of neurons. Generally, a single synapase is supposed to contribute little to the information storage and a robust memory is retained by a population of synapses. I demonstrate that the Fokker-Planck equation successfully describes a population of synapses changing according to the spike-timing-dependent plasticity.

Readings

Simulating large neural networks with the Neural Simulation Tool NEST.

NEST is a simulation environment for large heterogeneous networks of point-neuron models or neuron models with a small number of compartments. It supports spike based as well as continuous (e.g. rate, currents) interaction between the nodes of the network.In this course, I will present NEST2 with its new Python-based user interface, PyNEST. PyNEST makes NEST easy to learn and use. Python provides a large number of libraries for scientific computing (www.scipy.org), making it a powerful alternative to Matlab. Users can simulate, analyze, and visualize networks and simulation data in a single interactive Python session.Other features of NEST 2 include support for synaptic plasticity, a wide range of model neurons, and parallel simulation on multi-processor (core) computers as well as computer clusters. To customize NEST to their own purposes, users can add new neuron and synapse models, as well as new connection and analysis functions, by writing their own NEST modules in C++.During the course, I will first give an introduction to NEST and its most important features, followed by a hands-on tutorial, where students can find out whether NEST is the right tool for their projects.For more information about NEST, visit our homepage at http://www.nest-initiative.org and see NEST's entry on Scholarpedia.Pre-releases of NEST 2 have already been used with great success and appreciation at European Advanced Course in Computational Neuroscience 2007 and the FIAS Summer School 2007.NEST is released under an open source license for non-commercial use.

Readings

  1. http://www.nest-initiative.org
  2. http://www.scholarpedia.org/article/NEST
  3. Brette et al: Simulation of networks of spiking neurons: A review of tools and strategies. Journal of Computational Neuroscience (23) p 349-398

Michael Häusser

Single neuron computation

One of the central questions in neuroscience is how particular tasks, or computations, are implemented by neural networks to generate behaviour, and how patterns of activity are stored during learning. In the past, the prevailing view has been that information processing and storage in neural networks results mainly from properties of synapses and connectivity of neurons within the network. As a consequence, the contribution of single neurons to computation in the brain has long been underestimated.

I will describe recent work providing evidence that the dendritic processes of single neurons, which receive most of the synaptic input, display an extremely rich repertoire of behaviour, and actively integrate their synaptic inputs to define the input-output relation of the neuron. Moreover, the signalling mechanisms which have been discovered in dendrites have suggested new ways in which patterns of network activity could be stored and transmitted.

Readings

  • London M, Hausser M.: Dendritic computation. Annu Rev Neurosci. 28:503-32. (2005)
  • Dendrites: bug or feature?: Hausser M, Mel B. Curr Opin Neurobiol. 13(3):372-83. (2004)
  • Dendrites: Stuart, Spruston & Hausser Oxford University Press, 2008

Shin Ishii

Decision making in an uncertain world: tutorials on reinforcement learning and Bayesian inference

Reinforcement learning (RL) is a machine learning framework that allows an agent to make appropriate decisions in an even unknown environment. Since Schultz and his colleagues found that activities of dopaminergic neurons can be well explained by reinforcement learning theory, a number of studies have been done to see the relationship between machine's reinforcement learning and animal¹s decision making. In particular, I am interested in the human's decision making in an uncertain environment. To perform better in an uncertain environment, resolution of uncertainty is definitely necessary. Bayesian inference provides a useful way of resolving uncertainty in an on-line manner. In this talk, I will first give a tutorial on reinforcement learning, and introduce some studies of its possible implementation in the brain. Next, I will provide another tutorial on Bayesian inference and Bayesian modeling of uncertainty resolution. In addition, I will present the results of our psychological experiment in which subjects were required to make sequential decisions in an uncertain maze environment. We applied Bayesian modeling to the behaviors and MRI-measured brain activities of the subjects, and the model-based regression analysis demonstrated that the prefrontal cortex is responsible for uncertainty resolution.

Readings

  • Yoshida, W., Ishii, S.: Resolution of uncertainty in prefrontal cortex. Neuron, 50(5), 781-789 (2006).

Christof Koch

Modeling Extracellular Fields of Hippocampal and Cortical Neurons

Extracellular action potential ("spike") recordings are a primary means for gathering information about the representation of information in the brain, yet the exact physical basis of their generation remains contested. In joint work with Gyuri Buzsaki, we (Gold et al., 2006) use the Line Source Approximation method (Holt and Koch 1999) to model the extracellular action potential (EAP) voltage resulting from the spiking activity of individual neurons. We compare the simultaneous intracellular and extracellular recordings of CA1 pyramidal neurons recorded in vivo (Henze et al. 2000) with model predictions for the same cells reconstructed and simulated with compartmental models. The model accurately reproduces both the waveform and the amplitude of the EAP's. This suggests that accounting for the EAP waveform provides a considerable constraint on the overall model. The developed model explains how and why the waveform varies with electrode position relative to the recorded cell. Interestingly, each cell's dendritic morphology had very little impact on the EAP waveform. The model also demonstrates that the varied composition of ionic currents in different cells is reflected in the features of the EAP.

Together with Cyrille Girardin and Kevan Martin, we recorded spikes from anesthetized cat primary visual cortex under a standard protocol. We found that a minority of spikes had an inverted polarity, with a leading positive components, and unusual large amplitude of up to +1.5mV. The highest amplitudes occurred in deep layers of cortex, particularly layer 5. Using standard cable theory and a purely resistive extracellular cytoplasm we can accurately describe standard extracellular spikes with a leading negative component. However, to model the high amplitude positive spikes (HAPS), we need to assume that a cluster of nearby layer 5 pyramidal cells fire with a sub-millisecond precisions, with their action potential being triggered in the dendrites. It is possible that HAPS occur in other neocortical regions and species as well and that they represent a general feature of cortical representation (Gold et al., 2008).

Readings

  • Henze DA, Borhegyi Z, Csicsvari J, Mamiya A, Harris K, and Buzsaki G.: Intracellular features predicted by extracellular recordings in the hippocampus in vivo. J Neurophysiology 83: 390-400, 2000.
  • Holt GR and Koch C.: Electrical interactions via the extracellular potential near cell bodies. J. computational Neuroscience 6: 169-84, 1999.
  • Quian-Quiroga R, Reddy L, Kreiman G, Koch C and Fried I.: Invariant visual representation by single neurons in the human brain. Nature 435: 1102-7, 2005.

Zhaoping Li

Visual attentional selection and the contribution by the primary visual cortex

I will introduce the two of the goals of early visual processing in human vision, efficient coding and information selection. Then I will focus on the second goal of information selection, which in the research community is often referred to as visual attentional selection. There are two types of selection, one is pre-attentive, goal independent, or bottom-up, the other is attentive, goal-directed, or top-down. The experimental data on human pre-attentive selection will be introduced, and I will then present the development, modeling, and experimental tests of the theory that the primary visual cortex computes from visual input an explicit map of saliency in input in order to direct attention to conspicuous location.

Readings

  1. A bottom-up visual saliency in the primary visual cortex --- theory and its experimental test
  2. Understanding primary vision
  3. Theoretical understanding of early visual processes by data compress and data selection

Andre Longtin

Stochastic firing: implications for sensory coding and plasticity.

This talk will help you navigate the dynamics of neural systems at the boundary of determinism and randomness. The senses must process a vast amount of environmental information and package it in a form that is accessible to a variety of target neurons. Main challenges for deciphering the principles of this coding and decoding are the presence of multiple scales of time and space and the influence of plasticity. One advantage of working at the sensory periphery is that one has a better intuition as to significance of the signals being processed at each stage, which can guide the analysis.

Case study: Neural coding in the electrosensory system.

This talk will present dynamical models for select combinations of space and time, inspired from experiments in electrosensory processing (a mix of the senses of touch and hearing). We will show two parallel schemes that enable the animal to simultaneously process high and low frequencies. One relies on "envelope" coding, and the other on synchronous afferent spikes -- both nonlinear phenomena. We will also discuss how spatial correlations of stimuli interact with feedback in the sensory pathway to modulate oscillation strength.

Readings

  • Benda, J., Longtin, A. and Maler, L. : A synchronization-desynchronization code for natural communication signals. Neuron 52, 347-58. (2006).
  • Middleton, J.W., Longtin, A., Benda, J. and Maler, L. : The cellular basis for parallel neural transmission of a high-frequency stimulus and its low-frequency envelope. Proc. Nat. Acad. Sci. (USA) 103, 14596-14601. (2006)
  • Doiron, B., Chacron, M.J., Maler, L., Longtin, A. and Bastian, J. : Inhibitory feedback required for network burst responses to communication but not to prey stimuli. Nature 421, 539-543. (2003).

Klaus Stiefel

Modeling channels and morphologically detailed neurons

Individual neurons by themselves can carry out amazingly powerful computations. In the first part of my lecture, I will present some of these computations. Examples will be the motion direction sensitive neurons in the lobular plate of the fly, coincidence detectors in the avian auditory brainstem, and cortical pyramids and interneurons. In will outline how dendritic morphology and active conductances interact to lead to the desired neural input-output transformations.

After motivating the study of individual neurons with these examples, I will present methods for constructing models of them in the second part of my lecture. First, I will cover cable theory, a method useful for the study of passive neurons. Then I will introduce the modeling of active conductances. Bringing these together, I will talk about multi-compartmental models of neurons with active dendrites. A number of examples will show the usefulness of such models for exploring dendritic function. Finally, I will discuss the modeling of stochastic processes in individual neurons.

Readings

  • W. Rall: The theoretical foundation of dendritic function. Selected papers of Wilfrid Rall with comentaries, MIT Press. (1995)
  • D. Johnston, S.M. Wu: Foundations of Cellular Neurophysiology. MIT Press. (1994)
  • K. Koch, I. Segev: Methods in Neuronal Modeling - 2nd Edition: From Ions to Networks. MIT Press (1998).

Misha Tsodyks

  

Neural Networks with Dynamic Synapses

Synaptic transmission in the cortex is characterized by the activity-dependent short-term plasticity (STP), which can be broadly classified as synaptic depression and synaptic facilitation. As recent experiments indicate, different cortical areas exhibit variable mixes of facilitation and depression, which are also specific for connections between different types of neurons. In the first half of my presentation, I will describe the basics of dynamic synaptic transmission, its biophysical underpinnings and the ways it can be captured in biophysically motivated phenomenological models. I will also discuss some immediate implications of STP on information transmission between ensembles of neocortical neurons.

In the second half of the presentation, I will focus on the effects of STP on the dynamics of recurrent networks and resulting neural computation. I will introduce the 'population spikes' (PSs), which are brief epochs of highly synchronized activity that emerge in recurrent networks with dominating synaptic depression between excitatory neurons. PSs can underlie some of the response properties of neurons in the auditory cortex. I will then describe the recently introduced idea that synaptic facilitation could be utilized in order to maintain information about the incoming stimuli in the facilitation level of recurrent connections between the targeted neurons, thus providing an effective mechanism for short-term memory for a period of several seconds after the termination of the stimulus.

Readings

  • A. Loebel, I. Nelken and M. Tsodyks.: Processing of sounds by population spikes in a model of primary auditory cortex. Frontiers in Neuroscience. 1:197-209 (2007).
  • G. Mongillo, O. Barak and M. Tsodyks.: Synaptic theory of working memory. Science. 2008 Mar 14; 319(5869):1543-6

Xiao-Jing Wang

Recurrent neural circuit theory of decision making and working memory

Decision making and working memory are cognitive functions that depend on 'higher-level' cortical areas, such as the prefrontal cortex and posterior parietal cortex. What circuit properties enable these cortical areas to subserve cognitive processes, as in contrast to early processing in sensory areas? In my lecture, I will review relevant monkey neurophysiological experiments and human brain imaging, that form the basis of an attractor network theory for working memory and decision making. First, I will cover working memory, the brain's ability to actively maintain and manipulate information in the absence of external sensory stimulation. Many studies have shown that working memory is encoded and stored by self-sustained persistent neural activity patterns. I will discuss the idea that such mnemonic persistent activity is generated by recurrent neuronal dynamics in the cortex. Such strongly recurrent networks are prone to instability, and sometimes require fine-tuning of parameters. I will examine various possible mechanisms for ensuring stable and robust working memory, in the face of circuit perturbations, or distracting sensory stimuli. Second, I will show that such recurrent network mechanism is also capable of decision computations. A decision is a deliberate process that involves accumulation of evidence for possible alternatives, ultimately leading to the commitment to a categorical choice. Recent physiological studies with behaving nonhuman primates have begun to uncover neural signals at the single-cell level that are correlated with specific aspects of subject's decision computations. I will show that this model accounts for a range of observations from monkey experiments on perceptual decision. Third, I will discuss adaptive behavior of such a decision network endowed with reward-dependent synaptic plasticity. The model is applied to several 'neuroeconomic' type choice behavior, again studied with behaving monkeys: foraging, competitive games, probabilistic inference. Moreover, I will show that reward-dependent learning can also generate attractor states that, instead of storing sensory stimuli, encode abstract rules that are internally maintained to guide behavior. Interestingly, this model for the generation of rule attractor states, and switching between abstract rules (for instance, when the rule currently in play no longer yields desirable outcomes), critically depends on the diversity of highly heterogeneous neuronal responses in the network. In summary, this lecture will be focused on exploring 'cognitive-type' strongly recurrent cortical microcircuits (such as the prefrontal and parietal cortex), linking behavioral functions to the underlying neural dynamics and plasticity.

Readings

  • Wang X-J: Probabilistic decision making by slow reverberation in neocortical circuits. Neuron 36: 955-968. (2002)
  • Lo CC and Wang X-J: Cortico-basal ganglia circuit mechanism for a decision threshold in reaction time tasks. Nature Neurosci. 9: 956-963. (2006)
  • Soltani A and Wang X-J: A biophysically-based neural model of matching law behavior: melioration by stochastic synapses. J. Neurosci. 26: 3731-3744. (2006)
  • Wong K-F and Wang X-J: A recurrent network mechanism for time integration in perceptual decisions. J. Neurosci. 26: 1314-1328. (2006)
  • Fusi S, Asaad W, Miller EK and Wang X-J: A neural model of flexible sensori-motor mapping: learning and forgetting on multiple timescales. Neuron 54: 319-333. (2007)

Jeff Wickens

Synaptic plasticity and behaviour

In the lecture on synapses and synaptic plasticity, the classical idea of the synapse will be extended to include neuromodulatory actions of neurotransmitters. Quantitative neuroanatomy of synapses will be discussed, as it is one of the few clues we have about the connectivity of real neural networks. The experimental study of synaptic plasticity will also be reviewed. This has encouraged our speculations about mechanisms for learning and memory. So, what are the biological rules governing synaptic plasticity, and how important are the details, for example, of timing? Second messengers and dendritic spines will be reviewed, as they suggest mechanisms that may constrain the possible rules. In the lecture on cognition and behaviour we will consider how these rules for synaptic plasticity may be engaged at synapses deep within the brain during ongoing behaviour. At the macroscopic level, the brain is composed of many entities -- large masses of grey matter and broad connecting tracts. We need to consider how these major entities interact to produce purposeful behaviour. I will give an overview of the anatomical organisation of the central nervous system, and a brief introduction to the structure of the cerebral cortex, basal ganglia and cerebellum. Then I will discuss regional specialization of function based on evidence from different types of experiments, from single unit recordings to lesion and behaviour studies.

Readings

  • (Kandel E.R., Schwartz J.H. and Jessell T.M.), Principles of Neural Science: Chapters on The Neurobiology of Behaviour, The Neural Basis of Cognition, and Learning and Memory.
  • Matsuzaki, M., Honkura, N., Ellis-Davies, G. C. & Kasai, H.: Structural basis of long-term potentiation in single dendritic spines. Nature 429, 761-6 (2004).
  • Reynolds, J. N. J., Hyland, B. I. & Wickens, J. R. : A cellular mechanism of reward-related learning. Nature 413, 67-70 (2001).
  • Squire, L. R. : Memory systems of the brain: a brief history and current perspective. Neurobiol Learn Mem 82, 171-7 (2004).

In this page:

  • Group photo
  • Lecturers
  • Tutors
  • Students

People 

Lecturers

  • Arbuthnott, Gordon
  • Bell, Tony
  • Bhalla, Upi
  • Butera, Robert
  • De Schutter, Erik
  • Deneve, Sophie
  • Destexhe, Alain
  • Doya, Kenji
  • Fairhall, Adrienne
  • Gewaltig, Marc-Oliver
  • Häusser, Michael
  • Ishii, Shin
  • Koch, Christof
  • Li, Zhaoping
  • Longtin, André
  • Stiefel, Klaus
  • Tsodyks, Misha
  • Wang, Xiao-Jing
  • Wickens, Jeff

Tutors

  • Achard, Pablo
  • Cannon, Robert
  • Hong, Sungho
  • Kriener, Birgit
  • Tetzlaff, Tom
  • Toyoizumi, Taro
  • Yamazaki, Tadashi

Students

  • Biessmann, Felix
  • Dejean, Cyril
  • De Martino, Benedetto
  • FitzGerald, Thomas
  • Fontaine, Bertrand
  • Gjorgjieva, Julijana
  • Hopp, Elisabeth
  • Hsiao, Janet
  • Hirose, Satoshi
  • Hituri, Katri
  • Ikeda, Kaori
  • Jackson, Carl
  • Jarsky, Tim
  • John, Jessy
  • Kalantzis, Georgios
  • Kurashige, Hiroki
  • Kyle, Robert
  • Lazar, Leslee
  • Lindgren, Jussi
  • Mattioni, Michele
  • Ody, Chrystele
  • Ohmae, Shogo
  • Plotkin, Josh
  • Rahnev, Dobromir
  • Reid, Lee
  • Vossen, Christine
  • Wang, Xu-dong
  • Watanabe, Masayuki
  • Weihberger, Oliver
  • Yger, Pierre
  • Yun, Kyongsik 

Pablo Achard

Affiliation: Postdoctoral Fellow, Marder lab, Brandeis University, USA
URL: http://www.pabloachard.eu
About:

Hi everybody.

By the age of 7, I decided to become a professional toy inventor. But that never happened. My rock-star career was aborted due to my lack of knowledge of any musical instrument. I was much too heavy to become a ballet dancer and much too light for sumo competitions. Therefore I studied (particle) physics and somehow managed to get a PhD working on data analysis of particle collisions in CERN detectors.

After deciphering the quantum world, I decided to decipher the brain. And I have to admit that I (partly) failed. But that allowed me to visit Marseille, where I studied rhythm generation and maturation, Antwerp, where I discovered homeostasis, parameter tuning and a couple of good beers and Boston, where I continue to work on single cell and small network modeling, when not shoveling snow.

I'm delighted to come again to Okinawa and meet you all.

I am Pablo Achard and I approved this message (I hope that the US presidential campaign was broadcast enough to make this sentence understandable worldwide...)

Gordon Arbuthnott

Affiliation: OIST
About:

Gordon did a degree in physiology in the dim and distant past and since then he has worked on the nervous system, He is no mathematician (First Ordinary Maths at Aberdeen failed) but has always been interested -- failure does that for you. He has published with several mathematicians and enjoys the cut and thrust of debate, He doesn't mind being wrong so long as the evidence -- or the sums -- are convincing! Mind you he can take some convincing...

Message to participants:

I'm really looking forward to talking to you all and beginning to form the kind of collaboration of ideas that is my real reason for teaching at all. You folks all know stuff that I can only vaguely grasp so if we can share even some of our experience we will all be richer.

Tony Bell

Affiliation: Redwood Center for Theoretical Neuroscience, UC Berkeley
URL: http://redwood.berkeley.edu/wiki/Tony_Bell
About:

Tony Bell is one of the founding members of the Redwood Center for Theoretical Neuroscience at the University of California at Berkeley. He is primarily known for his work on ICA. He comes from Northern Ireland, has a degree in Computer Science and Philosophy from the University of St Andrews in Scotland, a PhD in Artificial Intelligence from the Free University of Brussels, and has also worked many years at the Salk Institute with Terry Sejnowski.

Upinder Bhalla

Affiliation: National Centre for Biological Sciences, Bangalore
About:

I studied Physics at IIT Kanpur, India, and Cambridge University, UK, before taking the plunge into Biology for my PhD at Caltech. I am now at the National Centre for Biological Sciences, in Bangalore. I have done experiments on rats and on tissue cultures, and have worked in computational neuroscience and what is popularly called systems biology. I think these are all just labels for whatever approaches happen to work for studying the grand complexity of biology and the brain. I am currently interested in olfactory sensory processing and memory, from molecules to networks.

Message to participants:

You have the privilege and challenge to come into Neuroscience at a time when it is a matter of fact that machines can outcompute humans at many of the tasks that were once thought to be the hallmarks of intelligence. Over the coming weeks you will see how to use the same machines to ferret out the amazing but not impenetrable complexity of the brain. Somewhere in these details, and in the marriage of computational and experimental biology, are the seeds of an understanding of the brain. I believe this is the greatest adventure of our age. I welcome you to Okinawa to begin on this journey and meet some of the people who will share this challenge with you in the years to come.

Felix Biessmann

Affiliation: Research Assistant, Max-Planck Institute for Biological Cybernetics and TU Berlin, Dept. Machine Learning, Germany
URL: http://www.kyb.mpg.de/~fbiessma
About:

My name is Felix Biessmann, I am working as a research assistant in a collaboration between the machine learning dept. at TU Berlin and the neurophysiology dept. of the Max-Planck Institute for Biological Cybernetics in Tuebingen. The aim of my project is to learn more about the mechanisms of neurovascular coupling using functional magnetic resonance imaging in combination with electrophysiological recordings and neurochemical measurements. In particular, we are investigating the effect of neuromodulators such as acetylcholine on both hemodynamic response and electrophysiology. Other than that I am interested in neuroinformatics eand machine learning.

Robert Butera

Affiliation: Georgia Institute of Technology, Atlanta, GA USA
About:

Robert Butera is an Associate Professor at the Georgia Institute of Technology (Atlanta, GA, USA). His current research interests include the neural basis of respiration, dynamics of bursting neurons, intrinsic and synaptic mechanisms for synchronization, and conduction properties of peripheral nerve and their block by high frequency stimulation. His approaches include single neuron models, complex neuron models, electrophysiology experiments, and the use of the dynamic clamp. Robert believes strongly in the value of complementary skills and perspectives, and as a result has multiple collaborative funded projects with both modelers and experimentalists.

Robert received his PhD in Electrical Engineering from Rice University (Houston, TX, USA) in 1996, where his PhD work focused on the modeling and analysis of R15 (a bursting neuron in Aplysia) and its neuromodulation. From 1996-1999 he was a postdoctoral fellow at the National Institutes of Health (Bethesda, MD) working jointly with John Rinzel and Jeff Smith modeling brainstem neurons involved in respiratory rhythm generation. Since 1999 he has been on the faculty at Georgia Tech.

Message to participants:

The intent of this talk is to make sure we are all conversant in basic concepts used to describe and analyze dynamical systems. I would rather cover less material in more depth, so questions are encouraged and this outline is flexible. My lectures will be a mixture of simulation demonstrates and occasional figures from references where necessary. I have taught courses on this material, as well as electrophysiological modeling, to bioengineering and neuroscience students for 9 years at Georgia Tech, so I am used to a wide range of questions and perspectives on this material. So while my talk my seem mathematical, I am not a mathematician myself and am more of a "end user" of these methods when they are helpful for my computational neuroscience projects.

Robert Cannon

Affiliation: Textensor Limited, Edinburgh, UK
About:

My background is theoretical physics followed by software engineering, neuroscience and a little hardware design. I am particularly interested in applications where innovative use of models and software opens up new experimental possibilities, such as automated model validation, and in the space between bottom-up modeling and machine learning.

Specifically, efforts to design computational systems that use physiologically plausible components to perform behavioral tasks have several interesting outcomes. They help to understand real networks by attaching functional roles to physiological phenomena; they can generate hypotheses about what structures may be needed for particular behaviors; and they lead into novel software architectures for robust parallel systems.

Recently I have been working on new software for efficiently computing the behavior of neurons with stochastic ion channels http://www.psics.org. This will be available to any students wishing to study such models.

Cyril Dejean

Affiliation: Postdoctoral Fellow, Department of Anatomy and Structural Biology, University of Otago, Dunedin, New Zealand.
About:

My research focuses on the study of the basal ganglia physiology as well as pathology. On a more general manner I'm interested in system neuroscience. So far I have been conducting mainly electrophysiology experiments and I'd like to add a computational flavor to my biologist background. I really hope the course will help me with that. I'm very excited by the approaching OCNC and I'm looking forward to meeting you all.

Benedetto De Martino

Affiliation: Postdoctoral fellow, California Institute of Technology
URL: http://www.hss.caltech.edu/ss/faculty/bmartino
About:

Hello my name is Benedetto and I am doing my post-doc at Caltech. Although I am often a very indecisive person I study how people make choices. Making decisions and generating preferences among choice options is one of the most important aspects of human behavior, often seen as reflecting free will. Economic 'choice theory' assumes people have a stable representation of the value of each option (decision utility), and use logical rules in choosing between them. Much empirical data challenge this view. My research question is to understand how the brain constructs preferences and how our emotional state and cognitive limitations dynamically shape this process. Looking forward to meet all of you in Okinawa.

  

Sophie Deneve

Affiliation: Group for Neural Theory, Ecole Normale Supérieure (ENS)
URL: http://www.gnt.ens.fr/
About:

Dr Sophie Deneve studied in the field of neuroscience and mathematics at the Ecole Normale Superieure in Paris. She obtained her PhD in Brain and cognitive sciences at University of Rochester in 2003, in the laboratory of Alex Pouget. After a postdoc in Gatsby Computational Neuroscience unit, London, she became an assistant professor at the institute of cognitive science in Lyon, in 2004. In 2006, she moved to Paris to lead the Group for Neural Theory, a EU funded team of researchers in the field of computational neuroscience. Her main research interests include investigating the neural basis of probabilistic inference and learning.

  

Erik De Schutter

Affiliation: Computational Neuroscience Unit, OIST
URL:  
About:

Welcome to Okinawa! I moved to this beautiful island a bit over a year ago. My first visit to Okinawa was as faculty of OCNC2006 and I immediately liked the place and the beautiful views of the East China Sea from Seaside House. So it should come as no surprise that my lab space is now in the Seaside House, where the course takes place. You already know my secretary Tsuyuki and during these 3 course weeks the lab will hold open house and welcome all students and faculty to visit and use our facilities (but don't move books or equipment out of the lab please).

I have been teaching for more than 10 years at European CNS summer schools and was part of the last two OCNCs. It is always exciting to meet the diverse groups of highly motivated young scientists attending our courses. These courses have an important function in teaching computational methods and approaches, and in establishing social networks among the participants. Ideally every neuroscientist, including experimentalists and clinicians, should attend a CNS course because computational methods have become essential tools to understand the complex systems we are studying.

There is a broad range of modeling approaches available. I have specialized in bottom-up methods that are very accessible to experimentalists as they are mainly parameter driven. This includes large compartmental models of neurons with active dendrites, networks with realistic connectivity using conductance based neuron models and reaction-diffusion models of molecular interactions. I will focus on the latter during my methods presentation, but please feel free to ask me or my collaborators about our other work!

Alain Destexhe

Affiliation: Centre National de la Recherche Scientifique
About:

Alain Destexhe is Research Director at CNRS and his research interests stand at the interface between physics (dynamical systems) and neuroscience (electrophysiology). Applying concepts of dynamical systems and complex systems to analyze neuronal activity, as well as designing models of single neurons and neuronal networks, is at the basis of his research. His research team at CNRS is mostly composed of physicists, who together work very close to experimental data with biologists, as exemplified by the dynamic-clamp technique, which directly puts in interaction theoretical mode ls with living neurons (in collaboration with Thierry Bal at UNIC). This combination of disciplines is necessary to understand neuronal operations in complex dynamical states such as in the awake brain. Alain Destexhe is also Editor in Chief of the Journal of Computational Neuroscience and he is actively involved in the organization of conferences and summer schools in the field.

Message to participants:

I'd be happy to help to an question or to realize projects, if needed you can email me at Destexhe@iaf.cnrs-gif.fr

Kenji Doya

Affiliation: OIST
URL:  
About:

Kenji Doya took BS in 1984, MS in 1986, and PhD in 1991 at the University of Tokyo. He became a research associate at U.Tokyo in 1986, U.C. San Diego in 1991, and Salk Institute in 1993. He joined ATR in 1994 and is currently the head of Computational Neurobiology Department, ATR Computational Neuroscience Laboratories. In 2004, he was appointed as a Principal Investigator of the Initial Research Project, Okinawa Institute of Science and Technology. He is interested in understanding the functions of basal ganglia and neuromodulators based on the theory of reinforcement learning.

Message to participants:

Welcome to OCNC 2008! The last two courses were hit by typhoons in the end, so this year we shifted the course one week earlier. Let us see how typhoons like computational neuroscience. I hope what you experience here in Okinawa, not only the lectures and the projects but also the variety of people at the course and the nature and culture of this fine island, will be the most exciting and valuable ones. I myself has been benefitting a lot from the courses and am looking forward to see what I can learn this time.

Adrienne Fairhall

Affiliation: Department of Physiology and Biophysics, University of Washington, Seattle, WA, USA
About:

I obtained my BSc (Hons) in Theoretical Physics from the Australian National University and my PhD in physics from the Weizmann Institute of Science in Israel. After that I did my postdoctoral work on adaptive coding with Bill Bialek at NEC Research Institute and with Michael J. Berry at Princeton University. During my PhD and postdoctoral years I was lucky to have the opportunity to spend a little time at the Ecole Normale Superieure in Paris and in the Cognitive Neuroscience Sector at SISSA, in Trieste. Finally I joined the Department of Physiology and Biophysics at the University of Washington, Seattle as a faculty member in 2004. I have visited Okinawa once before for this school and have great memories of the experience -- nothing better than talking science in fabulous locations. I'm looking forward to meeting you all there!

Thomas FitzGerald

Affiliation: PhD Student, Institute of Psychiatry, King's College London

Functional Imaging Laboratory, University College London

About:

My original background is in philosophy, followed by a flirtation with medicine, and I now work jointly in the Clinical Neurosciences department at the Institute of Psychiatry, KCL, and the Emotion and Cognition group at the Functional Imaging Laboratory, UCL.

I use fMRI together with single-unit and local field data recorded from electrodes implanted in the brains of epilepsy patients to explore the neural processes underlying emotion and decision-making. Currently my main focus is on using time series analysis, particularly measures of synchronisation and effective connectivity, to look at how task performance is mediated by interactions between spatially distant regions. I want to use computational models to explore the neuronal bases of the changing connectivity patterns I observe.

Bertrand Fontaine

Affiliation: Active Perception Lab, University of Antwerp, Belgium
URL: http://www.ua.ac.be/apl
About:

My name is Bertrand. I'm a phd student at the Computer Science department of the University of Antwerp, Belgium. My background is in signal processing engineering. I'm interested in the bat auditory system, in particular the pathways that process cues for sound localization. I'm trying to build a biologically plausible model of the sub-cortical auditory system and implement it on our bat-head robot to perform echolocation.

Beside that, I'm involved with music. I play different instruments and have a home studio. I record and mix my own songs. I'm also interested in languages, I do speak four fluently and I'm learning more, but Japanese seems too hard in such a short time.

Marc-Oliver Gewaltig

Affiliation: Honda Research Institute Europe
Message to participants:

This will be my first time at the OCNC and I am looking forward to meeting you in a beautiful and stimulating environment. In the last 6 years I have held many lectures and tutorials on NEST, first at the University of Freiburg and later at the Advanced Courses in Computational Neuroscience in Obidos and Arcachon. Each lecture and each tutorial is an exciting and stimulating experience for me, because each student has a unique approach, a unique topic, and a unique point of view. Every time I learn something new and it is your feedback that will improve NEST.

Over the years I have shifted my focus from a very technical perspective of neural simulation towards the question of how neural systems and simulated experiments are best described and expressed. How can we formulate neural simulations in a way that that both researchers and computers understand?

In my lecture I will introduce the main features of NEST. More important than the lecture will be the hands-on tutorial in which you have the opportunity to try and play with NEST. But I also encourage you to seek my assistance outside the lecture and tutorial.

Julijana Gjorgjieva

Affiliation: PhD student, University of Cambridge
About:

I am a first year PhD student at the Department for Applied Mathematics and Theoretical Physics at Cambridge developing computational models for retinotopic map formation. I come from Macedonia, but I did my undergraduate at Harvey Mudd College in California, in Mathematics. My PhD work provides an interface between experiment and theory, because I work with real data of spontaneous retinal activity and try to explain its role in topographic map formation using computational modelling. At OCNC I would like to learn about theoretical techniques that can be applied to my model to understand how computationally explored parameters can be derived analytically.

I am very excited about meeting everyone! I have heard amazing things about Okinawa and this course, and I am looking forward to experiencing them myself.

Michael Häusser

Affiliation: Wolfson Institute for Biomedical Research, University College London
URL: http://www.ucl.ac.uk/wibr/research/neuro/mh/index.htm
About:

Born in Canada to German parents. Ph Dat Oxford with Julian Jack. Postdoc with Bert Sakmann in Heidelberg and then with Philippe Ascher in Paris. Started my own independent la bat University College London in 1997 with the support of a fellowship from the Wellcome Trust. Currently Professor of Neuroscience and Wellcome Trust Senior Research Fellow at UCL.

Satoshi Hirose

Affiliation: Graduate Student, Graduate School of Human and Environmental Studies, Kyoto University, Japan
URL: http://www.mmatsumura.neuro.jinkan.kyoto-u.ac.jp/
About:

Hi, I'm Satoshi Hirose. I'm working on human motor control, utilizing fMRI and psychophysical technique. I hope to be a friend to you all!

Katri Hituri

Affiliation: Ph.D. student, Computational Neuroscience and Computational Systems, Biology research groups, Tampere University of Technology, Finland
About:

Hello, my name is Katri and I'm a first-year Ph.D. student at Tampere University of Technology, Finland. My research is related to modeling of intracellular signal transduction events related to LTD in cerebellar Purkinje cell. For now, I have been concentrating in models of IP3 receptor. In our research group we have a special interest in using and developing stochastic methods for modeling and simulations. I visited Okinawa this spring, so if you have some questions don't hesitate to ask. I look forward meeting you all! PS. Don't forget to take your snorkeling gear with you!

Sungho Hong

Affiliation: Postdoctoral Fellow, Computational Neuroscience Unit, OIST
URL: http://www.irp.oist.jp/cns/
About:

After finishing my degree in theoretical physics, I joined Adrienne Fairhall's lab in the University of Washington where I worked on some problems in single neuron computation. Since last year, I have been working on a cerebellar Purkinje neuron in Erik De Schutter's group at OIST. My primary interest is to understand how a computational function of a single neuron is determined: how can we quantitatively establish a mapping between a neuron's computational property and its physical basis, and how such a map would be modified by changing biophysical parameters? I am interested in extending these questions beyond a single neuron, too. I was an OCNC student myself in 2006, which influenced my decision to move to Okinawa. The course is a wonderful opportunity to learn a lot, make new friends, and enjoy the beautiful nature of South Pacific. I hope you have great time in OCNC 2008!

Elisabeth Hopp

Affiliation: Diploma student, Humboldt University, Berlin, Germany
About:

Hi! My name is Elisabeth Hopp, I am a Diploma student at the Institute for Theoretical Biology at Humboldt University in Berlin, Germany. Currently I am particularly interested in perceptual decision making, how it can be described in mathematical terms, and how these computations can be carried out on a neuronal level. I am very much looking forward to exchanging ideas and enjoying the summer with you all in Okinawa!

Janet Hsiao

Affiliation: Postdoctoral Fellow, University of California San Diego, USA
URL: http://www.cse.ucsd.edu/~jhsiao/
About:

I am currently a postdoctoral researcher at UC San Diego, and affiliated with the Temporal Dynamics of Learning Center and the Perceptual Expertise Network. My research interests include hemispheric asymmetry in cognitive processes and development of perceptual expertise, using a variety of approaches including computational modeling (in particular connectionist models) and cognitive neuroscience (with behavioral, eye movement, and EEG/ERP measures). I am interested in incorporating neuroanatomical findings into computational models to account for behavioral data.

Kaori Ikeda

Affiliation: PhD student, Division of Neuroscience, John Curtin School of Medical Research, Australian National University (Canberra, Australia)
About:

Hi, I'm a PhD student in Dr. John Bekkers' lab at the John Curtin School of Medical Research in Canberra, Australia. My PhD research has involved using cultured autaptic neruons to study presynaptic properties of synaptic transmission. I have a biological background and am eager to learn how this can be combined with more computational approaches. Look forward to meeting you all soon!

Shin Ishii

Affiliation: NAIST
URL: http://hawaii.naist.jp/~ishii/
About:

Shin Ishii received a BE in 1986, ME in 1988, and PhD in 1997 from the U. Tokyo. He joined Ricoh Co. Ltd. in 1988, and ATR in 1994. He became an associate professor at Nara Institute of Science and Technology (NAIST) in 1997 and a professor in 2001. Since 2007, he has been a professor in the Graduate School of Informatics, Kyoto University, while serving concurrently in NAIST. He is interested in systems neurobiology, machine learning including reinforcement learning and Bayesian learning, and bioinformatics.

Message to participants:

This is my fourth time to join this wonderful summer school, OCNC, as a faculty member. The venue is in front of the ocean, and I believe it will be a nice experience for you and myself to learn and study a lot about computational neuroscience in a very interdisciplinary atmosphere.

Carl Jackson

Affiliation: Postdoctoral Research Fellow, PRISM Lab, Behavioural Brain Sciences, University of Birmingham, UK
URL: http://prism.bham.ac.uk/~jackscpz
About:

Hi all! My name is Carl Jackson, and I'm a postdoctoral research fellow in human sensory motor neuroscience in the PRISM lab at the University of Birmingham, working with Chris Miall. My background is in physics, but my PhD and postdoc are in behavioural neuroscience and neuroimaging related to motor control.

I'm interested in how predictive forward models and optimal feedback control can help us to explain the characteristics of human movement. Specifically, I've recently been looking at bimanual movement and on this course I want to try to model neural systems that can explain my behavioural results.

Very much looking forward to meeting people and having three weeks learning interesting science in Okinawa!

Tim Jarsky

Affiliation: Postdoctoral Fellow, Department of Ophthalmology, Northwestern University, Chicago, Il.
About:

I joined Josh Singer's laboratory at Northwestern University in April of last year to study the second synapse in the rod retinal circuitry, the rod bipolar cell -- AII amicrince cell synapse. My study of synaptic properties with josh follows my pre doctoral work with Nelson Spruston which was entirely focused on the intrinsic properties of CA1 pyramidal neurons of the hippocampus. In Dr. Spruston's lab i was able to collaborate with the computational modelers with great success. With this course i hope to be able to bring computational modeling to the Singer lab and the study of the rod bipolar cell -- AII amicrince cell synapse. In my free time i enjoy working in the student machine shop as well as building audio amplifiers.

Jessy John

Affiliation: Research Scholar, Computational Neurophysiology Lab, IIT Bombay, India.
About:

I am a PhD student with Prof. R Manchanda, IIT Bombay. Our lab is performing computational investigations into the mechanisms of information processing in medium spiny projection neurons in relation to reward processing and learning. I look forward to meet, discuss and learn from people in different areas of neuroscience coming for this course from across the world. Also, this is my first visit to a foreign country.

See you all at Okinawa!

Georgios Kalantzis

Affiliation: Ph.D. student, Department of Neurobiology and Anatomy, University of Texas-Health Science Center in Houston, USA
URL: http://nba.uth.tmc.edu/homepage/shouval/index.htm
About:

After finishing my studies in Greece, I joined the graduate program in Neuroscience at the University of Texas in the department of Neurobiology and Anatomy. My background is mostly Physics and I am interested in synaptic plasticity. For my research, me and my advisor Dr. Shouval, we followed a bottom to top approach. I started with modeling of simplified molecular mechanisms that contribute in synaptic alterations, studying and developing appropriate tools for exploring more complex reaction-diffusion systems and finally trying to derive abstract learning models. I am very looking forward to meet and discuss with you all at the OCNC Summer School.

Christof Koch

Affiliation: Professor, Division of Biology, California Institute of Technology
URL: http://www.klab.caltech.edu/
About:

Born in 1956 in the American Midwest, Christof Koch grew up in Holland, Germany, Canada, and Morocco, where he graduated from the Lycèe Descartes in 1974. He studied Physics and Philosophy at the University of Tübingen in Germany and was awarded his Ph.D. in Biophysics in 1982.

After four years at MIT, Dr. Koch joined Caltech in 1986, where he is the Lois and Victor Troendle Professor of Cognitive and Behavioral Biology. He lives in Pasadena, and loves to run and to climb.

The author of three hundred scientific papers and journal articles, and several books, Dr. Koch studies the biophysics of computation, and the neuronal basis of visual perception, attention, and consciousness. Together with Francis Crick, he is one of the pioneers of the neurobiological approach to consciousness.

Birgit Kriener

Affiliation: Postdoctoral Fellow, Network Dynamics Group, MPI for Dynamics and Self-Organization

Bernstein Center for Computational Neuroscience, Goettingen, Germany

URL: http://www.chaos.gwdg.de/research/nldgroupfolder.2007-11-12.3980822778/nldgroup.2007-12-19.1982354479
About:

Hi!

My name is Birgit and I have the pleasure to be one of the tutors of this year's course! My background is theoretical physics and computational neuroscience. I did my PhD at the Bernstein Center for Computational Neuroscience (BCCN) in Freiburg/Germany and I am now affiliated to the Network Dynamics Group at the Max Planck Institute for Dynamics and Self-Organization and the BCCN in Goettingen/Germany.

My main research interest is the dynamics of networks of integrate-and-fire neurons in dependence of structural features such as weight distribution and network topology. Especially, I¹m interested in correlations and synchronization phenomena in these systems. The analytical part of my work is concerned with non-linear dynamics, time series analysis, graph theory and random matrix theory. If it comes to computation, I make use of Matlab, NEST, and Mathematica.

Due to my close association with the interdisciplinary Bernstein Centers, I have always interacted with experimenters of the "wet" part of the neurosciences just as well.

As a student I partook in several summer schools, amongst them the Advanced Course in Computational Neuroscience in 2005, and an internship at the RIKEN BSI in 2006. It was always a more than nice and helpful experience, both on the professional and the social level!

I'm really looking forward to visit Japan again and meet you all at Okinawa!

Hiroki Kurashige

Affiliation: Researcher, Recognition and Judgment Unit, RIKEN BSI-TOYOTA Collaboration Center, RIKEN, Japan
URL: http://btcc.brain.riken.jp/RJU/index_e.html
About:

Hello! I am a postdoctoral fellow at RIKEN BSI-TOYOTA Collaboration Center. In my PhD, I studied the mechanism for synaptic plasticity, especially STDP, by model simulation. Currently, I¹m working on the development of the data analysis methods for in vivo and in vitro neurophysiological data. I am interested in the functional significance of the dendritic regenerative electrogenesis, such as dendritic spikes, for information processing in noisily physiological microcircuits. Additionally, I want to explore the roles of various types of plasticity in the information processing. I'm looking forward to meeting and discussing with you in OCNC!

Robert Kyle

Affiliation: PhD student, Institute for Adaptive and Neural Computation, University of Edinburgh
URL: http://homepages.inf.ed.ac.uk/s0453687/
About:

Hey folks! My name is Robert Kyle, and I'm a PhD student in the Institute for Adaptive and Neural Computation at the University of Edinburgh. My research is into how we learn goal-directed behaviours, with a particular focus on the role of dopamine in this process. I'm 2 years into the project now, working with my supervisors David Willshaw and Daniel Durstewitz.

My background is in Theoretical Physics and Cognitive Science, and I still have interests in these areas even though I have since moved to neuroscience. I look forward to meeting all of you and sharing ideas... See you in June!

Leslee Lazar

Affiliation: PhD student, National Brain Research Centre, Manesar, India.
About:

I am a PhD student at National Brain Research Centre, India. My institute is near New Delhi, which is the capital of India and a short trip away from beautiful Himalayan mountains and the Rajasthan desert.

My research interest is in coding of stimulus location and roughness representation in the somatosensory cortex of monkeys. For this, I use multi-electrode recordings (spikes/LFPs) in anaesthetized and awake-behaving monkeys. I am also interested in cortical (and sub-cortical) plasticity following spinal cord injuries.

I am new to computational modeling and hope to learn lots of new ideas from this workshop. Eventually I want to model the spatio-temporal pattern of spiking activity in the somatosensory cortex following tactile stimulus.

I am looking forward to meeting the diverse group of theoreticians and experimentalists at OCNC2008 and having many stimulating discussions. Also, this is my first trip to Japan. I have been a big fan of many things Japanese (especially Haruki Murakami and Sake). So, can¹t wait to get there!

Zhaoping Li

Affiliation: Department of Computer Science, University College London
URL: http://www.cs.ucl.ac.uk/staff/Zhaoping.Li/
About:

I obtained my B.S. in Physics in 1984 from Fudan University, Shanghai, and Ph.D. in Physics in 1989 from California Institute of Technology. I was a postdoctoral researcher in Fermi National Laboratory in Batavia, Illinois USA, Institute for Advanced Study in Princeton New Jersey, USA, and Rockefeller University in New York USA. I have been a faculty member in Computer Science in Hong Kong University of Science and Technology, and was a visiting scientist at various academic institutions. In 1998, I helped to found the Gatsby Computational Neuroscience Unit in University College London. Currently, I am a Professor of computational neuroscience in the Department of Computer Science in University College London. My research experience throughout the years ranges from areas in high energy physics to neurophysiology and marine biology, with most experience in understanding the brain functions in vision, olfaction, and in nonlinear neural dynamics. In late 90s and early 2000s, I proposed a theory (which is being extensively tested) that the primary visual cortex in the primate brain creates a saliency map to automatically attract visual attention to salient visual locations.

Message to participants:

To students in OCNC2008, please feel free to send me emails for more information about the lectures or give your inputs as to what you desire to learn or wish to be taught more. Reading the suggested readings ahead of the lectures is essential if you want more than a superficial absorption of the lectures. To colleagues at Okinawa, I look forward to visiting your labs for the first time and hearing about your latest research.

Jussi T. Lindgren

Affiliation: PhD student, NeuroInformatics Group, CS/HY, Finland
URL: http://www.cs.helsinki.fi/u/jtlindgr
About:

Hi all. I'm Jussi Lindgren, and I'm interested in transformations that turn photon inputs into perceptual experiences of the surrounding world. In OCNC I hope to learn more about the nuts and bolts used in this process called 'vision'. Lately I've been working on statistical models of natural images and trying to understand the connections between the data statistics and the characteristics of neural machinery. My personal characteristics include liking long walks and bicycling.

Andre Longtin

Affiliation: Professor of Physics

Director, Center for Neural Dynamics Physics Department

University of Ottawa

About:

Andre Longtin is native of Montreal, Canada. He received his honours B.Sc. Physics in 1983, and his M.Sc. Physics in 1985 from the Universite de Montreal. His M.Sc. thesis was on Mathematical Models of the Human acoustic Reflex. He received his Ph.D. in Physics from McGill University in 1989. His thesis was a theoretical and experimental study of Nonlinear Oscillations, Noise and Chaos in Neural Delayed Feedback Systems, under the supervision of Michael Mackey and co-supervision of John Milton (Montreal Neurological Institute). He then joined Los Alamos National Laboratory for two years, both as a Natural Sciences and Engineering Research Council of Canada Postdoctoral Fellow and a Los Alamos Director's Funded Postdoctoral Fellow. He held a joint position in the Theoretical Division T13 (Complex Systems; Doyne Farmer group leader) and the Center for Nonlinear Studies (David Campbell, Director).

He began as assistant professor of Physics in 1992 at the University of Ottawa. He is now Professor since 2002, and cross-appointed to the Department of Cellular and Molecular Medicine in the Faculty of Medicine since 2004. He is a Fellow of the American Physical Society, and on the editorial board of Biological Cybernetics (Springer) and Cognitive Neurodynamics (Springer). He directs the Center for Neural Dynamics at the University of Ottawa.

Michele Mattioni

Affiliation: PhD Student, Computational Neurobiology Group, EMBL-EBI, UK
URL: http://www.ebi.ac.uk/~mattioni
About:

Hello everybody! My name is Michele Mattioni and I'm a Ph.D. student based at the EMBL-EBI in Cambridge, UK. I'm working in computational neurobiology group, headed by Le Novere and I'm developing an accurate model of a medium spiny neuron with compartmental approach. I'm really looking forward to meet you all in Okinawa.

Chrystele Ody

Affiliation: Postdoctoral fellow, Consciousness and computation lab, Columbia University, New York, USA
URL: http://hakwan.googlepages.com/lab
About:

I am a postdoc at Columbia university. My research explores using brain imaging the neural bases of unconscious perceptual processing in humans. I am particularly interested in the influence of unconscious processing on the control of behavior, and I would love to integrate a modeling approach with my experimental work. I look forward to seeing you in Okinawa!

Shogo Ohmae

Affiliation: Adjunct research associate, Department of Neurophysiology (Kitazawa Lab.), Juntendo Univ., Japan
URL: http://www.juntendo.ac.jp/graduate/laboratory/labo/shinkei_seiri/index.html
About:

When I was Kyoto University student, I specialized in Neurochemistry (gene identification and characterization). Since I graduated the university, I majored in Neurophysiology (single and multiple units recording) and computational neuroscience (the optimization of movements) in Ph.D. course of Juntendo University. Now, I am especially interested in motor control from the following three points of view. I want to elucidate neural mechanisms of these processing by using both simulation and electrophysiology.

[1]. Optimization: I succeeded in optimizing, namely smoothing, simulated arm movements by our model.

[2]. Decoding of saccades: I estimated saccade timing and direction using neural activities from frontal eye field (FEF) and supplementary eye field (SEF). Comparing 2 areas, before actual saccades SEF is relatively useful, but after saccades FEF overwhelm SEF.

[3]. Movement timing: Neural activities in SEF represent the length of waiting time for saccades.

Josh Plotkin

Affiliation: Postdoctoral fellow, Physiology Department, Northwestern University
About:

I've been a postdoc in Jim Surmeier's lab at Northwestern for one year now. My research focuses on dendritic excitability of medium spiny projection neurons of the striatum. My experiments use electrophysiology and 2-photon imaging to examine the mechanisms governing dendritic excitability, and how these mechanisms are altered in Parkinson's disease and contribute to changes in synaptic plasticity. Because striatal projection neuron dendrites are small and not easily accessible to direct electrophysiological measurements, I think being able to make testable models of what may be going on in distal dendrites would be a very helpful tool to learn. I'm looking forward to learning more about this technique, and I can't wait to meet everyone in Okinawa!

Dobromir Rahnev

Affiliation: Ph.D. student, Columbia University
About:

I am a first-year Ph.D. student at Columbia University. My lab is interested in visual awareness and we approach the issue using techniques such as fMRI, TMS, and psychophysical studies. Recently, we've also become interested in modeling the brain processes responsible for perceptual decisions in the context of Signal Detection Theory and criterion setting.

I am looking forward to a fun time in Okinawa and meeting all of you.

Lee Reid

Affiliation: Masters Student, Avian Neurobiology Group, University of Auckland, New Zealand
About:

Hi guys! I'm a Masters Student at Auckland University in New Zealand. My Master's project investigates somatosensory-auditory overlap in the avian song system using both anatomical techniques and multiunit recordings. I'm interested in how these two senses can integrate in non-linear ways to provide abilities such as identification of self-generated sounds.

I'm somewhat a jack-of-all-fields, rather than particularly specialised guy, with an interest spectrum ranging from molecular mechanisms to cognitive psychology and philosophy of mind. Most of my interests lie at the systems level, which computational neuroscience compliments very nicely! Looking forward to meeting you all in Okinawa!

Klaus Stiefel

Affiliation: OIST
URL: http://www.irp.oist.jp/tenu
About:

I got my undergraduate degree in microbiology from the University of Vienna and my PhD in zoology from the Max-Planck Institute for Brain Research in Frankfurt (with Wolf Singer). Then I did a post-doc with Terry Sejnowski at the Salk Institute in La-Jolla. In 2006 I joined OIST and established the Theoretical and Experimental Neurobiology Unit.

Message to participants:

Welcome to Okinawa! I hope that during your stay here you will learn a lot about neuroscience, make new friends, see some of the beautiful nature and interesting culture this island has to offer, eat lots of good Japanese food and drink some Awamori. You can sleep when you're dead, but certainly only very little during OCNC!

Tom Tetzlaff

Affiliation: Postdoctoral Fellow, Inst. of Mathematical Sciences and Technology, Norwegian University of Life Sciences, Ås, NO
URL: http://arken.umb.no/~tomt
About:

It's a pleasure for me to come to Okinawa and to attend this course as a tutor. I'm looking forward to meet you all and to accept the challenge to learn and discuss about Computational Neuroscience at such an exciting place.

I have a background in physics and received the doctoral degree in theoretical neuroscience last year in Freiburg, Germany. Currently I live (and work) in Norway.

My scientific work is concerned with the relationship between the structure and dynamics of networks of spiking neurons. I think that the rich dynamical repertoire of these high-dimensional complicated systems is certainly the key to understand brain function. Nevertheless, there are several good reasons to boil things down and to go for simplified descriptions. Currently I'm therefore involved in a project that seeks to derive macroscopic (population) models from microscopic (networks of neurons) models.

Taro Toyoizumi

Affiliation: Postdoctoral Research Fellow, Center for Theoretical Neuroscience, Columbia University
URL: http://neurotheory.columbia.edu/~taro/
About:

My background is physics and information theory. I have been studying synaptic plasticity, information coding, and the self-organization of the brain from theoretical perspectives, hoping that they could provide some unified understanding of our brain circuitry and functions. I am particularly interested in the development of visual cortex and the idea of synaptic plasticity as an optimal learning rule. I hope to see you all on OCNC soon.

Misha Tsodyks

Affiliation: Department of Neurobiology, Weizmann Institute of Science
URL: http://www.weizmann.ac.il/home/bnmisha/
About:

I was trained as a theoretical physicist in Moscow, then moved between fields and countries, until finally settling at the Weizmann Institute in Rehovot, Israel, at the Department of Neurobiology.

This will be my first trip to Okinawa school (and to Japan in general), I am looking forward to meeting all the students and the faculty, and to having heating discussion on computational neuroscience -- what is should and should not study, on which level of realism, where it stands, and where it is heading. I am coming from the school of theoretical physics that developed a very specific approach to these issues, but will be happy to hear opposing views and argue about them.

Christine Vossen

Affiliation: Ph.D. Student, T35 Theoretical Biophysics, Department of Physics, Technical University of Munich (TUM), Germany
URL: www.t35.physik.tu-muenchen.de
About: The important thing is not to stop questioning.

Albert Einstein

My name is Christine Vossen and I am a Ph.D. student in Leo van Hemmen's group for Theoretical Biophysics. Before I came to Munich I studied Mathematics in Heidelberg. I am currently working on concepts of sensory information processing. Therefore I am studying a concrete example -- the auditory system of a lizard -- as well as general concepts of merging senses from different sensory systems. As a scientist, I always try to stay curious. Thus I am looking forward to the conference, to all the interesting people from around the world, to new theories and to new questions.

Xiao-Jing Wang

Affiliation: Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine
URL: http://wanglab.med.yale.edu/
About:

Xiao-Jing Wang is Professor of Neuroscience, and director of the Swartz Initiative for Theoretical Neurobiology at Yale. He did his undergraduate studies and Ph. D. in Theoretical Physics at the University of Brussels, then he changed his field to Computational Neuroscience in 1987. He uses theory and biophysically realistic neural circuit modeling to study cortical dynamics and functions. His interests cover a broad range of topics, including neuronal adaptation at multiple timescales, diversity of inhibitory interneurons, and synchronous network oscillations. The main goal of research in his laboratory is to uncover circuit mechanisms of working memory and decision making in the prefrontal cortex and other cortical areas, in close collaboration with experimentalists. Currently, his lab is pursuing large-scale neural circuit models of spiking neurons, to elucidate general principles and cellular basis of key cognitive processes, as well as their impairments associated with schizophrenia and other mental disorders.

Message to participants:

Welcome to the Okinawa Computational Neuroscience Course; I hope you will find the experience rewarding as well as challenging. I expect you to be actively involved in the class, asking questions, participating in discussions, and carrying out projects on interesting topics. If you are from outside of neuroscience, it may be reassuring to know that you are not alone---many people (me included) are from physics, mathematics, engineering and other fields. I hope this summer school will strengthen your resolve to enter the field and stimulate you to think about some big open questions.

My own work is at the interface between neurobiology and cognitive neuroscience. Compared to sensory processing and motor behavior, we know relatively little about detailed biophysical and circuit mechanisms of cognitive functions. In my course I hope to show that this is an area of exciting current research and great future promise. I have been teaching computational neuroscience since 1992, at the University of Chicago, Brandeis University and Yale University. I have lectured at various summer schools: Marine Biological Laboratory in Woods Hole (USA), Paris (France), Torino and Trieste (Italy), Xylocastro (Greece), Shanghai (China). I am looking forward to participating in this summer school, and to meeting you all in Okinawa.

 Xu-dong Wang

Affiliation: Ph.D. student, Institute of Neuroscience, Shanghai, China
URL: www.ion.ac.cn
About:

I am Xu-dong Wang, a Ph.D. student in the laboratory of neural plasticity, Institute of Neuroscience. Learning, memory and the underlying biological mechanism- neural plasticity are within the scope of my interest. I love experiment-based theories and theory-guided experiments. It will be my pleasure to take the opportunity to meet you all during this summer school.

Masayuki Watanabe

Affiliation: Postdoctoral Fellow, Centre for Neuroscience Studies, Queen's University
URL: http://brain.phgy.queensu.ca/doug/www/
About:

Hi, I'm an experimentalist in neurophysiology, working with Doug Munoz at Queens' University. My interest is arbitrary stimulus-response mapping, our ability to generate any kind of action in response to a sensory event. I take advantage of the saccade eye movement system to understand the neural basis of this ability. My ongoing research focuses on the basal ganglia.

I've been collecting a lot of data by running experiments everyday, but I'm sick of it lately. So this is a great opportunity to run away from experiments and learn some neural computational skills which I've always wanted to since undergrad. I hope to work hard and play hard with you all for three weeks in Okinawa. By the way, I'm Japanese, so I can teach you Japanese if you teach me math.

Oliver Weihberger

Affiliation: PhD student, Bernstein Center for Computational Neuroscience Freiburg, Germany
URL: http://www.bccn.uni-freiburg.de/People/members/weihberger
About:

I'm Oliver, 27 years old and from Freiburg, Germany. I studied physics and my first contact with neuroscience was in my master thesis where I worked on a computational study of synchronization in a HH-type coupled neuronal network. The emphasis in my current PhD work is more experimental. I am working with in-vitro cortical cell cultures on microelectrode arrays. I apply electrical stimulation under varying paradigms to investigate the network's input-output relationships. The goal is to derive fundamental network properties, in terms of e.g. processing and storage capabilities, that can be generalized to other, more realistic neuronal networks.

I am very pleased to participate in OCNC 2008, it is the first time that I visit Japan. I will enjoy meeting and discussing with people from many different countries and backgrounds that all have a common interest in neuroscience. See you soon!

Jeff Wickens

Affiliation: OIST
About:

Jeff Wickens complete a medical degree (1982) and PhD (1991), at the University of Otago. He became a faculty member at the University of Otago in 1991 and obtained a personal chair in 2004. In 2007 he was appointed as a principal investigator at the Okinawa Institute of Science and Technology, where he is in the process of establishing the neurobiology research unit. He works on the basal ganglia, using experimental and computational approaches to investigate learning mechanisms and local circuit operations of the striatum.

Message to participants:

Since my PhD work on a theory of the striatum I have pursued my ambition to develop a computational theory of information processing in the basal ganglia, one which is faithful to the neurobiology and also able to account for its contribution to purposeful behaviour. I believe that theory has a vital role to play in advancing our understanding of the brain and computational modeling is a powerful tool for bringing theories and experimental results into interaction. This approach is really in its infancy, and in my opinion giant steps still lie ahead of us. There is much work to be done, and courses like this are a great way to accelerate the development of the field. I hope you will find here some of the tools you need to advance this field and fulfill your own theoretical ambitions.

Tadashi Yamazaki

Affiliation: Staff Scientist, Lab. for Motor Learning Control, RIKEN Brain Science Institute, Japan
URL: http://capsule.brain.riken.jp/~tyam/
About:

My original background is in theoretical computer science (designing algorithms and computational complexity), and I was/am interested in what the brain computes and how the brain computes it. So, I decided to work in the field of computational neuroscience.

Now, I'm a staff scientist at Lab. for Motor Learning Control, RIKEN Brain Science Institute. I'm interested in cerebellar mechanisms for gain control (how far to move) and timing control (when to move). Currently, I've been developing a large-scale spiking network model of the cerebellum which can control both quantities by a single computational principle. I'm also interested in how motor memory is acquired and consolidated in the cerebellum, specifically, how the learned memory in the cerebellar cortex is transferred to the cerebellar nuclei.

I look forward to seeing you and having three weeks together in Okinawa.

Pierre Yger

Affiliation: PhD Student (2nd year) in the Unité de Neurosciences Intégratives et Computationnelles (UNIC)
URL: http://www.unic.cnrs-gif.fr
About:

My name is Pierre Yger and I'm a 2nd year PhD student in a neuroscience laboratory close to Paris, kilometers away from Okinawa. In short, my main background is computer science but I'm working close to experimentalists performing intracellular recordings in the cat visual cortex. Since I wanted to work in Artificial Intelligence, I thought too naively : "Why not trying first to understand how the brain works before designing any silly algorithm ?" How naive was I ...

Inspired from their gathered data, I perform large scale simulations of neuronal networks to address more specifically the question of how information may be reliably transmitted and recalled within such chaotic dynamical system. I'm also very interested in plasticity, especially in Spike Timing Dependent Plasticity, and in how such rules can affect the dynamical properties of the network and underly the unsupervised learning achieved during epigenetic development.

I've already attended, last year, to the RIKEN summer internship, spending two months in Tokyo, eating sushis and improving my karaoke skills, so I'm sure that Okinawa will confirm my idea that Japan is a wonderful world.

I'm looking forward to meet all of you there.

Kyongsik Yun

Affiliation: Ph.D. Candidate, Brain Dynamics Laboratory, Department of Bio and Brain Engineering, KAIST, Korea
URL: http://yunks.kaist.ac.kr
About:

I am a Ph.D. candidate in the Brain Dynamics Lab, Department of Bio and Brain Engineering at KAIST. I earned a BA degree in bio and brain engineering from KAIST in 2006.

The aim of my work is to understand the fundamental mechanisms underpinning the information processing of decision-making and to elucidate the pathophysiology of various neuropsychiatric disorders. My research interests are reward, decision-making, addiction, and nonlinear dynamics, using computational modeling of neural networks and functional neuroimaging including fMRI and EEG. I am very excited to attend the course and hope to gain very positive academic stimulus and advanced knowledge in computational modeling. I also look forward to sharing my study with all of you.