Lecturers & Abstract / OCNC2022

Week 1

Monday, June 13

Parallel Sessions

1. Kenji Doya

Title: Introduction to numerical methods for differential equations

This tutorial introduces the basic concepts of differential equations and how to solve them, or simulate their behaviors in time, using a computer. Key concepts like eigenvalues and stability are explained while solving simple differential equation. Some examples of Hodgkin-Huxley type neuron models are introduced.

Suggested Reading:

  • Koch C: Biophysics of Computation: Information Processing in Single Neurons. Oxford University Press (1999).

2. Izumi Fukunaga

Title: A very brief tour of the nervous system

 

Tuesday, June 14

Erik De Schutter

Title: Modeling biochemical reactions, diffusion and reaction-diffusion systems

In this talk I will use calcium dynamics modeling as a way to introduce deterministic solution methods for reaction-diffusion systems. The talk covers exponentially decaying calcium pools, diffusion, calcium buffers and buffered diffusion, and calcium pumps and exchangers. I will describe properties of buffered diffusion systems and ways to characterize them experimentally. Finally I will compare the different modeling approaches.
In the second talk I will turn towards stochastic reaction-diffusion modeling. Two methods will be described: Gillespie's Stochastic Simulation algorithm extended to simulate diffusion, and particle-based methods. I will briefly describe the STEPS software and give some examples from our research.
I will finish with describing how the STEPS framework can be used to go beyond the compartmental model to simulate neurons in 3D.

Suggested Readings:

  • U.S. Bhalla and S. Wils: Reaction-diffusion modeling. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)
  • E. De Schutter: Modeling intracellular calcium dynamics. In Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston. 61–92 (2009)
  • F. Santamaria, S. Wils, E. De Schutter and G.J. Augustine: Anomalous diffusion in Purkinje cell dendrites caused by dendritic spines. Neuron 52: 635–648 (2006).   
  • A.R. Gallimore, et al.: Switching on depression and potentiation in the cerebellum. Cell Reports 22: 722-733 (2018).
Wednesday, June 15

Bernd Kuhn

Title: 1. Ion channel physiology and the Hodgkin-Huxley model of neuronal activity

In my lecture I will talk about electric activity in neurons. I will start with the basics of ion channels, and specifically focus on voltage-gated channels and their dynamics in response to membrane voltage. Neurons use a combination of different voltage-gated channels to generate fast (about 1 ms), depolarizing action potentials. I will explain the first action potential model by Hodgkin and Huxley . Finally, I will discuss more recent additions or fine-tuning of the time-honored Hodgin-Huxley model.

Title: 2. Functional optical imaging

Functional optical imaging has becomes one of the key techniques in neuroscience. In my second lecture I will introduce fluorescence and the most important imaging methods. I will explain what we can learn from them but also discuss their limitations.

Suggested Readings:

  • Johnston and Wu: Foundation of cellular neurophysiology, MIT press
  • Helmchen, Konnerth: Imaging in Neuroscience, 2011
  • Yuste, Lanni, Konnerth: Imaging Neurons, 2000
Thursday, June 16

Tomoki Fukai

Title: Neural network modeling of cognitive functions

The brain's ability to learn and memorize things is crucial for cognitive behavior of animals. Though our understanding of the underlying mechanisms of learning is limited, researchers have achieved many insights into the mechanisms in the last few decades. In my lecture, I will explain the basic properties of several (both classic and recent) models of neural information processing. These models range from feedforward network models with error correction learning and backpropagation to reservoir computing in recurrent neural networks. Then, I will show how these models can account for cognitive behaviors of animals such as pattern recognition, spatial navigation and decision making. I want to emphasize the essential role of low-dimensional features of neural dynamics in learning.

Related readings:

  • Anthony Joseph Decostanzo, Chi Chung Fung and Tomoki Fukai (2019) Hippocampal neurogenesis reduces the dimensionality of sparsely coded representations to enhance memory encoding. Front Comput Neurosci 12: 1-21.
  • Tomoki Kurikawa, Tatsuya Haga, Takashi Handa, Rie Harukuni and Tomoki Fukai (2018) Neuronal stability in medial frontal cortex sets individual variability in decision-making. Nat Neurosci, 21:1764-1773.
  • Toshitake Asabuki, Naoki Hiratani and Tomoki Fukai (2018) Interactive reservoir computing for chunking information streams. PLoS Comput Biol, 14(10):e1006400.
  • Tatsuya Haga and Tomoki Fukai (2018) Recurrent network model for learning goal-directed sequences through reverse replay. Elife 7: e34171.
  • Mastrogiuseppe F, Ostojic S (2018) Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks. Neuron 99: 609-623.
  • Song HF, Yang GR, Wang XJ. Reward-based training of recurrent neural networks for cognitive and value-based tasks. Elife. 2017 Jan 13;6. pii: e21492.
  • Sussillo D, Abbott LF (2009) Generating coherent patterns of activity from chaotic neural networks. Neuron. 63: 544-557.
Friday, June 17

Erik De Schutter

Title: Introduction to modeling neurons

I will discuss methods to model single neurons, going from very simple to morphologically detailed. I will briefly introduce cable-theory, the mathematical description of current flow in dendrites. By discretizing the cable equation we come to compartmental modeling, the standard method to simulate morphologically detailed models of neurons. I will also give an overview of dendritic properties predicted by cable theory and experimental data confirming these predictions.  I will discuss the challenges in fitting compartmental models to experimental data with an emphasis on active properties.

Suggested Readings:

  • Several chapters in Computational Modeling Methods for Neuroscientists, E. De Schutter ed., MIT Press, Boston (2009).
  • Y. Zang, S. Dieudonné and E. De Schutter: Voltage- and Branch-specific Climbing Fiber Responses in Purkinje Cells. Cell Reports 24: 1536–1549 (2018).
Saturday, June 18

Kenji Doya

Title: Introduction to reinforcement learning and Bayesian inference

The aim of this tutorial is to present the theoretical cores for modeling animal/human action and perception. In the first half of the tutorial, we will focus on "reinforcement learning", which is a theoretical framework for an adaptive agent to learn behaviors from exploratory actions and resulting reward or punishment. Reinforcement learning has played an essential role of understanding the neural circuit and neurochemical systems behind adaptive action learning, most notably the basal ganglia and the dopamine system. In the second half, we will familiarize ourselves with the framework of Bayesian inference, which is critical in understanding the process of perception from noisy, incomplete observations.

Suggested Readings:

  • Doya K: Reinforcement learning: Computational theory and biological mechanisms. HFSP Journal, 1(1), 30-40 (2007). Free on-line access: http://dx.doi.org/10.2976/1.2732246
  • Doya K, Ishii S: A probability primer. In Doya K, Ishii S, Pouget A, Rao RPN eds. Bayesian Brain: Probabilistic Approaches to Neural Coding, pp. 3-13. MIT Press (2007). Free on-line access: http://mitpress.mit.edu/catalo/item/default.asp?ttype=2&tid=11106

Week2

Monday, June 20

Yukiko Goda

Title: Features of synaptic strength regulation

Synapses are the key mediators of information transmission in the brain. The efficacy of synaptic transmission, i.e. synaptic strength, determines the extent of information received by the target neuron, and the changes in synaptic strengths are thought to constitute the neural substrate for learning. Whereas synaptic strength changes associated with learning are thought to be specific to active inputs in principle, most often, single synapses do not operate in isolation and nearby synapses influence each other. Such local interactions, in turn, shape dendritic integration of information by the target neuron. Given the ambiguities of the spatial spread of synaptic strength changes and the types of plasticity associated with the changes, the minimal operating unit of synaptic plasticity and the rules for its implementation remain enigmatic. The lecture will highlight and discuss experimental insights on features of synaptic strength regulation that are consequential for neural circuit properties.

Suggested readings

  • Chater TE, Goda Y. (2021) My Neighbour Hetero - deconstructing the mechanisms underlying heterosynaptic plasticity. Curr Op Neurobiol 67, 106-114. doi: 10.1016/j.conb.2020.10.007
  • Larsen RS, Sjostrom PJ (2015) Synapse-type-specific plasticity in local circuits. Curr Op Neurobiol 35, 127-135. doi: 10.1016/j.conb.2015.08.001
  • Chipman PH, Fung CCA, Fernandez A, Sawant A, Tedoldi A, Kawai A, Gautam SG, Kurosawa M, Abe M, Sakimura K, Fukai T, Goda Y. (2021) Astrocyte GluN2C NMDA receptors control basal synaptic strengths of hippocampal CA1 pyramidal neurons in the stratum radiatum. eLife 10, e70818. doi: 10.7554/eLife.70818
  • Zador AM (2019) A critique of pure learning and what artificial neural networks can learn from animal brains. Nat Commun 10:3770. doi: 10.1038/s41467-019-11786-6
Tuesday, June 21

Boris Gutkin

Title: Brain oscillations from mathematical mechanistic models to function

Oscillations are ubiquitously observed across multiple frequency bands  in all brain states: from the extremely low frequency (ultra-slow) oscillations in the rest state, to global alpha-band activity under anesthesia, to the complex dynamics of multi-frequency coherence during active states. The spectral content of the oscillations and their intercoupling has been linked to many neural information processing phenomena and performance of different cognitive tasks. At the same time, this functional role of oscillations remains controversial: while some theories see them as central mechanisms of computation, others argue that oscillations are a biproduct of the how our brain networks are constructed.  At the same time, the neuronal mechanisms of how coherent oscillations arise in brain networks is also a question that continues to be studied. In this lecture I will first give an glimpse at the mechanistic mathematical models of how oscillations arise in spiking networks and how we can characterize these to understand the onset of the various oscillatory states. In the second part, I will give examples from my research on how we can view oscillatory activity as a mechanism for function: such as working memory and speech processing.

Suggested Reading:

  • Rooy M, Lazarevich I, Koukouli F, Maskos U, Gutkin B.S. (2021) Cholinergic modulation of hierarchical inhibitory control over cortical resting state dynamics: Local circuit modeling of schizophrenia-related hypofrontality, Current Research in Neurobiology, 100018, ISSN 2665-945X
  • Zeldenrust F, Gutkin B, Denéve S (2021) Efficient and robust coding in heterogeneous recurrent networks. PLoS Comput Biol 17(4): e1008673. https://doi.org/10.1371/journal.pcbi.1008673
  • Dumont G, Gutkin B (2019) Macroscopic phase resetting-curves determine oscillatory coherence and signal transfer in inter-coupled neural circuits. PLOS Computational Biology 15(5): e1007019
  • Chalk M, Gutkin B, Denève S. (2016) Neural oscillations as a signature of efficient coding in the presence of synaptic delays, Elife. 2016 Jul 7;5. pii: e13824
  • A Hyafil, L Fontolan, C Kabdebon, B Gutkin, AL Giraud (2015) Speech encoding by coupled cortical theta and gamma oscillations eLife, e06213

Brief Bio / Message to Students:

I received my initial training in Physics and Applied Mathematics at North State Carolina University and my doctorate in Computational Neuroscience from the University of Pittsburgh. I am presently  Research Director at the Centre National pour la Recherche Scientifique (CNRS), the French national research institute. Since 2006 I have been director and co-director of the Group for Neural Theory (GNT) at the Ecole Normale Superieure. Within the GNT, I have been working on a wide variety of research topics, ranging from biophysical models of neural excitability and dendritic computation to models of cognitive processes. In my research I take a biophysically realistic circuit approach to  modelling drug addiction, combining receptor modelling with neurodynamics and homeostatic mechanisms of  reinforcement learning. One of the central themes of my research have been the mechanisms and function of oscillatory neuronal activity. My contribution to the summer school will be on this particular topic.

Wednesday, June 22

Devika Narain

Title: Neural activity to behavior through dynamics

A hallmark of higher brain function is to generate flexible behaviors based on internal and external cues. Recent work has highlighted the usefulness of frameworks such as nonlinear dynamical systems to advance our understanding of how the brain generates behaviors based on environmental and internal contingencies. In these lectures, I will provide an introduction to dynamical systems, some recent tools such as recurrent neural networks, that have proved useful in deciphering neural dynamics. We will also discuss the appropriateness of various techniques in decoding behavioral variables from neural dynamics. Finally, I will provide in-depth examples from the field where all these techniques have been employed.

Relevant literature:

  • Towards the neural population doctrine. Saxena & Cunningham 2019
  • A dynamical systems perspective on flexible motor timing. Remington et al 2018
  • Dimensionality reduction for large-scale neural recordings Cunningham & Yu 2014
  • Recurrent neural networks as versatile tools of neuroscience research. Barak 2017
  • Low distortion local eigenmaps. Kohli, Colinger, Mishne 2021
Thursday, June 23

Shiro Ikeda

Title: Collaboration of data science and astronomy: Imaging black hole shadow with the Event Horizon Telescope

In April 2019, the EHTC (Event Horizon Telescope collaboration) released the first image of the M87 black hole shadow and in May this year, the black hole shadow image of our Milky Way galaxy was
released. The EHTC has more than 300 members from different backgrounds and countries. I have been involved in this project as a data scientist for more than 8 years and collaborated with EHTC members to develop a new imaging method. The EHT is a huge very long baseline interferometer (VLBI), which is different from optical telescopes in that a lot of computation is required to obtain a single image. Blackhole imaging is also very interesting from the data scientific viewpoint. In this talk, I will explain how the new imaging technique has been developed and the final images were created through
our discussion.

Friday, June 24

Leenoy Meshulam

Title: Theoretical approaches to modeling the activity of (very) large neuronal populations

Recent technological progress has dramatically increased our access to the neural activity underlying memory-related tasks. These complex high-dimensional data call for theories that allow us to identify signatures of collective activity in the networks that are crucial for the emergence of cognitive functions. I will focus on neural activity in dorsal hippocampus as a mouse runs along a virtual linear track. One of the dominant features of this data is the activity of place cells, which fire when the animal visits particular locations. During the first stage of our work we used a maximum entropy framework to characterize the probability distribution of the joint activity patterns observed across ensembles of up to 100 cells. These models make surprisingly accurate predictions for the activity of individual neurons given the state of the rest of the network, and this is true both for place cells and for non-place cells.  Next, I we shall discuss populations  of ~ 2000 neurons. To address this much larger system, I will show how we use different coarse graining methods, in the spirit of the renormalization group from statistical physics, to uncover macroscopic features the network and its scaling behavior. Finally, we will conclude in an informal discussion about how to identify potential future avenues for theory in neuroscience, and whether we should hope for simplification in a system as complex as the brain.

Gaute Einevoll

Title: Modelling electric brain signals

Measurements of electric potentials from neural activity have played a key role in neuroscience for almost a century. Simulations of neural activity is an important tool for understanding such measurements and to make a quantitative link between what the neural network is doing and what is measured. Volume conductor (VC) theory is used to compute extracellular electric potentials such as extracellular spikes, MUA, LFP, ECoG and EEG surrounding neurons, and also magnetic signals such as MEG. In the lecture, the foundations of VC theory is outlined. Furthermore, examples are provided of how the theory is applied to compute spikes, LFP- EEG- and MEG signals generated by neurons and neuronal populations.

Saturday, June 25

Michael Berry II

Title: Neural Coding

OUTLINE
1. Fundamentals of Neural Coding
  A. Noise & Discrimination
  B. Entropy & Information
  C. Design Principles
2. Population Neural Codes
  A. Correlation & Redundancy
  B. Coding with Clusters of Population Activity
3. Predictive Coding

Suggested Reading(s) :

1. Newsome 1989: https://www.nature.com/articles/341052a0
2. Laughlin 2001: https://www.sciencedirect.com/science/article/pii/S0959438800002373?casa_token=Ksi7T7pdGJQAAAAA:OrWyRD3PGRMXaJ69gFyz4docBZ543pLGCeZiium9wY1_G0cqRFFI2x6CNq7nu6Hpc2YPx-iF9_sg
3. Berry & Tkacik 2020: https://www.frontiersin.org/articles/10.3389/fncom.2020.00020/full
4. Rao & Ballard 1999: https://www.nature.com/articles/nn0199_79
5. Keller & Flogel 2018: https://www.sciencedirect.com/science/article/pii/S0896627318308572

Brief Bio / Message to Students:
I started with a PhD in experimental physics, studying electronic transport in quantum dots.Then, I switched to neuroscience for a postdoc studying the neural code of the retina.
In 1999, I became a professor at Princeton, where I continued studying the retina. In more recent years, I have begun studying the visual cortex. Methodologically, I have combined large-recording experiments with sophisticated data analysis and computational models.

Week3

Monday, June 27

Yukiyasu Kamitani

Title: Brain–DNN representational homology and its applications to brain decoding

Deep neural networks (DNNs) are computational models inspired by the functions of neurons and their networks. They have been studied as general-purpose machine learning models apart from neuroscience research. However, recent studies have recast the understanding of the relationship between the brain and DNNs. By optimizing a DNN using large-scale image data, feature representations with different complexity levels automatically emerge at each layer. These features can be quantitatively correlated with characteristic representations found in monkey and human visual cortex. This lecture gives a general overview of how DNNs were "inspired" by neuroscience and explains methods for analyzing information representations in the brain and DNNs. We discuss how brain activity can be converted into DNN signals and then used to reconstruct perceptual and subjective experiences.

Suggested readings:

  • Horikawa, T. & Kamitani, Y. Generic decoding of seen and imagined objects using hierarchical visual features. Nature Communications 8, 15037 (2017). https://doi.org/10.1038/ncomms15037
  • Shen, G., Horikawa, T., Majima, K. & Kamitani, Y. Deep image reconstruction from human brain activity. PLOS Computational Biology 15, 1006633 (2019). https://doi.org/10.1371/journal.pcbi.1006633
  • Nonaka, S., Majima, K., Aoki, S. C. & Kamitani, Y. Brain hierarchy score: Which deep neural networks are hierarchically brain-like? iScience 24, 103013 (2021). https://doi.org/10.1016/j.isci.2021.103013
Tuesday, June 28

Anne Churchland

Title: Decisions, movements and diverse cell types

Understanding how cortical circuits generate complex behavior requires investigating the cell types that comprise them. Functional differences across pyramidal neuron (PyN) types have been observed within cortical areas, but it is not known whether these local differences extend throughout the cortex, nor whether additional differences emerge when larger-scale dynamics are considered. We used genetic and retrograde labeling to target pyramidal tract (PT), intratelencephalic (IT) and corticostriatal projection neurons and measured their cortex-wide activity. Each PyN type drove unique neural dynamics, both at the local and cortex-wide scale. Cortical activity and optogenetic inactivation during an auditory decision task also revealed distinct functional roles: all PyNs in parietal cortex were recruited during perception of the auditory stimulus, but, surprisingly, PT neurons had the largest causal role. In frontal cortex, all PyNs were required for accurate choices but showed distinct choice-tuning. Our results reveal that rich, cell-type-specific cortical dynamics shape perceptual decisions.

Message to participants:

I am very pleased to be coming to Okinawa to talk about my work and hear about recent Neuroscience developments in Japan. I am especially excited to speak to students, postdocs, and early-career researchers about their experiments and their vision for the future of the field. The field of neuroscience is at an exciting moment, and discoveries about cells, circuits and behavior are changing our understanding of the brain and our ability to treat disease. I look forward discussing how we can best move the field forward to build on these discoveries and gain even more insights about healthy and diseased brains.

Wednesday, June 29

Saori C. Tanaka

Title: Understanding human behavior through the neural model of decision and learning

To understand human behavior, computational modeling might be effective. Especially computational modeling of decision and learning behavior has been widely used because the links between models and the neural mechanisms have been well investigated. Firstly, I will show the history and some examples of the computational modeling of behavior shortly. Secondly, I will offer detailed examples of such approaches by using our recent studies of computational modeling of psychiatric disorders. In this part, I exhibit the contents of three steps; constructing the computational model (step 1), simulating behaviors (step 2), and evaluating the models by experiments (step3). Thirdly, I will discuss some remaining issues of computational modeling of behavior. For example, in the human neuroscience experiment, we face heterogeneity among participants and a barrier to accessing the neural signals due to non-invasive measurement. A possible solution is multi-level approaches combining large-sample size/light data and micro-sample size/dense data. I will introduce our recent studies adopting such a multi-level approach.

Brief Bio / Message to Students
I received my Ph.D. in information science from the Nara Institute of Science and Technology (NAIST) in 2006, supervised by Kenji Doya. After a postdoctoral fellow in the lab of John O’Doherty at Caltech, I worked at Osaka University and ATR. Now I am running two laboratories, at NAIST and ATR. My research aim is to understand human behavior. For this aim, I use an approach combining non-invasive brain imaging and the computational models of decision and learning. I have recently added on data-driven approach using large-scale behavioral and neural data.

I also joined OCNC as a student in 2005, and the network I built at that time greatly influenced my research. I hope OCNC will be a great experience in your research life. I look forward to seeing you in Okinawa!