BLUE BRAIN PROJECT
Henry Markram, Project Director of the Blue Brain Project, Director of the Center for Neuroscience & Technology and co-Director of EPFL's Brain Mind Institute, obtained his B.Sc. (Hons) from Cape Town University, South Africa under the supervision of Rodney Douglas and his Ph.D from the Weizmann Institute of Science, Israel, under the supervision of Menahem Segal. During his PhD he discovered a link between acetylcholine and memory mechanisms by showing that acetylcholine modulates the primary receptor linked to synaptic plasticity.
He went to the USA as a Fulbright Scholar at the National Institutes of Health (NIH), where he studied ion channels on synaptic vesicles. He then went as a Minerva Fellow to the Laboratory of Bert Sakmann at the Max Planck Institute, Heidelberg, Germany, where he discovered calcium transients in dendrites evoked by sub-threshold activity, and by single action potentials propagating back into dendrites. He also began studying the connectivity between neurons, describing in great detail how layer 5 pyramidal neurons are interconnected.
He was the first to alter the precise millisecond relative timing of single pre- and post-synaptic action potentials to reveal a highly precise learning mechanism operating between neurons -- now reproduced in many brain regions and known as spike timing-dependent synaptic plasticity (STDP). These experiments were carried out in 1993, four years before publication. Although there were some correlation-sensitive findings before, this was the first study that manipulated single pre- and post-synaptic spike times to monitor the effect of synaptic changes. He was appointed assistant professor at the Weizmann Institute for Science, Israel, where he started systematically dissecting out the neocortical column. He discovered that synaptic learning can also involve a change in synaptic dynamics (called redistribution of synaptic efficacy) rather than merely changing the strengths of connections. He also revealed a spectrum of new principles governing neocortical microcircuit structure, function, and emergent dynamics. Based on the emergent dynamics of the neocortical microcircuit he and Wolfgang Maass developed the theory of liquid computing, or high entropy computing.
In 2002 he moved to EPFL as full professor and founder/director of the Brain Mind Institute and Director of the Center for Neuroscience and Technology. At the BMI, in the Laboratory for Neural Microcircuitry, Markram has continued to unravel the blueprint of the neocortical column, building state-of-the-art tools to carry out multi-neuron patch clamp recordings combined with laser and electrical stimulation as well as multi-site electrical recording ,chemical imaging and gene expression. Markram has received numerous awards and published over 75 papers.
About the Blue Brain Project
About the Blue Brain Project
The cerebral cortex, the convoluted "grey matter" that makes up 80% of the human brain, is responsible for our ability to remember, think, reflect, empathize, communicate, adapt to new situations and plan for the future. The cortex first appeared in mammals, and it has a fundamentally simple repetitive structure that is the same across all mammalian species.
The brain is populated with billions of neurons, each connected to thousands of its neighbors by dendrites and axons, a kind of biological "wiring". The brain processes information by sending electrical signals from neuron to neuron along these wires. In the cortex, neurons are organized into basic functional units, cylindrical volumes 0.5 mm wide by 2 mm high, each containing about 10,000 neurons that are connected in an intricate but consistent way. These units operate much like microcircuits in a computer. This microcircuit, known as the neocortical column (NCC), is repeated millions of times across the cortex. The difference between the brain of a mouse and the brain of a human is basically just volume - humans have many more neocortical columns and thus neurons than mice.
This structure lends itself to a systematic modeling approach. And indeed, the first step of the Blue Brain project is to re-create this fundamental microcircuit, down to the level of biologically accurate individual neurons. The microcircuit can then be used in simulations.
What the Blue Brain Project is not
The Blue Brain Project is an attempt to reverse engineer the brain, to explore how it functions and to serve as a tool for neuroscientists and medical researchers. It is not an attempt to create a brain. It is not an artificial intelligence project. Although we may one day achieve insights into the basic nature of intelligence and consciousness using this tool, the Blue Brain Project is focused on creating a physiological simulation for biomedical applications.
Building the microcircuit
Neurons are not all alike - they come in a variety of complex shapes. The precise shape and structure of a neuron influences its electrical properties and connectivity with other neurons. A neuron's electrical properties are determined to a large extent by a variety of ion channels distributed in varying densities throughout the cell's membrane. Scientists have been collecting data on neuron morphology and electrical behavior of the juvenile rat in the laboratory for many years, and this data is used as the basis for a model that is run on the Blue Gene to recreate each of the 10,000 neurons in the NCC.
To model the neocortical column, it is essential to understand the composition, density and distribution of the numerous cortical cell types. Each class of cells is present in specific layers of the column. The precise density of each cell type and the volume of the space it occupies provides essential information for cell positioning and constructing the foundation of the cortical circuit. Each neuron is connected to thousands of its neighbors at points where their dendrites or axons touch, known as synapses. In a column with 10,000 neurons, this translates into trillions of possible connections. The Blue Gene is used in this extremely computationally intensive calculation to fix the synapse locations, "jiggling" individual neurons in 3D space to find the optimal connection scenario.
Modeling the column
The result of all these calculations is a re-creation, at the cellular level, of the neocortical column, the basic microcircuit of the brain. In this case, it's the cortical column of a juvenile rat. This is the only biologically accurate replica to date of the NCC - the neurons are biologically realistic and their connectivity is optimized. This would be impossible without the huge computational capacity of the Blue Gene. A model of the NCC was completed at the end of 2006.
In November, 2007, The Blue Brain Project officially announced the conclusion of Phase I of the project, with three specific acheivements:
1. A new modeling framework for automatic, on-demand construction of neural circuits built from biological data
2. A new simulation and calibration process that automatically and systematically analyzes the biological accuracy and consistency of each revision of the model
3. The first cellular-level neocortical column model built entirely from biological data that can now serve as a key tool for simulation-based research
Simulating the microcircuit
Once the microcircuit is built, the exciting work of making the circuit function can begin. All the 8192 processors of the Blue Gene are pressed into service, in a massively parallel computation solving the complex mathematical equations that govern the electrical activity in each neuron when a stimulus is applied. As the electrical impulse travels from neuron to neuron, the results are communicated via inter-processor communication (MPI). Currently, the time required to simulate the circuit is about two orders of magnitude larger than the actual biological time simulated. The Blue Brain team is working to streamline the computation so that the circuit can function in real time - meaning that 1 second of activity can be modeled in one second.
Interpreting the results
Running the Blue Brain simulation generates huge amounts of data. Analyses of individual neurons must be repeated thousands of times. And analyses dealing with the network activity must deal with data that easily reaches hundreds of gigabytes per second of simulation. Using massively parallel computers the data can be analyzed where it is created (server-side analysis for experimental data, online analysis during simulation).
Given the geometric complexity of the column, a visual exploration of the circuit is an important part of the analysis. Mapping the simulation data onto the morphology is invaluable for an immediate verification of single cell activity as well as network phenomena. Architects at EPFL have worked with the Blue Brain developers to design a visualization interface that translates the Blue Gene data into a 3D visual representation of the column. A different supercomputer is used for this computationally intensive task. The visualization of the neurons' shapes is a challenging task given the fact that a column of 10,000 neurons rendered in high quality mesh (see picture) accounts for essentially 1 billion triangles for which about 100GB of management data is required. Simulation data with a resolution of electrical compartments for each neuron accounts for another 150GB. As the electrical impulse travels through the column, neurons light up and change color as they become electrically active. See some clips of the column in action.
A visual interface makes it possible to quickly identify areas of interest that can then be studied more extensively using further simulations. A visual representation can also be used to compare the simulation results with experiments that show electrical activity in the brain. This calibration - comparing the functioning of the Blue Brain circuit with experiment, improving and fine-tuning it - is the second stage of the Blue Brain project, expected to be complete by the end of 2007.
What's next for the Blue Brain Project?
Phase I marks the completion of a proof-of-principle simulation-based research process that has resulted in a cellular-level model of the neocortical column. We have achieved biological fidelity such that the model itself now serves as a primary tool for evaluating the consistency and relevance of neurobiological data, while providing guidance for new experimental efforts. These new data will serve to further refine the neocortical column model. The assembled process allows neuroscientists to investigate scientific questions by integrating the available experimental data and evaluating hypotheses of network dynamics and neural function.
The completion of phase I provides the basis now for increasing the resolution of the models down to the molecular level and expanding the size of the models towards the whole brains of mammals.
In the future, information from the molecular and genetic level will be added to the algorithms that generate the individual neurons and their connections, and thus this level of detail will be reflected in the circuit's construction. The simulations can be used to explore what happens when this molecular or genetic information is altered -- situations such as a genetic variation in particular neurotransmitters, or what happens when the molecular environment is altered via drugs.
The project will continue to expand and will necessarily involve additional scientists and research groups from around the world.