Flagship brain project to develop software for the future of brain science

Understanding the human brain is one of the greatest challenges facing 21st century science

This is key for finding effective treatments for brain injuries and disease, but also for technological applications such as finding more energy-efficient ways of computing and constructing intelligent robots supporting our lives. Such future technological applications may even come to benefit human medicine, as The Human Brain Project (HBP) aptly demonstrates.

NMBU plays key role

HBP is a European Flagship Project that connects the work of over 500 researchers in 19 European countries. Started in 2013, it is one of two Flagship Projects of the European Commission’s Future & Emerging Technologies (FET) programme.

NMBU plays a key role in the project, which represents a huge investment in strategic and collective research and innovation and was designated a crucial role in the future of neuroscience when the project's first phase was evaluated by a panel of high-level experts in a FET flagship report.

"The Human Brain Project Flagship addresses the development of understandings of the human brain. It also deals with the development of the supporting infrastructures needed by brain researchers to adopt big-science approaches based on the use, for example, of large data sets. The work of the Human Brain Project Flagship is expected to provide profound insights into what makes us human, to enable the building of revolutionary computing technologies, and to provide knowledge that will lead to the development of new treatments for brain disorders..."

- A major paradigm change

"The objective of the Human Brain Project Flagship is to develop a federated ICT infrastructure that would become a research e-infrastructure in the future, helping the neuroscience community collect, analyse, share, integrate and model data about the brain with the aim of better understanding the functioning of the human brain and its diseases," according to the report.

"The project will integrate much of our knowledge of the field of neuroscience in mathematical models. It is a major paradigm change, but neuroscience cannot get further without that integration. Both to understand a healthy brain, and also to cure brain dysfunction in the event of injury or disease, you need the detailed understanding that only can be provided by precise mathematical models," said Hans Ekkehard Plesser, Professor at NMBU's Faculty of Science and Technology.

From the left: Hans Ekkehard Plesser and Gaute Einevoll, both professors at NMBUs Faculty of Science and Technology, and Markus Diesmann, Director of the Institute of Neuroscience and Medicine (INM-6) at Jülich Research Centre.

Photo
Håkon Sparre

As president of the NEST Initiative, he is heavily involved in the development of the NEST neural network simulator, which in 2013 set a world record with a 1.86 billion brain cell simulation connected by over 10 trillion links performed on K, one of the world's largest supercomputers. The simulator has an important role in HBP, and Plesser co-leads the subproject constructing the High Performance Analytics and Computing Platform of the HBP.

The NEST neural network simulator has an important role in the Human Brain Project.

Photo
Håkon Sparre

So why exactly do we need such simulators and mathematical models to understand the brain better?

"In broad terms one could argue that we now have a fairly good understanding of how individual neurons operate and process information, but that the behaviour of networks of such neurons is poorly understood. Our project can be likened to the development of a new branch of mathematics in the late eighteenth century by Isaac Newton," said Gaute Einevoll, Professor at NMBU's Faculty of Science and Technology, and continued:

Key for making scientific progress

"Newton needed to develop a type of mathematics called calculus to check whether his suggested gravitational law of how masses such as planets attract each other, was correct or not. With calculus he could compute the planetary orbits in his model and check that predictions from his theory were in agreement with experimental observations. With the computing infrastructure developed within HBP, we can correspondingly check whether our candidate network models give predictions in agreement with brain measurements. This workflow is key for making scientific progress."

His task in the HBP is to construct biophysical equations and computer simulation tools relating the activity of individual nerve cells to electrical brain signals that, for example, can be measured at the scalp of humans in so-called Electroencephalography (EEG) recordings. EEG is an electrophysiological monitoring method to record electrical activity of the brain.

A project that requires big cultural change

"The Human Brain Project promises to create a European ICT infrastructure for brain research. We have made big steps forward, but we have also experienced that it is challenging to bring this about because it requires a big cultural change. Most neuroscientists are trained in biology or medicine. Biology is built on observation, classification and qualitative models, and mathematical modelling and physics are still somewhat foreign to the field," he explained. 

A number of advanced measuring instruments which allow for the study of brain processes from the molecular level to whole human brains, have been developed, but mathematical modelling has only been used to a small extent to use this experimental data to try to provide a unified understanding of the brain.

"The most interesting thing is that HBP is introducing a new culture where we have to learn to collaborate with many people, a thing alien to neuroscientists," said Markus Diesmann, Director of the Institute of Neuroscience and Medicine (INM-6) at Jülich Research Centre, and in charge of a work package on the construction of models covering multiple brain areas at cellular resolution in HBP.

New research infrastructure for brain research

According to the FET-report mentioned above: "In March 2016, Human Brain Project announced the launch of six initial versions of its ICT Platforms, which are the core of the emerging Human Brain Project research infrastructure for brain research.

"The Platforms embody the key objectives of the Human Brain Project, to gather and disseminate data describing the brain, to simulate and build models of the brain, to develop brain-inspired computing and robotics, and to create a global scientific community around the developing research infrastructure.

"The Platforms consist of prototype hardware, software tools, databases, programming interfaces, and initial data-sets, which will be refined and expanded on an on-going basis in close collaboration with end users. The development of the Platforms has been the result of an extensive multidisciplinary effort involving more than 750 scientific collaborators and engineers from 114 institutions in 24 countries.

Will increase reproducibility of research

"The Platforms will enable new kinds of collaborative research to be performed in neuroscience, medicine and computing. The prototype tools, hardware systems and initial data sets are designed to enable faster and more efficient research techniques in, for example, modelling, in silicon experimentation, or data analysis."

"An important aspect of the «efficient research techniques» mentioned here is the increase in reproducibility of research by digitized workflows. The data analysis and modelling steps of neuroscience have become so complex that it is hard to repeat the work of others. But, this reproducibility is essential to the scientific method," said Diesmann.

However, most of the data sets HBP researchers work with today are so big that you could not possible store them on your personal computer.

The research generates huge amounts of data that currently needs to be stored on highly specialized supercomputers, and the researchers are eagerly preparing their simulation codes and analysis tools for the arrival of exascale computer - expected to be on the market about year 2022.

Preparing for a new generation of computers

"Exascale computing refers to computing systems capable of at least one exaFLOPS, or a billion billion (quintillion) calculations per second. Such capacity represents a thousandfold increase over the first petascale computer that came into operation in 2008," according to Wikipedia.

Future exascale computers will exceed the performance of today's high-end supercomputers by a ten- to a hundredfold and will for the first time give researchers the compute power available to simulate neuronal networks on the scale of the human brain.

However, the ultimate dream is that such computer capacity will eventually become the norm and not the exception. Interestingly, to achieve this goal computers may need to become more like our brains with their billions of nerve cells all operating in parallel.

"My dream is to be able to go to my cabin in the mountains, and if I get an idea for how brain networks might operate while I'm out skiing, be able to pop back to the cabin and log in to the research infrastructure on my laptop and test the idea," said Einevoll. 

NMBU is involved in several of the 12 subprojects (SPs) of HBP, including

  • SP4 Theoretical Neuroscience,
  • SP7 High Performance Analytics and Computing Platform, and
  • SP11 Central Services (Education), as well as
  • Co-Design Projects. 
Published 5. April 2019 - 9:40 - Updated 5. April 2019 - 10:04