"Disease is too complex to just think your way through it," says Raimond Winslow, director of The Institute for Computational Medicine at Johns Hopkins. "We can no longer work with what I call purely mental models of how biological systems function in either health or disease."
Thankfully, we have technology to lend a hand.
[See also: Johns Hopkins physics lab brings systems engineering to bear on ICU safety]
The burgeoning and highly complex field of computational medicine is showing promise for the treatment of illnesses such as Alzheimer's, heart disease, cancer and more, as technology and troves of data are harnessed to investigate the underpinnings and map the progression of diseases.
Technological advancements have precipitated a significant leap forward for the discipline over the past decade or so, but it still has a long way to go before realizing its true potential.
"Computational medicine is a discipline where we try to develop experimentally based computer models of disease, so we can very quantitatively understand what disease is, what affects disease, and then try to model therapeutic interventions," Winslow explains.
He likens it to "the way a flight simulator has a computer model of how the airplane behaves. Intentionally in this case, something goes wrong with the flight simulator and the pilots, aka the physicians, make an intervention, and the simulator responds the way a simulator would respond.
[See also: Johns Hopkins to develop master's in health IT]
"The analogy is: we have a model of the plane, a/k/a the disease, and we have the people who are making interventions when something goes wrong – the physicians – and learning how to correct for whatever goes wrong," he says.
The Institute for Computational Medicine was founded in 2005 as a partnership between Johns Hopkins' Whiting School of Engineering and its School of Medicine. But the techniques central to computational medicine have been around for more than 50 years.
"Probably the oldest discipline in which this kind of approach has been used is cardiovascular science," says Winslow. It began in 1960, with work by Oxford University biologist Denis Noble. "He published first electrical model of how the cardiac myocyte generates its electrical activity, which leads to the contraction of the heart. He did it for a single cell. But that was the beginning of modeling cardiac muscle cells and trying to understand how they function in health and in disease."
Nowadays, says Winslow, "computational models of heart disease are being developed, from the molecular level to the cellular level to the whole-heart level."
Those models, of course, are central to the burgeoning field of personalized medicine.
Cardiac care is just one of the fields that stands to gain from computational learning.
"Models have began to appear in other disciplines, over the past 10 years or so," says Winslow. "Modeling has been done on lung disease, cancer, certain types of brain diseases. Increasingly, these models are being tailored to the individual and used to guide selection of therapy to treat the disease."
The research, he says, "has truly taken off over the last 10 years or so." Biological knowledge is expanding. Experimental tools are improving. "There's a lot of data to feed quantitative models."
And, of course, "The ever-expanding power of computing is making it possible to simulate very large scale systems."
Winslow has described the intricate interplay of genetic material, proteins, cells and bodily organs as something like hugely complex jigsaw puzzle. Advances in computational models have equipped biologists with powerful ways to make sense of the microscopic mechanisms of disease – and offered new opportunities to test out treatments based on that knowledge.
Some examples of recent research include models that are helping scientists understand how networks of molecules are implicated in cancer – helping them predict which people might be most at risk – and a field called computational physiological medicine, which uses computational modeling to show how biological systems shift from a healthy to an unhealthy state.
Computational anatomy uses imaging technology to look for changes in the shape of certain structures in the brain – changes that could indicate Alzheimer’s disease or schizophrenia.
That is "one area of computational medicine that I think is closest to truly meaningful, large-scale, clinical application," says Winslow.
Already, he says, it's at the point where, by looking at the structure of the hippocampus, researchers could say a patient has Alzheimer's, or is at the early stages, or has indications that portend a significant progression of the disease.
The research – which could similarly be a boon for treating Parkinson's disease, or Tourette's syndrome – is "very close to being delivered and used in the clinic, precisely because it's proving to have such refined diagnostic capabilities," says Winslow.
Despite these near-term advances, however, computational medicine is "definitely not being used to its full potential," he adds.
Still, Winslow is hopeful. "We should all understand that this is a slow process," he adds. "We're at the very beginnings of computational medicine. Constructing these models is difficult. It relies on data that is difficult to get as well."
And the advances are coming faster every day. "As we develop better technologies for measuring what's going on in the body," says Winslow, "with the emerging power of genomics, it's likely that these new kinds of data that we can measure in every patient are going to be really valuable in helping to constrain models for that patient's illness."