Research rundown: using supercomputers to personalize cancer treatment

Mar 15, 2018
Research rundown: using supercomputers to personalize cancer treatment

The claim

 A mathematical model that predicts how cancers grow and how they will respond to radiotherapy has been developed using supercomputers in Texas.

 The background

 While the cancer treatment community has huge quantities of data, there is little understanding of the laws that govern the disease.

Researchers argue that mathematical models could hold the power to decode the formulas that dictate how cancer behaves. Such an understanding would allow healthcare teams to better personalize cancer therapy and reduce unnecessary exposure to expensive and potentially harmful treatments.

Different mathematical models of tumor growth have been proposed over the years, but determining which is most accurate at predicting cancer progression is a challenge. Uncertainties in observational data, the complexity of biological systems and the hundreds of different types and sub-types of cancer all contribute to the difficulties.

Researchers are now using supercomputers in this area. Their ability to undertake extremely complicated mathematical calculations holds great potential.

The method

A team from the Institute for Computational Engineering and Sciences’ (ICES) Center for Computational Oncology, have developed their method of creating personal tumor growth predictions using super computers at the Texas Advanced Computing Center (TACC).

First, they used a study of cancer in rats to test 13 existing leading tumor growth models to determine which could best predict factors associated with survival, as well as the effects of various therapies.

By implementing the principle of Occam’s razor, i.e., that the simplest explanation is the best, they developed the Occam Plausibility Algorithm. This selected the most plausible model within each given dataset, and then determined if the selected model was a valid prediction tool.

Researchers have also developed a series of equations that capture every factor involved in tumor response, from the speed at which chemotherapeutic drugs reach the tissue or the degree to which cells signal each other to grow.

All this information is then tied with patient-specific data taken from magnetic resonance imaging (MRI), positron emission tomography (PET), X-ray computed tomography (CT) and biopsies. The result is an individual prediction of biological behavior at the tissue, cellular and cell-signalling level.

“This is one of the most complicated projects in computational science. But you can do anything with a supercomputer,” ICES Director J. Tinsley Oden said. “There’s a cascading list of models at different scales that talk to each other. Ultimately, we’re going to need to learn to calibrate each and compute their interactions with each other.”

The results

“We have examples where we can gather data from lab animals or human subjects and make startlingly accurate depictions about the growth of cancer and the reaction to various therapies, like radiation and chemotherapy,” Oden said.

One trial found the method was able to predict tumor growth in rats to within 5–10% of their final mass.

In a case study, the group was able to predict with 87% accuracy if a breast cancer patient would respond to treatment after just one cycle of chemotherapy.

The implications

The team is trying to reproduce its results in a community setting, and the implications for therapy optimization, if they succeed, are huge.

The ultimate goal of “mathematizing” the problem of cancer is to be able to create a model of each individual patient, by populating the computer models with their own specific data.

Doctors could then be able to “try” different therapies on the patient virtually, and only choose that which the model suggests would be the most effective.

Thomas Yankeelov is the Director of the Center for Computational Oncology, which is based at The University of Texas at Austin, and Director of Cancer Imaging Research in the LIVESTRONG Cancer Institutes of the Dell Medical School.

He said: “If you have a model that can recapitulate how tumors grow and respond to therapy, it becomes a classic engineering optimization problem – ‘I have this much drug and this much time. What’s the best way to give it to minimize the number of tumor cells for the longest amount of time?’”

The source

Texas Advanced Computing Center (2018). Tailoring cancer treatments to individual patients. Available at: (accessed February 2018).

Research rundown: using supercomputers to personalize cancer treatment

Stakeholder Engagement is a function within Corporate Affairs at Astellas that focuses on creating, building and maintaining third-party relationships. We serve as a conduit between Astellas and external stakeholders to help improve patient outcomes, improve access issues and address patients’ unmet needs head on.

Leave a Reply