Feeds:
Posts
Comments

Archive for January, 2013

cmts-mabMonoclonal antibodies are increasingly popular therapies for diseases such as cancer, arthritis and multiple sclerosis. They are also very expensive, due in part to the requirement that they are given intravenously at high concentrations to achieve their therapeutic benefits. Attempts to redesign the therapies to allow for easier and cheaper subcutaneous delivery have been stymied by the tendency of the antibodies to clump together, producing an unusably viscous solution. While experimental studies have identified some of the reasons for this viscosity, fully understanding these protein-protein interactions requires zooming in to a scale that’s currently beyond the ability of experiments.

Enter computational modeling, which can help scientists determine why some antibodies aggregate and others don’t, pointing the way to designing better treatments. While a postdoctoral scholar with the Center for Multiscale Theory and Simulation, Anuj Chaudhri worked with CMTS director Gregory Voth and scientists Dan Zarraga, Steve Shire and Tom Patapoff from the Late & Early Stage Pharmaceutical Development teams at Genentech to construct a model of what exactly happens when you put a lot of these antibodies into close proximity. The work was published by The Journal of Physical Chemistry.

“For high concentration proteins, not many experimental methods are available to get a deeper understanding of the fundamental interactions involved,” Chaudhri said. “This is where computation comes in. Using theoretical and computational methods, we can model the problem step by step by putting each piece together.”

(more…)

Read Full Post »

F1.medium

Last October, we helped celebrate Petascale Day with a panel on the scientific potential of new supercomputers capable of running more than a thousand trillion floating point operations per second. But the ever-restless high performance computing field is already focused on the next landmark in supercomputing speed, the exascale, more than fifty times faster than the current record holder (Titan, at Oak Ridge National Laboratory). As with the speed of personal computers, supercomputers have been gaining speed and power at a steady rate for decades. But a new article in Science this week suggested that the path to the exascale may not be as smooth as the field has come to expect.

The article, illustrated with an image of a simulated exploding supernova (seen above) by the CI-affiliated Flash Center for Computational Science, details the various barriers facing the transition from petascale to exascale in the United States and abroad. Government funding agencies have yet to throw their full weight behind an exascale development program. Private computer companies are turning their attention away from high performance computing in favor of commercial chips and Big Data. And many experts agree that supercomputers must be made far more energy-efficient before leveling up to the exascale — under current technology, an exascale computer would use enough electricity to power half a million homes.

(more…)

Read Full Post »

Using laboratory experiments, scientists have learned a great deal about our world at the molecular and cellular scale. But when modern techniques reach their limit, the baton can be passed to computer simulations and theory to fill in the gaps. The Center for Multiscale Theory and Simulation, directed by CI Senior Fellow Gregory Voth, uses these methods for research on biological systems to reveal the inner workings of cells, build better materials and design more effective drugs. To describe the potential of bringing theoretical chemistry and multiscale computer simulations to bear on biology, the CMTS produced this video featuring Voth, Marissa Saunders, and John Grime (whose work on HIV was previously featured on ScaleOut). Enjoy the video below:

Read Full Post »