Feeds:
Posts
Comments

Archive for the ‘Research Computing Center’ Category

cancer-classifiers-web

Finding a better way to fight cancer doesn’t always mean discovering a new drug or surgical technique. Sometimes just defining the disease in greater detail can make a big difference. A more specific diagnosis may allow a physician to better tailor a patient’s treatment, using available therapies proven to work better on a specific subtype of disease or avoiding unnecessary complications for less aggressive cases.

“Finding better ways to stratify kids when they present and decide who needs more therapy and who needs less therapy is one of the ways in which we’ve gotten much better at treating pediatric cancer,” said Samuel Volchenboum, Computation Institute Fellow, Assistant Professor of Pediatrics at Comer Children’s Hospital and Director of the UChicago Center for Research Informatics. “For example, kids can be put in one of several different groups for leukemia, and each group has its own treatment course.”

Classically, patients have been sorted into risk or treatment groups based on demographic factors such as age or gender, and relatively simple results from laboratory tests or biopsies. Because cancer is a genetic disease, physicians hope that genetic factors will point the way to even more precise classifications. Yet despite this promise, many of the “genetic signatures” found to correlate with different subtypes of cancer are too complex – involving dozens or hundreds of genes – for clinical use and difficult to validate across patient populations.

(more…)

Read Full Post »

tianhe-2-jack-dongarra-pdf-600x0

CHINA’S LATEST SUPERCOMPUTER VICTORY

China’s Milky Way 2 supercomputer was recently declared the fastest supercomputer in the world by industry scorekeeper Top500, the latest move in the increasingly international race for high performance computing supremacy. Late last month, CI Senior Fellow Rick Stevens appeared on Science Friday, alongside Top 500 editor Horst Simon, to talk about why that competition matters, and what the global push for faster computation will do for medicine, engineering and other sciences.

“These top supercomputers are like time machines,” Stevens said. “They give us access to a capability that won’t be broadly available for five to ten years. So whoever has the time machine is able to do experiments, able to see into the future deeper and more clearly than those that don’t have such machines.”

The same time machine metaphor was also picked up by the University of Chicago’s profile of Mira, our local Top500 competitor, which was bumped down to #5 by the Milky Way 2’s top ranking. But there’s no shame in fifth-best, when fifth-best can run 10 quadrillion calculations per second — the equivalent computing power of 58 million iPads. CI Senior Fellow Gregory Voth is quoted about how access to such a world-class resource helps both today and tomorrow’s scientists.

“Having access to a computing resource like Mira provides excellent opportunities and experience for educating up-and-coming young scientists as it forces them to think about how to properly utilize such a grand resource very early in their careers,” Voth says. “This gives them a unique perspective on how to solve challenging scientific problems and puts them in an excellent position to utilize computing hardware being imagined now for tomorrow.”

WHY DATA SCIENCE MUST BE OPEN SCIENCE AND MORE FROM DSSG

The Data Science for Social Good fellowship has reached the halfway point, and the website is starting to fill up with interesting content about the projects. Some fellows have already produced tools for the community to use, such as Paul Meinshausen’s interactive tree map of the City of Chicago’s Data Portal. Instead of a cold, no-frills list of the datasets available for download by the public, Meinshausen’ s map uses color and shape to guide users quickly to the data they are seeking and make rapid comparisons about the size of the dataset. The visualization was popular enough that programmers in Boston and San Francisco quickly applied his code to their own city’s data portals, while another built a common map for every city that uses Socrata software to share its data.

(more…)

Read Full Post »

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Big science projects can afford big cyberinfrastructure. For example, the Large Hadron Collider at CERN in Geneva generates 15 petabytes of data a year, but also boasts a sophisticated data management infrastructure for the movement, sharing and analysis of that gargantuan data flow. But big data is no longer an exclusive problem for these massive collaborations in particle physics, astronomy and climate modeling. Individual researchers, faced with new laboratory equipment and methods that can generate their own torrents of data, increasingly need their own data management tools, but lack the hefty budget large projects can dedicate to such tasks. What can the 99% of researchers doing big science in small labs do with their data?

That was how Computation Institute director Ian Foster framed the mission at hand for the Research Data Management Implementations Workshop, happening today and tomorrow in Arlington, VA. The workshop was designed to help researchers, collaborations and campuses deal with the growing need for   high-performance data transfer, storage, curation and analysis — while avoiding wasteful redundancy.

“The lack of a broader solution or methodology has led basically to a culture of one-off implementation solutions, where each institution is trying to solve their problem their way, where we don’t even talk to each other, where we are basically reinventing the wheel every day,” said H. Birali Runesha, director of the University of Chicago Research Computing Center, in his opening remarks.

(more…)

Read Full Post »

r-BLOOMBERG4-large570A CITY PROJECT BATTLE ROYALE

As the keynote speaker at the Urban Sciences Research Coordination Network kickoff last Friday, the City of Chicago’s Brett Goldstein presented a blizzard of exciting city projects at various states of development. One slightly-under-wraps project Goldstein touched upon was the SmartData platform, an ambitious plan to craft a new tool for decision-making and city services out of the abundant raw material of city data. In collaboration with the Computation Institute and the Urban Center for Computation and Data, the city’s Innovation and Technology team hopes to create a tool that will analyze the city’s many large datasets in real time to help the city respond to challenges more quickly and efficiently, while providing frequently updated, useful information to its citizens.

Wednesday, that exciting new effort was announced as a finalist in the Bloomberg Philanthropies Mayors Challenge, a competition among ideas proposed by cities across the United States. As part of the judging, the public is invited to vote for their favorite project among the 20 finalists at the Huffington Post. We’re biased of course, but to help make the case for Chicago’s project, you can read more about the SmartData platform here, or watch a video about the concept featuring Mayor Rahm Emanuel below.

(more…)

Read Full Post »

Photo by Jason Smith

Computation is now an essential tool for researchers, as data analytics and complex simulations fuel ambitious new studies in the sciences and humanities. But the path from a spreadsheet on a laptop to using the world’s most powerful supercomputers can be intimidating for researchers unfamiliar with computational methods.

To help researchers along this journey, the University created the Research Computing Center (RCC), providing access to hardware and expertise to faculty and students. At an opening reception on November 8th at the Crerar Library, scientists from Argonne National Laboratory and IBM joined RCC director H. Birali Runesha in welcoming UChicago researchers to this valuable new resource.

“The mission of the Research Computing Center is to advance research and scholarship at the University,” Runesha said. “What we are trying to do here is not just to provide access to hardware, but to work with you to understand your research and integrate high-performance computing into it to achieve our major goal, which is to help you literally transform your research by performing computational analysis that would otherwise not be possible.”

(more…)

Read Full Post »