Feeds:
Posts
Comments

Archive for March, 2013

THE ILLINOIS SUPERCOMPUTER NEIGHBORHOOD GROWS

On Thursday, the National Center for Supercomputing Applications at the University of Illinois celebrated the full launch of Blue Waters, their new one-petaflop supercomputer. As part of the ceremony, Governor Pat Quinn declared “Blue Waters Supercomputer Day,” and Senator Dick Durbin saluted the machine and other supercomputers as “the gateway to next-generation research.” The start of 24/7 research was also a proud day for Computation Institute scientists such as Michael Wilde and Daniel Katz, who were involved in getting Blue Waters up and running. Wilde spoke about the supercomputer at the CI’s Petascale Day event last October.

Meanwhile, a couple hundred miles north of Blue Waters, Argonne’s new 10-petaflop supercomputer Mira nears the start of its own full production period later this year. This week, the laboratory released a new timelapse video of the machine’s construction, which you can watch below. But science isn’t waiting for Mira to reach full strength, as demonstrated by this new project on the combustion and detonation of hydrogen-oxygen mixtures — a potential alternative source of fuel.

THE GRAND MOTHER OF CLOUD

In recent years, cloud computing has crossed over from inside baseball IT chatter to the general public. As CI fellow Rob Gardner recently charted, web searches for the term began climbing in 2009 and still vastly out-pace searches for similar buzzwordy topics such as “big data” and “virtualization.” Now that consumers are comfortable with storing files and running programs in the cloud, it’s time for the pioneers of that technology to take their victory laps. One recent round-up of cloud computing mavericks at Forbes tagged CI fellow Kate Keahey as “the grand mother of cloud,” recognizing her early work on infrastructure-as-a-service (Iaas) platforms. Her current project, Nimbus, is dedicated to providing cloud-based infrastructure for scientific laboratories.

OTHER NEWS IN COMPUTATIONAL SCIENCE

A lot of what we know about science may be wrong, but finding those flaws could lead to better discovery in the future. That’s how this article on Txchnologist framed the new Metaknowledge Network led by CI fellow James Evans. “We’re building on decades of this deep work on science and trying to connect it to this computational moment…to get a quantitative understanding of why we have the knowledge we have,” Evans told reporter Rebecca Ruiz.

The open release of data by the city of Chicago hasn’t just improved our understanding of how the city works, but also how we see it. These beautiful visualizations created with the Edifice software (one of the projects at the Open City collaborative) make the neighborhoods of Chicago look like a genomic SNP chip…or an elaborate Lite Brite project.

Many Chicago homes would benefit from improvements that improve energy efficiency, saving them a huge portion of their monthly utility bills. But many residents are unaware of the option or unwilling to bear the up-front expenses needed to retrofit homes to reduce energy usage. According to WBEZ, two University of Chicago students have founded a new startup called Effortless Energy that uses data-mining techniques to both locate and assist these opportunities for conservation and savings.

The “traveling salesman problem” of finding the most efficient route between 20 different cities has long frustrated mathematicians. So English scientists created “programmable goo” to find the shortest route in similar fashion to studies that have used slime mold as navigators. You can read the paper, “Computation of the Traveling Salesman Problem by a Shrinking Blob” at arXiv.

Read Full Post »

snap_plant

The robotic system in action (from Vision Systems Design)

Among scientific disciplines, botany might be considered one of the least tech-minded branches, concerned as it is with the natural world of plant life. But like the rest of biology, botany is quickly moving into the types of large-scale experiments that require more sophisticated and advanced techniques. In many botany labs, high-throughput sequencers generate data at unprecedented rates about plant genomics for many different species. However, this genetic bounty creates a new bottleneck, as the complementary studies examining how those genes control plant traits still proceed at a speed closer to that of old-fashioned fieldwork. That old cliche “watching the grass grow” is not compatible with fast-paced science.

To help bring phenotype closer to the pace of genotype, Nicola Ferrier has equipped botanists with powerful new lab assistants: robots. Ferrier, now an engineer at Argonne National Laboratory, worked with University of Wisconsin botanists to design better equipment for monitoring the growth of the plant species Arabidopsis thaliana. Arabidopsis is a small flowering plant popular as a laboratory model species, in part because of its relatively small genome of roughly 27,000 genes. Eventually, scientists would like to find the role of each of those genes on aspects of the plant’s phenotype, such as root gravitropism, how the roots grow in response to gravity.

But the popular method – a computer-controlled camera to monitor the growth of one Arabidopsis seedling at a time – was far too slow to monitor tens of thousands of mutants. So Ferrier helped the laboratory of Edgar Spalding replace their single-camera system with a “robotic machine vision platform” capable of monitoring up to 144 seedlings simultaneously as their gravity is artificially changed (by rotating the dish 90 degrees).

(more…)

Read Full Post »

Analyticalengine

A model of Charles Babbage’s Analytical Engine (via Wikimedia Commons)

The general public tends to think of supercomputers as the big brothers of their home computers; a larger, faster and more powerful version of more familiar every devices. But in his talk last week for the Argonne National Laboratory’s OutLoud series, CI senior fellow Peter Beckman urged the crowd to think of supercomputers more imaginatively as the real-life version of a common sci-fi device: the time machine.

A modern laptop is faster than the state-of-the-art supercomputer Beckman used at Los Alamos National Laboratory in 1995, he said. That same year, a supercomputer with the computing speed of today’s iPad would rank on the Top 500 list of fastest computers in the world. Beyond raw speed, the programming strategies and hardware architectures developed on the room-sized supercomputers of the last 60 years have eventually trickled down to the consumer, as with the multi-core processors and parallel operations that can be found in new laptops.

As such, “supercomputing is a time machine,” said Beckman, the director of exascale technology and computing at Argonne. “What we build here in the lab, 20 years later ends up in your phone.”

(more…)

Read Full Post »

DC districts map by Peter Fitzgerald. (Wikimedia Commons)

DC districts map by Peter Fitzgerald. (Wikimedia Commons)

“Civic hacking” has become a popular way for people skilled in programming and data crunching to give back to their community. Through organized Hack-a-thons or groups such as Open City and Code for America, volunteers imaginatively transform enormous tables of numbers into user-friendly web and mobile tools that bring localized and interactive information about a city to its citizens. From simple questions such as “has my street been plowed yet?” to more complex visualizations of zoning laws or crime patterns, these apps are changing the interaction between cities and their residents and creating an exciting new spirit of civic participation.

This summer, the Computation Institute will further nurture these efforts with the Data Science for Social Good fellowship. From early June to late August, aspiring data scientists will be brought to Chicago to work on the application of data and computation to urgent real-world problems with members of the Obama campaign analytics team and experts from the business world and academia, including the University of Chicago, Argonne National Laboratory and the Urban Center for Computation and Data. Projects addressing questions in education, health, energy, transportation and other spheres will be developed in interdisciplinary teams overseen by an advisory team including Google’s Eric Schmidt and Rayid Ghani, chief scientist for Obama 2012.

PhD, masters or advanced undergraduate students, software engineers and people with statistics, programming and data skills are encouraged to apply. The deadline for applications is April 1st, and all the information required can be found at the fellowship’s website. The fellowship is also seeking experienced mentors and partners, who should e-mail the organizers at datasciencefellowship@uchicago.edu. Good luck!

Read Full Post »

topfinalists_inpage_newWin0312SMARTDATA PLATFORM GETS BLOOMBERG NOD

A few weeks ago, we urged readers to vote in the Bloomberg Mayors Challenge for the City of Chicago’s entry, a collaboration with the Urban Center for Computation and Data called the SmartData Platform. This week, the project received good news as it was chosen for a $1 million grant from Bloomberg Philanthropies to launch the project, one of five proposals to receive funding from the original pool of 305 applications. The SmartData platform will put city datasets — like those that can found on the city’s data portal — to work in making the city run more effectively and efficiently, and the UrbanCCD will help provide the computational expertise and tools to extract the maximum potential from the data. The new open-source platform is considered the next iteration of the WindyGrid system currently used internally by the city, which was discussed by Chicago’s Chief Data Officer Brett Goldstein at the recent Urban Sciences Research Coordination Network workshop.

Chicago and the other Bloomberg winners were covered by the New York Times, the Chicago Sun-Times, Crain’s Chicago Business, NBC Chicago, ABC Chicago, The Atlantic Cities,

THE PRESIDENT COMES TO ARGONNE

The security at Argonne National Laboratory will be even tighter than usual today as President Barack Obama visits to deliver a speech on the subjects of energy and climate change. The Presidential visit comes just months after the announcement of the Argonne “Battery Hub,” a $155 million project that’s part of the national Joint Center for Energy Storage Research. But President Obama’s speech will also come at a time where national laboratories such as Argonne face budget cuts due to the federal sequestration.  If you want to see what the President says about these pressing topics, tune into White House Live at 1:30 p.m. Central time.

OTHER NEWS IN COMPUTATIONAL SCIENCE

Next month will feature a lot of exciting CI-affiliated events. On April 3rd, Senior Fellow Gregory Voth will deliver a lunchtime talk in downtown Chicago on “Molecular Modeling: A Window to the Biochemical World” (register here). The 2013 edition of the GlobusWorld meeting runs from April 16-18 at Argonne, and registration for the conference and hotel rooms is currently open. Finally, the Computation Institute will host the inaugural Day of the Beagle symposium on April 23rd, celebrating the groundbreaking biology and medicine research performed on the Beagle supercomputer in its first year of operation.

The first supercomputer in the country of Jordan was built with somewhat unusual components: the processors from Playstation 3 video game consoles. As the article discusses, it follows a US Air Force supercomputer in using video game parts for high-performance computing.

Two visions of the future of computing received attention in recent weeks. A special issue of Science put the spotlight on quantum computing and recent experiments that move it closer to real-world application, and a feature on the new Nova Next website speculated on how synthetic biology could someday create computers made up of biological components.

“The Internet of Things” is catching on as a tech/computing buzzword, and in this video for the business news site Quartz, Robert Mawrey of ioBridge uses a fresh cup of coffee to explain why might soon want our appliances to send and receive tweets.

Read Full Post »

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Big science projects can afford big cyberinfrastructure. For example, the Large Hadron Collider at CERN in Geneva generates 15 petabytes of data a year, but also boasts a sophisticated data management infrastructure for the movement, sharing and analysis of that gargantuan data flow. But big data is no longer an exclusive problem for these massive collaborations in particle physics, astronomy and climate modeling. Individual researchers, faced with new laboratory equipment and methods that can generate their own torrents of data, increasingly need their own data management tools, but lack the hefty budget large projects can dedicate to such tasks. What can the 99% of researchers doing big science in small labs do with their data?

That was how Computation Institute director Ian Foster framed the mission at hand for the Research Data Management Implementations Workshop, happening today and tomorrow in Arlington, VA. The workshop was designed to help researchers, collaborations and campuses deal with the growing need for   high-performance data transfer, storage, curation and analysis — while avoiding wasteful redundancy.

“The lack of a broader solution or methodology has led basically to a culture of one-off implementation solutions, where each institution is trying to solve their problem their way, where we don’t even talk to each other, where we are basically reinventing the wheel every day,” said H. Birali Runesha, director of the University of Chicago Research Computing Center, in his opening remarks.

(more…)

Read Full Post »

“We know more about the movement of celestial bodies than about the soil underfoot.”

Leonardo da Vinci never gave a TED talk, but if he did, that quote from around the beginning of the 16th century might have been a good tweetable soundbite. Five centuries later, da Vinci’s statement still holds true, and it was there for CI Senior Fellow Rick Stevens to pluck as the epigraph for his talk in November 2012 at the TEDxNaperville conference. Stevens used his 18 minutes on the TED stage to talk about the Earth Microbiome Project, an international effort “to systematically study the smallest life forms on earth to build a comprehensive database to capture everything we can learn about these organisms.”

Stevens talks about how little we know about the estimated 1 billion species of microbes on Earth (“In one kilogram of soil there are more microbes than there are stars in our galaxy,” he says), and how citizen science, high-throughput genomics and supercomputing are coming together to finally reveal this vast ecosystem — a process he likens to reconstructing the front page of the newspaper using only the firehose of Twitter and Facebook posts. In 5-10 years, Stevens says, microbiology will finally exceed astronomy, with enormous implications for our understanding of the world around us.

You can watch video of Stevens’ talk below:

Read Full Post »

Older Posts »