Archive for the ‘Energy’ Category

CGI for Science

An image from a model of how endophilin sculpts membrane vesicles into a network of tubules. (Mijo Simunovic/CMTS)

An image from a model of how endophilin sculpts membrane vesicles into a network of tubules. (Mijo Simunovic/CMTS)

Computer graphics have greatly expanded the possibilities of cinema. Special effects using CGI (computer-generated imagery) today enable directors to shoot scenes that were once considered impossible or impractical, from interstellar combat to apocalyptic action sequences to fantastical digital characters that realistically interact with human actors.

In science, computer graphics are also creating sights that have never been seen before. But where movie special effects artists are realizing the vision of a screenwriter and director, scientific computer models are inspiring new discoveries by revealing a restless molecular world we cannot yet see with the naked eye.

Using computers to peer into this hidden universe was the theme of CI faculty and senior fellow Gregory Voth‘s Chicago Council on Science and Technology talk last week, titled Molecular Modeling: A Window to the Biochemical World. Scientists at Voth’s Center for Multiscale Theory and Simulation use computers to recreate real-world physics and produce awe-inspiring, intricate images, pushing the frontiers of discovery one femtosecond and nanometer at a time.

[Some of those images, including the one above by Mijo Simunovic, were on display as a “Science as Art” gallery, which you can view in a slideshow here.]

“The computer simulation allows us to make a movie, if you will, but it’s a movie describing what the laws of physics tells us,” Voth said. “It’s not a movie where we tell the computer we want this figure to run and shoot this figure. We don’t know what’s going happen. We know the equations, we feed them in [to a supercomputer], and we solve those equations…and we can reach scales we never dreamed of reaching before.”


Read Full Post »


On Thursday, the National Center for Supercomputing Applications at the University of Illinois celebrated the full launch of Blue Waters, their new one-petaflop supercomputer. As part of the ceremony, Governor Pat Quinn declared “Blue Waters Supercomputer Day,” and Senator Dick Durbin saluted the machine and other supercomputers as “the gateway to next-generation research.” The start of 24/7 research was also a proud day for Computation Institute scientists such as Michael Wilde and Daniel Katz, who were involved in getting Blue Waters up and running. Wilde spoke about the supercomputer at the CI’s Petascale Day event last October.

Meanwhile, a couple hundred miles north of Blue Waters, Argonne’s new 10-petaflop supercomputer Mira nears the start of its own full production period later this year. This week, the laboratory released a new timelapse video of the machine’s construction, which you can watch below. But science isn’t waiting for Mira to reach full strength, as demonstrated by this new project on the combustion and detonation of hydrogen-oxygen mixtures — a potential alternative source of fuel.


In recent years, cloud computing has crossed over from inside baseball IT chatter to the general public. As CI fellow Rob Gardner recently charted, web searches for the term began climbing in 2009 and still vastly out-pace searches for similar buzzwordy topics such as “big data” and “virtualization.” Now that consumers are comfortable with storing files and running programs in the cloud, it’s time for the pioneers of that technology to take their victory laps. One recent round-up of cloud computing mavericks at Forbes tagged CI fellow Kate Keahey as “the grand mother of cloud,” recognizing her early work on infrastructure-as-a-service (Iaas) platforms. Her current project, Nimbus, is dedicated to providing cloud-based infrastructure for scientific laboratories.


A lot of what we know about science may be wrong, but finding those flaws could lead to better discovery in the future. That’s how this article on Txchnologist framed the new Metaknowledge Network led by CI fellow James Evans. “We’re building on decades of this deep work on science and trying to connect it to this computational moment…to get a quantitative understanding of why we have the knowledge we have,” Evans told reporter Rebecca Ruiz.

The open release of data by the city of Chicago hasn’t just improved our understanding of how the city works, but also how we see it. These beautiful visualizations created with the Edifice software (one of the projects at the Open City collaborative) make the neighborhoods of Chicago look like a genomic SNP chip…or an elaborate Lite Brite project.

Many Chicago homes would benefit from improvements that improve energy efficiency, saving them a huge portion of their monthly utility bills. But many residents are unaware of the option or unwilling to bear the up-front expenses needed to retrofit homes to reduce energy usage. According to WBEZ, two University of Chicago students have founded a new startup called Effortless Energy that uses data-mining techniques to both locate and assist these opportunities for conservation and savings.

The “traveling salesman problem” of finding the most efficient route between 20 different cities has long frustrated mathematicians. So English scientists created “programmable goo” to find the shortest route in similar fashion to studies that have used slime mold as navigators. You can read the paper, “Computation of the Traveling Salesman Problem by a Shrinking Blob” at arXiv.

Read Full Post »


A few weeks ago, we urged readers to vote in the Bloomberg Mayors Challenge for the City of Chicago’s entry, a collaboration with the Urban Center for Computation and Data called the SmartData Platform. This week, the project received good news as it was chosen for a $1 million grant from Bloomberg Philanthropies to launch the project, one of five proposals to receive funding from the original pool of 305 applications. The SmartData platform will put city datasets — like those that can found on the city’s data portal — to work in making the city run more effectively and efficiently, and the UrbanCCD will help provide the computational expertise and tools to extract the maximum potential from the data. The new open-source platform is considered the next iteration of the WindyGrid system currently used internally by the city, which was discussed by Chicago’s Chief Data Officer Brett Goldstein at the recent Urban Sciences Research Coordination Network workshop.

Chicago and the other Bloomberg winners were covered by the New York Times, the Chicago Sun-Times, Crain’s Chicago Business, NBC Chicago, ABC Chicago, The Atlantic Cities,


The security at Argonne National Laboratory will be even tighter than usual today as President Barack Obama visits to deliver a speech on the subjects of energy and climate change. The Presidential visit comes just months after the announcement of the Argonne “Battery Hub,” a $155 million project that’s part of the national Joint Center for Energy Storage Research. But President Obama’s speech will also come at a time where national laboratories such as Argonne face budget cuts due to the federal sequestration.  If you want to see what the President says about these pressing topics, tune into White House Live at 1:30 p.m. Central time.


Next month will feature a lot of exciting CI-affiliated events. On April 3rd, Senior Fellow Gregory Voth will deliver a lunchtime talk in downtown Chicago on “Molecular Modeling: A Window to the Biochemical World” (register here). The 2013 edition of the GlobusWorld meeting runs from April 16-18 at Argonne, and registration for the conference and hotel rooms is currently open. Finally, the Computation Institute will host the inaugural Day of the Beagle symposium on April 23rd, celebrating the groundbreaking biology and medicine research performed on the Beagle supercomputer in its first year of operation.

The first supercomputer in the country of Jordan was built with somewhat unusual components: the processors from Playstation 3 video game consoles. As the article discusses, it follows a US Air Force supercomputer in using video game parts for high-performance computing.

Two visions of the future of computing received attention in recent weeks. A special issue of Science put the spotlight on quantum computing and recent experiments that move it closer to real-world application, and a feature on the new Nova Next website speculated on how synthetic biology could someday create computers made up of biological components.

“The Internet of Things” is catching on as a tech/computing buzzword, and in this video for the business news site Quartz, Robert Mawrey of ioBridge uses a fresh cup of coffee to explain why might soon want our appliances to send and receive tweets.

Read Full Post »

Photo by Lloyd DeGrane.

Photo by Lloyd DeGrane.

Cities draw their strength from community and diversity, when people from different backgrounds work together in close proximity on big problems. So to unleash the potential of city data, it only makes sense to replicate that mixing bowl effect in the context of research. To formally kick off the new Urban Sciences Research Coordination Network (USRCN), 80 experts representing a broad range of disciplinary knowledge met in downtown Chicago to forge new connections and grand ideas for projects that harness data for the benefit of the modern city.

Computer scientists, mathematicians, public health and education experts, architects, urban planners, social scientists, artists and more gathered inside the ballroom of the School of the Art Institute of Chicago on February 15th with an ambitious goal: form a new interdisciplinary research community for data-driven urban science. Co-hosted by the Urban Center for Computation and Data (UrbanCCD) and the University of Chicago Urban Network and funded by the National Science Foundation, the event was meant as both social mixer and brainstorming session.

“We were asked by the NSF to create this research coordination network as a network of people, not computers,” said Charlie Catlett, director of UrbanCCD. “If you can put teams together that are interdisciplinary and also cut across these experience types, then we can begin to study the city in a way that none of us could do just as individuals or small groups.”


Read Full Post »


A recurring theme of Breaking Bad is getting out of difficult situations with science. Yet, you still probably wouldn’t expect to run into a character from the hit TV show on the campus of Argonne National Laboratory or University of Chicago. But if you happen to spot a man who looks just like protagonist Walter White’s former boss at his car wash job, no need for a double take — you’re not losing your mind. Marius Stan, a Computation Institute Senior Fellow and Argonne scientist studying computational chemistry and physics, provides the memorable eyebrows and Romanian curses for the role of Bogdan, a character who has appeared in a handful of episodes of the AMC drug-trade drama.

This week in the Chicago Tribune, reporter Ted Gregory profiled Stan and told the story of how he got involved with the show when he lived in Albuquerque, before moving to Chicago. Stan might humbly list “Breaking Bad, Bogdan” below a computational microscope and a book about modeling and simulation in materials science on his CI web page. But his colleague, CI fellow Andrew Siegel, said most people at Argonne find his moonlighting career “extremely cool.”

“Everybody finds it hilarious and great. In science, you’re so uncool, at least in this country, and the world of acting is so opposite of that. It’s a funny convergence of things.”


Electrical power grids present a complex and challenging mathematical problem, with questions of how to efficiently produce, store and distribute the energy. To study the mathematics behind tomorrow’s power lines, Argonne National Laboratory and Computation Institute scientists formed the Multifaceted Mathematics for Complex Energy Systems Project (M2ACS), which recently received a $17.5 million grant from the Department of Energy. The project will bring together experts from several different areas of mathematical study, including optimization, dynamical systems, uncertainty quantification, random processes, data analysis, discrete mathematics and linear algebra to find new techniques needed to drive next-generation “smart” power grids and other technologies.


Read Full Post »