Feeds:
Posts
Comments

Archive for May, 2013

Brett-Harris-WebFROM CITY HALL TO HYDE PARK

In its early days, the Urban Center for Computation and Data formed a valuable partnership with the data team installed by Mayor Rahm Emanuel within Chicago’s city government. Leading the city’s efforts to free up data internally and externally was Chief Data and Information Officer Brett Goldstein, an alumnus of the UChicago computer science program and the restaurant reservation startup company OpenTable. Goldstein’s team and UrbanCCD worked together on the SmartData platform proposal that was chosen for a $1 million grant in the Bloomberg Mayors Challenge earlier this year, and Goldstein was the keynote speaker at the first workshop of the Urban Sciences Research Coordination Workshop in February.

So we are very excited about the news that Goldstein will soon be joining the University of Chicago as the inaugural Fellow in Urban Science at the Harris School of Public Policy. Goldstein will continue to work with UrbanCCD researchers on the SmartData platform and other projects, while also helping with the launch of a masters degree in computation and public policy and the Urban Technology Innovators’ Conference, a new initiative organized by Chicago Harris and the City of Chicago that seeks to create a peer-learning network for municipal technology innovators.

“Chicago Harris demonstrates a commitment to rigorous interdisciplinary scholarship, with strong partnerships with the Department of Computer Science and the Computation Institute, and a desire to advance the field of data science, especially so it can be leveraged it for public service,” Goldstein said. “I am excited about the opportunity to continue working to meld urban science and data analytics and work with this impressive community of faculty and staff.”

You can read more coverage of Goldstein’s move and career so far at Crain’s Chicago Business and Government Technology.

(more…)

Read Full Post »

CERN is known as the current world epicenter of particle physics, the home of the Large Hadron Collider and thousands of scientists expanding our knowledge of the universe’s most basic ingredients. For one day earlier this month, the Geneva, Switzerland laboratory was also a meeting place for scientists, philosophers, musicians, animators and even Will.I.Am to share their grand ideas for the first ever TEDxCERN event. Among the speakers riffing on the theme of “Multiplying Dimensions” was CI Director Ian Foster, who presented his vision for The Discovery Cloud and accelerating the pace of science by bringing advanced data and computation tools to the smaller laboratories and citizen scientists of the world.

What we need to do is to in a sense create a new set of cloud services which do for science what the myriad of business cloud services do for business. We might call it the discovery cloud. It would be a set of services that take on, automate, and allow people to handle or outsource many of the routine activities that currently dominate research…I believe if we do that right, we can really make a transformative difference in how people do science.

You can watch a full video of Foster’s presentation below:

International Science Grid This Week also covered Foster’s talk and another given a day earlier to the information technology team at CERN. In that speech, Foster delivered a similar message about the need to bring advanced cyberinfrastructre to the “99%” of laboratories who can’t afford to build international data grids akin to what CERN used in its discovery of the Higgs boson.

“We have managed to create exceptional infrastructure for the 1%, but what about the rest?” asks Foster. “We have big science, but small labs. How do we deliver cyber infrastructure to small groups? They need something that is frictionless, affordable and sustainable.”

Read Full Post »

rick-stevens-dataThe exascale — one million trillion calculations per second — is the next landmark in the perpetual race for computing power. Although this speed is 500 times faster than the world’s current leading supercomputers and many technical challenges remain, experts predict that the exascale will likely be reached by 2020. But while the United States is used to being the frontrunner in high-performance computing achievement, this leg of the race will feature intense competition from Japan, China and Europe. In order to pass the exascale barrier first and reap the application rewards in energy, medicine and engineering research, government funding is critical.

On Capitol Hill yesterday, CI Senior Fellow Rick Stevens testified to this urgency as part of a Congressional Subcommittee on Energy hearing, “America’s Next Generation Supercomputer: The Exascale Challenge.” The hearing was related to the American High-End Computing Leadership Act [pdf], a bill proposed by Rep. Randy Hultgren of Illinois to improve the HPC research program of the Department of Energy and make a renewed push for exascale research in the United States. You can watch archived video of the hearing here, and Stevens’ prepared opening statement is reproduced in full below.

=====

Thank you Chairman Lummis, Ranking Member Swalwell, and Members of the Subcommittee. I appreciate this opportunity to talk to you about the future of high performance computing research and development, and about the importance of U.S. leadership in the development and deployment of Exascale computing.

I am Rick Stevens, the Associate Laboratory Director responsible for Computing, Environment, and Life Sciences research at Argonne National Laboratory. My laboratory operates one of the two Leadership Class computing systems for DOE’s Office of Science. My own research focuses on finding new ways to increase the impact of computation on science – from the development of new more powerful computer systems to the creation of large-scale applications for computational genomics targeting research in energy, the environment and infectious disease. I also am a Professor at the University of Chicago in the Department of Computer Science, where I hold senior fellow appointments in the University’s Computation Institute and the Institute for Genomics and Systems Biology.

I believe that advancing American leadership in high-performance computing is vital to our national interest. High-performance computing is a critical technology for the nation. It is the underlying foundation for advanced modeling and simulation and big data applications.

(more…)

Read Full Post »

applicant-mapWhile you’re planning for a summer vacation on the beach, we’re planning to host three dozen aspiring data scientists for The Eric and Wendy Schmidt Data Science for Social Good Fellowship. In just a couple weeks, 550 undergraduate and graduate students from around the world applied for the program, as visualized above. While the lucky 6.5% don’t arrive until early next month, the fellowship’s website launched today with portraits and Twitter/GitHub links for all the fellows, mentors and staff involved in this exciting effort. There’s also a debut post on the DSSG blog by organizers Rayid Ghani, Matt Gee and Juan-Pablo Velez, that nicely lays out the grand motivation for organizing this first-of-its-kind program.

By analyzing data from police reports to website clicks to sensor signals, governments are starting to spot problems in real-time and design programs for maximum impact. More nonprofits are measuring whether or not they’re helping people, and experimenting to find interventions that work.

None of this is inevitable, however.

We’re just realizing the potential of using data for social impact. We face hurdles to the widespread adoption of analytics in this space:

  • Most governments and nonprofits simply don’t know what’s possible yet.
  • There are too few data scientists out there – and too many spending their days optimizing ads instead of bettering lives.

To make an impact, we need to show social good organizations the power of data by doing high-impact analytics projects. And we need to expose data scientists to the problems that really matter.

That’s exactly why we’re doing the Eric and Wendy Schmidt Data Science for Social Good summer fellowship at the University of Chicago.

We want to bring three dozen aspiring data scientists to Chicago, and have them work on data science projects with social impact.

Be sure to browse through the fellows and watch the website for frequent updates as the fellowship gets to work this summer. For more on the concept of training data scientists to apply their talents to making the world a better place, read Chicago Magazine’s in-depth interview with Rayid Ghani, posted yesterday.

Read Full Post »

Angry_Birds_Land_Särkänniemi_8Many of us carry a computer in our pocket that’s as powerful as the supercomputers of the late 1980’s. Many of us also mostly use that revolutionary device to slingshot cartoon birds at evil pigs. Smartphones have undoubtedly improved and changed our lives in many different ways, yet the potential of these mobile computers to benefit science and humanity has often been overshadowed by their talent for eating up free time with a silly game. But as CI fellow T. Andrew Binkowski said in his (flood-delayed) talk for Faculty Technology Day on May 8th, there are few reasons why the power of smartphone apps can’t also be harnessed for teaching and research in an academic context.

In general, the world of smartphone apps is a cruel and competitive ecosystem. Almost 1 million apps are available in Apple’s App Store, which has seen some 50 billion downloads since its launch in 2008. Due to this scale, Binkowski said he often warns people that no matter how good their app idea is, it’s very likely that somebody else has already created and released something similar. Often, it’s the design, marketing and support of the app that separates it from a crowd of lookalike releases — Angry Birds wasn’t even the first game where a player flings animals at buildings, and yet it is now the most successful franchise in iOS history.

For your typical developer, that makes selling your app “the hardest 99 cents you will ever earn,” Binkowski said. But for academic apps meant for the classroom or laboratory, that fierce competition is irrelevant.

“A lot of educators have goals to better reach out and connect with students, facilitate research, or something as simple as improving communication,” Binkowski said. “This removes a lot of the burdens and constraints of developing an app. If you want an app for your lab and you’re the only one in the world doing this research, you don’t have to worry about this fiercely competitive marketplace. You can build something that just helps you.”

(more…)

Read Full Post »

The Alliant FX/8, an early parallell supercomputer.

The Alliant FX/8, an early parallell supercomputer.

For the last few decades, parallelism has been the secret weapon of computing. Based on the theory that large problems could be solved faster if they are chopped up into smaller problems performed simultaneously, parallel computing has driven supercomputers to their current petascale power. Recently, the concept has spread to consumer computers as well, as clock speed limitations of single processors led manufacturers to switch to multi-core chips combining 2, 4 or 8 CPUs. But in the early 1980’s, when Argonne National Laboratory created its Argonne Leadership Research Facility, the path of parallelism was not so clear.

The origin, subsequent impact and future role of this technology were the topics of discussion at the Thirty Years of Parallel Computing at Argonne symposium, held over two days earlier this week. Luminaries of the computer industry and research community — many of them Argonne alumni or collaborators — met on the Argonne campus to share stories of the laboratory’s instrumental role in nurturing parallel computers and the software they use, and how the approach helped to create the computational science of today and tomorrow.

From a modern perspective, it was hard to spot the world-changing potential in Jack Dongarra’s pictures and descriptions of the earliest Argonne parallel computers, which more resembled washer-dryers than today’s sleek, gargantuan supercomputers. The diversity of parallel machines purchased by the ACRF — 13 in its first 8 years, Dongarra said — reflected the excitement and uncertainty about parallel computing in those early days.

“We knew that parallel computing was the way HPC was going to be done in the future,” said Paul Messina, director of science at what is now known as the Argonne Leadership Computing Facility. “But there was no clear winner in terms of parallel architectures.”

(more…)

Read Full Post »

ChicagoBlue SmallWatch or listen to the news in any city and you’ll be fed a stream of numbers: traffic times, weather forecasts, sports scores and financial reports. All this data gives a quick, surface snapshot of the city on any given day — what happened last night, what’s happening right now, what will happen over the next 24 hours. But a city’s health is harder to put a figure on, either because of the complexity of data, the scarcity of data or the hiding of data behind locked doors. At the University of Chicago last week, a panel of researchers in medicine and the social sciences discussed how the health numbers of Chicago and other cities can be both collected and applied, enabling research on unprecedented scales and empowering citizens to improve their own wellbeing.

The panel, “Methods, Data, Infrastructure for the Study of Health in Cities,” was part of the broader Health in Cities event, one of four Urban Forums held by the University of Chicago Urban Network encompassing the impressive breadth of city research on campus. Among the participants were several scientists who currently collaborating with CI researchers on how to use computation to better collect, analyze and share data. Kate Cagney, an associate professor of sociology and health studies, is working with the Urban Center for Computation and Data on their efforts to help plan and study the massive new Lakeside development taking shape on Chicago’s South Side. Her team will conduct interviews of residents in the neighborhoods surrounding Lakeside both before and after construction to assess how many aspects of their lives — including health — are affected by this enormous addition to the city’s landscape.

“We have an opportunity to study the impact of building a neighborhood from the ashes,” Cagney said. “New computational and data-intensive science techniques now exist to organize and analyze many disparate data sets, and these will allow for the study of Lakeside in unprecedented detail and produce insights due to real-time data.”

(more…)

Read Full Post »

Older Posts »