[This post was co-published at the Data Science for Social Good blog]

Last week’s Techweek Chicago event had all the trappings of your typical tech conference. Sprawled out over a floor of the city’s massive Merchandise Mart was a maze of exhibitor booths luring attendees to learn about their new dating service or augmented reality app with tchotchkes and loud music. Three stages offered a full slate of talks and panels about the future of the internet, smartphones and video games, tips for making content go viral and pageant-style battles of aspiring startups. Against this noisy backdrop of buzzwords and brands, the public announcement of the Data Science for Social Good fellowship struck a different note.

Serendipitously following a passionate argument by Jeff Lawson of Twilio for the power of software to change the world, fellowship director Rayid Ghani offered an overview of how the Data Science for Social Good program was designed to fulfill that promise. Inspired by Ghani’s work as chief data scientist with the Obama for America campaign, the vision of the fellowship was to take some of the creative and technical firepower on display at these types of tech gatherings and apply those talents in a new, socially beneficial direction.

“We thought it was really important for people with skills in data and technology to do something useful for a change,” Ghani said. “A lot of work on putting ads in sidebars and optimizing click-throughs or moving money around…there’s nothing wrong with those things, except there’s an opportunity cost. If you’re doing those things, you’re not doing something useful. So we decided it’s time for people who really have the skills to do useful things to really have them do those things.”

Continue Reading »


As the Eric & Wendy Schmidt Data Science for Social Good fellowship enters its third week, the orientation ice-breakers of the first couple days have given way to the grind of hard work. Following the technically-oriented “boot camp” of the first week, where fellows got a crash course in the software and tools at their disposal this summer, the second week featured a different sort of educational experience. A steady stream of experts, on topics ranging from Chicago crime and public transit to energy infrastructure and early childhood interventions, visited the DSSG space to expose fellows to the gritty details of the real world problems they will address.

The purpose of these visits is for the fellows to learn about “the dark matter of public policy data,” the important information that won’t necessarily show up in the numbers that they’ll work with during their projects. Some of the speakers chose to give the fellows a little dose of humility, such as Paul O’Connor from architecture firm Skidmore, Owings & Merrill, who challenged them with the questions of “Who are you, and what are you looking for?” amid a history lesson on Chicago.

Continue Reading »

Forest of synthetic pyramidal dendrites grown using Cajal's laws of neuronal branching (Wikimedia Commons)

Forest of synthetic pyramidal dendrites grown using Cajal’s laws of neuronal branching (Wikimedia Commons)

Trauma surgeons know how to fix gunshot wounds, lacerations and broken bones. It’s what comes afterwards that really worries them. Even after the initial injury is treated, patients are at risk for secondary issues such as infection, sepsis and organ failure. While the biological pathways involved in these processes have been well studied and characterized, effective interventions to reliably stop the dangerous cascade have yet to be discovered.

“It was very frustrating for me to not have the drugs and tools necessary to fix what I thought was actually going wrong with those patients,” said trauma surgeon and CI senior fellow Gary An, in his University of Chicago Alumni Weekend UnCommon Core talk. “Often we know what will happen, but we have no way to stop it.”

The current fashionable approach to such intractable problems in medicine and other fields is Big Data, where answers hiding in massive datasets will be uncovered by advanced analytic methods. But quoting Admiral Ackbar, An warned the audience that this approach alone “is a trap,” generating a multitude of correlations and hypotheses that don’t always translate into real world applications.

“What it wants to appeal to is magic…if you can get enough data and a big powerful computer, an answer will magically appear,” said An, an associate professor of surgery at University of Chicago Medicine. “That’s fine if you want to diagnose or characterize. But if we want to engineer interventions to be able to manipulate systems, we need to have presumptions of mechanistic causality; we need to be able to test hypotheses.”

Continue Reading »


The Data Science for Social Good summer fellowship is officially underway, as (most of) the 36 fellows have made it to Chicago, met their mentors and explored their new digs on the Chicago River. Juan-Pablo Velez captured the excitement of the first day over at the fellowship’s blog, where there will be plenty of updates all summer.

We spent our first day getting to know the program, and each other.

Program director Rayid Ghani kicked things off by welcoming mentors, staff, and fellows.

“Three months ago, we didn’t know we were doing this. Lots of people did a ton of work to get you here,” Ghani said. “But this is now your program – it’s up to you to make this work.”

“No pressure.”

Details of the projects our crew of data scientists will be tackling this summer will be announced later this week.


In its early days, the Urban Center for Computation and Data formed a valuable partnership with the data team installed by Mayor Rahm Emanuel within Chicago’s city government. Leading the city’s efforts to free up data internally and externally was Chief Data and Information Officer Brett Goldstein, an alumnus of the UChicago computer science program and the restaurant reservation startup company OpenTable. Goldstein’s team and UrbanCCD worked together on the SmartData platform proposal that was chosen for a $1 million grant in the Bloomberg Mayors Challenge earlier this year, and Goldstein was the keynote speaker at the first workshop of the Urban Sciences Research Coordination Workshop in February.

So we are very excited about the news that Goldstein will soon be joining the University of Chicago as the inaugural Fellow in Urban Science at the Harris School of Public Policy. Goldstein will continue to work with UrbanCCD researchers on the SmartData platform and other projects, while also helping with the launch of a masters degree in computation and public policy and the Urban Technology Innovators’ Conference, a new initiative organized by Chicago Harris and the City of Chicago that seeks to create a peer-learning network for municipal technology innovators.

“Chicago Harris demonstrates a commitment to rigorous interdisciplinary scholarship, with strong partnerships with the Department of Computer Science and the Computation Institute, and a desire to advance the field of data science, especially so it can be leveraged it for public service,” Goldstein said. “I am excited about the opportunity to continue working to meld urban science and data analytics and work with this impressive community of faculty and staff.”

You can read more coverage of Goldstein’s move and career so far at Crain’s Chicago Business and Government Technology.

Continue Reading »

CERN is known as the current world epicenter of particle physics, the home of the Large Hadron Collider and thousands of scientists expanding our knowledge of the universe’s most basic ingredients. For one day earlier this month, the Geneva, Switzerland laboratory was also a meeting place for scientists, philosophers, musicians, animators and even Will.I.Am to share their grand ideas for the first ever TEDxCERN event. Among the speakers riffing on the theme of “Multiplying Dimensions” was CI Director Ian Foster, who presented his vision for The Discovery Cloud and accelerating the pace of science by bringing advanced data and computation tools to the smaller laboratories and citizen scientists of the world.

What we need to do is to in a sense create a new set of cloud services which do for science what the myriad of business cloud services do for business. We might call it the discovery cloud. It would be a set of services that take on, automate, and allow people to handle or outsource many of the routine activities that currently dominate research…I believe if we do that right, we can really make a transformative difference in how people do science.

You can watch a full video of Foster’s presentation below:

International Science Grid This Week also covered Foster’s talk and another given a day earlier to the information technology team at CERN. In that speech, Foster delivered a similar message about the need to bring advanced cyberinfrastructre to the “99%” of laboratories who can’t afford to build international data grids akin to what CERN used in its discovery of the Higgs boson.

“We have managed to create exceptional infrastructure for the 1%, but what about the rest?” asks Foster. “We have big science, but small labs. How do we deliver cyber infrastructure to small groups? They need something that is frictionless, affordable and sustainable.”

rick-stevens-dataThe exascale — one million trillion calculations per second — is the next landmark in the perpetual race for computing power. Although this speed is 500 times faster than the world’s current leading supercomputers and many technical challenges remain, experts predict that the exascale will likely be reached by 2020. But while the United States is used to being the frontrunner in high-performance computing achievement, this leg of the race will feature intense competition from Japan, China and Europe. In order to pass the exascale barrier first and reap the application rewards in energy, medicine and engineering research, government funding is critical.

On Capitol Hill yesterday, CI Senior Fellow Rick Stevens testified to this urgency as part of a Congressional Subcommittee on Energy hearing, “America’s Next Generation Supercomputer: The Exascale Challenge.” The hearing was related to the American High-End Computing Leadership Act [pdf], a bill proposed by Rep. Randy Hultgren of Illinois to improve the HPC research program of the Department of Energy and make a renewed push for exascale research in the United States. You can watch archived video of the hearing here, and Stevens’ prepared opening statement is reproduced in full below.


Thank you Chairman Lummis, Ranking Member Swalwell, and Members of the Subcommittee. I appreciate this opportunity to talk to you about the future of high performance computing research and development, and about the importance of U.S. leadership in the development and deployment of Exascale computing.

I am Rick Stevens, the Associate Laboratory Director responsible for Computing, Environment, and Life Sciences research at Argonne National Laboratory. My laboratory operates one of the two Leadership Class computing systems for DOE’s Office of Science. My own research focuses on finding new ways to increase the impact of computation on science – from the development of new more powerful computer systems to the creation of large-scale applications for computational genomics targeting research in energy, the environment and infectious disease. I also am a Professor at the University of Chicago in the Department of Computer Science, where I hold senior fellow appointments in the University’s Computation Institute and the Institute for Genomics and Systems Biology.

I believe that advancing American leadership in high-performance computing is vital to our national interest. High-performance computing is a critical technology for the nation. It is the underlying foundation for advanced modeling and simulation and big data applications.

Continue Reading »