Feeds:
Posts
Comments

Archive for the ‘Globus Online’ Category

CERN is known as the current world epicenter of particle physics, the home of the Large Hadron Collider and thousands of scientists expanding our knowledge of the universe’s most basic ingredients. For one day earlier this month, the Geneva, Switzerland laboratory was also a meeting place for scientists, philosophers, musicians, animators and even Will.I.Am to share their grand ideas for the first ever TEDxCERN event. Among the speakers riffing on the theme of “Multiplying Dimensions” was CI Director Ian Foster, who presented his vision for The Discovery Cloud and accelerating the pace of science by bringing advanced data and computation tools to the smaller laboratories and citizen scientists of the world.

What we need to do is to in a sense create a new set of cloud services which do for science what the myriad of business cloud services do for business. We might call it the discovery cloud. It would be a set of services that take on, automate, and allow people to handle or outsource many of the routine activities that currently dominate research…I believe if we do that right, we can really make a transformative difference in how people do science.

You can watch a full video of Foster’s presentation below:

International Science Grid This Week also covered Foster’s talk and another given a day earlier to the information technology team at CERN. In that speech, Foster delivered a similar message about the need to bring advanced cyberinfrastructre to the “99%” of laboratories who can’t afford to build international data grids akin to what CERN used in its discovery of the Higgs boson.

“We have managed to create exceptional infrastructure for the 1%, but what about the rest?” asks Foster. “We have big science, but small labs. How do we deliver cyber infrastructure to small groups? They need something that is frictionless, affordable and sustainable.”

Read Full Post »

©CERN Photo: Brice Maximilien / Laurent Egli — at CERN.

©CERN Photo: Brice Maximilien / Laurent Egli — at CERN.

WE CAME, WE SAW, WE CERNED

We were thrilled to spend Friday morning with the folks at TEDxCERN via webcast, enjoying fascinating talks by CI director Ian Foster and several other amazing scientists and educators. Foster’s talk focused on “The Discovery Cloud,” the idea that many complex and time-consuming research tasks can be moved to cloud-based tools, freeing up scientists to accelerate the pace of discovery. We’ll post the video when it’s up, but for now, enjoy this great animation produced for the conference by TED-Ed explaining grid computing, cloud computing and big data.

OTHER NEWS IN COMPUTATIONAL SCIENCE

Speaking of CERN, ISGTW ran a lengthy profile of the computing grid that powers the particle physics research on their Large Hadron Collider. In the three years since the LHC started running, it has produced 70 petabytes of data, which is subsequently distributed around the world to over 150 sites for coordinated and parallel analysis. As Wired wrote back in 2004, the LHC grid was built on Globus Toolkit, created by Ian Foster and Carl Kesselman, “the Lewis and Clark of grid computing.”

Some of the science-as-a-service ideas Foster discussed in his TEDxCERN talk were brought up a week earlier by Renee DiResta in O’Reilly Radar. Companies that provide 3D microscopic scanning, data platforms for computational biology or drug discovery and even connections with freelance scientists are featured.

Computation is eating science, and that’s a good thing…but funding agencies and researchers need to change or be digested, writes Will Schroeder at Kitware.

Read Full Post »

image descriptionPeople who work in laboratories take a lot of things for granted. When they come into work in the morning, they expect the equipment to have power, the sink to produce hot and cold water, and the internet and e-mail to be functional. Because these routine services are taken care of “behind the scenes” by facilities and IT staff, scientists can get started right away on their research.

But increasingly, scientists are hitting a new speed bump in their day-to-day activities: the storage, movement and analysis of data. As datasets grow far beyond what can easily be handled on a single desktop computer and long-distance collaborations become increasingly common, frustrated researchers find themselves spending more and more time and money on data management. To get the march of science back up to top speed, new services must be provided that make handling data as simple as switching on the lights.

That mission was the common thread through the second day of the GlobusWorld conference, an annual meeting for the makers and users of the data management service, held this year at Argonne National Laboratory. As Globus software has evolved from enabling the grids that connect computing centers around the world to a cloud-based service for moving and sharing data, the focus has shifted from large, Big Science collaborations to individual researchers. Easing the headache for those smaller laboratories with little to no IT budget can make a big impact on the pace of their science, said Ian Foster, Computation Institute Director and Globus co-founder, in his keynote address.

“We are sometimes described as plumbers,” Foster said. “We are trying to build software and services that automate activities that get in the way of discovery and innovation in research labs, that no one wants to be an expert in, that people find time-consuming and painful to do themselves, and that can be done more effectively when automated. By providing the right services, we believe we can accelerate discovery and reduce costs, which are often two sides of the same coin.”

(more…)

Read Full Post »

Dna-splitThe original Human Genome Project needed 13 years, hundreds of scientists and billions of dollars to produce the first complete human DNA sequence. Only ten years after that achievement  genome sequencing is a routine activity in laboratories around the world, creating a new demand for analytics tools that can grapple with the large datasets these methods produce. Large projects like the HGP could assemble their own expensive cyberinfrastructure to handle these tasks, but even as sequencing gets cheaper, data storage, transfer and analysis remains a time and financial burden for smaller labs.

Today at the Bio-IT World conference in Boston, the CI’s Globus Online officially unveiled their solution for these scientists: Globus Genomics. Per the news release, “integrates the data transfer capabilities of Globus Online, the workflow tools of Galaxy, and the elastic computational infrastructure of Amazon Web Services. The result is a powerful platform for simplifying and streamlining sequencing analysis, ensuring that IT expertise is not a requirement for advanced genomics research.”

In the release, positive feedback is provided by researchers including William Dobyns, who studies the genetics and neuroscience of developmental disorders at the University of Washington, and Kenan Onel, a pediatric oncologist who directs the Familial Cancer Clinic at The University of Chicago Medicine. Nancy Cox, section chief for genetic medicine at UChicago Medicine, said the service enabled her laboratory to meet the big data challenges of modern genomic research.

“We needed a solution that would give us flexibility to extend our analysis pipelines and apply them to very large data sets,” says Dr. Cox. “Globus Genomics has provided us with a key set of tools and scalable infrastructure to support our research needs.”

If you’re at the Bio-IT World conference, you can visit the Globus Online team at Booth 100 and get a tutorial on the new Globus Genomics service.

Read Full Post »

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Ian Foster speaking at the RDMI workshop, March 13, 2013. Photo by Rick Reinhard.

Big science projects can afford big cyberinfrastructure. For example, the Large Hadron Collider at CERN in Geneva generates 15 petabytes of data a year, but also boasts a sophisticated data management infrastructure for the movement, sharing and analysis of that gargantuan data flow. But big data is no longer an exclusive problem for these massive collaborations in particle physics, astronomy and climate modeling. Individual researchers, faced with new laboratory equipment and methods that can generate their own torrents of data, increasingly need their own data management tools, but lack the hefty budget large projects can dedicate to such tasks. What can the 99% of researchers doing big science in small labs do with their data?

That was how Computation Institute director Ian Foster framed the mission at hand for the Research Data Management Implementations Workshop, happening today and tomorrow in Arlington, VA. The workshop was designed to help researchers, collaborations and campuses deal with the growing need for   high-performance data transfer, storage, curation and analysis — while avoiding wasteful redundancy.

“The lack of a broader solution or methodology has led basically to a culture of one-off implementation solutions, where each institution is trying to solve their problem their way, where we don’t even talk to each other, where we are basically reinventing the wheel every day,” said H. Birali Runesha, director of the University of Chicago Research Computing Center, in his opening remarks.

(more…)

Read Full Post »

698px-PC_remote_control_20101107“Software as a Service,” or SaaS, is a concept that has revolutionized the way people use their computers. Every time you check your e-mail on Gmail, stream a movie over Netflix or customize a radio station on Pandora, you’re accessing an SaaS through a browser that saves you the trouble of installing programs and storing data locally on your own computer. In an essay written for O’Reilly Radar, Renee DiResta argues for swapping out the first S in SaaS for “science,” creating online tools for scientists to outsource time-intensive and expensive processes such as specialized experiments and data sharing, storage and analysis.

Perhaps we can facilitate scientific progress by streamlining the process. Science as a service (SciAAS?) will enable researchers to save time and money without compromising quality. Making specialized resources and institutional expertise available for hire gives researchers more flexibility. Core facilities that own equipment can rent it out during down time, helping to reduce their own costs. The promise of science as a service is a future in which research is more efficient, creative, and collaborative.

In the comments to the article and on his blog, CI director Ian Foster responded, agreeing that science as a service has the power to free up researchers’ time and budget and talking about one of the CI’s own SciAAS initiatives:

This article echoes several themes that I speak to often. In my conception, every researcher is an entrepreneur, and researchers, like entrepreneurs, should be able to run their (virtual) operations from coffee shops. Science as a service frees researchers to work when and where they want, while also saving them time and money.

(more…)

Read Full Post »