Forest of synthetic pyramidal dendrites grown using Cajal’s laws of neuronal branching (Wikimedia Commons)
Trauma surgeons know how to fix gunshot wounds, lacerations and broken bones. It’s what comes afterwards that really worries them. Even after the initial injury is treated, patients are at risk for secondary issues such as infection, sepsis and organ failure. While the biological pathways involved in these processes have been well studied and characterized, effective interventions to reliably stop the dangerous cascade have yet to be discovered.
“It was very frustrating for me to not have the drugs and tools necessary to fix what I thought was actually going wrong with those patients,” said trauma surgeon and CI senior fellow Gary An, in his University of Chicago Alumni Weekend UnCommon Core talk. “Often we know what will happen, but we have no way to stop it.”
The current fashionable approach to such intractable problems in medicine and other fields is Big Data, where answers hiding in massive datasets will be uncovered by advanced analytic methods. But quoting Admiral Ackbar, An warned the audience that this approach alone “is a trap,” generating a multitude of correlations and hypotheses that don’t always translate into real world applications.
“What it wants to appeal to is magic…if you can get enough data and a big powerful computer, an answer will magically appear,” said An, an associate professor of surgery at University of Chicago Medicine. “That’s fine if you want to diagnose or characterize. But if we want to engineer interventions to be able to manipulate systems, we need to have presumptions of mechanistic causality; we need to be able to test hypotheses.”
Continue Reading »
Posted in Biology, Medicine, Modeling & Simulation | Leave a Comment »
CERN is known as the current world epicenter of particle physics, the home of the Large Hadron Collider and thousands of scientists expanding our knowledge of the universe’s most basic ingredients. For one day earlier this month, the Geneva, Switzerland laboratory was also a meeting place for scientists, philosophers, musicians, animators and even Will.I.Am to share their grand ideas for the first ever TEDxCERN event. Among the speakers riffing on the theme of “Multiplying Dimensions” was CI Director Ian Foster, who presented his vision for The Discovery Cloud and accelerating the pace of science by bringing advanced data and computation tools to the smaller laboratories and citizen scientists of the world.
What we need to do is to in a sense create a new set of cloud services which do for science what the myriad of business cloud services do for business. We might call it the discovery cloud. It would be a set of services that take on, automate, and allow people to handle or outsource many of the routine activities that currently dominate research…I believe if we do that right, we can really make a transformative difference in how people do science.
You can watch a full video of Foster’s presentation below:
International Science Grid This Week also covered Foster’s talk and another given a day earlier to the information technology team at CERN. In that speech, Foster delivered a similar message about the need to bring advanced cyberinfrastructre to the “99%” of laboratories who can’t afford to build international data grids akin to what CERN used in its discovery of the Higgs boson.
“We have managed to create exceptional infrastructure for the 1%, but what about the rest?” asks Foster. “We have big science, but small labs. How do we deliver cyber infrastructure to small groups? They need something that is frictionless, affordable and sustainable.”
Posted in Cyberinfrastructure, Globus Online, Science as a Service, Video | Leave a Comment »
The exascale — one million trillion calculations per second — is the next landmark in the perpetual race for computing power. Although this speed is 500 times faster than the world’s current leading supercomputers and many technical challenges remain, experts predict that the exascale will likely be reached by 2020. But while the United States is used to being the frontrunner in high-performance computing achievement, this leg of the race will feature intense competition from Japan, China and Europe. In order to pass the exascale barrier first and reap the application rewards in energy, medicine and engineering research, government funding is critical.
On Capitol Hill yesterday, CI Senior Fellow Rick Stevens testified to this urgency as part of a Congressional Subcommittee on Energy hearing, “America’s Next Generation Supercomputer: The Exascale Challenge.” The hearing was related to the American High-End Computing Leadership Act [pdf], a bill proposed by Rep. Randy Hultgren of Illinois to improve the HPC research program of the Department of Energy and make a renewed push for exascale research in the United States. You can watch archived video of the hearing here, and Stevens’ prepared opening statement is reproduced in full below.
Thank you Chairman Lummis, Ranking Member Swalwell, and Members of the Subcommittee. I appreciate this opportunity to talk to you about the future of high performance computing research and development, and about the importance of U.S. leadership in the development and deployment of Exascale computing.
I am Rick Stevens, the Associate Laboratory Director responsible for Computing, Environment, and Life Sciences research at Argonne National Laboratory. My laboratory operates one of the two Leadership Class computing systems for DOE’s Office of Science. My own research focuses on finding new ways to increase the impact of computation on science – from the development of new more powerful computer systems to the creation of large-scale applications for computational genomics targeting research in energy, the environment and infectious disease. I also am a Professor at the University of Chicago in the Department of Computer Science, where I hold senior fellow appointments in the University’s Computation Institute and the Institute for Genomics and Systems Biology.
I believe that advancing American leadership in high-performance computing is vital to our national interest. High-performance computing is a critical technology for the nation. It is the underlying foundation for advanced modeling and simulation and big data applications.
Continue Reading »
Posted in Argonne, Exascale | Leave a Comment »