Category Archives: AAAS Annual Meeting 2008
For those of you who went to this year’s AAAS meeting in Boston, now is a chance to sip coffee, recover from jet lag and go over all those indecipherable notes you took so hastily. For those of you who didn’t go, I hope that my blog has given you a taster of the symposia pertaining most to physics.
There’s been a remarkable range of topics covered. I heard the Rwandan president Paul Kagame share his vision for scientific education in Africa, the AAAS president David Baltimore tell of the importance of US research, and Princeton University’s Harold Shapiro discuss how money should be allocated within science budgets. I saw eye-popping pictures taken with present-day supercomputer simulations, graphs depicting the awkward public opinion surrounding nanotechnology, and pamphlets informing one how to spot a Weapon of Mass Destruction. I spoke to folk artists about nuclear physics, directors about the status of forthcoming facilities, and press officers about mismatching meetings.
On a final valedictory note, I also found it refreshing to hear other journalists talk about the difficulties of reporting science, and scientists acknowledge the importance of the media in helping their cause. The general feeling at the meeting was that the media will have a great — and to a certain extent isolated — role to play in conveying important issues such as funding and climate change.
This time next year, physicsworld.com will be blogging from Chicago for the 2009 AAAS meeting. But if you can’t wait until then for more physics gossip, be sure to check-in on 10 March when physicsworld.com editor Hamish Johnston and Physics World news editor Michael Banks will be blogging from this year’s American Physical Society (APS) meeting in New Orleans.
“Is the title an oxymoron?” Harold Shapiro asked rhetorically at the beginning of his talk, The Responsible Use of Public Resources in Elementary Particle Physics. He wanted to show how one goes about prioritizing funding within the US science budget for high-energy physics. Later in his talk he posed another rhetorical question: “Are we in the US silently executing an exit strategy?”
Shapiro is professor of economics and public affairs at Princeton University and also chairs the US elementary particle physics committee. The committee is composed of nine particle physicists among five non-particle physicists and six non-physicists, and recently submitted a report recommending research priorities to the US National Academy.
The report begins by summarizing, for the uninitiated, the main unresolved issues in physics: the nature of space and time; the origin of mass; and the beginning and fate of the universe. Then it notes that the most likely way for significant progress to be made will be to resolve Einstein’s theory of general relativity, which describes how gravity arises from the curvature of space–time, with the Standard Model of particle physics. A theory known as supersymmetry might be able to do this, but to be tested it really needs the help of particle accelerators that operate at tera-eV (1012 eV) energy scales.
One such accelerator is the Large Hadron Collider (LHC) at CERN, due to start-up this June. However, it ought to be complemented by the International Linear Collider (ILC), a non-hadron accelerator that is still in the R&D stage. The US would like to submit a credible bid to host the ILC, and that requires making significant R&D contributions. Hence firmly recommending it as a priority to the National Academy.
Trouble is, particle accelerators are expensive pieces of kit. The LHC will clock-in at around $9.2bn, while the ILC could easily be double or triple that. Playing the devil’s advocate, I asked why funds allocated for particle-physics facilities would not be better spent on research into more useful physics — alternative energy, for instance. Shapiro said there is no way of quantifying which is more important, adding that, for him, understanding the ways of the universe “is an extraordinarily important issue.”
Lawrence Krauss of Case Western Reserve University, the symposium organizer, also chipped in. “There is no other way of answering these big questions,” he said. “And it’s worth remembering that the entire cost of the LHC is the same as nine days in Iraq.”
Don Geesman of Argonne National Laboratory is well-known not only for his work on quarks but also for serenading his audience with folk songs about underground neutrino detectors (“For it’s dark as a dungeon and damp as the dew/where neutrinos come slowly and the funding does too”). Although we were not so fortunate to experience any music at his symposium this afternoon, I did manage to corner the maestro afterwards for a quick chat about the ties between nuclear physics and astrophysics.
Geesman said that most of the crossover between the two disciplines occurs when studying explosive events such as supernovae and novae. Because such events occur on rapid timescales, they can be greatly affected by the physics of nuclear isotopes that have a very short lifetime themselves — ones that can only be glimpsed in facilities such as Argonne’s ATLAS accelerator. A lot of the experiments at ATLAS seek to understand how the nature of these short-lived isotopes changes when they are at the sort of high temperatures found in supernovae.
Why is this important? Thermonuclear, “Type Ia” supernovae have for a long time been considered the “standard candles” of the cosmos because their brightness is so consistent. It was using these standard candles as distance markers that, in 1998, physicists were led to conclude that the expansion of the universe is accelerating, and therefore that there must be some kind of all-pervasive “dark energy”. But recent observations have led astrophysicists to suspect that standard candles might not be so standard after all. Geesman explained that a better understanding of short-lived isotopes will give astrophysicists a solid theoretical basis to tell how reliable standard candles are.
There’s not a fantastic selection of freebies at this year’s AAAS meeting, although there are one or two gems. A brain that sticks to walls, a pen that unfolds at the push of a button and — if you can make it past the dark sunglasses and curly-wired earphones of the agents at the FBI stand — a copy of Weapons of Mass Destruction: A Pocket Guide.
However, the freebie producing the biggest buzz comes from ITER, the project that is desperately trying to get a fusion power plant up and running by 2018. Go to its stand and you can pick up a pair of magnetic bean-shaped “fusion particles”. The idea is that you take one in each hand, throw them up in the air, and gasp in amazement as they stick and flutter together, thus demonstrating the basic principle of fusion.
I tried for a good two minutes earlier today trying to make those fusion particles work, but to no avail. Then, back in the press room, a journalist who wished to remain anonymous revealed the secret. “You have to believe in ITER,” he told me. “They won’t ever work unless you say you believe.”
The US science funding cuts revealed in last year’s omnibus bill were a terrific blow to US physicists, with Fermilab in particular being forced to lay-off 200 of its staff. If it doesn’t recover, the US might find that key research and development institutions begin to settle elsewhere. “It takes something like the race to the Moon to open up the coffers,” lamented Robert Rosner, Director of Argonne National Laboratory, at a press breakfast this morning.
Rosner was appealing to the media to help the public see the benefit of physics research because, he said, they are not so good at it on their own. Many important discoveries in physics, from the transistor to the internet, only became widely adopted after a long period of development. “But culture is impatient,” he noted. The question is how to convey to a public accustomed to instant gratification the need to be patient in science.
Another problem in Rosner’s eyes is the poor demographic involved in science education. Most university students are Caucasian males, despite them comprising a smaller and smaller cross-section of the population. This might be because of the “nerdy” image of science, which only seems to disappear in extraordinary circumstances such as the launch of Russia’s Sputnik I 50 years ago. Or it might be that scientists receive poor compensation for their efforts, compared with, say, doctors.
Unlike the past, when the US would enjoy brain-draining scientists from abroad, today sees fewer top scientists migrating to the country. Rosner thinks this is because of the difficulty in getting a work visa, and a lack of clear economic incentives. For global institutions, he calls science a “meritocracy”: they take root where the talent is.
Rosner’s presentation echoed feelings from yesterday’s symposium on the media’s coverage of climate change, namely that the best way — if the only way — to persuade the government and the public about the merit of scientific research is through the media. At the symposium, John Holdren of Harvard Univerity pointed out that the erstwhile British prime minister Margaret Thatcher only became convinced of the importance of the environmental cause after she was lumbered with a pile of New Scientist magazines to take on holiday.
It’s not often that journalists are the ones being quoted. And going by the attendance of this afternoon’s symposium, Global warming heats up: how the media covers climate change, a lot of people were eager to find out what they have to say for themselves.
Andrew Revkin of the New York Times gave several reasons why the accurate reporting of climate change often clashes with an editor’s news values. He said that because many developments in climate change have no obvious “peg” to sell them as news stories, editors often leap on what they can sell, regardless of whether they are on solid scientific ground. One example is Hurricane Katrina, which many newspapers reported as a direct consequence of global warming despite any real evidence to back the claim. “You lose all those caveats that scientists crave,” he said.
He went on to say that the science, which by its nature is complex, is difficult to report accurately when time and page space is limited. So simple messages such as “CO2 equals warmer planet” are inevitably conveyed more frequently than the factors affecting the likelihood of an ice meltdown in Greenland. In a similar vein, editors do not understand the importance of small developments in research. “The word ‘incremental’ in the Times newsroom is the death knell to a story,” he joked.
Revkin also addressed the thorny topic of opinion in science journalism: “For every PhD there is an equal and opposite PhD.” In other words, it is easy to find a scientist with a counter-opinion (when dealing with climate change, a sceptic) to make a story appear balanced. David Dickson, director of the website SciDev.Net, said that this tendency is a result of a fundamental misunderstanding among many journalists about the nature of science. Unlike politics, for example, the science community contains such a thing as consensus and the weight of one scientist’s opinion over another.
Nanotechnology has proved to be a gold mine for applied physics, and by 2015 recent predictions suggest that it could have generated a $1 trillion global market. That’s not very surprising: new applications for nanotechnology seem to crop up every day. Only this week, physicsworld.com reported on an electricity-generating fabric that could be woven from nanowire fibres. But will this trend continue indefinitely?
One stumbling block might be the public, as Steve Currall, head of management science and innovation at University College London, UK, has found. In this afternoon’s talk, Consumer Attitudes Toward Nanotech, he summarized the fruits of his surveys of how the public weighs-up both the risks and benefits of nanotechnology.
The above graph, which Currall’s group published at the end of 2006 (Nature Nanotech. 1 153), shows how a randomly selected cross-section of the public rated the risks and benefits of 44 technologies, including nanotechnology. Although at first glance nanotechnology looks safe slap-bang in the middle, bear in mind that the neighbouring “SC” stands for stem-cell research, which (at least for the public) is a contentious area. “It shows that the jury is still out with regards to the public perception of nanotechnology,” said Currall.
He then presented the results of a more specific survey. His group came up with four hypothetical, innovative nanotechnology products — a medicinal drug, a skin-care product, a car tyre and a coolant — and then posed them, with various risks and benefits attached, to another cross-section of the public. (For instance, one statement might have been: “A new coolant will be more effective than the nearest non-nanotech alternative by 65%, but will pose a 12% worse influence on the atmosphere.”)
They have found that the public’s decision on whether to approve a product is based more on that product’s benefit, rather than its risk. “We didn’t expect that,” remarked Currall. I would have. Consumers should practically be defined by their tendency to see the honey and miss the bees. What was interesting, however, was that the public did not think about the benefits and risks sequentially as a proper risk assessment would. Instead, they would weigh them up simultaneously — perhaps giving a more unpredictable outcome.
Robert Aymar, director general of CERN, gave his talk on the forthcoming Large Hadron Collider with veteran poise. The particle accelerator, he said, will reveal the origin of mass, the nature of dark matter and the essence of the primordial plasma. The results will dictate the future course of high-energy physics. It will be a paradigm for future international collaborations. Thirty-eight countries, 2310 scientists, 15 million gigabytes of data.
We had to wait until the end of the symposium for questions, at which point several hands shot up, mine included. Is the accelerator’s repeatedly delayed start-up still on course for late May/early June? “We plan to have the beam running at injection energy in June, with data collecting in the summer time,” replied Aymar, somewhat tiredly. Well, what did you expect him to say?
Visualizing data really has come a long way. It may have started with geniuses like Galileo mapping the movement sunspots as a series of sketches, but four centuries later it is all about supercomputing terabytes of data. “A graphic message has to hit you between the eyes,” said Alyssa Goodman, who was moderator at the symposium Seeing Science.
Chris Johnson — the director of the scientific computing and imaging institute and faculty member of the department of physics at the University of Utah — spoke of a “golden age in scientific computing” that we are now entering. He has the undesirable job of presenting the results of supercomputer simulations in an intuitive form. Take this image, for example, a video-still that Johnson’s group produced from a simulation of fire:
In the past, video simulations would have simply rendered the main parameter — let’s say, temperature — as colour varying with 3D position. But it would be difficult to tell from this alone whether the simulation is correct. After all, the colour of a real fire doesn’t just result from temperature, it results from all sorts of other physical processes such as falling shadows and the scattering of light. Johnson’s group adds these processes into the simulation output. “Adding information can make 3D models easier to comprehend,” he said.
After the talk, the man sitting next to me jumped up and proclaimed that Johnson’s presentation proves that art is integral to science. He certainly has a point: many of the images from the symposium looked fit to be put in an ebony frame and hung on the wall. But Felice Frankel of Harvard University, author of the recent book Envisioning Science, was cautious about making a connection: “People can confuse beautiful art with science, which can be dangerous.”
Paul Kagame is the last person you would expect to see at a science meeting. Eighteen years ago he returned to his native Rwanda after 30 years of exile. In 1994, as the genocide against his people climaxed, he led the Rwandan Patriotic Front to overthrow the incumbent government. Nine years later he became Rwanda’s first democratically elected president.
This evening I listened with hundreds of the public, scientists and fellow journalists as Kagame — the last keynote speaker at the AAAS president’s address — voiced his vision of using scientific education to help rebuild his torn country. “There can be no better inspiration than the United States of America,” he said. “What we seek to achieve in Rwanda and Africa is taken for granted here.”
Kagame proceeded to explain how he wants to shift the source of Rwanda’s economy from raw-material exports, such as coffee, towards knowledge. To do this, he is almost doubling the funding for scientific research from 1.6% of the gross domestic product to 3% (incidentally, more than the US). “We must keep the steady track of using the powerful tools of science and technology,” he continued. “We have made a good start in Rwanda, but challenges still remain.” Those challenges include building relationships between the government, businesses and universities.
I can only describe it as inspiring to see a man for whom the ravages of war must surely be a raw memory speak with such prudence about science. And clearly this sentiment was shared by everyone, because he finished his talk to a standing ovation.