Today’s colloquium speaker was Lawrence M. Krauss, who is somewhat well-known for doing a lot of public outreach and having written several books aimed at the general public. (One of these books was The Physics of Star Trek, which I received from three or four separate people as birthday gifts when it came out in 1995.) He’s also done political advocacy, perhaps most notably fighting “intelligent design” creationism in Ohio. Today’s colloquium was about neither Star Trek nor politics, however, but about the “dismal future” of the universe.
The talk was basically a series of extrapolations from the fact that cosmological observations show a universe that is not only expanding, but expanding at an exponentially increasing rate. The most direct consequence is that eventually everything that isn’t gravitationally bound to our galaxy cluster will be receding away at faster than the speed of light, not only inaccessible but invisible. This won’t happen for many billions of years, so it’s not of any particular concern to us personally, but will be an issue for the future of life itself. As a result of being isolated to a single cluster, the amount of energy available becomes limited: I think the estimated number was 3×1067 Joules, for what it’s worth. Consequently, the amount of information that can be processed also is limited, to on the order of 10120 bits. One of the more interesting numbers quoted was that, if one assumes that Moore’s Law will continue to hold on the rate of information processing, civilization would run through this capacity in just 400 years. (Since at the moment we are limited to the amount of energy here on Earth, I expect Moore’s Law will fail rather sooner than this, which is why I’m skeptical about Singularity talk.)
Another section of Krauss’ talk was devoted to what cosmology would look like to a far-future civilization in one of these “island universes” created by expansion. Since these future scientists would be unable to observe the universe outside the cluster, they would be unable to infer the expanding universe or the Big Bang, and would conclude that the universe was static. (They could, however, estimate the age of the universe from abundance of various elements.)
Finally, on long timescales everything disappears, as dark matter halos evaporate and galaxies dissipate.
Krauss, being a more public figure than most physicists, was a very good speaker who gave an entertaining talk. He was deliberately provocative, declaring at the beginning that he would alienate most of the audience, and particularly targeted advocates of the anthropic principle. I was hoping for more fireworks in the question session, but it was somewhat tame. A video of this talk will appear at some point here on the department website.
Today’s colloquium was Steve Chu, Nobelist and director of Lawrence Berkeley Laboratory, giving an account of his biophysics experiments. However, rather than report on this I’m going to share a thought I had in the middle of the talk. At one point he was describing a standard optical tweezers technique in which ribosomes are engineered to stick to a tiny glass bead, which can then be manipulated with a laser beam. I was thinking there was something familiar about this, and I realized you could make a game out of it in which you have a biological sample with lots of components designed to stick to the bead, and then roll the bead around with the laser beam to pick them up… yes! Optical Trap Katamari Damacy!
On the other hand, I don’t think the King of All Cosmos would be impressed by a 3 μm katamari.
Some of you know Steve Koonin from his days as Caltech’s provost. He’s now chief scientist at BP International, and gave the colloquium at Berkeley today under the title “A Physicist’s View of the World’s Energy Situation”. The talk was extremely interesting and seemed like a very realistic assessment. Some of the points I took away (in a bit of random order):
- Koonin estimates peak oil in about 30 years. Asked about the more alarmist estimates of 10-20 years, he basically says that BP has better data about the oil supply.
- On the other hand, there is 200 years worth of coal left in the ground.
- Coal is the worst fossil fuel for carbon emissions, but technologies exist to mitigate this.
- Oil in the US is mostly used for transportation, coal and natural gas for electric power.
- Energy use in transportation is very inefficient, but efficiency needs to be coupled to conservation: car engines improved efficiency by about 25% in the 90’s but most of this went into heavier and faster cars rather than better gas mileage.
- Koonin first downplayed the evidence for climate change, then stated that he is 90% confident that it is happening and went on to treat it as a serious issue.
- However, based on projected fossil fuel use he feels that large quantities of CO2 in the atmosphere by 2100 are unavoidable, and we should focus on adaptation rather than prevention.
- Renewable energy is very far from being a realistic replacement for fossil fuels.
- There are two large numbers relevant to global energy use: the per capita energy consumption in developed countries (the USA is an outlier, but other developed countries are within a factor of two) and the population of developing countries. Efforts by Europe, the US, and Japan to control emissions only offset the effects of growth in China, India, etc. by a few years.
- The word “fusion” did not appear in the talk. A number of questioners brought it up and Koonin stated that it was at least 50 years away from replacing fossil fuels. “First you have to get it to work.”
- In the extreme long run (200+ years, once fossil fuels are exhausted) Koonin predicts fusion and solar will be the dominant energy sources. Currently solar is much more expensive than almost all other sources of energy, but this is a materials problem and can potentially be solved.
The talk will eventually appear here as a webcast. I’ve been increasingly interested in energy issues lately and I found it to be a fascinating look at how the oil companies (or at least one of them) look at these things. Next week while I’m traveling I’ll read Out of Gas and see what Koonin’s fellow Caltech prof David Goodstein has to say about this. (Goodstein is clearly more pessimistic.)
Today’s colloquium was Stuart Freedman on the latest results from KamLAND, one of the neutrino detection experiments. The experiment is basically a gigantic vat of liquid scintillator—an oil convenient for producing photons from exotic particles passing through—surrounded by high-efficiency photon detectors. Neutrinos are produced in huge quantities by the sun and nuclear reactors, but they rarely interact with matter, so to observe them one needs to construct a very large detector and wait for a while.
I’ve always enjoyed following the neutrino experiments, since they came online about when I started to study physics, and since then they have made steady progress understanding this particle. It’s a nice example of the incremental progress of science. Around my senior year in high school the story was “We’ve been assuming neutrinos are massless, but it’s been suggested they do have mass and experiments are being constructed to look for it.” (That was the year I went to IPhO, which was held in Sudbury, Canada, a town whose only distinction was that the Sudbury Neutrino Observatory was being built there, so we heard a lot on this subject.) Over the next few years the line became “Neutrinos might have mass,” then “Neutrinos probably have mass (but we don’t know what it is)”. And in today’s colloquium, the word was:
- Neutrinos totally have mass. There are three different varieties of neutrinos, named according to the lepton they’re associated with in weak-force interactions: for example, the basic nuclear beta decay produces an electron and an electron neutrino. But KamLAND looked at neutrinos produced in this way by nuclear reactors, and found that neutrinos that start out as electron neutrinos oscillate between this type and the other two types (the mu and tau neutrinos). This can happen only if neutrinos are massive.
- But we don’t know what it is. Measurements on neutrino oscillations only tell you the relationship between the masses of the three types of neutrinos, and not the masses themselves. There are estimates of the actual masses based on this, but they are not very precise.
- The electron, mu, and tau neutrinos are not mass eigenstates. Rather than having a single mass itself, the electron neutrino is actually a kind of mixture (technically, a superposition) of three neutrinos, each of which does have its own mass. The mass eigenstates have been creatively named ν1, ν2, and ν3. It’s known approximately how much of each mass eigenstate is present in the electron, mu, and tau mixtures, but not how the masses are arranged—so ν3 could be the heaviest or the lightest.
- There’s still an undetermined parameter in neutrino mixing. It’s a complex phase, and relates to a symmetry breaking? This is one of those things I’d be more informed about if I’d ever taken a course on the Standard Model.
Freedman also spent some time on another angle of this experiment, in geophysics rather than fundamental physics. (I know I have some geophysicists reading, so you can correct me if I get this wrong.) There’s a discrepancy between various estimates of the heat produced by the Earth, and one hypothesis (which is apparently not widely credited) is that the core of the Earth contains a natural nuclear reactor. Since KamLAND is built to detect neutrinos from man-made reactors, it could in principle look for one at the center of the planet as well. Except that KamLAND is (deliberately) built really close to a number of reactors in Japan, and any geophysical signal would be absolutely swamped by the signal from power plants. So in practice it looks like another detector would have to be built somewhere else to do this experiment.
Yesterday’s colloquium was entitled “String Theory and Cosmology”, usually a sign that I can safely spend that hour in the lab trying to get my qubits to work. If I had known that the speaker would be giving the talk from handwritten transparencies I definitely would have stayed away, figuring that the talk was so overly technical that Powerpoint couldn’t handle it, and the speaker would be running through some incomprensible morass of equations and text that had been lifted from the Necronomicon and then translated a couple times by Babelfish.
But fortunately I did go to the colloquium, which turned out to be pretty accessible. The speaker, Shamit Kachru, was very good and able to give sort of a hand-wavy outline of what string theorists are up to. String theory is a very difficult and jargon-heavy subject, and there was no way for him to get very technical without losing 95% of the audience (myself included), so I can’t say that I gained much understanding of what string theories are actually about. However, I did at least grasp where the boundaries of knowledge are in this field, which I think can best be classified using the epistemological scheme invented by philosopher/poet Donald Rumsfeld:
- Known knowns: The Standard Model of particle physics, which describes the behavior of particles in certain regimes (i.e. the experimentally accessible ones) to very high accuracy. And general relativity, which describes gravity in observable regimes.
- Known unknowns: What gravity does at energies where it’s comparable to the other three forces (it’s normally much weaker). Also, various mathematical quirks and inconsistencies in the Standard Model.
- Unknown knowns: Various string theories generate universes that look sort of like this one. But it’s unknown whether any of them do describe the actual universe, because they only make interesting predictions at energies much higher than could possibly be tested. (I believe the number cited in the talk was 1017 GeV; the best accelerators run at 103 GeV.) The connection to cosmology in the talk was in trying to explain the origins of the universe using string theory; out of all of the potential verieties of theories, a few do make testable predictions on observable phenomena like the cosmic microwave background due to how they address the Big Bang. So if we’re lucky enough to live in one of these universes, we could confirm it with certain astrophysical experiments.
- Unknown unknowns: And then there’s the possibilty that string theory isn’t the right answer, but rather something no one’s thought of yet. As Douglas Adams noted, “There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable.
There is another theory which states that this has already happened.” (String theory always reminds me of that quote.)
Now, for serious string theory blogging you should be reading Cosmic Variance, since I don’t really know much at all about the field beyond what can be communicated in an hour-long colloquium. However, I’m starting to understand why it’s interesting. (Also, it turns out that the guys shambling down the halls around here muttering about “braaaaanes” aren’t zombies but overworked string theorists. Oops.)
I just discovered that there are videos of the colloquia on the physics department website, here, so you can actually watch this talk if you’re interested. (It hasn’t been posted yet but probably will be within a week.) Another good one from this semester was “Cycles in Fossil Diversity” by Rich Muller, which was a study of what causes species to thrive or die out at apparently regular intervals in Earth’s history.