Friday, June 26, 2020

Eric Betzig, Biological Physicist

Important advances in fluorescence microscopy highlight the interaction of physics and biology. This effort is led by Eric Betzig of Berkeley, winner of the 2014 Chemistry Nobel Prize. Betzig obtained his bachelor’s and doctorate degrees in physics, and only later began collaborating with biologists. He is a case-study for how physicists can contribute to the life sciences, a central theme of Intermediate Physics for Medicine and Biology.

If you want to learn about Betzig’s career and work, watch the video at the bottom of this post. In it, he explains how designing a new microscope requires trade-offs between spatial resolution, temporal resolution, imaging depth, and phototoxicity. Many super-resolution fluorescence microscopes (having extraordinarily high spatial resolution, well beyond the diffraction limit) require intense light sources, which cause bleaching or even destruction of the fluorophore. This phototoxicity arises because the excitation light illuminates the entire sample, although much of it doesn’t contribute to the image (as in a confocal microscope). Moreover, microscopes with high spatial resolution must acquire a huge amount of data to form an image, which makes them too slow to follow the rapid dynamics of a living cell.

Eric Betzig’s explanation of the trade-offs between spatial resolution, temporal resolution, imaging depth, and phototoxicity.

Betzig’s key idea is to trade lower spatial resolution for improved temporal resolution and less phototoxicity, creating an unprecedented tool for imaging structure and function in living cells. The figure below illustrates his light-sheet fluorescence microscope.

A light-sheet fluorescence microscope.
The sample (red) is illuminated by a thin sheet of short-wavelength excitation light (blue). This light excites fluorescent molecules in a thin layer of the sample; the position of the sheet can be varied in the z direction, like in MRI. For each slice, the long-wavelength fluorescent light (green) is imaged in the x and y directions by the microscope with its objective lens.

The advantage of this method is that only those parts of the sample to be imaged are exposed to excitation light, reducing the total exposure and therefore the phototoxicity. The thickness of the light sheet can be adjusted to set the depth resolution. The imaging by the microscope can be done quickly, increasing its temporal resolution.

A disadvantage of this microscope is that the fluorescent light is scattered as it passes through the tissue between the light sheet and the objective. However, the degradation of the image can be reduced with adaptive optics, a technique used by astronomers to compensate for scattering caused by turbulence in the atmosphere.

Listen to Betzig describe his career and research in the hour-and-a-half video below. If you don’t have that much time, or you are more interested in the microscope than in Betzig himself, watch the eight-minute video about recent developments in the Advanced Bioimaging Center at Berkeley. It was produced by Seeker, a media company that makes award-winning videos to explain scientific innovations.

Enjoy!

A 2015 talk by Eric Betzig about imaging life at high spatiotemporal resolution.

“Your Textbooks Are Wrong, This Is What Cells Actually Look Like.” Produced by Seeker.

Friday, June 19, 2020

The Berkeley Physics Course

In Intermediate Physics for Medicine and Biology, Russ Hobbie and I cite two volumes of the Berkeley Physics Course: Volume II about electricity and magnetism, and Volume V about statistical mechanics. This five-volume set provides a wonderful introduction to physics. Its preface states
This is a two-year elementary college physics course for students majoring in science and engineering. The intention of the writers has been to present elementary physics as far as possible in the way in which it is used by physicists working on the forefront of their field. We have sought to make a course which would vigorously emphasize the foundations of physics. Our specific objectives were to introduce coherently into an elementary curriculum the ideas of special relativity, of quantum physics, and of statistical physics.

The course is intended for any student who has had a physics course in high school. A mathematics course including the calculus should be taken at the same time as this course….

The five volumes of the course as planned will include:
I. Mechanics (Kittel, Knight, Ruderman)
II. Electricity and Magnetism (Purcell)
III. Waves and Oscillations (Crawford)
IV. Quantum Physics (Wichmann)
V. Statistical Physics (Reif)
 ...The initial course activity led Alan M. Portis to devise a new elementary physics laboratory.
Statistical Physics, Volume 5 of the Berkeley Physics Course, by Frederick Reif, superimposed upon Intermediate Physics for Medicine and Biology.
Statistical Physics,
Volume 5 of the Berkeley Physics Course,
by Frederick Reif.
Chapter 3 of IPMB is modeled in part on Volume V by Frederick Reif.
Preface to Volume V

The last volume of the Berkeley Physics Course is devoted to the study of large-scale (i.e., macroscopic) systems consisting of many atoms or molecules: thus it provides an introduction to the subjects of statistical mechanics, kinetic theory, thermodynamics, and heat…My aim has been … to adopt a modern point of view and to show, in as systematic and simple a way as possible, how the basic notions of atomic theory lead to a coherent conceptual framework capable of describing and predicting the properties of macroscopic systems.
I love Reif’s book, in part because of nostalgia: it’s the textbook I used in my undergraduate thermodynamics class at the University of Kansas. His Chapter 4 is similar to IPMB’s Chapter 3, where the concepts of heat transfer, absolute temperature, and entropy are shown to result from how the number of states depends on energy. Boltzmann’s factor is derived, and the two-state magnetic system important in magnetic resonance imaging is analyzed. Reif even has short biographies of famous scientists who worked on thermodynamics—such as Boltzmann, Kelvin, and Joule—which I think of as little blog posts built into the textbook. If you want more detail, Reif also has a larger book about statistical and thermal physics that we also cite in IPMB.

Electricity and Magnetism, Volume 2 of the Berkeley Physics Course, by Edward Purcell, superimposed on Intermediate Physics for Medicine and Biology.
Electricity and Magnetism,
Volume 2 of the Berkeley Physics Course,
by Edward Purcell.
Russ and I sort of cite Edward Purcell’s Volume II of the Berkeley Physics Course. Earlier editions of IPMB cited it, but in the 5th edition we cite the book Electricity and Magnetism by Purcell and Morin (2013). It is nearly equivalent to Volume II, but is an update by an additional author. If you want to gain insight into electricity and magnetism, you should read Purcell.
Preface to Volume II

The subject of this volume of the Berkeley Physics Course is electricity and magnetism. The sequence of topics, in rough outline, is not unusual: electrostatics; steady currents; magnetic field; electromagnetic induction; electric and magnetic polarization in matter. However, our approach is different from the traditional one. The difference is most conspicuous in Chaps. 5 and 6 where, building on the work of Vol. I, we treat the electric and magnetic fields of moving charges as manifestations of relativity and the invariance of electric charge.
I love Purcell’s book, but introducing magnetism as a manifestation of special relativity is not the best way to teach the subject to students of biology and medicine. In IPMB we never adopt this view except in a couple teaser homework problems (8.5 and 8.26).

IPMB doesn’t cite Volumes I, III, or IV of the Berkeley Physics Course. If we did, where in the book would those citations be? Kittel, Knight, and Ruderman’s Volume I covers classical mechanics. They analyze the dynamics of particles in a cyclotron so that we could cite it in Chapter 8 of IPMB, and they describe the harmonic oscillator so we could cite it in our Chapter 10. Crawford’s Volume III on waves could be cited in Chapter 13 of IPMB about sound and ultrasound. Wichmann’s Volume IV on quantum mechanics would fit well in the first part of our Chapter 14 on atoms and light.

Do universities adopt the Berkeley Physics Course textbooks anymore? I doubt it. The series is out-of-date, having been published in the 1960s. The use of cgs rather than SI units makes the books seem old fashioned. The preface says it’s a two-year introduction to physics (five semesters, one semester for each book), while most schools offer a one-year (two-semester) sequence. The books don’t have the flashy color photos so common in modern introductory texts. Nevertheless, if you were introduced to physics through the Berkeley Physics Course, you would have a strong grasp of physics fundamentals, and would have more than enough preparation for a course based on Intermediate Physics for Medicine and Biology.

Friday, June 12, 2020

Atomic Accidents

Reading Atomic Accidents,  by Jim Mahaffey,  in my home office, with Intermediate Physics for Medicine and Biology nearby.
Reading Atomic Accidents,
by Jim Mahaffey,
in my home office.
The Oakland University library has online access to the book Atomic Accidents: A History of Nuclear Meltdowns and Disasters, From the Ozark Mountains to Fukushima, by Jim Mahaffey. I’m glad they do; with the library still locked up because of the coronavirus pandemic, I wouldn’t have been able to check out a paper copy. The book is more about nuclear engineering than nuclear medicine, but the two fields intersect during nuclear accidents, so it’s relevant to readers of Intermediate Physics for Medicine and Biology.

In his introduction, Mahaffey compares the 20th century invention of nuclear power to the 19th century development of steam-powered trains. Then he writes
In this book we will delve into the history of engineering failures, the problems of pushing into the unknown, and bad luck in nuclear research, weapons, and the power industry. When you see it all in one place, neatly arranged, patterns seem to appear. The hidden, underlying problems may come into focus. Have we been concentrating all effort in the wrong place? Can nuclear power be saved from itself, or will there always be another problem to be solved? Will nuclear fission and its long-term waste destroy civilization, or will it make civilization possible?

Some of these disasters you have heard about over and over. Some you have never heard of. In all of them, there are lessons to be learned, and sometimes the lessons require multiple examples before the reality sinks in. In my quest to examine these incidents, I was dismayed to find that what I thought I knew, what I had learned in the classroom, read in textbooks, and heard from survivors could be inaccurate. A certain mythology had taken over in both the public and the professional perceptions of what really happened. To set the record straight, or at least straighter than it was, I had to find and study buried and forgotten original reports and first-hand accounts. With declassification at the federal level, ever-increasing digitization of old documents, and improvements in archiving and searching, it is now easier to see what really happened.

So here, Gentle Reader, is your book of train wrecks, disguised as something in keeping with our 21st century anxieties. In this age, in which we strive for better sources of electrical and motive energy, there exists a deep fear of nuclear power, which makes accounts of its worst moments of destruction that much more important. The purpose of this book is not to convince you that nuclear power is unsafe beyond reason, or that it will lead to the destruction of civilization. On the contrary, I hope to demonstrate that nuclear power is even safer than transportation by steam and may be one of the key things that will allow life on Earth to keep progressing; but please form your own conclusions. The purpose is to make you aware of the myriad ways that mankind can screw up a fine idea while trying to implement it. Don’t be alarmed. This is the raw, sometimes disturbing side of engineering, about which much of humanity has been kept unaware. You cannot be harmed by just reading about it.

That story of the latest nuclear catastrophe, the destruction of the Fukushima Daiichi plant in Japan, will be held until near the end. We are going to start slowly, with the first known incident of radiation poisoning. It happened before the discovery of radiation, before the term was coined, back when we were blissfully ignorant of the invisible forces of the atomic nucleus.
I’ll share just one accident that highlights some of the issues with reactor safety discussed by Mahaffey. It took place at the Chalk River reactor in Ontario, Canada, about 300 miles northeast of Oakland University, as the crow flies.

I found several parallels between the Chalk River and Chernobyl accidents (readers might want to review my earlier post about Chernobyl before reading on). Both hinged on the design of the reactor, and in particular on the type of moderator used to slow neutrons. Both highlight how human error can overcome the most careful of safety designs. Their main difference is that Chalk River was a minor incident while Chernobyl was a catastrophe.

The Chalk River reactor began as a Canadian-British effort during World War II that operated in parallel to America’s Manhattan Project. It’s development has more in common with the plutonium-producing Hanford Site in Washington state than with the bomb-building laboratory in Los Alamos. After Enrico Fermi and his team built the first nuclear reactor in Chicago using graphite as the moderator, the Canadian-British team decided to explore moderation by heavy water. In 1944 they began to build a low-power experimental reactor along the Chalk River. For safety, the reactor has a scram consisting of heavy cadmium plates that would absorb neutrons and would lower into the reactor if a detector recorded too high of a neutron flux, shutting down nuclear fission. The energy production was controlled by raising and lowering the level of heavy water, which could be pumped into the reactor by pushing a switch. As a safety precaution, the pump would turn off after 10 seconds unless the switch was pushed again. To power up the reactor, the operator had to push the switch over and over.

In the summer of 1950 an accident occurred. Two physicists were going to test a new fuel rod design, so the reactor was shut down. The operator knew he would have to push the heavy water button many times to restart the reactor, so he began early, before the physicists were done installing the rod. Growing tired of repeatedly pushing the button, he shoved a wood chip into the switch so it was stuck on. Then the phone rang, and he was distracted by the call. The reactor went supercritical and the two physicists were doused with gamma radiation until the cadmium plates descended. Fortunately, the plates shut down the reactor before too much damage was done, and the physicists survived. Yet, the accident provides many lessons, including how human error can cause the best laid plans to go awry.

Later, a much larger reactor was built at Chalk River, and Mahaffey tells more horror stories about subsequent accidents, including one that required months of cleanup that was led in part by future President Jimmy Carter.

This story is a sample of what you’ll find in Atomic Accidents. Mahaffey describes all sorts of mishaps, from a sodium-cooled plutonium breeder reactor that in the 1960s that almost lost Detroit (Yikes, that's just down the road from where I sit writing this post), to a variety of incidents in which an atomic bomb (usually not armed) was damaged or lost, to the frightening Kyshtym disaster in 1957 at the Mayak plutonium production site in Russia. He ends the book by describing the better-known accidents at Three-Mile Island, Chernobyl, and Fukushima.

I didn’t realize how all-or-nothing an atomic reactor is. The nuclear fuel is below a critical mass and inert until it reaches a threshold of criticality, at which point it promptly releases a burst of energy and neutrons. Usually it doesn’t blow up like a bomb, because it typically melts before the chain reaction can reach truly explosive proportions. Mahaffey has all sorts of terrifying tales. One begins with a fissile material such as uranium-235 or plutonium-239 dissolved in water; A technician pores the water from one container to another with a more spherical shape, resulting in a flash of neutrons and gamma rays that deliver a lethal dose of radiation.

After scaring us all to death, Mahaffey ends on an upbeat note.
The dangers of continuing to expand nuclear power will always be there, and there could be another unexpected reactor meltdown tomorrow, but the spectacular events that make a compelling narrative may be behind us now. We have learned from each incident. As long as nuclear engineering can strive for new innovations and learn from its history of accidents and mistakes, the benefits that nuclear power can yield for our economy, society, and yes, environment will come.
Atomic Accidents reminded me of Henry Petroski’s wonderful To Engineer is Human: The Role of Failure in Successful Design. The thesis of both books is that you can learn more by examining how things fail than how things succeed. If you want to understand nuclear engineering, the best way is to study atomic accidents.

Friday, June 5, 2020

Pneumoencephalography

How did neuroradiologists image the brain before the invention of computed tomography and magnetic resonance imaging? They used a form of torture called pneumoencephalography. Perhaps the greatest contribution of CT and MRI—both discussed in Intermediate Physics for Medicine and Biology—was to make pneumoencephalography obsolete.

In their article “Evolution of Diagnostic Neuroradiology from 1904 to 1999,” (Radiology, Volume 217, Pages 309-318, 2000), Norman Leeds and Stephen Kieffer describe this odious procedure.
Pneumoencephalography was performed by successively injecting small volumes of air via lumbar puncture and then removing small volumes of cerebrospinalfluid with the patient sitting upright and the head flexed... Pneumoencephalography was used primarily to determine the presence and extent of posterior fossa or cerebellopontine angle tumors, pituitary tumors, and intraventricular masses... It was also used to rule out the presence of lesions affecting the cerebrospinal fluid spaces in patients with possible communicating hydrocephalus or dementia... After the injection of a sufficient quantity of air, the patient was rotated, somersaulted, or placed in a decubitus position to depict the entire ventricularsystem and subarachnoid spaces. These patients were often uncomfortable, developed severe headaches, and became nauseated or vomited.
In Chapter 7 of the book Radiology 101: The Basics and Fundamentals, Wilbur Smith shares this lurid tale.
The early brain imaging techniques… involved such gruesome activities as injecting air into the spinal canal (pneumoencephalography) and rolling the patient about in a specially devised torture chair. Few patients willingly returned for another one of those examinations!
In her book The Immortal Life of Henrietta Lacks, Rebecca Skloot writes
I later learned that while Elsie was at Crownsville, scientists often conducted research on patients there without consent, including one study titled “Pneumoencephalographic and skull X-ray studies in 100 epileptics.” Pneumoencephalography was a technique developed in 1919 for taking images of the brain, which floats in a sea of liquid. That fluid protects the brain from damage, but makes it very difficult to X-ray, since images taken through fluid are cloudy. Pneumoencephalography involved drilling holes into the skulls of research subjects, draining the fluid surrounding their brains, and pumping air or helium into the skull in place of the fluid to allow crisp X-rays of the brain through the skull. the side effects—crippling headaches, dizziness, seizures, vomiting—lasted until the body naturally refilled the skull with spinal fluid, which usually took two to three months. Because pneumoencephalography could cause permanent brain damage and paralysis, it was abandoned in the 1970s.
Russ Hobbie claims that the development of CT deserved the Nobel Peace Prize in addition to the Nobel Prize in Physiology or Medicine!

The application of physics to medicine and biology isn’t just to diagnose diseases that couldn’t be diagnosed before. It also can help replace barbaric procedures by ones that are more humane.

A scene from The Exorcist, in which Regan undergoes pneumoencephalography.