Friday, February 26, 2010

All the News That’s Fit to Print

Newspaper articles may not provide the most authoritative information about science and medicine, but they are probably the primary source of news about medical physics for the layman. Today, I will discuss some recent articles from one of the leading newspapers in the United States: the venerable New York Times.


Last week Russ Hobbie sent me a copy of an article in the February 16 issue of the NYT, titled “New Source of an Isotope in Medicine is Found.” It describes the continuing shortage of technetium-99m, a topic I have discussed before in this blog.
Just as the worldwide shortage of a radioactive isotope used in millions of medical procedures is about to get worse, officials say a new source for the substance has emerged: a nuclear reactor in Poland.

The isotope, technetium 99, is used to measure blood flow in the heart and to help diagnose bone and breast cancers. Almost two-thirds of the world’s supply comes from two reactors; one, in Ontario, has been shut for repairs for nine months and is not expected to reopen before April, and the other, in the Netherlands, will close for six months starting Friday.

Radiologists say that as a result of the shortage, their treatment of some patients has had to revert to inferior materials and techniques they stopped using 20 years ago.

But on Wednesday, Covidien, a company in St. Louis that purifies the material created in the reactor and packages it in a form usable by radiologists, will announce that it has signed a contract with the operators of the Maria reactor, near Warsaw, one of the world’s most powerful research reactors.
I doubt that relying on a Polish reactor is a satisfactory long-term solution to our 99mTc shortage, but it may provide crucial help with the short term crisis. A more promising permanent solution is described in a January 26 article on medicalphysicsweb.
GE Hitachi Nuclear Energy (GEH) announced today it has been selected by the U.S. Department of Energy’s National Nuclear Security Administration (NNSA) to help develop a U.S. supply of a radioisotope used in more than 20 million diagnostic medical procedures in the United States each year.
More information can be found at the American Association of Physicists in Medicine website. Let’s hope that this new initiative will prove successful.


The second topic I want to discuss today was called to my attention by my former student Phil Prior (PhD in Biomedical Sciences: Medical Physics, Oakland University, 2008). On January 26, the NYT published Walt Bogdanich’s article “As Technology Surges, Radiation Safeguards Lag.”
In New Jersey, 36 cancer patients at a veterans hospital in East Orange were overradiated—and 20 more received substandard treatment—by a medical team that lacked experience in using a machine that generated high-powered beams of radiation… In Louisiana, Landreaux A. Donaldson received 38 straight overdoses of radiation, each nearly twice the prescribed amount, while undergoing treatment for prostate cancer… In Texas, George Garst now wears two external bags—one for urine and one for fecal matter—because of severe radiation injuries he suffered after a medical physicist who said he was overworked failed to detect a mistake.

These mistakes and the failure of hospitals to quickly identify them offer a rare look into the vulnerability of patient safeguards at a time when increasingly complex, computer-controlled devices are fundamentally changing medical radiation, delivering higher doses in less time with greater precision than ever before.

Serious radiation injuries are still infrequent, and the new equipment is undeniably successful in diagnosing and fighting disease. But the technology introduces its own risks: it has created new avenues for error in software and operation, and those mistakes can be more difficult to detect. As a result, a single error that becomes embedded in a treatment plan can be repeated in multiple radiation sessions.
A related article by the same author, “Radiation Offers New Cures, and Ways to Do Harm,” was also published in the Gray Lady a few days earlier. These articles discuss recent medical mistakes in which patients have received much more radiation than originally intended. While somewhat sensational, the articles reinforce the importance of quality control in medical physics.

The NYT articles triggered a response from the American Association of Physicists in Medicine on January 28.
The American Association of Physicists in Medicine (AAPM) has issued a statement today in the wake of several recent articles in the New York Times yesterday and earlier in the week that discuss a number of rare but tragic events in the last decade involving people undergoing radiation therapy.

While it does not specifically comment on the details of these events, the statement acknowledges their gravity. It reads in part: “The AAPM and its members deeply regret that these events have occurred, and we continue to work hard to reduce the likelihood of similar events in the future.” The full statement appears here.

Today's statement also seeks to reassure the public on the safety of radiation therapy, which is safely and effectively used to treat hundreds of thousands of people with cancer and other diseases every year in the United States. Medical physicists in hospitals and clinics across the United States are board-certified professionals who play a key role in assuring quality during these treatments because they are directly responsible for overseeing the complex technical equipment used.

Taken together, the articles I’ve discussed today highlight some of the challenges that face the field of medical physics. For those who want additional background about the underlying physics and its applications to medicine, I recommend—you guessed it—the 4th edition of Intermediate Physics for Medicine and Biology.

Friday, February 19, 2010

The Electron Microscope

Intermediate Physics for Medicine and Biology does not discuss one of the most important instruments in modern biology: the electron microscope. If I were to add a very brief introduction about the electron microscope to Intermediate Physics for Medicine and Biology, I would put it right after Sec. 14.1, The Nature of Light: Waves Versus Photons. It would look something like this.
14.1 ½ De Broglie Wavelength and the Electron Microscope

Like light, matter can have both wave and particle properties. The French physicist Louis de Broglie derived a quantum mechanical relationship between a particle’s momentum p and wavelength λ

λ = h/p     (14.6 ½)

[Eisberg and Resnick (1985)]. For example, a 100 eV electron has a speed of 5.9 × 106 m s−1 (about 2% the speed of light), a momentum of 5.4 × 10−24 kg m s−1, and a wavelength of 0.12 nm.

The electron microscope takes advantage of the short wavelength of electrons to produce exquisite pictures of very small objects. Diffraction limits the spatial resolution of an image to about a wavelength. For a visible light microscope, this resolution is on the order of 500 nm (Table 14.2). For the electron microscope, however, the wavelength of the electron limits the resolution. A typical electron energy used for imaging is about 100 keV, implying a wavelength much smaller than an atom (however, practical limitations often limit the resolution to about 1 nm). Table 1.2 shows that viruses appear as blurry smears in a light microscope, but can be resolved with considerable detail in an electron microscope. In 1986, Ernst Ruska shared the Nobel Prize in Physics “for his fundamental work in electron optics, and for the design of the first electron microscope.”

Electron microscopes come in two types. In a transmission electron microscope (TEM), electrons pass through a thin sample. In a scanning electron microscope (SEM), a fine beam of electrons is raster scanned across the sample and secondary electrons emitted by the surface are imaged. In both cases, the image is formed in vacuum and the electron beam is focused using a magnetic lens.
To learn more, you can watch a YouTube video about the electron microscope. Nice collections of electron microscope images can be found at http://www.denniskunkel.com, http://www5.pbrc.hawaii.edu/microangela and http://www.mos.org/sln/SEM.

Structure and function of the electron microscope. 

Friday, February 12, 2010

Biomagnetism and Medicalphysicsweb

Medicalphysicsweb is an excellent website for news and articles related to medical physics. Several articles that have appeared recently are related to the field of biomagnetism, a topic Russ Hobbie and I cover in Chapter 8 of the 4th edition of Intermediate Physics for Medicine and Biology. I have followed the biomagnetism field ever since graduate school, when I made some of the first measurements of the magnetic field of an isolated nerve axon. Below I describe four recent articles from medicalphysicsweb.

A February 2 article titled “Magnetomometer Eases Cardiac Diagnostics” discusses a novel type of magnetometer for measuring the magnetic field of the heart. In Section 8.9 of our book, Russ and I discuss Superconducting Quantum Interference Device (SQUID) magnetometers, which have long been used to measure the small (100 pT) magnetic fields produced by cardiac activity. Another way to measure weak magnetic fields is to determine the Zeeman splitting of energy levels of a rubidium gas. The energy difference between levels depends on the external magnetic field, and is measured by detecting the frequency of optical light that is in resonance with this energy difference. Ben Varcoe, of the University of Leeds, has applied this technology to the heart by separating the magnetometer from the pickup coil:
The magnetic field detector—a rubidium vapour gas cell—is housed within several layers of magnetic shielding that reduce the Earth’s field about a billion-fold. The sensor head, meanwhile, is external to this shielding and contained within a handheld probe.
I haven’t been able to find many details about this device (such as if the pickup coils are superconducting or not, and why the pickup coil doesn’t transport the noise from the unshielded measurement area to the shielded detector), but Varcoe believes the device is a breakthrough in the way researchers can measure biomagnetic fields.

Another recent (February 10) article on medicalphysicsweb is about the effect of magnetic resonance imaging scans on pregnant women. As described in Chapter 18 of Intermediate Physics for Medicine and Biology, MRI uses a radio-frequency magnetic field to flip the proton spins so their decay can be measured, resulting in the magnetic resonance signal. This radio-frequency field induces eddy currents in the body that heat the tissue. Heating is a particular concern if the MRI is performed on a pregnant woman, as it could affect fetal development.
Medical physicists at Hammersmith Hospital, Imperial College London, UK, have now developed a more sophisticated model of thermal transport between mother and baby to assess how MRI can affect the foetal temperature (Phys. Med. Biol. 55 913)… This latest analysis takes account of heat transport through the blood vessels in the umbilical cord, an important mechanism that was ignored in previous models. It also includes the fact that the foetus is typically half a degree warmer than the mother – another key piece of information overlooked in earlier work.
Russ and I discuss these issues in Sec. 14.10: Heating Tissue with Light, where we derive the bioheat equation. The authors of the study, Jeff Hand and his colleagues, found that under normal conditions, fetal heating wasn’t a concern, but if exposed to 7.5 minutes of continuous RF field (unlikely during MRI) the temperature increase could be significant.

In a January 27 article, researchers from the University of Minnesota (Russ’s institution) use magnetoencephalography to diagnose post-traumatic stress disorder.
Post-traumatic stress disorder (PTSD) is a difficult condition to diagnose definitively from clinical evidence alone. In the absence of a reliable biomarker, patients’ descriptions of flashbacks, worry and emotional numbness are all doctors have to work with. Researchers from the University of Minnesota Medical School (Minneapolis, MN) have now shown how magnetoencephalography (MEG) could identify genuine PTSD sufferers with high confidence and without the need for patients to relive painful past memories (J. Neural Eng. 7 016011).
The magnetoencephalogram is discussed in Sec. 8.5 of Intermediate Physics for Medicine and Biology. The data for the Minnesota study was obtained using a 248-channel SQUID magnetometer. The researchers analyzed data from 74 patients with post-traumatic stress disorder known to the VA hospital in Minneapolis, and 250 healthy controls. The authors claim that the accuracy of the test is over 90%.

Finally, a February 8 article describes a magnetic navigation system installed in Taiwan by the company Stereotaxis.
The Stereotaxis System is designed to enable physicians to complete more complex interventional procedures by providing image guided delivery of catheters and guidewires through the blood vessels and chambers of the heart to treatment sites. This is achieved using computer-controlled, externally applied magnetic fields that govern the motion of the working tip of the catheter or guidewire, resulting in improved navigation, shorter procedure time and reduced x-ray exposure.
The system works by having ferromagnetic material in a catheter tip, and an applied magnetic field that can be adjusted to “steer” the catheter through the blood vessels. We discuss magnetic forces in Sec. 8.1 of Intermediate Physics for Medicine and Biology, and ferromagnetic materials in Sec. 8.8.

Although I believe medicalphysicsweb is extremely useful for keeping up-to-date on developments in medical physics, I find that often their articles either have specialized physics concepts that the layman may not understand or, more often, don’t address the underlying physics at all. Yet, one can’t understand modern medicine without mastery of the basic physics concepts. My browsing through medicalphysicsweb convinced me once again about the importance of learning how physics can be applied to medicine and biology. Perhaps I am biased, but I think that studying from the 4th edition of Intermediate Physics for Medicine and Biology is a great way to master these important topics.

Friday, February 5, 2010

Beta Decay and the Neutrino

In Section 17.4 in the 4th edition of Intermediate Physics for Medicine and Biology, Russ Hobbie and I discuss beta decay the neutrino.
The emission of a beta-particle is accompanied by the emission of a neutrino… [which] has no charge and no rest mass… [and] hardly interacts with matter at all… A particle that seemed originally to be an invention to conserve energy and angular momentum now has a strong experimental basis.
Understanding Physics: The Electron, Proton, and Neutron, by Isaac Asimov, superimposed on Intermediate Physics for Medicine and Biology.
Understanding Physics:
The Electron, Proton, and Neutron,
by Isaac Asimov.
Our wording implies there is a story behind this particle “that seemed originally to be an invention to conserve energy.” Indeed, that is the case. I will let Isaac Asimov tell this tale. (Asimov's books, which I read in high school, influenced me to become a scientist.) The excerpt below is from Chapter 14 of his book Understanding Physics: The Electron, Proton, and Neutron.
In Chapter 11, disappearance in mass during the course of nuclear reactions was described as balanced by an appearance of energy in accordance with Einstein’s equation, e=mc2. This balance also held in the case of the total annihilation of a particle by its anti-particle, or the production of a particle/anti-particle pair from energy.
Nevertheless, although in almost all such cases the mass-energy equivalence was met exactly, there was one notable exception in connection with radioactive radiations.

Alpha radiation behaves in satisfactory fashion. When a parent nucleus breaks down spontaneously to yield a daughter nucleus and an alpha particle, the sum of the mass of the two products does not quite equal the mass of the original nucleus. The difference appears in the form of energy—specifically, as the kinetic energy of the speeding alpha particle. Since the same particles appear as products at every breakdown of a particular parent nucleus, the mass-difference should always be the same, and the kinetic energy of the alpha particles should also always be the same. In other words, the beam of alpha particles should be monoenergetic. This was, in essence, found to be the case…

It was to be expected that the same considerations would hold for a parent nucleus breaking down to a daughter nucleus and a beta particle. It would seem reasonable to suppose that the beta particles would form a monoenergetic beam too…

Instead, as early as 1900, Becquerel indicated that beta particles emerged with a wide spread of kinetic energies. By 1914, the work of James Chadwick demonstrated the “continuous beta particle spectrum” to be undeniable.

The kinetic energy calculated for a beta particle on the basis of mass loss turned out to be a maximum kinetic energy that very few obtained. (None surpassed it, however; physicists were not faced with the awesome possibility of energy appearing out of nowhere.)

Most beta particles fell short of the expected kinetic energy by almost any amount up to the maximum. Some possessed virtually no kinetic energy at all. All told, a considerable portion of the energy that should have been present, wasn’t present, and through the 1920’s this missing energy could not be detected in any form.

Disappearing energy is as insupportable, really, as appearing energy, and though a number of physicists, including, notably, Niels Bohr, were ready to abandon the law of conservation of energy at the subatomic level, other physicists sought desperately for an alternative.

In 1931, an alternative was suggested by Wolfgang Pauli. He proposed that whenever a beta particle was produced, a second particle was also produced, and that the energy that was lacking in the beta particle was present in the second particle.

The situation demanded certain properties of the hypothetical particle. In the emission of beta particles, electric charge was conserved; that is, the net charge of the particles produced after emission was the same as that of the original particle. Pauli’s postulated particle therefore had to be uncharged. This made additional sense since, had the particle possessed a charge, it would have produced ions as it sped along and would therefore have been detectable in a cloud chamber, for instance. As a matter of fact, it was not detectable.

In addition, the total energy of Pauli’s projected particle was very small—only equal to the missing kinetic energy of the electron. The total energy of the particle had to include its mass, and the possession of so little energy must signify an exceedingly small mass. It quickly became apparent that the new particle had to have a mass of less than 1 percent of the electron and, in all likelihood, was altogether massless.

Enrico Fermi, who interested himself in Pauli’s theory at once, thought of calling the new particle a “neutron,” but Chadwick, at just about that time, discovered the massive, uncharged particle that came to be known by that name. Fermi therefore employed an Italian diminutive suffix and named the projected particle the neutrino (“little neutral one”), and it is by that name that it is known.