Friday, March 11, 2011

Retinal Injuries from a Handheld Laser Pointer

Are laser pointers safe? Apparently, it depends on the laser pointer. A recent article by Christine Negroni in the New York Times (Feb. 28, 2011) states that
Eye doctors around the world are warning that recent cases of teenagers who suffered eye damage while playing with high-power green laser pointers are likely to be just the first of many.
Negroni cites a letter that appeared last September in the New England Journal of Medicine (Wyrsch, Baenninger, and Schmid, “Retinal Injuries from a Handheld Laser Pointer,” N. Engl. J. Med., Volume 363, Pages 1089–1091, 2010), which says
In the past, laser pointers sold to the public had a maximal output of 5 mW, which is regarded as harmless because the human eye protects itself with blink reflexes. The measured output of the laser in [the case of a person who was injured] was 150 mW. The use of lasers that are threatening to the eye is normally restricted to occupational and military environments; laser accidents outside these fields are very rare. However, powerful laser devices, with a power of up to 700 mW, are now easily obtainable through the Internet, despite government restrictions. These high-power lasers are advertised as “laser pointers” and look identical to low-power pointers. The much higher power of such devices may produce immediate, severe retinal injury. Despite their potential to cause blinding, such lasers are advertised as fun toys and seem to be popular with teenagers. In addition, Web sites now offer laser swords and other gadgets that use high-power lasers.
I attended a talk just last week where the speaker waved his green laser pointer around like a light saber. I don’t know the power of his pointer, but I wonder if I was in danger.

One concern arises from the bozos who point lasers at airplanes. The U.S. Congress plans to toughen the laws on this sort of horseplay, making shining a laser at a plane a federal crime with up to five years imprisonment. I’m all for high school students learning science by hands-on activities, but do it right. Buy a 5 mW red helium-neon laser pointer and use it safely to do some optics experiments (I suggest observing Young’s double slit interference pattern). Don’t buy a 700 mW green laser pointer and start shining it up into the sky! Do you think I’m being a schoolmarm out to ruin your fun? Consider this: the website laserpointersafety.com reports that
A $5000 reward is being offered for information leading to the arrest of the person(s) who aimed a laser into the cockpit of a Southwest Airlines flight approaching Baltimore-Washington International Airport. The flight, which originated in Milwaukee, was 2000 feet over the town of Millersville, near Old Mill Road and Kenora Drive, when it was illuminated around 6:45 pm on Sunday, Feb. 20, 2011. Millersville is about 8 miles from BWI Airport.
You better be careful; someone may be watching.

How do you tell the difference between a safe, educational experience and a potentially disastrous prank? You begin by learning about light and its biological impact. Russ Hobbie and I discuss light in Chapter 14 of the 4th edition of Intermediate Physics for Medicine and Biology. We address topics related to light and safety, although we don’t analyze the particular concern of laser damage to the eye. For instance, we discuss how ultraviolet light damages the eye (Section 14.9.6) and how light can be used to heat tissue (Section 14.10), as well as a detailed discussion of radiometry (the measurement of radiant energy, Section 14.11) and the anatomy and optics of the eye (Section 14.12).

In another New York Times article, Negroni relates how high powered laser pointers can pose a risk to pilots. And on her blog, she explains why helicopters may be at a greater risk than airplanes.
A helicopter cockpit has glass extending below the level of the pilots' eyes toward the ground exactly where the lasers are. Rotor craft fly at low altitudes over residential areas and busy highways. They are not flying autopilot and they may be piloted by a single person. They hover and may make inviting targets. That was the case on Tuesday when a Los Angeles television station sent its chopper to follow and report on the police activity and it was hit by a laser.
The interaction of laser light and vision is one more example of why a firm understanding of physics applied to medicine and biology is so important.

Friday, March 4, 2011

The Role of Magnetic Forces in Biology and Medicine

The current issue of the journal Experimental Biology and Medicine contains a minireview about “The Role of Magnetic Forces in Biology and Medicine” by yours truly (Volume 236, Pages 132–137). It fits right in with Section 8.1 (The Magnetic Force on a Moving Charge) in the 4th Edition of Intermediate Physics for Medicine and Biology. The abstract states:
The Lorentz force (the force acting on currents in a magnetic field) plays an increasingly larger role in techniques to image current and conductivity. This review will summarize several applications involving the Lorentz force, including (1) magneto-acoustic imaging of current; (2) “Hall effect” imaging; (3) ultrasonically-induced Lorentz force imaging of conductivity; (4) magneto-acoustic tomography with magnetic induction; and (5) Lorentz force imaging of action currents using magnetic resonance imaging.
The review was easy to write, because it consisted primarily of the background and significance section of a National Institutes of Health grant proposal I wrote several years ago (and which is now funded). The review describes ground-breaking work by many authors, but here I want to highlight studies by three talented undergraduate students who worked with me at Oakland University during several summers.

Kaytlin Brinker

Kayt studied a method to measure conductivity called Magneto-Acoustic Tomography with Magnetic Induction, or MAT-MI (Brinker and Roth, “The Effect of Electrical Anisotropy During Magnetoacoustic Tomography with Magnetic Induction,” IEEE Transactions on Biomedical Engineering, Volume 55, Pages1637–1639, 2008). This technique was developed by Bin He and his group at the University of Minnesota. You apply two magnetic fields, one static and one changing with time. The rapidly changing magnetic field induces eddy currents in the tissue, which then experience a Lorentz force from the static field, causing the material to move and initiating a sound wave. Measurement of the acoustic signal allows you to gain information about the conductivity distribution. Kayt’s task was to determine how anisotropy (the conductivity depends on direction) would influence MAT-MI. She “found that when imaging nerve or muscle, electrical anisotropy can have a significant effect on the acoustic signal and must be accounted for in order to obtain accurate images.”

Nancy Tseng

Nancy, who had just graduated from high school when she worked with me, analyzed a technique originally pioneered by Han Wen and then developed further by Amalric Montalibet. A sound wave is propagated through the tissue in the presence of a magnetic field. The Lorentz force causes charge separation, inducing an electrical potential and current. Measurement of the electrical signal provides information about the conductivity. Tseng looked at this effect in anisotropic tissue (Tseng and Roth, “The Potential Induced in Anisotropic Tissue by the Ultrasonically-Induced Lorentz Force,” Medical and Biological Engineering and Computing, Volume 46, Pages 195–197, 2008). She found “a novel feature of the ultrasonically-induced Lorentz force in anisotropic tissue: an oscillating electrical potential propagates along with the ultrasonic wave.” The effect has not yet been measured experimentally, but represents a fundamentally new mechanism for the induction of bioelectric signals.

Kevin Schalte

Kevin derived a tomographic method to determine tissue conductivity using the ultrasonically-induced Lorentz force (Roth and Schalte, “ Ultrasonically-Induced Lorentz Force Tomography,” Medical and Biological Engineering and Computing, Volume 47, Pages 573-577, 2009). “The strength and timing of the electric dipole caused by the ultrasonically-induced Lorentz force determines the amplitude and phase of the Fourier transform of the conductivity image. Electrical measurements at a variety of [ultrasonic] wavelengths and directions are therefore equivalent to mapping the Fourier transform of the conductivity distribution in spatial frequency space. An image of the conductivity itself is then found by taking the inverse Fourier transform.” I would never have undertaken this project had I not been a coauthor on the 4th edition of Intermediate Physics for Medicine and Biology. Only by working on the textbook did I come to fully understand and appreciate the power of tomography (see Chapter 12 on Images and Section 16.9 about Computed Tomography).

I often read about how the United States is falling behind other nations in math and science, but working with outstanding undergraduates such as these three gives me confidence that we remain competitive.

Finally, let me reproduce the all-important acknowledgments section of the minireview:
I thank Steffan Puwal and Katherine Roth [my daughter] for their comments on this manuscript. I also thank Bruce Towe, Han Wen, Amalric Montalibet and Xu Li for permission to reproduce their figures in this review. This work was supported by the National Institutes of Health grant R01EB008421.

Friday, February 25, 2011

Round-Number Handbook of Physics for Medicine and Biology

The 4th edition of Intermediate Physics for Medicine and Biology contains a list of fundamental constants in Appendix O. Russ Hobbie and I got the values of these constants from a 2002 study, but the National Institute of Science and Technology (NIST) website we cite no longer exists. A new NIST website, http://physics.nist.gov/cuu/Constants/index.html, gives the most up-to-date values for these constants, often including many significant figures.

For some applications, knowing the electron mass to, say, nine significant figures is important. But in biology and medicine, most quantities are not known with such precision. If a number is known to one percent, that is impressive. When I teach biological and medical physics, I would much rather see my students have an approximate feel for the size of important constants, without having them bother to memorize more than one or two significant figures. To know that the speed of light is 299,792,458 m/s is nice, but what I really want them to remember (forever!) is that the speed of light is about 3 × 108 m/s. If they need more precision, they can look it up.

Edward Purcell, one of the great ones, published his “Round-Number Handbook of Physics” in the January 1983 issue of the American Journal of Physics. He presented a list of important physical constants, but only to one or two significant figures. It was meant not as a reference to look up precise values, but as a list of approximate values that every physicist should know without needing to consult a reference. Unfortunately, Purcell used cgs units, which are becoming more and more obsolete.

Below I present my version of a “Round-Number Handbook of Physics for Medicine and Biology”. I take the constants from Appendix O and approximate them as round numbers in (mostly) mks units. These are the numbers you should remember.

cSpeed of light3 × 108 m/s
eElementary charge1.6 × 10−19 C
FFaraday constant105 C/mole
gAcceleration of gravity10 m/s2
hPlanck’s constant2π × 10−34 J s
Planck’s constant (reduced)10−34 J s
kBBoltzmann’s constant1.4 × 10−23 J/K
meElectron mass9 × 10−31 kg
mec2Electron rest energy0.5 MeV
mpProton mass1.7 × 10−27 kg
mpc2Proton rest energy1000 MeV
NAAvogadro’s number6 × 1023 1/mole
reClassical radius of the electron3 × 10−15 m
RGas constant8 J/(mole K)


2 cal/(mole °C)
ε0Electrical permittivity9 × 10−12 F/m
1/4πε0Coulomb’s law constant9 × 109 N m2/C2
σSB Stefan-Boltzmann constant 6 × 10−8 W/(m2 K4)
λCCompton wavelength of the electron2.4 pm
μBBohr magneton10−23 J/T
μ0Magnetic permeability4π × 10−7 H/m
μNNuclear magneton5 × 10−27 J/T

Friday, February 18, 2011

Tc-99m Production: Losing the Reactor

Periodically in this blog I have discussed the growing technetium-99m shortage that faces medical physics (see, for instance, here, here, here, and here). Russ Hobbie and I discuss technetium in the 4th edition of Intermediate Physics for Medicine and Biology.
The most widely used isotope is 99mTc. As its name suggests, it does not occur naturally on earth, since it has no stable isotopes … The isotope is produced in the hospital from the decay of its parent, 99Mo, which is a fission product of 235U and can be separated from about 75 other fission products. The 99Mo decays to 99mTc.
Interestingly, the 99mTc shortage here in the United States may be solved in part by our friends up north (or, for those of us living in the Detroit area, our friends down south; look at a map), the Canadians. You can learn more in an article on medicalphysicsweb.org (and I hope you are a regular reader of that very useful website).
Technetium-99m (Tc-99m) is the most widely used medical imaging isotope, employed in more than 30 million procedures worldwide each year. The isotope is created via decay of molybdenum-99 (Mo-99), which itself is produced in nuclear reactors. And herein lies the problem.

The nuclear reactor is needed to generate neutrons that bombard uranium-235 targets, with the resulting fission reaction producing Mo-99 around 6% of the time. This Mo-99 then decays into Tc-99m. Unfortunately, over 90% of the world’s Mo-99 is produced by just five ageing reactors, resulting in an extremely fragile supply chain - the vulnerability of which was highlighted recently when unexpected shutdowns and routine maintenance closures combined to create serious shortages.

But there are other ways to create Tc-99m, and ways that don’t require nuclear reactors or a uranium target—itself a cause for concern as most facilities currently process highly-enriched (weapons-grade) uranium. Instead, researchers are investigating production methods based on cyclotrons and linear accelerators. Such processes exploit nuclear reactions within targets of Mo-100, bypassing the need for uranium completely.

In a bid to advance such technologies, the government of Canada has invested $35 million in four development programmes. The projects are headed up by: TRIUMF (Vancouver, BC); Canadian Light Source (Saskatoon, SK); Advanced Cyclotron Systems (Richmond, BC); and Prairie Isotope Production Enterprise (Winnipeg, MB) ….

In terms of practical implementation, the cyclotron-based method produces Tc-99m, which has a half-life of just six hours and must therefore be manufactured at or very near to clinical sites. This approach can, however, take advantage of a wide network of existing medical cyclotrons.

The electron accelerator approach creates Mo-99, which has a half-life of 66 hours and, as such, can be shipped. “One or two linacs could probably supply most of Canada,” Barnard said. This method also benefits from being more similar to, and thus able to exploit, the existing Tc-99m supply chain based on shipping of Mo-99.
The article was written by medicalphysicsweb’s editor, Tami Freeman, who has worked as a journalist for the Institute of Physics for the last dozen years, and who has a PhD in physics.

P.S. There is a nice article in the February issue of Physics Today about U.S. attempts to address the Tc-99m shortage (see the comments to this blog entry).

Friday, February 11, 2011

The Framingham Heart Study

The Framingham Heart Study is one of the oldest and most widely cited research studies in the history of medicine. Russ Hobbie and I mention the study briefly In Section 2.4 of the 4th edition of Intermediate Physics for Medicine and Biology, when discussing exponential decay.
Figure 2.8 shows the survival of patients with congestive heart failure for a period of nine years. The data are taken from the Framingham study [McKee et al. (1971)]; the death rate is constant during this period.
The data in Fig. 2.8 is from a paper with over 1400 citations in the scientific and medical literature: P. A. McKee, W. P. Castelli, P. M. McNamara, and W. B. Kannel (1971) “The Natural History of Congestive Heart Failure: The Framingham Study,” New England Journal of Medicine, Volume 285, Pages 1441–1446. The abstract to the paper states
The natural history of congestive heart failure was studied over a 16-year period in 5192 persons initially free of the disease. Over this period, overt evidence of congestive heart failure developed in 142 persons. In almost every five-year age group, from 30 to 62 years, the incidence rate was greater for men than for women. Although the usual etiologic precursors were found, the dominant one was clearly hypertension, which preceded failure in 75 per cent of the cases. Coronary heart disease was noted at an earlier examination in 39 per cent, but in 29 per cent of the cases it was accompanied by hypertension. Precursive rheumatic heart disease, noted in 21 per cent of cases of congestive heart failure, was accompanied by hypertension in 11 per cent. Despite modern management, congestive heart failure proved to be extremely lethal. The probability of dying within five years from onset of congestive heart failure was 62 per cent for men and 42 per cent for women.
A Change of Heart:  How the Framingham Study  Helped Unravel the Mysteries  of Cardiovascular Disease,  by Levy and Brink, superimposed on Intermediate Physics for Medicine and Biology.
A Change of Heart:
How the Framingham Study
Helped Unravel the Mysteries
of Cardiovascular Disease,
by Levy and Brink.
In 2005, Daniel Levy and Susan Brink published A Change of Heart: How the Framingham Heart Study Helped Unravel the Mysteries of Cardiovascular Disease. The book is a fascinating history of this landmark study. Levy (the study’s current director) and Brink (formerly a writer for U.S. News and World Report) write
A turning point in our evolving understanding of heart disease was the establishment of the Framingham Heart Study in 1948. It was a large and ambitious community-based research project unlike anything that had been conducted before. It came at a time of growing awareness that cardiovascular disease was sweeping the country, even slowing down what should have been a steady rise in life expectancy. It was also a time, three years after the end of World War II, when resources from the national treasury, no longer needed for military purposes, could be used for research into the nation’s leading killer….

In light of this ignorance [of how to treat coronary disease], the U.S. government in 1948 made a twenty-year commitment to uncovering the root causes of heart disease. That scientific resolve was sponsored by the U.S. Public Health Service with half a million dollars of start-up funding from Congress. A cadre of physicians, scientists, government officials, and academics—many of whom knew each other from having served together at military hospitals during the war—selected a New England town in which to carry out this national scientific experiment. The Framingham Heart Study turned out to be instrumental in changing the attitudes, if not the behavior, of virtually every American, and it put the otherwise ordinary town of Farmingham, Massachusetts, on the map….

They [the Heart Study researchers] needed the 5209 men and women from Framingham at first, followed by 5124 of their sons and daughters, and now 3500 of their grandchildren who have donated their medical histories to science. It is ironic, perhaps, that this most respected—even beloved—piece of epidemiology centers on the heart, the organ that symbolically aches, breaks, longs, and loves like no other. It took a commitment from thousands of volunteers to make the study a success.
I found Chapter 5 (The People Who Changed America’s Heart: Voices from Framingham) to be particularly inspiring. For instance, they quote Evelyn Langley—housewife, mother, and PTA president—who played an early role in promoting the study among potential participants, and was a participant herself.
Langley’s heart still lies with the Study. “When they call me up and tell me it’s time to come in for an exam, I know I have that ritual to do,” she says. She has made the trip to the clinic twenty-seven times so far. “I am trying to give back to the Heart Clinic [Study] what they have given me. I always feel as if I am part of something bigger than myself. It’s not just for the people who live in this town. Many lives have been saved because of the Heart Study.”
You can learn more about the Framingham Heart Study at the study’s website: http://www.framinghamheartstudy.org. The study is currently funded by the National Heart, Lung, and Blood Institute (part of the National Institutes of Health) and Boston University. Let me finish with a fitting quote from the acknowledgments of A Change of Heart.
This book would not have been possible without the more than fifty years of dedication and commitment from three generations of Framingham Heart Study volunteers. We would like to thank them all for providing a gift to the world that has changed untold millions of lives.

Friday, February 4, 2011

Britton Chance (1913-2010)

Britton Chance died late last year. The website www.brittonchance.org states that
Britton Chance, M.D., Ph.D., D.Sc., for more than 50 years one of the giants of biochemistry and biophysics and a world leader in transforming theoretical science into useful biomedical and clinical applications, died on November 16, 2010, at age 97 in Philadelphia, PA. Dr. Chance had the rare distinction of being the recipient of a National Medal of Science (1974), a Gold Medal in the Olympics (1952, Sailing, Men’s 5.5 Meter Class), and a Certificate of Merit for his sensitive work during World War II.
His obituary in the New York Times describes his early work.
Over a lifetime of research, Dr. Chance focused on the observation and measurement of chemical reactions within cells, tissue and the body. But unlike most researchers, he also had expertise in mechanics, electronics and optics, and a great facility in instrument-building. His innovations helped transform theoretical science into biochemical and biophysical principles, the stuff of textbooks, and useful biomedical and clinical applications.

Early in his career he invented a tool, known as a stopped-flow apparatus, for measuring chemical reactions involving enzymes; it led to the establishment of a fundamental principle of enzyme kinetics, known as the enzyme-substrate complex.
Another obituary, in the December 17 issue of Science magazine, observed that
In his mid-70s, Chance (then emeritus) launched a new field of optical diagnostics that rests on the physics of light diffusion through scattering materials such as living tissue. He showed that scattered near-infrared light pulses could not only measure the dynamics of oxy- and deoxyhemoglobin levels in performing muscles, but also reveal and locate tumors and cancerous tissue in muscles and breast as well as injury in the brain. Because changing patterns of oxy- and deoxyhemoglobin in the brain reflect cognitive activity, the applications of this diagnostic approach widened to include assessing neuronal connectivity in premature babies.
Chance appears in the 4th edition of Intermediate Physics for Medicine and Biology because of his research on light diffusion. In Section 14.4 (Scattering and Absorption of Radiation), Russ Hobbie and I analyze the absorption and scattering coefficients of infrared light, and then give typical values that “are eyeballed from data from various tissues reported in the article by Yodh and Chance (1995),” with the reference being to Yodh, A. and B. Chance (1995) “Spectroscopy and Imaging with Diffusing Light,” Physics Today, March, Pages 34–40.

Then in Sec. 14.5 (The Diffusion Approximation to Photon Transport), we analyze pulsed measurements of infrared light.
A technique made possible by ultrashort light pulses from a laser is time-dependent diffusion. It allows determination of both [the scattering coefficient] and [the absorption coefficient]. A very short (150-ps) pulse of light strikes a small region on the surface of the tissue. A detector placed on the surface about 4 cm away records the multiply-scattered photons. A typical plot of the detected photon fluence rate is shown in Fig. 14.13.
Figure 14.13 is a figure from Patterson, M. S., B. Chance, and B. C. Wilson (1989) “Time Resolved Reflectance and Transmittance for the Noninvasive Measurement of Tissue Optical Properties,” Applied Optics, Volume 28, Pages 2331–2336, which has been cited over 1000 times in the scientific literature.

Finally, in Sec. 14.6 (Biological Applications of Infrared Scattering), we reproduce a figure from the Physics Today article by Yodh and Chance, which shows the absorption coefficient for water, oxyhemoglobin and deoxyhemoglobin.
The greater absorption of blue light in oxygenated hemoglobin makes oxygenated blood red…The wavelength 800 nm at which both forms of hemoglobin have the same absorption is called the isosbestic point. Measurements of oxygenation are made by comparing the absorption at two wavelengths on either side of this point.
This property of infrared absorption of light is the basis for pulse oximeters that measure oxygenation. Not all measurements of blood oxygen use pulsed light. Russ and I cite one of Chance’s papers that uses a continuous source: Liu, H., D. A. Boas, Y. Zhang, A. G. Yodh, and B. Chance (1995) “Determination of Optical Properties and Blood Oxygenation in Tissue Using Continuous NIR Light,” Physics in Medicine and Biology, Volume 40, Pages 1983–1993. A fourth of Chance’s paper that we include in our references is Sevick, E. M., B. Chance, J. Leigh, S. Nioka, and M. Maris (1991) “Quantitation of Time- and Frequency-Resolved Optical Spectra for the Determination of Tissue Oxygenation,” Analytical Biochemistry, Volume 195, Pages 330–351.

In 1987, Chance won the Biological Physics Prize (now known as the Max Delbruck Prize in Biological Physics) from the American Physical Society
for pioneering application of physical tools to the understanding of Biological phenomena. The early applications ranged from novel spectrometry that elucidated electron transfer processes in living systems to analog computation of nonlinear processes. Later contributions have been equally at the forefront.

Friday, January 28, 2011

The Quantum Ten

The Quantum Ten: A Story of Passion, Tragedy, Ambition, and Science, by Sheilla Jones, superimposed on Intermediate Physics for Medicine and Biology.
The Quantum Ten:
A Story of Passion, Tragedy,
Ambition, and Science,
by Sheilla Jones.
Over the holiday break, I read The Quantum Ten: A Story of Passion, Tragedy, Ambition and Science, by Sheilla Jones. The book is about the development of quantum mechanics in the 1920s.
The seeds of the shift currently taking place in science were sown eighty years ago, from 1925 to 1927. That’s when a dramatic two-year revolution in physics reached a climax, the denouement set the course for what was to follow. It’s the story of a rush to formalize quantum physics, the work of just a handful of men fired by ambition, philosophical conflicts, and personal agendas…

Remarkably, this dramatic shift in science was primarily the work of ten men, and they were ten fallible men, some famous and some not so famous, although they also had a large supporting cast. The triumphs and tragedies, loves and betrayals, dreams realized and ambitions thwarted, shaped the competition over who would get to define truth and reality. There never was a consensus. By the time of the pivotal Fifth Solvay Conference in Brussels in 1927, there was so much ill will and disappointment among the creators of quantum physics over their various competing theories and over who deserved credit that most were barely on speaking terms.

The Brussels conference was the first time so many of them had come together: Albert Einstein, the lone wolf; Niels Bohr, the obsessive but gentlemanly father figure; Max Born, the anxious hypochondriac; Werner Heisenberg, the intensely ambitious one; Wolfgang Pauli, the sharp-tongued critic with a dark side; Paul Dirac, the quiet one; Erwin Schrodinger, the enthusiastic womanizer; Prince Louis de Broglie, the French aristocrat; and Paul Ehrenfest, who was witness to it all. Their coming together, however, lasted only for the duration of the conference.
I enjoyed the book, but couldn’t help wishing that it would focus less on the personal problems of the scientists and more on their science. I prefer my scientific biographies to be a bit more rigorous with an emphasis on the science, like Pais’s Subtle is the Lord. Nevertheless, the story was fascinating in a gossipy sort of way. The book is full of tidbits like this:
From time to time [Schrodinger] did consult on the mathematics with his Zurich colleague Hermann Weyl, who was at that point embroiled in a passionate love affair with Schrodinger’s wife, Anny. Wince the Weyls were part of the same sexually permissive crowd as the Schrodingers, the affair was no cause for tension between the two colleagues.
I found myself oddly attracted to Paul Ehrenfest, “an intense physicist with a debilitating streak of self-doubt who could rarely see the valuable gift he offered to physics and a passionate friend to both Einstein and Bohr.” Then, near the end of the book, I discovered—to my horror—that not only did Ehrenfest take his own life (I had heard that before), but that just before he committed suicide he shot and killed his son. My admiration vanished.

There was no biological physics in The Quantum Ten, but I couldn’t help wonder how these great scientists fared in the 4th edition of Intermediate Physics for Medicine and Biology. A quick survey gave the following results:
  • Albert Einstein. I discussed Einstein’s presence in our textbook about a year ago in this blog, and concluded that “we rarely mention Einstein by name in our book, but his influence is present throughout, and most fundamentally when we discuss the idea of a photon.”
  • Niels Bohr. His model for the hydrogen atom is referred to, but not derived. His contributions to calculating the stopping power of a charged particle in tissue are discussed in Chapter 15 (Interaction of Photons and Charged Particles with Matter).
  • Paul Ehrenfest. His name never appears in our book.
  • Max Born. The Born charging energy is discussed in Chapter 6 (Impulses in Nerve and Muscle Cells).
  • Erwin Schrodinger. The Schrodinger equation is mentioned in Chapter 3 (Systems of Many Particles), but never written down.
  • Wolfgang Pauli. The Pauli exclusion principle is stated in Chapters 14 (Atoms and Light) and 15 (Interaction of Photons and Charged Particles with Matter).
  • Louis de Broglie. His name is not in the book, although I have mentioned him in this blog before.
  • Werner Heisenberg. He and his uncertainty principle are not in the book.
  • Paul Dirac. I discussed Dirac in the blog before. His delta function shows up in Chapter 11 (The Method of Least Squares and Signal Analysis).
  • Pascual Jordan. His name never appears in our book.
I am not overly concerned that the quantum ten don’t figure prominently in Intermediate Physics for Medicine and Biology. Russ Hobbie and I do not focus on microscopic phenomena, where quantum mechanics is essential. Probably the greatest contribution to biological physics from any of the quantum ten is Schrodinger’s book What is Life?, which had a major impact on the early development of molecular biology (see The Eighth Day of Creation).

P.S. We had a significant revision of the errata this week. It is available at the book’s website: https://sites.google.com/view/hobbieroth. A big thank you to Gabriela Castellano for finding many mistakes and pointing them out to us. If you, dear reader, find additional mistakes, please let us know.

Friday, January 21, 2011

Gaussian integration

Chapter 8 in the 4th edition of Intermediate Physics for Medicine and Biology covers Biomagnetism: the measurement of the magnetic field produced by electrical currents in nerve and muscle. One issue that arises during biomagnetic recordings is that the magnetic field is not measured at a point, but is averaged over a pickup coil. Therefore, when comparing theoretical calculations to experimental data, you need to integrate the calculated magnetic field over the coil.

One way to do this is Gaussian quadrature, which approximates the integral by a weighted sum. Homework problem 40 in Chapter 8 shows a three-point Gaussian quadrature formula for integrating over a circular coil. At the end of the problem Russ Hobbie and I write
Higher-order formulas for averaging the magnetic field can be found in Roth and Sato (1992).
The reference is to Roth, B. J. and S. Sato (1992) “Accurate and Efficient Formulas for Averaging the Magnetic Field over a Circular Coil,” In M. Hoke, S. N. Erne, T. C. Okada, and G. L. Romani, eds. Biomagnetism: Clinical Aspects. Amsterdam, Elsevier. This book is the proceedings of the 8th International Conference on Biomagnetism, held in Munster, Germany on August 19–24, 1991. I didn’t attend that meeting, but my colleague Susumu Sato did. Sato is a senior scientist in the Epilepsy Research Branch of the National Institute of Neurological Disorders and Stroke, part of the National Institutes of Health in Bethesda, Maryland. When I worked with him he had an active research program in magnetoencephalography (MEG), including a large and expensive shielded room and a multi-channel SQUID magnetometer.

The introduction of our paper states
The MEG is measured by detecting the magnetic flux through a pickup coil, usually circular, that is coupled to a SQUID magnetometer. Often the source of the MEG is modeled as a current dipole, whose position, orientation and strength are determined iteratively by fitting the MEG data to a dipolar magnetic field pattern. To obtain an accurate result, this dipole field must be integrated over the pickup coil area to obtain the magnetic flux. Since this integration is repeated for each dipole considered in the iteration, the numerical algorithm used to estimate this integral should be efficient. In this note, several integration formulas are presented that allow the magnetic field to be integrated over the coil area quickly with little error. These formulas are examples of a general technique of approximating multiple integrals described by Stroud [1].
Reference [1] is to: Stroud AH (1971) Approximate Calculation of Multiple Integrals, Prentice-Hall, Englewood Cliffs, New Jersey, Pages 277–289.

I remember deriving several of these formulas independently before discovering Stroud’s textbook (it is always deflating to find you’ve been scooped). The derivation requires solving a system of nonlinear equations (which I rather enjoyed). Each formula requires evaluating the magnetic field at N points, and the integral is accurate to mth order. We presented a 1-point formula accurate to first order, a 3-point formula accurate to second order (this was the formula examined in the homework problem), a 4-point formula accurate to third order, a 6-point formula accurate to fourth order, a 7-point formula accurate to fifth order, and a 12-point formula accurate to seventh order.

The general formulation of Gaussian quadrature was developed by Carl Friedrich Gauss (1777–1855), one of the greatest mathematicians of all time. Gauss’s name appears often in the 4th edition of Intermediate Physics for Medicine and Biology, including the Gaussian function (Chapter 4), Gauss’s law (Chapter 6), the cgs unit for the magnetic field of a gauss (Chapter 8), the fast Fourier transform (FFT, Chapter 11) about which we write “the grouping used in the FFT dates back to Gauss in the early nineteenth century,” and the Gaussian Probability Distribution (Appendix I).

Friday, January 14, 2011

DNA animation by Drew Berry

I know that the 4th edition of Intermediate Physics for Medicine and Biology doesn’t discuss much about the physics of life at the molecular level. In the preface, Russ Hobbie and I wrote that “molecular biophysics has been almost completely ignored.” Nevertheless, I recently ran across an animation of DNA that is so good I have to tell you about it.

My story starts with the January-February issue of American Scientist, the science and technology magazine published by Sigma Xi, The Scientific Research Society. The cover of this issue shows DNA, packed “tightly in some chromosomal territories and loosely in others, forming sheer walls and intergenic fissures, as seen in the cover image from a 3D animation by renowned molecular animator Drew Barry.” When I read this, I asked myself: Who is Drew Berry, and where can I find his animations?

It turns out you can find Berry’s wonderful animation “Molecular Visualizations of DNA” on Youtube. Trust me, you really want to watch this video. It explains DNA packing into chromosomes, transcription, and translation in a visual way that is unforgettable. Other Berry animations can be found at http://www.wehi.edu.au/education/wehitv.

 DNA packing into chromosomes, by Drew Berry.
https://www.youtube.com/watch?v=7wpTJVWra7I

In 2010, Berry was awarded a MacArthur Fellowship from the John D and Catherine T MacArthur Foundation, the so-called “genius award”. The MacArthur website says
Drew Berry is a biomedical animator whose scientifically accurate and aesthetically rich visualizations are elucidating cellular and molecular processes for a wide range of audiences. Trained as a cell biologist as well as in light and electron microscopy, Berry brings a rigorous scientific approach to each project, immersing himself in the relevant research in structural biology, biochemistry, and genetics to ensure that the most current data are represented. In three- and four-dimensional renderings of such key biological concepts as cell death, tumor growth, and the packaging of DNA, Berry captures the details of molecular shape, scale, behavior, and spatio-temporal dynamics in striking form. His groundbreaking series of animations of the intricate biochemistry of DNA replication, translation, and transcription demonstrates these multifaceted processes in ways that enlighten both scientists and the scientifically curious. The sequence and pace of each molecular interaction are precisely coordinated, at the same time as the ceaseless motion of the molecules reveals the complex and seemingly random choreography of the molecular world. Committed to educating the public about critical topics in medical research, Berry created a two-part animation of the malaria life cycle that illustrates the pathogen’s development in the mosquito host and its invasion of and diffusion throughout human cells. In these and many other projects in progress, Berry synthesizes data across a variety of fields and presents them in engaging and lucid animations that both inspire a sense of wonder and enhance understanding of biological systems.

Drew Berry received B.Sc. (1993) and M.Sc. (1995) degrees from the University of Melbourne. Since 1995, he has been a biomedical animator at the Walter and Eliza Hall Institute of Medical Research. His animations have appeared in exhibitions and multimedia programs at such venues as the Museum of Modern Art, the Guggenheim Museum, the Royal Institute of Great Britain, and the University of Geneva.
 Note added in 2019: Watch Berry’s TED talk below.

 Drew Berry: Animations of Unseeable Biology.
https://www.youtube.com/embed/WFCvkkDSfIU

Friday, January 7, 2011

Convergence

This week researchers at the Massachusetts Institute of Technology released a white paper titled “The Third Revolution: The Convergence of the Life Sciences, Physical Sciences, and Engineering.” It begins
There are few challenges more daunting than the future of health care in this country. This paper introduces the dynamic and emerging field of convergence—which brings together engineering and the physical and life sciences—and explains how convergence provides a blueprint for addressing the health care challenges of the 21st century by producing a new knowledge base, as well as a new generation of diagnostics and therapeutics. We discuss how convergence enables the innovation necessary to meet the growing demand for accessible, personalized, affordable health care. We also address the role of government agencies in addressing this challenge and providing funding for innovative research. Finally, we recommend strategies for embedding convergence within agencies like the National Institutes of Health (NIH), which aims to optimize basic research, improve health technology, and foster important medical advances.
If “convergence” is the melding of physics and engineering with the life sciences, then I suggest that a good place to start your search for convergence is the 4th edition of Intermediate Physics for Medicine and Biology. The MIT white paper is singing our song about the integration of physics with biology. But I am a Johnny-come-lately to convergence compared to my coauthor, Russ Hobbie, who pioneered this approach decades ago.
Between 1971 and 1973 I audited all the courses medical students take in their first two years at the University of Minnesota. I was amazed at the amount of physics I found in these courses.
You can find more about the white paper in an article in the Science Insider. The authors talk about three revolutions in biomedicine: the first was molecular and cellular biology, the second was genomics, and the third will be convergence. I must admit that I find the white paper a little self-serving; most of their examples feature MIT researchers (says the guy who writes a weekly blog about physics in medicine and biology with the goal of peddling textbooks!). But I agree with its premise. Indeed, the first sentence of their concluding paragraph sounds as if it could be a promotion for our book.
The merger of the life, engineering, and physical sciences promises to fundamentally alter and speed our scientific trajectory. NIH and other affected agencies, if adequately funded and made ready, can be thought leaders in this next scientific revolution. The time is right for NIH and other agencies to take up convergence as the wave of the future, creating dramatic new opportunities in medicine for new therapies and diagnostics, economic opportunity, as well as promise in many other scientific fields, from energy to climate to agriculture.