Friday, October 25, 2019

One Hundred Books About Physics for Medicine and Biology

When I was in high school, I became intrigued by St. John’s College and their Great Books program. I had their brochure, which included a list of the books to read each year; the most famous works of western civilization.

In the spirit of St. John’s, below I list one hundred Great Books about physics applied to medicine and biology. Read all these and you will have obtained a liberal education in biological and medical physics. One book you won’t find on this list is Intermediate Physics for Medicine and Biology. I’m going to assume you’ve already read IPMB and my goal is to suggest books to supplement it.

Where to begin? I’ll assume you have taken a year of physics and a year of calculus. Once you have these prerequisites, start reading.
  1. Powers of Ten. First an overview that’s easy and fun. It provides an intuitive feel for the relative sizes of things. 
  2. The Machinery of Life. Although I’m assuming you’ve studied some physics and math, I’m not assuming you have much background in biology. This book provides a gentle introduction to biochemistry. Plus, it has those wonderful drawings by David Goodsell
  3. The Art of Insight in Science and Engineering. Remember: We seek insight, not just facts.
  4. Physical Models of Living Systems. IPMB is about modeling in medicine and biology. Philip Nelson’s little book gets us started building models. 
  5. The Feynman Lectures on Physics. I know, I know...you’ve already studied introductory physics, but The Feynman Lectures are special. You don’t want to miss them, and they contain some biology too.
  6. Air and Water. Now we get to our main topic: physics applied to biology. Mark Denny’s book covers many topics found in the first half of IPMB.
  7. Physics with Illustrative Examples from Medicine and Biology. This classic three-volume set covers much of the same ground as IPMB.
  8. The Double Helix. To further strengthen your background in biology, read James Watson’s first-person account of how he and Francis Crick discovered the structure of DNA. It’s a required text for any student of science, and is an easy read.
  9. The Eighth Day of Creation: The Makers of the Revolution in Biology. After warming up with The Double Helix, it’s time to dig deeper into the history and ideas of modern biology. Physicists play a large role in this book, and it’s wonderfully written.
  10. Biomechanics of Human Motion. Chapter 1 in IPMB covers statics applied to the bones and muscles of the body. It’s our first book that focuses in detail on a specific topic.
  11. Structures, or Why Things Don’t Fall Down. A delightful book about mechanics, including some biomechanical examples. It’s one of the most enjoyable books on this list. Don’t miss the sequel, The New Science of Strong Materials.
  12. Biomechanics: Mechanical Properties of Living Tissue. We need a book about biomechanics that treats tissue as a continuous medium. YC Fung’s textbook fills that niche.
  13. A Treatise on the Mathematical Theory of Elasticity. This book is long and technical, and may contain more material than you really need to know. Nevertheless, it’s a great place to learn elasticity. I’m sure there are more modern books that you may prefer. Skip if you’re in a hurry.
  14. The Physics of Scuba Diving. An easy read about how hydrostatics impacts divers.
  15. Life in Moving Fluids. Fluid dynamics is one of those topics that’s critical to life, but is often skipped in introductory physics classes. This book by Steven Vogel provides an excellent introduction to the field of biological fluid dynamics.
  16. Vital Circuits. Another book by Vogel, which focuses on the fluid dynamics of the circulatory system. 
  17. Boundary Layer Theory. This large tome may be too advanced for the list, but I learned a lot from it. Skip if you need to move along quickly.
  18. Textbook of Medical Physiology. We need to get serious about learning physiology. This classic text is by Arthur Guyton, but any good physiology textbook will do. Not much physics here. The book contains more biology than we need, but physiology is too important to skip.
  19. e, The Story of a Number. A gentle introduction into calculus and differential equations, and a great history of the exponential function, the topic of IPMB’s second chapter.
  20. Quick Calculus. Yes, you already studied calculus. But we are about to get more mathematical, and this book will help you brush up on math you may have forgotten. If you don’t need it, move on. 
  21. Used Math. Finish your math review with this outline of mathematics essential for college physics.
  22. The Essential Exponential. It’s time to focus specifically on the exponential function and its properties, so important in biology and medicine.
  23. A Change of Heart. Chapter 2 of IPMB mentions the Framingham heart study. Read the story behind the project.
  24. On Being the Right Size. This is really an essay, but indulge me while I include it here among the books. J. B. S. Haldane is too fascinating of a writer to miss.
  25. Scaling: Why is Animal Size so Important? Knut Schmidt-Nielsen’s study of scaling, a key topic in Chapter 2 of IPMB.
  26. Lady Luck. Chapter 3 of IPMB requires us to know some probability, and Warren Weaver’s book is an engaging introduction.
  27. Statistical Physics. The first few sections of Chapter 3 in IPMB develop the ideas of statistical physics in a way reminiscent of Frederick Reif’s volume in the Berkeley Physics Course.
  28. An Introduction to Thermal Physics. For those who want a more traditional approach to thermodynamics, I recommend Daniel Schroeder’s textbook.
  29. Lehninger Principles of Biochemistry. Biological thermodynamics overlaps with biochemistry. Any good biochemistry book will do. They all contain more detail than you need, but a biological physicist must know some biochemistry.
  30. The Second Law. This delightful book by Peter Atkins will fill a hole in IPMB: a penetrating discussion about the second law of thermodynamics.
  31. Div Grad Curl and All That. Chapter 4 of IPMB uses vector calculus, and there is no better introduction to the topic.
  32. Random Walks in Biology. Howard Berg’s wonderful little book about diffusion.
  33. The Mathematics of Diffusion. John Crank’s intimidating giant tome about diffusion. Mathephobes shouldn’t bother with it; Mathephiles shouldn’t miss it.
  34. Conduction of Heat in Solids. Like the book by Crank, this ponderous textbook by Horatio Carslaw and John Jaeger presents all you ever want to know about solving the heat equation (also known as the diffusion equation).
  35. How Animals Work. Another delightful book by Schmidt-Nielsen that considers comparative physiology, and topics in Chapter 5 of IPMB such as countercurrent heat exchange.
  36. The Nuts and Bolts of Life. A colorful book about the first dialysis machine.
  37. The Biomedical Engineering Handbook. Don’t read this encyclopedia-like multi-volume handbook in one sitting. Yet it provides dozens of examples of how physics is applied to medicine. Ask your library to buy this set and the next one.
  38. Encyclopedia of Medical Devices and Instrumentation. The title should be Case Studies: How Physics is Applied to Medicine.
  39. Plant Physics. IPMB doesn’t say much about plants, but physics impacts botany as well as zoology.
  40. Nerve, Muscle, and Synapse. Bernard Katz’s excellent, if somewhat dated, introduction to all the electrophysiology you need for Chapter 6 of IPMB.
  41. The Conduction of the Nervous Impulse. Read about the Hodgkin-Huxley model from the pen of Alan Hodgkin himself.
  42. From Neuron to Brain. A modern introduction to neuroscience.
  43. Electricity and Magnetism. This book by Ed Purcell is part of the Berkeley Physics Course. The first of three physics books about electricity and magnetism.
  44. Introduction to Electrodyamics. David Griffiths’s text competes with Purcell’s for my favorite electricity and magnetism book.
  45. Classical Electrodynamics. John David Jackson’s famous graduate-level physics text may be more electricity and magnetism than you want, but how could I leave it off the list?
  46. Galvani’s Spark. A history of neurophysiology.
  47. Shattered Nerves. A fascinating look at using electrical stimulation to compensate for neural injury. A history of neural prostheses.
  48. Bioelectricity: A Quantitative Approach. The first, and probably easiest, of three bioelectricity textbooks.
  49. Bioelectromagnetism. Jaakko Malmivuo and Robert Plonsey’s big book about bioelectricity.
  50. Bioelectricity and Biomagnetism. Another big tome. Ramesh Gulrajani’s alternative to Malmivuo and Plonsey.
  51. The Art of Electronics. In order to understand voltage clamping and other electrophysiological methods, you need to know some electronics. This book is my favorite introduction to the topic. 
  52. Mathematical Handbook of Formulas and Tables. Chapter 6 contains lots of mathematics, and the next three books are references you may want. This Schaum’s Outline contains most of the math you’ll ever need. It’s cheap, light, and easy to use. Keep it handy.
  53. Handbook of Mathematical Functions with Formulas, Graphs and Mathematical Tables. No one would sit down and read this handbook straight through, but “Abramowitz and Stegun” is invaluable as a reference.
  54. Table of Integrals, Series, and Products. “Gradshteyn and Ryzhik” is the best integral table ever. Let the library buy it, but have them keep it in the reference section so you can find it quickly. 
  55. Numerical Recipes. If you want to solve the equations of the Hodgkin-Huxley model, you need to program a computer. This book is great for finding the needed numerical methods.
  56. Numerical Methods that Work. Forman Acton’s book is more chatty than Numerical Recipes, but full of insight.
  57. Machines in our Hearts. Chapter 7 of IPMB examines the heart. Read this history of pacemakers and defibrillators to put it all in perspective.
  58. Cardiac Electrophysiology: From Cell to Bedside. This multi-author, multi-edition work contains everything you always wanted to know about the electrical properties of the heart, but were afraid to ask.
  59. Cardiac Bioelectric Therapy. Another multi-author collection, with several excellent chapters about the bidomain model.
  60. When Time Breaks Down. Art Winfree’s unique analysis of the electrical properties of the heart.
  61. Electric Fields of the Brain. Paul Nunez’s book about the electroencephalogram from the perspective of a physicist.
  62. Iron, Nature’s Universal Element. Why people need iron and animals make magnets.
  63. The Spark of Life. An accessible introduction to electrophysiology and ion channel diseases.
  64. Ion Channels of Excitable Membranes. The definitive textbook about ion channels, by Bertil Hille.
  65. Voodoo Science. Some of the topics in Section 9.10 of IPMB about possible effects of weak electric and magnetic fields make me yearn for this hard-hitting book by Bob Park.
  66. Dynamics: The Geometry of Behavior. Chapter 10 of IPMB covers nonlinear dynamics. This beautiful book introduces dynamics using pictures.
  67. From Clocks to Chaos. Leon Glass and Michael Mackey introduce the idea of a dynamical disease.
  68. Nonlinear Dynamics and Chaos. Steven Strogatz’s classic; my favorite book about nonlinear dynamics.
  69. Mathematical Physiology. An award-winning textbook about applying math to biology.
  70. Mathematical Biology. Another big fine textbook for the mathematically inclined.
  71. The Geometry of Biological Time. A quirky book by Art Winfree, more wide-ranging than When Time Breaks Down.
  72. Data Reduction and Error Analysis for the Physical Sciences. Many of the ideas about least squares fitting discussed in Chapter 11 of IPMB are related to analyzing noisy data.
  73. The Fourier Transform and its Applications. The Fourier transform is the most important concept in Chapter 11. Ronald Bracewell’s book is a great place to learn about it.
  74. Introduction to Membrane Noise. Louis DeFelice’s book explains how to deal with noise.
  75. Naked to the Bone. A historical survey of medical imaging.
  76. Medical Imaging Physics. A book by William Hendee and E Russell Ritenour, at a level similar to IPMB but dedicated entirely to imaging. Also see its partner, Hendee's Radiation Therapy Physics.
  77. Foundations of Medical Imaging. A big, technical book about imaging.
  78. Theoretical Acoustics. Not much biology here, but a definitive survey of acoustics to back up Chapter 13 of IPMB.
  79. Physics of the Body. This book discusses many topics, including hearing.
  80. Musicophilia. An extraordinary book by Oliver Sacks about the neuroscience of hearing.
  81. Quantum Physics of Atoms, Molecules, Solids, Nuclei, and Particles. My choice for a modern physics textbook, with much information about the interaction of light with matter.
  82. The First Steps in Seeing. Robert Rodieck’s incredible book about the physics of vision.
  83. The Optics of Life. This masterpiece by Sonke Johnsen walks you through optics, examining all the biological applications. A great supplement to Chapter 14 of IPMB.
  84. From Photon to Neuron. A study of light, imaging, and vision.
  85. Introduction to Physics in Modern Medicine. Suzanne Amador Kane’s nice introduction to physics applied to medicine, covering many topics in the last half of IPMB.
  86. Introduction to Radiological Physics and Radiation Dosimetry. Frank Herbert Attix wrote the definitive textbook about how x-rays interact with tissue, a topic covered in Chapter 15 of IPMB.
  87. Radiobiology for the Radiologist. The go-to reference for how cells and tissues respond to radiation.
  88. Molecular Biology of the Cell. The classic textbook of cell biology.
  89. Radiation Oncology: A Physicists Eye View. Explains how to treat cancer using radiation.
  90. The Physics of Radiation Therapy. Faiz Khan’s in-depth study of radiation therapy.
  91. The Atomic Nucleus. An classic about nuclear physics, providing background for Chapter 17 of IPMB. You could replace it with one of many modern nuclear physics textbooks.
  92. The Immortal Life of Henrietta Lacks. A fascinating study of how a women treated for cancer using radioactivity ended up providing science with an immortal cell line.
  93. Strange Glow. How radiation impacts society.
  94. The Radium Girls. This book is about women poisoned by radium-containing paint (lip, dip, paint). It reminds us why we study medical physics.
  95. Magnetic Resonance Imaging: Physical Properties and Sequence Design. All you need to know about MRI.
  96. Principles of Nuclear Magnetic Resonance Microscopy. Paul Callaghan’s view of magnetic resonance imaging.
  97. Echo Planar Imaging. Advanced MRI techniques.
  98. Biological Physics. IPMB is not strong in covering physics applied to cellular and molecular biology. Here are three great books to fill that gap.
  99. Cell Biology by the Numbers. I love the quantitative approach to biology.
  100. Physical Biology of the Cell. How physicists view biology. 
Don’t see your favorite listed? Here’s my call to action: Add your recommendations to the comments below.

I didn’t end up going to St. John’s College and studying the Great Books. Instead, I attended a more traditional school, the University of Kansas. I loved KU, and I have no regrets. But sometimes I wonder...

Friday, October 18, 2019

Entry Region and Deviations from Poiseuille Flow

Chapter 1 of Intermediate Physics for Medicine and Biology analyzes viscous flow in a tube, also known as Poiseuille flow. Russ Hobbie and I derive the well-known parabolic distribution of speed: motionless at the wall—the no-slip boundary condition—and maximum at the center. In Section 1.20 we consider departures from the parabolic flow profile.
The entry region causes deviations from Poiseuille flow in larger vessels. Suppose that blood flowing with a nearly flat velocity profile enters a vessel, as might happen when blood flowing in a large vessel enters the vessel of interest, which has a smaller radius. At the wall of the smaller vessel, the flow is zero. Since the blood is incompressible, the average velocity is the same at all values of x, the distance along the vessel. (We assume the vessel has a constant cross-sectional area.) However, the velocity profile v(r) changes with distance x along the vessel. At the entrance to the vessel (x = 0), there is a very abrupt velocity change near the walls. As x increases, a parabolic velocity profile is attained. The transition or entry region is shown in Fig. 1.35. In the entry region, the pressure gradient is different from the value for Poiseuille flow. The velocity profile cannot be calculated analytically in the entry region. Various numerical calculations have been made, and the results can be expressed in terms of scaled variables (see, for example, Cebeci and Bradshaw 1977). The Reynolds number used in these calculations was based on the diameter of the pipe, D = 2Rp, and the average velocity. The length of the entry region is

L = 0.05DNR,D = 0.1RpNR,D = 0.2RpNR,Rp.       (1.63)
IPMB’s Figure 1.35 is shown below.

Figure 1.35 from Intermediate Physics for Medicine and Biology.

This figure first appeared in the 3rd edition of IPMB, for which Russ was sole author. I got to wondering how he created it.

Momentum Transfer in Boundary Layers, by Tuncer Cebeci and Peter Bradshaw, superimposed on Intermediate Physics for Medicine and Biology.
Momentum Transfer in
Boundary Layers
,
by Cebeci and Bradshaw.
I checked out the book Momentum Transfer in Boundary Layers, by Cebeci and Bradshaw (1977), through interlibrary loan and found the part about entrance-region flow in their Section 5.7.1. They write
Figures 5.9 and 5.10 show the velocity profiles, u/u0, and the dimensionless centerline (maximum) velocity, uc/u0, as functions of 2x*/Rd in the entrance region of a pipe. Here x* = x/r0 and Rd = u0d/ν. Figure 5.10 also shows the measured values of the centerline velocity obtained by Pfenninger (1951). According to the results of Fig. 5.10, the centerline velocity has almost reached its asymptotic value of 2 at 2x*/Rd = 0.20. Thus the entrance length for a laminar flow in a circular pipe is

le/d = Rd/20       (5.7.12)
Their Fig. 5.9 is

A photograph of Figure 5.9 from Momentum Transfer in Boundary Layers, by Tuncer Cebeci and Peter Bradshaw (1977).
A photograph of Figure 5.9 from Momentum Transfer
in Boundary Layers
, by Cebeci and Bradshaw (1977).
I believe this is the figure Russ used to create his drawing. Clever guy that he is, he seems to have taken the traces, rotated them by 90°, and reflected them across the centerline so they extend from the upper to lower wall. The variable r in their Fig. 5.9 is the distance from the wall, and r0 is the radius of the vessel, so r/r0 = 1 at the center of the vessel and r/r0 = 0 at the wall (don’t ask me why they defined r in this odd way); x* is x/r0, or the distance along the length of the vessel in terms of the vessel radius; ν is the kinematic viscosity, equal to the coefficient of viscosity η divided by the density ρ; and finally Rd is the Reynolds number, defined using the vessel diameter, which Russ and I call NR,D.

At the vessel entrance (x* = 0), the flow is uniform across its cross section with speed u0. The curve corresponding to 2x*/Rd = 0.531 looks almost exactly like Poiseuille flow, with the shape of a parabola and a maximum speed equal to 2u0. For the curve 2x*/Rd = 0.101 the flow is close to, but not exactly, Poiseuille flow. Cebeci and Bradshaw somewhat arbitrarily define the entrance length as corresponding to 2x*/Rd = 0.2.

The example that Russ analyzed in the caption to Fig. 1.35 corresponds to a large vessel such as the brachial artery in your upper arm. A diameter of 4 mm and an entrance length of 240 mm implies a Reynolds number of Rd = 1200. In this case, the entrance length is much greater than the diameter, and is similar to the length of the vessel. If we consider a small vessel like a capillary, however, we get a different picture. A typical Reynolds number for a capillary would be du0ρ/η = (8 × 10-6 m)(0.001 m/s)(1000 kg/m3)/(3 × 10-3 kg/m/s) = 0.0027, which implies an entrance length of about one nanometer. In other words, the parabolic flow distribution is established almost immediately, over a distance much smaller than the vessel’s length and even its radius. The entrance length is negligible in small vessels like capillaries.

Tuncer Cebeci is a Turkish-American mechanical engineer who worked for the Douglas Aircraft Company and was chair of the Department of Aerospace Engineering at California State University Long Beach. He has authored many textbooks in aeronautics, and developed the Cebeci-Smith model used in computational fluid dynamics. Peter Bradshaw was a professor in the Department of Aeronautics at the Imperial College London and then at Stanford, and is a fellow of the Royal Society. He developed the “Bradshaw Blower,” a type of wind tunnel use to study turbulent boundary layers.

Cebeci and Bradshaw describe why they wrote Momentum Transfer in Boundary Layers in their preface.
This book is intended as an introduction to fluid flows controlled by viscous or turbulent stress gradients and to modern methods of calculating these gradients. It is nominally self-contained, but the explanations of basic concepts are intended as review for senior students who have already taken an introductory course in fluid dynamics, rather than for beginning students. Nearly all stress-dominated flows are shear layers, the most common being the boundary layer on a solid surface. Jets, wakes, and duct flows are also shear layers and are discussed in this volume. Nearly all modern methods of calculating shear layers require the use of digital computers. Computer-based methods, involving calculations beyond the reach of electomechanical desk calculators, began to appear around 10 years ago... With the exception of one or two specialist books... this revolution has not been noticed in academic textbooks, although the new methods are widely used by engineers.
This post illustrates how IPMB merely scratches the surface when explaining how physics impacts medicine and biology. Behind each sentence, each figure, and each reference is a story. I wish Russ and I could tell them all.

Friday, October 11, 2019

A Blog as Ancillary Material for a Physics Textbook

Today I’m attending the Fall 2019 Meeting of the Ohio-Region Section of the American Physical Society and the Michigan Section of the American Association of Physics Teachers, held at Kettering University in Flint, Michigan. Flint is just 45 minutes north of Oakland University, so this is a local meeting for me.

At the meeting I’ll present a poster titled “A Blog as Ancillary Material for a Physics Textbook.” As you can probably guess, the blog I’m referring to is the one you’re reading now. My poster is shown below.

My poster for the Fall 2019 Meeting of the Ohio-Region Section of the American Physical Society and the Michigan Section of the American Association of Physics Teachers.
My poster for the Fall 2019 Meeting of the Ohio-Region Section of the American Physical Society
and the Michigan Section of the American Association of Physics Teachers
The poster begins with my meeting abstract.
Nowadays, textbooks come with many ancillary materials: solution manuals, student guides, etc. A unique ancillary feature is a blog. A blog allows you to keep your book up-to-date, to expand on ideas covered only briefly in your book, to point to other useful learning materials such as websites, articles and other books, and to interact directly with students using your book.
Then I address the question “Why write a blog associated with a textbook?” My reasons are to
  • Keep your book up-to-date. 
  • Present background material. 
  • Offer links to related websites, videos, and other books. 
  • Try out new material for future editions. 
  • Provide a direct line of communication between you and your readers. 
  • Reach out to students from other states and countries who are interested in your topic but don’t have your book (yet). 
  • Have fun. 
  • Increase book sales!
Next I discuss the blog for IPMB.
I am coauthor with Russell Hobbie of the textbook Intermediate Physics for Medicine and Biology (5th edition, Springer, 2015) My blog can be found at hobbieroth.blogspot.com The blog began in 2007. I post once a week, every Friday morning, with over 600 posts so far. I also share the weekly posts on the book’s Facebook page. I use the blogger software, which is free and easy to learn; https://www.blogger.com.
After that, I describe my different types of posts.
  • Useful for Instructors: Posts that will be especially helpful to faculty teaching from your book, such as sample syllabi, information about prerequisites, and links. 
  • Book Reviews: Reviews of books that are related to mine. 
  • Obituaries: Stories of famous scientists who have died recently. 
  • New Homework Problems: I often post new homework problems that instructors can use in class or on exams. 
  • My Own Research: Stories from my own research, to serve as examples of how to apply the material in the textbook. 
  • Lots of Math: Some of my posts are very mathematical, and I warn the reader. 
  • Personal Favorites: About 10% of my posts I list as personal favorites. These are particularly interesting, especially well written, or sometimes autobiographical.
Finally, I provide a sample post. I chose one of my favorites about craps, published on August 10, 2018.

A big thank you to my graduate student Dilmini Wijesinghe, who helped me design the poster. She’ll be at the meeting too, presenting another poster about biomechanics and mechanotransduction. But that’s another story.

Friday, October 4, 2019

Spiral MRI

In Chapter 18 of Intermediate Physics for Medicine and Biology, Russ Hobbie and I discuss a type of magnetic resonance imaging called echo-planar imaging.
In EPI the echoes are not created using π pulses. Instead, they are created by dephasing the spins at different positions along the x axis using a Gx gradient, and then reversing that gradient to rephase the spins, as shown in Fig. 18.32. Whenever the integral of Gx(t) is zero, the spins are all in phase and the signal appears. A large negative Gy pulse sets the initial value of ky to be negative; small positive Gy pulses (“blips”) then increase the value of ky for each successive kx readout. Echo-planar imaging requires strong gradients—at least five times those for normal studies—so that the data can be acquired quickly. Moreover, the rise and fall-times of these pulses are short, which induces large voltages in the coils. Eddy currents are also induced in the patient, and it is necessary to keep these below the threshold for neural activation. These problems can be reduced by using sinusoidally-varying gradient currents. The engineering problems are discussed in Schmitt et al. (1998); in Vlaardingerbroek and den Boer (2004); and in Bernstein et al. (2004).
Echo-Planar Imaging: Theory, Technique and Application, edited by Schmitt, Stehling, and Turner, superimposed on Intermediate Physics for Medicine and Biology.
Echo-Planar Imaging:
Theory, Technique and Application
,
edited by Schmitt, Stehling, and Turner.
To learn more about “sinusoidally-varying gradient currents,” I consulted the first of the three references, Echo-Planar Imaging: Theory, Technique and Application, edited by Franz Schmitt, Michael Stehling, and Robert Turner (Springer, 1998). In his chapter on the “Theory of Echo-Planar Imaging,” Mark Cohen discusses a spiral echo-planar pulse sequence in which the gradient fields have the unusual form Gx = Go t sin(ωt) and Gy = Go t cos(ωt).

Below I show the pulse sequence, which you can compare with the echo-planar imaging sequence in Fig. 18.32 of IPMB if you have the book by your side (don’t you always keep IPMB by your side?). The top two curves are the conventional slice selection sequence: a gradient Gz (red) in the z direction is applied during a radiofrequency π/2 pulse Bx (black), which rotates the spins into the x-y plane. The unconventional readout gradient Gx (blue) varies as an increasing sine wave. It produces a gradient echo at times corresponding approximately to the extrema of the Gx curve (excluding the first small positive peak). The phase encoding gradient Gy (green), an increasing cosine wave, is approximately zero at the echo times, but will shift the phase and therefore impact the amplitude of the echo.
A pulse sequence for spiral echo-planar imaging, based on Fig. 14 of “Theory of Echo-Planar Imaging,” by Mark Cohen in Echo-Planar Imaging: Theory, Technique and Application, edited by Schmitt, Stehling, and Turner.
A pulse sequence for spiral echo-planar imaging,
based on Fig. 14 of “Theory of Echo-Planar Imaging,”
by Mark Cohen.

If you look at the output in terms of spatial frequencies (kx, ky), you find that the echos correspond to points along an Archimedean spiral.

The spiral echo-planar imaging technique as viewed in frequency space, based on Fig. 13 of “Theory of Echo-Planar Imaging,” by Mark Cohen, in Echo-Planar Imaging: Theory, Technique and Application, edited by Schmitt, Stehling, and Turner.
The spiral echo-planar imaging technique as viewed in frequency space,
based on Fig. 13 of “Theory of Echo-Planar Imaging,” by Mark Cohen.

Spiral echo-planar imaging has some drawbacks. Data in k-space is not collected over a uniform array, so you need to interpolate onto a square grid before performing a numerical two-dimensional inverse Fourier transform to produce the image. Moreover, you get blurring from chemical shift and susceptibility artifacts. The good news is that you eliminate the rapid turning on and off of gradient pulses, which reduces eddy currents that can cause their own image distortions and possibly neural stimulation. So, spiral imaging has advantages, but the pulse sequence sure looks weird.

Echo-planar imaging in general, and spiral imaging in particular, are very fast. In his chapter on “Spiral Echo-Planar Imaging,” Craig Meyer discusses his philosophy about using EPI.
Spiral scanning is a promising alternative to traditional EPI. The properties of spiral scanning stem from the circularly symmetric nature of the technique. Among the attractive properties of spiral scanning are its efficiency and its good behavior in the presence of flowing material; the most unattractive property is uncorrected inhomogeneity leads to image blurring. Spiral image reconstruction can be performed rapidly using gridding, and there are a number of techniques for compensating for inhomogeneity. There are good techniques for generating efficient spiral gradient waveforms. Among the growing number of applications of spiral scanning are cardiac imaging, angiography, abdominal tumor imaging, functional imaging, and fluoroscopy.

Spiral scanning is a promising technique, but at the present it is still not in routine clinical use. There are many theoretical reasons why spiral scanning may be advantageous for a number of clinical problems, and initial volunteer and clinical studies have yielded very promising results for a number of applications. Still, until spiral scanning is established in routine clinical use, some caution is warranted about proclaiming it to be the answer for any particular question.

Friday, September 27, 2019

The Cauchy Distribution

In an appendix of Intermediate Physics for Medicine and Biology, Russ Hobbie and I analyze the Gaussian probability distribution
An equation for the Gaussian probability distribution.
It has the classic bell shape, centered at mean x with a width determined by the standard deviation σ.

Other distributions have a similar shape. One example is the Cauchy distribution
An equation for the Cauchy probability distribution.
where the distribution is centered at x and has a half-width at half-maximum γ. I initially thought the Cauchy distribution would be as well behaved as any other probability distribution, but it’s not. It has no mean and no standard deviation!

Rather than thinking abstractly about this issue, I prefer to calculate and watch how things fall apart. So, I wrote a simple computer program to generate N random samples using either the Gaussian or the Cauchy distribution. Below is a histogram for each case (N = 1000; Gaussian, x = 0, σ = 1; Cauchy, x = 0, γ = 1).

Histograms for 1000 random samples obtained using the Cauchy (left) and Gaussian (right) probability distribution.

Those samples out on the wings of the Cauchy distribution are what screw things up. The probability falls off so slowly that there is a significant chance of having a random sample that is huge. The histograms shown above are plotted from −20 to 20, but one of the thousand Cauchy samples was about −2400. I’d need to plot the histogram over a range more than one hundred times wider to capture that bin in the histogram. Seven of the samples had a magnitude over one hundred. By contrast, the largest sample from the Gaussian was about 4.6.

What do these few giant samples do to the mean? The average of the thousand samples shown above obtained from the Cauchy distribution is −1.28, which is bigger than the half-width at half-max. The average of the thousand samples obtained from the Gaussian distribution is −0.021, which is much smaller than the standard deviation.

Even more interesting is how the mean varies with N. I tried a bunch of cases, summarized in the figure below.
A plot of the mean versus sample size, for data drawn from the Gassian and Cauchy probability distribution.

There’s a lot of scatter, but the means for the Gaussian data appear to get smaller (closer to the expected value of zero) as N gets larger, The red line is not a fit, but merely drawn by eye. I included it to show how the means fall off with N. It has a slope of −½, implying that the means decay roughly as 1/√N. In contrast, the means for the Cauchy data are large (on the order of one) and don’t fall off with N. No matter how many samples you collect, your mean doesn’t approach the expected value of zero. Some oddball sample comes along and skews the average.

If you calculate the standard deviations for these cases, the problem is even worse. For data generated using the Cauchy distribution, the standard deviation grows with N. For N over a million, the standard deviation is usually over a thousand (remember, the half-width at half-max is one), and for my N = 5,000,000 case the standard deviation was over 600,000. Oddballs dominate the standard deviation.

I’m sorry if my seat-of-the-pants experimental approach to analyzing the Cauchy distribution seems simplistic, but for me it provides insight. The Cauchy distribution is weird, and I’m glad Russ and I didn’t include an appendix about it in Intermediate Physics for Medicine and Biology.

Friday, September 20, 2019

Happy Birthday Professor Fung

Yuan-Cheng Fung celebrated his 100th birthday last Sunday.

Biomechanics: Mechanical Properties of Living Tissues, by Y. C. Fung, superimposed on Intermediate Physics for Medicine and Biology.
Biomechanics: Mechanical
Properties of Living Tissues
,
by Y. C. Fung.
When Russ Hobbie and I needed to cite a general biomechanics textbook in Intermediate Physics for Medicine and Biology, we chose Biomechanics: Mechanical Properties of Living Tissues by “Bert” Fung.
Whenever a force acts on an object, it undergoes a change of shape or deformation. Often these deformations can be ignored… In other cases, such as the contraction of a muscle, the expansion of the lungs, or the propagation of a sound wave, the deformation is central to the problem and must be considered. This book will not develop the properties of deformable bodies extensively; nevertheless, deformable body mechanics is important in many areas of biology (Fung 1993).
According to Google Scholar, Biomechanics has over 10,000 citations, implying it’s a very influential book. In his introduction, Fung writes
Biomechanics seeks to understand the mechanics of living systems. It is an ancient subject and covers a very wide territory. In this book we concentrate on physiology and medical applications, which constitute the majority of recent work in the field. The motivation for research in this area comes from the realization that physiology can no more be understood without biomechanics than an airplane can without aerodynamics. For an airplane, mechanics enables us to design its structure and predict its performance. For an organ, biomechanics helps us to understand its normal function, predict changes due to alterations, and propose methods of artificial intervention. Thus diagnosis, surgery, and prosthesis are closely associated with biomechanics.
A First Course in Continuum Mechanics, by Y. C. Fung, superimposed on Intermediate Physics for Medicine and Biology.
A First Course in
Continuum Mechanics
,
by Y. C. Fung.
Another of Fung’s books that I like is A First Course in Continuum Mechanics. He states his goal in its first sentence. It’s a similar goal to that of IPMB.
Our objective is to learn how to formulate problems in mechanics, and how to reduce vague questions and ideas to precise mathematical statements, as well as to cultivate a habit of questioning, analyzing, designing, and inventing in engineering and science.
A special issue of the Journal of Biomechanical Engineering is dedicated to Fung’s birthday celebration. The editors write
Dr. Fung has been a singular pioneer in the field of Biomechanics, establishing multiple biomechanical theories and paradigms in various organ systems, including the heart, blood vessels, blood cells, and lung... He has mentored and trained many researchers in the biomechanics and bioengineering fields. His books … have become the classic biomechanics textbooks for students and researchers around the world. Dr. Fung is a member of all three U.S. National Academies—National Academy of Sciences, National Academy of Engineering, and National Academy of Medicine. He is also a member of the Chinese Academy of Sciences and a member of Academia Sinica. He has received many awards including the Timoshenko medal, the Russ Prize, and the National Medal of Science.
Fung earned his bachelor’s and master's degrees in aeronautics from the Central University of China in 1941 and 1943. College must have been difficult in China during the Second World War. I bet he has stories to tell. After the war he won a scholarship to come to the United States and study at Caltech, where he earned his PhD in 1948.

Fung joined the faculty at Caltech and remained there for nearly twenty years. In the 1950's, he became interested in biomechanics when his mother was suffering from glaucoma. In 1966, Fung moved to the University of California at San Diego, where he established their bioengineering program. He is known as the “Father of Modern Biomechanics.”

Happy birthday Professor Fung.

 
Yuan-Cheng Fung: 2000 National Medal of Science

2007 Russ Prize video

Friday, September 13, 2019

Intermediate Physics for Medicine and Biology has a New Website

A New Website

This summer I received an email from University Technology Services saying that faculty websites, like the one I maintain about Intermediate Physics for Medicine and Biology, would no longer be supported at Oakland University. In other words, IPMB needed a new online home. So today I announce our new website: https://sites.google.com/view/hobbieroth. If you try to access the old website listed in IPMB, www.oakland.edu/~roth/hobbie, it’ll link you to the new site, but I don’t know how long that will last.

What can you find at our new website? Lots of stuff, including
If you’re looking for my website, it’s changed too, to https://sites.google.com/view/bradroth.

Class Videos

This semester I’m teaching PHY 3250, Biological Physics. I am recording each class, and I’ll upload the videos to YouTube. Anyone can watch the lectures for free, as if it were an online class. I still use the blackboard, and sometimes it’s difficult to read in the video. I hope you can follow most of the lectures.
PHY 3250 class on September 6, 2019, covering biomechanics.

 

Useful for Instructors

If you scroll down to the box on the right of hobbieroth.blogspot.com you will find a list of labels. Click the one called “Useful for Instructors” and you can find several posts that are….er….useful for instructors. If you’re teaching from IPMB, you might find these posts particularly helpful.

Google Scholar

Below is a screenshot of IPMB’s Google Scholar citation statistics. We’ve averaged 26 citations a year over the last ten years, or one every two weeks. We thank all of you who’ve referenced IPMB. We’re delighted you found it important enough to cite.

A screenshot of the Google Scholar citation data for Intermediate Physics for Medicine and Biology, taken Septeber 1, 2019.

Friday, September 6, 2019

The Linear No-Threshold Model of Radiation Risk

Certain topics discussed in Intermediate Physics for Medicine and Biology always fascinate me. One is the linear no-threshold model. In Section 16.12, Russ Hobbie and I write
In dealing with radiation to the population at large, or to populations of radiation workers, the policy of the various regulatory agencies has been to adopt the linear no-threshold (LNT) model to extrapolate from what is known about the excess risk of cancer at moderately high doses and high dose rates, to low doses, including those below natural background.
Possible responses to radiation are summarized in Figure 16.51 of IPMB. Scientists continue to debate the LNT model because reliable data (shown by the two data points with their error bars in the upper right) do not extend down to low doses.

Figure 16.51 from Intermediate Physics for Medicine and Biology, showing possible responses to various doses. The two lowest-dose measurements are shown with their error bars.
Figure 16.51 from IPMB, showing possible responses to various doses.
The two lowest-dose measurements are shown with their error bars.
The linear no-threshold assumption is debated in a point/counterpoint article in the August issue of Medical Physics (“The Eventual Rejection of the Linear No-Threshold Theory Will Lead to a Drastic Reduction in the Demand for Diagnostic Medical Physics Services,” Volume 46, Pages 3325-3328). I have discussed before how useful point/counterpoint articles are for teaching medical physics. They provide a glimpse into the controversies that medical physicists grapple with every day. The title of each point/counterpoint article is phrased as a proposition. In this case, Aaron Jones argues for the proposition and Michael O’Connor argues against it. The moderator Habib Zaidi frames the issue in his overview
Controversies about the linear no‐threshold (LNT) hypothesis have been around since the early development of basic concepts in radiation protection and publication of guidelines by professional societies. Historically, this model was conceived over 70 yr ago and is still widely adopted by most of the scientific community and national and international advisory bodies (e.g., International Commission on Radiological Protection, National Council on Radiation Protection and Measurements) for assessing risk from exposure to low‐dose ionizing radiation. The LNT model is currently employed to provide cancer risk estimates subsequent to low level exposures to ionizing radiation despite being criticized as causing unwarranted public fear of all low-dose radiation exposures and costly implementation of unwarranted safety measures. Indeed, linearly extrapolated risk estimates remain hypothetical and have never been rigorously quantified by evidence-based studies. As such, is the LNT model legitimate and its use by regulatory and advisory bodies justified? What would be the impact on our profession if this hypothesis were to be rejected by the scientific community? Would this result in drastic reduction in the demand for diagnostic medical physics services? These questions are addressed in this month’s Point/Counterpoint debate.
Both protagonists give little support to the linear no-threshold hypothesis; they write as if its rejection is inevitable. What is the threshold dose below which risk is negligible? This question is not resolved definitively, but 100 mSv is the number both authors mention.

The linear no-threshold model has little impact for individuals, but is critical for estimating public health risks—such as using backscatter x-ray detectors in airports—when millions of people are exposed to minuscule doses. I’m no expert on this topic so I can’t comment with much authority, but I’ve always been skeptical of the linear no-threshold model.

Much of this point/counterpoint article deals with the impact of the linear no-threshold model on the medical physics job market. I agree with O’Connor that “[The title of the point/counterpoint article] is an interesting proposition as it implies that medical physicists care only about their field and not about whether or not a scientific concept (the LNT) is valid or not,” except “interesting” is not the word I would have chosen. I am skeptical that resolution of the LNT controversy will have a significant consequences for medical physics employment. After we discuss a point/counterpoint article in my PHY 3260 (Medical Physics) class, I insist that students vote either "for" or "against" the proposition. In this case, I agree with O'Connor and vote against it.

I will leave you with O’Connor’s concluding speculation about how rejecting the linear no-threshold model will affect both the population at large and on the future medical physics job market.
In our new enlightened world 30 yr from now, LNT theory has long been discarded, the public are now educated as to the benefits of low doses of ionizing radiation and there is no longer a race to push radiation doses lower and lower in x‐ray imaging. On the contrary, with acceptance of radiation hormesis, a new industry has arisen that offers the public an annual booster dose of radiation every year, particularly if they live in low levels of natural background radiation. How will this booster dose be administered? For those with the means, it might mean an annual trip to the Rocky Mountains. For others it could mean a trip to the nearest clinic for a treatment session with ionizing radiation. Who will oversee the equipment designed to deliver this radiation, to insure that the correct dose is delivered? The medical physicist!

Friday, August 30, 2019

The Book of Why

The Book of Why: The New Science of Cause and Effect, by Judea Pearl and Dana MacKenzie, superimposed on Intermediate Physics for Medicine and Biology.
The Book of Why,
by Judea Pearl.
At Russ Hobbie’s suggestion, I read The Book of Why, by Judea Pearl. This book presents a new way of analyzing data, using causal inferences in addition to more traditional, hypothesis-free statistical methods. In his introduction, Pearl writes
If I could sum up the message of this book in one pithy phrase, it would be that you are smarter than your data. Data do not understand causes and effects; humans do. I hope that the new science of causal inference will enable us to better understand how we do it, because there is no better way to understand ourselves than by emulating ourselves. In the age of computers, this new understanding also brings with it the prospect of amplifying our innate abilities so that we can make better sense of data, be it big or small.
I had a hard time with this book, mainly because I’m not a fan of statistics. Rather than asking “why” questions, I usually ask “what if” questions. In other words, I build mathematical models and then analyze them and make predictions. Intermediate Physics for Medicine and Biology has a similar approach. For instance, what if drift and diffusion both act in a pore; which will dominate under what circumstances (Section 4.12 in IPMB)? What if an ultrasonic wave impinges on an interface between tissues having different acoustic impedances; what fraction of the energy in the wave is reflected (Section 13.3)? What if you divide up a round of radiation therapy into several small fractions; will this preferentially spare healthy tissue (Section 16.9)? Pearl asks a different type of question: the data shows that smokers are more likely to get lung cancer; why? Does smoking cause lung cancer, or is there some confounding effect responsible for the correlation (for instance, some people have a gene that makes them both more susceptible to lung cancer and more likely to smoke)?

Although I can’t say I’ve mastered Pearl’s statistical methods for causal inference, I do like the way he adopts a causal model to test data. Apparently for a long time statisticians analyzed data using no hypotheses, just statistical tests. If they found a correlation, they could not infer causation; does smoking cause lung cancer or does lung cancer cause smoking? Pearl draws many causal diagrams to make his causation assumptions explicit. He then uses these illustrations to derive his statistical model. These drawings remind me of Feynman diagrams that we physicists use to calculate the behavior of elementary particles.

Simpson’s Paradox

Just when my interest in The Book of Why was waning, Pearl shocked me back to attention with Simpson’s paradox.
Imagine a doctor—Dr. Simpson, we’ll call him—reading in his office about a promising new drug (Drug D) that seems to reduce the risk of a heart attack. Excitedly, he looks up the researcher’s data online. His excitement cools a little when he looks at the data on male patients and notices that their risk of a heart attack is actually higher if they take Drug D. “Oh well,” he says, “Drug D must be very effective for women.”

But then he turns to the next table, and his disappointment turns to bafflement. “What is this?” Dr. Simpson exclaims. “It says here that women who took Drug D were also at higher risk of a heart attack. I must be losing my marbles! This drug seems to be bad for women, bad for men, but good for people.”
To illustrate this effect, consider the example analyzed by Pearl. In a clinical trial some patients received a drug (treatment) and some didn’t (control). Patients who subsequently had heart attacks are indicated by red boxes, and patients who did not by blue boxes. In the figure below, the data is analyzed by gender: males and females.

An example of Simpson's paradox, showing men and women being divided into treatment and control groups. Based on The Book of Why, by Judea Pearl and Dana MacKenzie.

One out of twenty (5%) of the females in the control group had heart attacks, while three out of forty (7.5%) in the treatment group did. For women, the drug caused heart attacks! For males, twelve out of forty men in the control group (30%) suffered heart attacks, and eight out of twenty (40%) in the treatment group did. The drug caused heart attacks for the men too!

Now combine the data for men and women.

An example of Simpson's paradox, showing men and women pooled together into treatment and control groups. Based on The Book of Why, by Judea Pearl and Dana MacKenzie.

In the control group, 13 out of 60 patients had a heart attack (22%). In the treatment group, 11 of 60 patients had one (18%). The drug prevented heart attacks! This seems impossible, but if you don’t believe me, count the boxes; it’s not a trick. What do we make of this? As Pearl says “A drug can’t simultaneously cause me and you to have a heart attack and at the same time prevent us both from having heart attacks.”

To resolve the paradox, Pearl notes that this was not a randomized clinical trial. Patients could decide to take the drug or not, and women chose the drug more often then men. The preference for taking the drug is what Pearl calls a “confounder.” The chance of having a heart attack is much greater for men than women, but more women elected to join the treatment group then men. Therefore, the treatment group was overweighted with low-risk women, and the control group was overweighted with high-risk men, so when data was pooled the treatment group looked like they had fewer heart attacks than the control group. In other words, the difference between treatment and control got mixed up with the difference between men and women. Thus, the apparent effectiveness of the drug in the pooled data is a statistical fluke. A random trial would have shown similar data for men and women, but a different result when the data was pooled. The drug causes heart attacks.

Mathematics

The Book of Why contains only a little mathematics; Pearl tries to make the discussion accessible to a wide audience. He does, however, use lots of math in his research. His opinion of math is similar to mine and to IPMB’s.
Many people find formulas daunting, seeing them as a way of concealing rather than revealing information. But to a mathematician, or to a person who is adequately trained in the mathematical way of thinking, exactly the reverse is true. A formula reveals everything: it leaves nothing to doubt or ambiguity. When reading a scientific article, I often catch myself jumping from formula to formula, skipping the words altogether. To me, a formula is a baked idea. Words are ideas in the oven.
One goal of IPMB is to help students gain the skills in mathematical modeling so that formulas reveal rather than conceal information. I often tell my students that formulas aren’t things you stick numbers into to get other numbers. Formulas tell a story. This idea is vitally important. I suspect Pearl would agree.

Modeling

The causal diagrams in The Book of Why aid Pearl in deriving the correct statistical equations needed to analyze data. Toy models in IPMB aid students in deriving the correct differential equations needed to predict behavior. I see modeling as central to both activities: you start with an underlying hypothesis about what causes what, you translate that into mathematics, and then you learn something about your system. As Pearl notes, statistics does not always have this approach.
In certain circles there is an almost religious faith that we can find the answers to these questions in the data itself, if only we are sufficiently clever at data mining. However, readers of this book will know that this hype is likely to be misguided. The questions I have just asked are all causal, and causal questions can never be answered from data alone. They require us to formulate a model of the process that generates the data, or at least some aspects of that process. Anytime you see a paper or a study that analyzes the data in a model-free way, you can be certain that the output of the study will merely summarize, and perhaps transform, but not interpret the data.
I enjoyed The Book of Why, even if I didn’t entirely understand it. It was skillfully written, thanks in part to coauthor Dana MacKenzie. It’s the sort of book that, once finished, I should go back and read again because it has something important to teach me. If I liked statistics more I might do that. But I won’t.

Friday, August 23, 2019

Happy Birthday, Godfrey Hounsfield!

Godfrey Hounsfield (1919-2004).
Godfrey Hounsfield
(1919-2004).
Wednesday, August 28, is the hundredth anniversary of the birth of Godfrey Hounsfield, the inventor of the computed tomography scanner.

In Intermediate Physics for Medicine and Biology, Russ Hobbie and I write
The history of the development of computed tomography is quite interesting (Kalender 2011). The Nobel Prize in Physiology or Medicine was shared in 1979 by a physicist, Allan Cormack, and an engineer, Godfrey Hounsfield…The Nobel Prize acceptance speeches (Cormack 1980; Hounsfield 1980) are interesting to read.
To celebrate the centenary of Hounsfield’s birth, I’ve collected excerpts from his interesting Nobel Prize acceptance speech.
When we consider the capabilities of conventional X-ray methods, three main limitations become obvious. Firstly, it is impossible to display within the framework of a two-dimensional X-ray picture all the information contained in the three-dimensional scene under view. Objects situated in depth, i. e. in the third dimension, superimpose, causing confusion to the viewer.

Secondly, conventional X-rays cannot distinguish between soft tissues. In general, a radiogram differentiates only between bone and air, as in the lungs. Variations in soft tissues such as the liver and pancreas are not discernible at all and certain other organs may be rendered visible only through the use of radio-opaque dyes.

Thirdly, when conventional X-ray methods are used, it is not possible to measure in a quantitative way the separate densities of the individual substances through which the X-ray has passed. The radiogram records the mean absorption by all the various tissues which the X-ray has penetrated. This is of little use for quantitative measurement.

Computed tomography, on the other hand, measures the attenuation of X-ray beams passing through sections of the body from hundreds of different angles, and then, from the evidence of these measurements, a computer is able to reconstruct pictures of the body’s interior...
The technique’s most important feature is its [enormous] sensitivity. It allows soft tissue such as the liver and kidneys to be clearly differentiated, which radiographs cannot do…
It can also very accurately measure the values of X-ray absorption of tissues, thus enabling the nature of tissue to be studied.
These capabilities are of great benefit in the diagnosis of disease, but CT additionally plays a role in the field of therapy by accurately locating, for example, a tumour so indicating the areas of the body to be irradiated and by monitoring the progress of the treatment afterwards...
Famous scientists and engineers often have fascinating childhoods. Learn about Hounsfield’s youth by reading these excerpts from his Nobel biographical statement.
I was born and brought up near a village in Nottinghamshire and in my childhood enjoyed the freedom of the rather isolated country life. After the first world war, my father had bought a small farm, which became a marvellous playground for his five children… At a very early age I became intrigued by all the mechanical and electrical gadgets which even then could be found on a farm; the threshing machines, the binders, the generators. But the period between my eleventh and eighteenth years remains the most vivid in my memory because this was the time of my first attempts at experimentation, which might never have been made had I lived in a city… I constructed electrical recording machines; I made hazardous investigations of the principles of flight, launching myself from the tops of haystacks with a home-made glider; I almost blew myself up during exciting experiments using water-filled tar barrels and acetylene to see how high they could be waterjet propelled…

Aeroplanes interested me and at the outbreak of the second world war I joined the RAF as a volunteer reservist. I took the opportunity of studying the books which the RAF made available for Radio Mechanics and looked forward to an interesting course in Radio. After sitting a trade test I was immediately taken on as a Radar Mechanic Instructor and moved to the then RAF-occupied Royal College of Science in South Kensington and later to Cranwell Radar School. At Cranwell, in my spare time, I sat and passed the City and Guilds examination in Radio Communications. While there I also occupied myself in building large-screen oscilloscope and demonstration equipment as aids to instruction...

It was very fortunate for me that, during this time, my work was appreciated by Air Vice-Marshal Cassidy. He was responsible for my obtaining a grant after the war which enabled me to attend Faraday House Electrical Engineering College in London, where I received a diploma.
I joined the staff of EMI in Middlesex in 1951, where I worked for a while on radar and guided weapons and later ran a small design laboratory. During this time I became particularly interested in computers, which were then in their infancy… Starting in about 1958 I led a design team building the first all-transistor computer to be constructed in Britain, the EMIDEC 1100

I was given the opportunity to go away quietly and think of other areas of research which I thought might be fruitful. One of the suggestions I put forward was connected with automatic pattern recognition and it was while exploring various aspects of pattern recognition and their potential, in 1967, that the idea occurred to me which was eventually to become the EMI-Scanner and the technique of computed tomography...
Happy birthday, Godfrey Hounsfield. Your life and work made a difference.

 Watch “The Scanner Story,” a documentary made by EMI 
about their early computed tomography brain scanners.
The video, filmed in 1978, shows its age but is engaging.

Part Two of “The Scanner Story.”