Friday, September 18, 2009

More on "Is Computed Tomography Safe?"

The December 7, 2007 entry to this blog was titled “Is Computed Tomography Safe?” As is often the case with such a difficult question, the answer is yes and no. No—there are risks associated with any exposure to ionizing radiation, so no procedure is entirely safe. Yes—in most cases the risks are small enough that the benefits outweigh the risks. In order to answer this question more precisely, a large scale study with nearly one million patients was conducted over three years. The conclusions were reported in the August 27 issue of the New England Journal of Medicine. The abstract of Fazel et al.’s paper “Exposure to Low-Dose Ionizing Radiation from Medical Imaging Procedures” (NEJM, Volume 361, Pages 849–857, 2009) is reproduced below.
Background: The growing use of imaging procedures in the United States has raised concerns about exposure to low-dose ionizing radiation in the general population.

Methods: We identified 952,420 nonelderly adults (between 18 and 64 years of age) in five health care markets across the United States between January 1, 2005, and December 31, 2007. Utilization data were used to estimate cumulative effective doses of radiation from imaging procedures and to calculate population-based rates of exposure, with annual effective doses defined as low (less than 3 mSv), high (greater than 20 to 50 mSv), or very high (greater than 50 mSv).

Results: During the study period, 655,613 enrollees (68.8%) underwent at least one imaging procedure associated with radiation exposure. The mean (±SD) cumulative effective dose from imaging procedures was 2.4±6.0 mSv per enrollee per year; however, a wide distribution was noted, with a median effective dose of 0.1 mSv per enrollee per year (interquartile range, 0.0 to 1.7). Overall, moderate effective doses of radiation were incurred in 193.8 enrollees per 1000 per year, whereas high and very high doses were incurred in 18.6 and 1.9 enrollees per 1000 per year, respectively. In general, cumulative effective doses of radiation from imaging procedures increased with advancing age and were higher in women than in men. Computed tomographic and nuclear imaging accounted for 75.4% of the cumulative effective dose, with 81.8% of the total administered in outpatient settings.

Conclusions: Imaging procedures are an important source of exposure to ionizing radiation in the United States and can result in high cumulative effective doses of radiation."
To help put this study in context, the NEJM published an accompanying editorial by Michael Lauer (“Elements of Danger—The Case of Medical Imaging,” Volume 361, Pages 841–843). Lauer writes that
Because the use of ionizing radiation carries “an element of danger in every . . . procedure,” we need to adopt a new paradigm for our approach to imaging. Instead of investing so many resources in performing so many procedures, we should take a step back and design and execute desperately needed large-scale, randomized trials to figure out which procedures yield net benefits. This approach would require leadership and courage on the part of our profession, our opinion leaders, and the research enterprise, but were we to insist that all, nearly all, procedures be studied in well-designed trials, we could answer many critical clinical questions within a short time. Because we will continue to be uncertain of the magnitude of harm, an accurate understanding of the magnitude of benefit is a moral imperative.
In Chapter 16 of the 4th edition of Intermediate Physics for Medicine and Biology, Russ Hobbie and I discuss the risk of radiation. While we do not provide a final answer regarding the safety of CT, we do outline many of the important issues one must examine in order to make an informed decision. The safety of computed tomography and other diagnostic imaging procedures will continue to be a crucial question of interest to readers of Intermediate Physics for Medicine and Biology. I will try to keep you posted as new information becomes available.

P.S. Thanks to Russ Hobbie for calling my attention to this paper. He reads the New England Journal of Medicine more than I do.

Friday, September 11, 2009

A New Homework Problem

While in Minneapolis last week, attending the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society, I had the pleasure of co-chairing a session with Professor Michael Joy of the University of Toronto. Joy has done some fascinating work on measuring current and conductivity in biological tissue using magnetic resonance imaging. His research inspired me to write a new homework problem for Chapter 8 of the 4th edition of Intermediate Physics for Medicine and Biology.
Problem 21.5 The differential form of Ampere’s law, derived in Problem 21, provides a relationship between the current density J and the magnetic field B that allows you to measure biological current with magnetic resonance imaging [see, for example, Scott, G. C., M. L. G. Joy, R. L. Armstrong, and R. M. Henkelman (1991) “Measurement of Nonuniform Current Density by Magnetic Resonance,” IEEE Transactions on Medical Imaging, Volume 10, Pages 362–374]. Suppose you use MRI and find the distribution of magnetic field to be

Bx = C (y z2 – y x2)
By = C (x z2 – x y2)
Bz = C 4 x y z

where C is a constant with the units of T/m3. Determine the current density. Assume the current varies slowly enough that the displacement current can be neglected.
To solve this problem, you need the result of Problem 21 in Chapter 8, which asks the reader to derive the differential form of Ampere’s law from the integral form given in the book by Eq. 8.11. If I were teaching a class from the book, I would assign both Problems 21 and 21.5, and expect the student to solve them both. But for readers of this blog, I will tell you the answer to Problem 21 (ignoring displacement current), so you will have the relationship needed to solve the new Problem 21.5: curl B = μ0 J. The curl is introduced in Section 8.6. If you don’t have the 4th edition of Intermediate Physics for Medicine and Biology handy, take a look in a math handbook for information about how to calculate the curl (or see Schey’s book Div, Grad, Curl, and All That).

The story of how you measure B using MRI is interesting, but a bit too complicated to describe in detail here. In brief, a magnetic resonance imaging device has a strong static magnetic field about which nuclear spins (such as those of hydrogen) precess. The magnetic field produced by the current density modifies the static magnetic field, causing a phase shift in this precession. This phase shift is detected, and the magnetic field can be deduced from it. Technically, this method allows one to determine the component of the magnetic field that is parallel to the static field. Obtaining the other components requires rotating the object and repeating the procedure. See Chapter 18 for more about MRI.

Send me an email (roth@oakland.edu) if you would like the answer to the new Problem 21.5.

Enjoy.

Friday, September 4, 2009

31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society

I’m posting this blog from Minneapolis, Minnesota, where I am attending the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society. The theme of the conference is “Engineering the Future of Biomedicine,” and there are many fascinating talks and posters that would interest readers of the 4th edition of Intermediate Physics for Medicine and Biology. Conference chair Bin He and his colleagues have put together a great meeting.

My friend Ranjith Wijesinghe and I have a poster later today about the “Effect of Peripheral Nerve Action Currents on Magnetic Resonance Imaging.” We analyzed if the magnetic field of action currents can be used to generate an artifact in an MRI, allowing direct imaging of biocurrents in the brain. There has been a lot of interest, and many publications, on this topic recently, but we conclude that the magnetic fields are just too small to have a measureable effect.

Last night, I got to hear Earl Bakken give a talk on “The History of Short-Term and Long-Term Pacing.” Bakken is a giant in the history of artificial pacemakers, and is the founder of Medtronic Corportion based in Minneapolis. He talked about the early years when Medtronic was a small electronics laboratory in a garage. He recommended a 10-minute video on YouTube, which he said told his story well. He also quoted one of my favorite books, Machines in our Hearts, a wonderful history of pacemakers and defibrillators. Tonight a social is being held at the Bakken Museum, “the only museum of its kind in the country, [where you can] learn about the history of electricity and magnetism and how it relates to medicine.” For a guy like me, this is great stuff.

Earl Bakken; Ready, Fire, Aim!

So far, the presentations are fascinating and inspirational. I must admit, the students who attend these conferences always stay the same age as I grow older. I don’t think these meetings used to be this exhausting for me. As that old Garth Brooks song says, “the competition’s getting younger.” They are also getting more diverse. The speaker who welcomed us to the Bakken talk said that in just a few years, Americans will be a minority within the IEEE Engineering in Medicine and Biology Society. This is not difficult to believe, after seeing researchers from so many countries attending this year.

As I survey all the research presented at this meeting, I am proud that so much of the underlying science is described in Intermediate Physics for Medicine and Biology. I am more convinced than ever that Russ Hobbie and I have written a book that will be of great value to future biomedical engineers.

Friday, August 28, 2009

Resource Letter MPRT-1: Medical Physics in Radiation Therapy

When Russ Hobbie and I were preparing the 4th edition of Intermediate Physics for Medicine and Biology, we tried to update our book with the most recent references. But, inevitably, as time passes the book becomes increasingly out-of-date. How does one keep up with the literature? This blog is meant to help our readers stay current, but sometimes more drastic measures are required. Fortunately, the American Journal of Physics publishes “Resource Letters,” in which the author reviews important sources (mainly textbooks and research articles) on a particular topic. In the September 2009 issue of AJP, Steven Ratliff of Saint Cloud State University published “Resource Letter MPRT-1: Medical Physics in Radiation Therapy” (Volume 77, Pages 774–782, 2009). The abstract is reproduced below.
This resource letter provides a guide to the literature on medical physics in the field of radiation therapy. Journal articles, books, and websites are cited for the following topics: radiological physics, particle accelerators, radiation dose measurements, protocols for radiation dose measurements, radiation shielding and radiation protection, neutron, proton, and heavy-ion therapies, imaging for radiation therapy, brachytherapy, quality assurance, treatment planning, dose calculations, and intensity-modulated and image-guided therapy.
I highly recommend this Resource Letter for anyone interested in radiation therapy. Particularly useful is Ratliff’s concluding section “Recommended Path Through the Literature.”
The best single reference for a newcomer to the field is Goitein (Ref. 14). It is clear, up to date, readable, complete, and gives a good explanation of what medical physicists do. For a person who does not want to enter the field but is just curious or needs to get some information and does not want to spend any money, a good place to start is the free on-line book by Podgorsak (Ref. 153). Van Dyk (Ref. 17) is a good place to start for those who want a clinical emphasis. The book by Turner (Ref. 91) has good problems (some with answers) and covers many aspects of the subject.

For those wanting to make a career of Medical Physics, a small but good starting library would consist of Goitein (Ref. 14), Hendee et al. (Ref. 30), Johns and Cunningham (Ref. 15), Khan (Ref. 16), Podgorsak (Ref. 153), Turner (Ref. 91), and van Dyk (Ref. 17). Khan is more useful once you have learned the material. If you have more money, you could add Attix (Ref. 19) and Podgorsak’s book on radiation physics (Ref. 26). Cember and Johnson (Ref. 92) is a good addition if you are interested in the health-physics aspects of radiotherapy.

If you were restricted to one book and wanted to learn as much as possible, then the handbook of Mayles et al. (Ref. 18) is worthy of serious consideration.
The references Ratliff cites in his conclusion (less than 10% of the 183 publications included in the entire Resource Letter) are listed below.
14. Radiation Oncology—A Physicist's Eye View, Michael Goitein (Springer Science+Business Media, LLC, New York, 2008).

15. The Physics of Radiology, Harold Elford Johns and John Robert Cunningham, 4th ed. (Charles C. Thomas, Springfield, Illinois, 1983).

16. The Physics of Radiation Therapy, Faiz M. Khan, 3rd ed. (Lippincott Williams and Wilkins, Philadelphia, PA, 2003).

17. The Modern Technology of Radiation Oncology—A Compendium for Medical Physicists and Radiation Oncologists, Vols. 1 and 2, edited by Jacob Van Dyk (Medical Physics, Madison, WI, 1999 and 2005).

18. Handbook of Radiotherapy Physics—Theory and Practice, edited by P. Mayles, A. Nahum, and J. C. Rosenwald (Taylor & Francis, New York, 2007).

19. Introduction to Radiological Physics and Radiation Dosimetry, Frank Herbert Attix (Wiley-VCH, Weinheim, Germany, 1986).

26. Radiation Physics for Medical Physicists, E. B. Podgorsak (Springer-Verlag, New York, 2006).

30. Radiation Therapy Physics, William R. Hendee, Geoffrey S. Ibbott, and Eric G. Hendee, 3rd ed. (Wiley, Hoboken, NJ, 2005).

91. Atoms, Radiation, and Radiation Protection, James E. Turner, 2nd ed. (Wiley, New York, 1995).

92. Introduction to Health Physics, Herman Cember and Thomas E. Johnson, 4th ed. (McGraw-Hill Medical, New York, 2009).

153. Radiation Oncology Physics: A Handbook for Teachers and Students, edited by E. B. Podgorsak (International Atomic Energy Agency, Vienna, 2005). (www-naweb.iaea.org/nahu/dmrp/pdf_files/ToC.pdf)
By the way, if you look in the acknowledgments of Ratliff’s publication you will find the ubiquitous Russ Hobbie among those thanked for their helpful suggestions.

Friday, August 21, 2009

The ECG Dance

In Chapter 7 of the 4th edition of Intermediate Physics for Medicine and Biology, Russ Hobbie and I describe the electrocardiogram. I always thought that the best way to teach the ECG was an online cardiac rhythm simulator. But now, thanks to a tip from my former student Debbie Janks, I have found an even better way to teach the ECG. Check out this video on youtube. I will have to try this myself next time I teach Biological Physics.

 Living Arrhythmias with John Grammer.
https://www.youtube.com/embed/deigcmtDV74

Friday, August 14, 2009

The Bell Curve

I was browsing through the 4th edition of Intermediate Physics for Medicine and Biology the other day (I do this sometimes; don’t ask why), and I noticed the footnote at the bottom of page 566 in Appendix H: The Binomial Probability Distribution, which states
See also A. Gawande, The bell curve. The New Yorker, December 6, 2004, pp. 82–91.
I thought to myself, “that must be one of the changes Russ Hobbie made when we were preparing the 4th edition, because I don’t remember ever reading the article.” Well, if Russ recommends it, then I want to read it, so I found the article on the web. It turns out to be a lovely, well-written piece about cystic fibrosis (CF), modern medicine, self-evaluation, and striving for excellence. The excerpt below is the one Russ probably had in mind when he added the citation of the article to our book. It describes a discussion between a teenage CF patient, Janelle, her physician Dr. Warwick, and the article’s author Atul Gawande, who is a surgeon and was observing Janelle’s interview as part of an effort to improve cystic fibrosis care. In the quote below, Janelle’s doctor is speaking.
“Let’s look at the numbers,” he said to me, ignoring Janelle. He went to a little blackboard he had on the wall. It appeared to be well used. “A person’s daily risk of getting a bad lung illness with CF is 0.5 per cent.” He wrote the number down. Janelle rolled her eyes. She began tapping her foot. “The daily risk of getting a bad lung illness with CF plus treatment is 0.05 per cent,” he went on, and he wrote that number down. “So when you experiment you’re looking at the difference between a 99.95-per-cent chance of staying well and a 99.5-per-cent chance of staying well. Seems hardly any difference, right? On any given day, you have basically a one-hundred-per-cent chance of being well. But”—he paused and took a step toward me—“it is a big difference.” He chalked out the calculations. “Sum it up over a year, and it is the difference between an eighty-three-per-cent chance of making it through 2004 without getting sick and only a sixteen-per-cent chance.
He turned to Janelle. “How do you stay well all your life? How do you become a geriatric patient?” he asked her. Her foot finally stopped tapping. “I can’t promise you anything. I can only tell you the odds.”
In this short speech was the core of Warwick’s world view. He believed that excellence came from seeing, on a daily basis, the difference between being 99.5-per-cent successful and being 99.95-per-cent successful. Many activities are like that, of course: catching fly balls, manufacturing microchips, delivering overnight packages. Medicine’s only distinction is that lives are lost in those slim margins.
The article describes how one CF center began measuring its own success against the top programs in the country, and their efforts to improve. Gawande concludes
The hardest question for anyone who takes responsibility for what he or she does is, What if I turn out to be average? If we took all the surgeons at my level of experience, compared our results, and found that I am one of the worst, the answer would be easy: I’d turn in my scalpel. But what if I were a C? Working as I do in a city that’s mobbed with surgeons, how could I justify putting patients under the knife? I could tell myself, Someone’s got to be average. If the bell curve is a fact, then so is the reality that most doctors are going to be average. There is no shame in being one of them, right?
Except, of course, there is. Somehow, what troubles people isn’t so much being average as settling for it. Everyone knows that averageness is, for most of us, our fate. And in certain matters—looks, money, tennis—we would do well to accept this. But in your surgeon, your child’s pediatrician, your police department, your local high school? When the stakes are our lives and the lives of our children, we expect averageness to be resisted. And so I push to make myself the best. If I’m not the best already, I believe wholeheartedly that I will be. And you expect that of me, too. Whatever the next round of numbers may say.

Friday, August 7, 2009

Technetium Shortage....Again

Readers of this blog (are there any?) may recall two earlier entries on December 14, 2007 and the May 23, 2008, when I discussed a shortage of technetium for medical imaging. It seems that this problem just won’t go away. According to a recent article in the New York Times, we are once again experiencing a global shortage of technetium, caused by the shutdown of nuclear reactors in Canada and the Netherlands. I fear that although the current shortage may be temporary, disruptions of the supply of technetium will reoccur with increasing frequency as nuclear power plants age. A reactor dedicated to technetium production in the United States would go a long way toward solving the problem, but would be expensive.

Russ Hobbie and I discussed technetium in the 4th edition of Intermediate Physics for Medicine and Biology. Technetium-99m—the key isotope of technetium for medical imaging—is a decay product of molybdenum-99, which in turn is a nuclear fragment that is produced during the fission of uranium. It is widely used in part because its 140 keV gamma emission and its 6 hour half life are particularly suited to nuclear medicine diagnostic procedures. 99mTc is often combined with other molecules to make radiopharmaceuticals, such as 99mTc-sestamibi and 99mTc-tetrofosmin, that can have very specific effects as tracers. For more about the discovery of technetium, see the March 13, 2009 entry of this blog.

Friday, July 31, 2009

Roberts Prize

One journal that readers of the 4th edition of Intermediate Physics for Medicine and Biology may enjoy is Physics in Medicine and Biology. Below is part of an editorial that recently appeared in PMB.
The publishers of Physics in Medicine and Biology (PMB), IOP Publishing, in association with the journal owners, the Institute of Physics and Engineering in Medicine (IPEM), jointly award an annual prize for the “best” paper published in PMB during the previous year.

The procedure for deciding the winner has been made as thorough as possible, to try to ensure that an outstanding paper wins the prize. We started off with a shortlist of the 10 research papers published in 2008 which were rated the best based on the referees’ quality assessments. Following the submission of a short “case for winning” document by each of the shortlisted authors, an IPEM college of jurors of the status of FIPEM assessed and rated these 10 papers in order to choose a winner, which was then endorsed by the Editorial Board.

It was a close run thing between the top two papers this year. The Board feel that we have a very worthy winner... We have much pleasure in advising the readers of PMB that the 2008 Roberts Prize is awarded to J P Schlomka et al for their paper on multi-energy CT.
The abstract of the paper (J P Schlomka, E Roessl, R Dorscheid, S Dill, G Martens, T Istel, C Bäumer, C Herrmann, R~Steadman, G Zeitler, A Livne and R Proksa, “Experimental Feasibility of Multi-Energy Photon-Counting K-Edge Imaging in Pre-Clinical Computed Tomography,” Physics in Medicine and Biology, Volume 53, Pages 4031–4047, 2008) is reproduced below
Theoretical considerations predicted the feasibility of K-edge x-ray computed tomography (CT) imaging using energy discriminating detectors with more than two energy bins. This technique enables material-specific imaging in CT, which in combination with high-Z element based contrast agents, opens up possibilities for new medical applications. In this paper, we present a CT system with energy detection capabilities, which was used to demonstrate the feasibility of quantitative K-edge CT imaging experimentally. A phantom was imaged containing PMMA, calcium-hydroxyapatite, water and two contrast agents based on iodine and gadolinium, respectively. Separate images of the attenuation by photoelectric absorption and Compton scattering were reconstructed from energy-resolved projection data using maximum-likelihood basis-component decomposition. The data analysis further enabled the display of images of the individual contrast agents and their concentrations, separated from the anatomical background. Measured concentrations of iodine and gadolinium were in good agreement with the actual concentrations. Prior to the tomographic measurements, the detector response functions for monochromatic illumination using synchrotron radiation were determined in the energy range 25 keV–60 keV. These data were used to calibrate the detector and derive a phenomenological model for the detector response and the energy bin sensitivities.
You can learn more about the Robert’s award and the winning paper at the IOP’s excellent website http://medicalphysicsweb.org. I signed up for their weekly email, which is where I learned about this year’s winner. It is a great way for readers of Intermediate Physics for Medicine and Biology to keep up-to-date on recent breakthroughs in medical physics.

Friday, July 24, 2009

Two-Dimensional Image Reconstruction

In Section 12.4 of the 4th edition of Intermediate Physics for Medicine and Biology, Russ Hobbie and I discuss Two-Dimensional Image Reconstruction from Projections by Fourier Transform. The method is summarized in our Fig. 12.20: i) perform a 1-D Fourier transform of the projection at each angle θ, ii) convert from polar coordinates (k, θ) to Cartesian coordinates (kx, ky), and iii) perform an inverse 2-D Fourier transform to recover the desired image.

I wanted to include in our book some examples where this procedure could be done analytically, thinking that they would give the reader a better appreciation for what is involved in each step of the process. The result was two new homework problems in Chapter 12: Problems 23 and 24. In both problems, we provide an analytical expression for the projection, and the reader is supposed to perform the necessary steps to find the image. Both problems involve the Gaussian function, because the Gaussian is one of the few functions for which the Fourier transform can be calculated easily. (Well, perhaps “easily” is in the eye of the beholder, but by completing the square of the exponent the process is fairly straight forward).

I recall spending considerable time coming up with examples that are simple enough to assign as a homework problem, yet complicated enough to be interesting. One could easily do the case of a Gaussian centered at the origin, but then the projection has no angular dependence, which is dull. I tried hard to find examples that were based on functions other than the Gaussian, but never had any success. If you, dear reader, can think of any such examples, please let me know. I would love to have a third problem that I could use on an exam next time I teach medical physics.

For anyone who wants to get a mathematical understanding of image reconstruction from projections by Fourier transform, I recommend solving Problems 23 and 24. But you won’t learn everything. For instance, in medical imaging the data is discrete, as compared to the continuous functions in these homework problems. This particularly complicates the middle step: transforming from polar to Cartesian coordinates in frequency space. Such a transformation is almost trivial in the continuous case, but more difficult using discrete data (see Problem 20 in Chapter 12 for more on that process). Nevertheless, I have found that performing the reconstruction in a couple specific cases is useful for understanding the algorithm better.

Problems 23 and 24 are a bit more difficult than the average homework problem in our book. The student needs to be comfortable with Fourier analysis. But there is something fun about these problems, especially if you are fond of treasure hunts. I find it exciting to know that there is a fairly simple function f(x,y) representing an object, and that it can be determined from projections F(θ,x') by a simple three-step procedure. Perhaps it mimics, in a very simplistic way, the thrill that developers of computed tomography must have felt when they were first able to obtain images by measuring projections.

If you get stuck on these two problems, contact Russ or me about obtaining the solution manual. Enjoy!


P.S. The Oakland University website is currently undergoing some changes. For the moment, if you have trouble accessing the book website, try http://personalwebs.oakland.edu/~roth/hobbie.htm. I hope to have a more permanent home for the website soon.

Friday, July 17, 2009

Random Walks in Biology

Random Walks in Biology, by Howard Berg, superimposed on Intermediate Physics for Medicine and Biology.
Random Walks in Biology,
by Howard Berg.
In Chapter 4 of the 4th edition of Intermediate Physics for Medicine and Biology, Russ Hobbie and I discuss the role of diffusion in biology. One source we cite in this chapter is Random Walks in Biology, by Howard Berg. Below is the introduction to this fascinating book, which I recommend highly. In particular, I love Berg’s first sentence.
Biology is wet and dynamic. Molecules, subcellular organelles, and cells, immersed in an aqueous environment, are in continuous riotous motion. Alive or not, everything is subject to thermal fluctuations. What is this microscopic world like? How does one describe the motile behavior of such particles? How much do they move on the average? Questions of this kind can be answered only with an intuition about statistics that very few biologists have. This book is intended to sharpen that intuition. It is meant to illuminate both the dynamics of living systems and the methods used for their study. It is not a rigorous treatment intended for the expert but rather an introduction for students who have little experience with statistical concepts.

The emphasis is on physics, not mathematics, using the kinds of calculations that one can do on the back of an envelope. Whenever practical, results are derived from first principles. No reference is made to the equations of thermodynamics. The focus is on individual particles, not moles of particles. The units are centimeters (cm), grams (g), and seconds (sec).

Topics range from the one-dimensional random walk to the motile behavior of bacteria. There are discussions of Boltzmann’s law, the importance of kT, diffusion to multiple receptors, sedimentation, electrophoresis, and chromatography. One appendix provides an introduction to the theory of probability. Another is a primer on differential equations. A third lists some constants and formulas worth committing to memory. Appendix A should be consulted while reading Chapter 1 and Appendix B while reading Chapter 2. A detailed understanding of differential equations or the methods used for their solution is not required for an appreciation of the main theme of this book.