by Isaac Asimov 1984
Book Review by Ray Herrmann
Asimov begins this history on science with the theories of Aristotle (~600 BC) and gives us a feel for how our yearning to understand has led us to ever more sophisticated theories and how these developments combined to give us the power to see and understand even more.
Take motion for example: Aristotle noticed that every object ("earthy material") seemed to get as close to the center of the earth as possible. Hence, he deduced that the earth must be round. But Aristotle also felt that the motion of a thrown object was driven by an "impulse" of the air in the object (so in a vacuum there would be no motion). In Aristotle's time, knowledge was driven by philosophies, not by experimentation. Yet his theory of motion held for about 2000 yeas until experimentation provided guidance.
It was the time of Galileo (1564-1642) that was the birth of experimentation and the revisiting of Aristotle’s theory of motion. This culminated with Isaac Newton’s publication of Mathematical Principles of Natural Philosophy in 1687. Newton’s three laws of motion led to an understanding that is valid even today (except at relativistic conditions). Law 1): A body at rest, or in motion remains at uniform motion, with constant speed in a straight line, unless it is acted on by an unbalanced external force. This led to the concept of inertia, mass (vs. weight), gravity, forces and vectors. Law 2): Which can be deduced from the first law, states that the rate of change of the motion of a mass is directly proportional to the external forces acting on it (F=ma). Law 3): To every action there is always imposed an equal and opposite reaction.
These laws together lead to the development of "Conservation of Momentum", Rotational motion, Work, Energy, the Pendulum and even to understanding vibrations and sound. These concepts are all neatly explained as Asimov takes us on a progression of knowledge without referring to calculus or complicated math.
Also developed are the concept of atoms (John Dalton (1776-1844), solids, liquids, gasses, pressure, surface tension and viscosity. Throughout the book, Asimov clearly defines the units of each attribute in British and Metric (mks & cgs) systems.
Sound is developed in particular detail, explaining such attributes as loudness, pitch, beats, Doppler affect, reflections, sonar, resonances and timbre. The scale of the piano is used to personalize chords and the scale: do, re, mi, fa, sol, la, ti do (Try playing just the white keys starting anywhere except with "C". How does it sound?).
The expansion of gasses with temperature was first studied in 1699 by the French physicist Gulllaume Amontons. At that time air was the only gas available. But over the next 100 years, many gasses became available. The change in volume with temperature at 0ºC seemed common to all gases, such that extrapolating to zero volume implied a limit temperature of -273ºC This lead to the establishment of an “absolute Temperature” scale starting with 0 K (zero Kelvins) = -273ºC. So water freezes at 273K (0ºC) and boils at 373K (100ºC). Using the Kelvin scale simplified the gas law formulas.
This was followed by the study of Heat and the development of the "Kinetic Theory of Gasses", which implied gasses were composed of very tiny particles called atoms that were in motion. This new understanding fit well with the emerging understandings of work, momentum of tiny atoms, vibrations as energy, and of liquids as sliding atoms. And it also fit with the interpretation of solids as atoms that were (still) vibrating, but with too little amplitude to break free from their attractive forces.
The flow of Heat also became understood as an exchange of momentum at the atomic level. The development of the Steam Engine in 1769 by James Watt spurred the development of the "Thermodynamics" branch of physics. There,it was proved that the efficiency of any possible heat engine, no matter how perfect, was governed by the difference between the "Hot" temperature (expressed in Kelvins) and the colder discharge temperature (in Kelvins), where an efficiency of 1 (=100%) could only be reached when the discharge temperature was at absolute zero.
Note that man’s evolved understandings led to both capabilities that allowed us to explore Nature (confined gas containers, Thermometers, accurate Timepieces) and to machining abilities used to take advantage of this knowledge (like building Steam Engines).
That light travels in straight lines is inherent in our understanding, since objects exist in the direction we are looking. Earlier studies of water waves and of sound waves suggested that light might also be a wave. However studies of projectiles suggested light might be a particle. The intensity of light was found to vary (diminish) with the square of the distance, as does gravity. This diminishing with distance varies exactly as the surface area of a sphere increases, suggesting that these phenomena are simply spread out over the enclosed area.
The law of reflection at equal angles had been known since ancient times (Euclid 300 BC)and refined in the 1600’s. Curved mirrors were examined using the same straight-line projections as for plane mirrors. From that, Lenses were studied and the (approximate) "lens law" came about [ 1/Do+ 1/Di =1/f ] where (f) is the focus. By the middle ages, spectacles were developed and the workings of the eye were examined.
The pinhole camera was invented by Gaimbattista della Porta in 1599 and Leonardo da Vinci made use of it. And in 1637 Descartes discovered the law of refraction (noticed since ancient times where a stick poked into the water seems bent at the surface). Around 1800, German optician, Joseph von Fraunhofer noticed a spectrum of fine lines coming from his prisms. This (and comparisons with water wave behavior) led to studies of characteristic light emissions from elements.
Experiments with light frequencies emitted vs. temperature (in Kelvins) done in 1899 produced the unexpected result that even though the frequency distribution of light increased with temperature, the amount of energy emitted actually decreased at the higher frequencies. Until 1900 it was assumed that light could be emitted in a continuous stream of energies, increasing with temperature (which didn’t fit observations). Then Max Plank tried the assumption that light was only emitted in discrete frequencies, so the energy would be proportional to the frequency of radiation. Adjusting the proportionality constant (h) to fit experimental data gave a value of h=6.6256 erg-seconds (a very tiny number). Later, in 1905, Einstein showed that energies were even absorbed in only these same discrete frequencies (the photoelectric effect). This new understanding separated the old "classical physics" from the new "modern physics", where light was now considered to be emitted or absorbed in discrete "quanta". Thus light was seemed to behave as a "particle" or a "wave" depending on the circumstances.
I will elaborate more detail on this subject since its understanding so drastically affects our technological development. However it should be noted that Isaac Asimov spent roughly equal detail in explaining the history of prior accomplishments. Still, in this report, I have left so much out!
Static electricity was known in ancient times but magnetism was more studied for about 2000 years because it seemed stronger and was found naturally (as Lodestone). A magnetic material floating on a piece of cork was seen to always point one end toward the North Pole. Hence the terms "North Pole" and "South Pole". However comparison with the stars indicated that the magnetic North was slightly different from the Star-indicated pole. Additionally, as one approached the magnetic North Pole, the magnet was seen to dip slightly implying that the Earth’s magnetism came from inside the earth.
It wasn't until Galileo's time (~1600) that scientific measurements took precedence over philosophical reasoning. Like Newton's gravitational force, Light intensity, Magnetic strength and the Electric (static) force were all found to diminish with the distance from the source.
Measurements on various materials showed that some materials (metals) conveyed the electric force while others did not. Development of "capacitors" was invented in 1745 with the Leyden Jar, which could store five times as much charge as could two plates separated by air. This greatly facilitated the study of "charge" and the concept of flow as charges were transferred among Leyden Jars. The great experimenter, Michael Faraday (1791-1867) had measured electric charge as being proportional to the potential difference present (capacitance was later defined as 96,500 coulombs =1 faraday).
Electric batteries were accidentally discovered by Italian physician Liugi Galvani in 1791 as he was studying the response of frog legs to electric charges from a Leyden Jar. He noticed that the legs also twitched when a metal scalpel touched the muscle. This was further elaborated by Alessandro Volta in 1800 who discovered that charge generation occurred when dissimilar metals were separated by a salty solution. He could also stack these "batteries" to produce proportionally more charge.
This discovery (though not understood for another century) provided the first continuous flow of "galvanic electricity" which then led to new experiments (and ultimately to our understanding of Elements as atoms and to various atomic models). Electrolysis rendered metallic Silver from a silver-salt whose weight was seen as proportional to the spent charge. This led to defining the "Coulomb" as the amount of charge to render 1.18 milligrams of Silver, followed by defining of the Ampere (1 coulomb per second). The concept of "Resistance" to flow was then developed and series/parallel combinations were worked out.
Magnetism was found to be associated with the flow of an electric current, accidentally, by Hans Christian Oersted in 1819 when he noticed that a magnetic needle pointed perpendicular to a wire when electric current flowed. Reversing the current flow caused the opposite end of the needle to swing toward the wire. More dramatic experiments with iron filings on a sheet of paper delineated the "lines of force" circumscribing the wire (lines as had been shown running from the poles of a magnet).
In 1813, experimenter William Sturgeon wrapped 18 turns of bare copper wire around a U-shaped bar and produced an "electromagnet". Then,in 1829, physicist Joseph Henry wrapped hundreds of turns of insulated copper wire around an iron bar, producing a very strong magnet. Now magnetic fields of unprecedented strength could be produced and studied. Following was the concept of "inductance" and development of solenoids (the Bell Ringer).
Since the flow of electric current induced a proportional magnetic strength, it was thought that the introduction of a magnetic field might conversely induce a flow of electric current. Early experiments found that it was the changing magnetic flux (not the magnetic flux itself) that induced a proportional current flow. This led to the development of electric generators, electric motors, current/voltage transformers and the telephone.
While the interrelationship between electric and magnetic affects were becoming obvious, it was the development of a mathematical relationship between them in 1864 by James Clerk Maxwell that put it all together (called Maxwell's Equations). His equations also predicted electromagnetic radiation that would travel at the speed of light, thus implying that light itself was a form of electromagnetic radiation (as we now realize). Further experiments with electric flows between (+) and (-) terminals separated by a vacuum led to the discovery of X-rays.
The invention of the battery made possible a furry of experimentation with electrolysis (where simple compounds are broken down by electric charge into their constituent parts). Measuring the portions and weights produced per coulomb suggested these compounds were composed of discrete "elements" and that the weights produced were proportional to the amount of charge (coulombs). Additionally these weights were found to occur in multiples of the lightest element found hydrogen), and also to be unique for each element. These discoveries fostered the development of both atomic theory and chemistry.
Results of these studies were summarized exquisitely by Russian chemist Dmitri Mendeleev in 1869 with the compilation of the "Periodic Table", where he listed over 60 then-known elements in order by their measured "atomic weights" and also according to their known properties. This table pointed to three specific vacancies which were then searched for and found, thus strengthening the acceptance of the Periodic Table.
Based on the study of gasses, Amedeo Avagadro hypothesized that equal volumes of gases contained equal numbers of molecules (now called "Avogadro’s Number"). In 1865, James Clerk Maxwell and Ludwig Boltzmann estimated Avogadro’s Number at 6 x 10 exp+23 (atoms) for an amount of an element equal to its atomic number in grams.
In 1911 Charles Barkla noticed that X-rays generated by bombarding various elements with a stream of electrons were able to penetrate different thicknesses of matter. He grouped these from the hardest as K-series, to L-series, and M-series. Then, in 1912, physicist Max von Laue suspected that crystals might serve as finer spaced diffraction gratings than could be produced, if they consisted of neat rows of molecules. It worked! Using a crystal of zinc sulfide, he was able to produce a diffracted pattern of dots instead of a single centered dot, thus proving that X-rays were waves (and of extremely short wavelength). That same year, William Henry Bragg analyzed these X-rays in more detail and was able to calculate the wavelengths. It was noticed that the wavelength got smaller (implying more energy) as the atomic number of the target went up.
Since atoms were neutral but were liberated by electrolysis proportional to the (-) charge, it seemed reasonable to assume that there existed positive charges in the same quantities required to give a neutral charge on each atom. In 1913, Henry Moseley assumed that the lightest element (Hydrogen) contained one (+) charge and that each element of sequentially increasing weight contained an additional (+) charge. This assignment, called "atomic number" seemed to fit better with the periodic table order, as some elements of higher atomic weights were switched with their neighbors so that they could be listed in order of their properties. Each additional (+) charge supposedly had a corresponding (-) electron such that the net charge was zero.
Further comparisons of the "inert gasses" column of the periodic table, along with the increasing energy of X-rays with atomic number led to the theory that electrons might be in orbits around the core's (+) charges and so each set of orbits was referred to as the K-shell, to L-shell and M-shell etc. As each shell was filled additional electrons occupied the next higher shell. There was a tendency for atoms to share electrons in order to complete each shell, but once a shell was filled, these elements would not be inclined to share and thus were the inert gasses. In this way, molecular bonds were explained.
However closer examination of the energy levels of radiations emitted or absorbed by elements led from the concept of "orbital shells" to a complex arrangement of "energy levels". This new view seemed to explain electron insulators and conductors (metals have overlapping energy levels which allow electrons to move freely among them). Working with "energy levels" where a particularly energetic photon was able to be absorbed when an electron was boosted to a higher-level, but radiated when the electron "fell back" led to the development of semiconductors, transistors, masers and lasers, thus giving strength to this point of view.
During these developments, it seemed that sometimes photons acted like waves but other times behaved like particles. In 1923, Louis de Broglie proposed that this "particle" view also could be applied to electrons with the wavelength = h/mv, where (h) is plank's constant. This revised view led to the development of the electron microscope in 1931.
In 1896, physicist Antoine Becquerel was exploring fluorescent materials to see if they emitted X-rays (as discovered by Rontgen a year earlier). He put a compound containing Uranium, on a photographic plate covered by black paper. He was surprised to find the photographic plate was darkened even when the compound was not exposed to sunlight. It seemed the compound was spontaneously emitting X-rays.
Further investigations found that uranium and thorium spontaneously emitted radiation. The 1911 invention of the "Cloud Chamber"by Charles Wilson, which produced vapor trails of emitted "particles" showing that many had their trajectories curved by a magnetic field: some curved one way while others curved oppositely and the amount of curvature was proportional to the charge/mass ratio. This along with measurements of the electrons charge/mass ratio led to the identification of "alpha particles" (two protons with 2 neutrons) and "beta particles" (ejected electrons). Very energetic "gamma rays" were soon found.
A rash of studies identified the few elements that were spontaneous emitters (radioactive). Studying these elements and meticulously measuring the masses of the remains led to the discovery of "isotopes" (elements that were heavier or lighter than expected, but always by a discrete amount). Decomposition pathways were identified for the "uranium series" and the "thorium series". Rate measurements of the cascade of particles led to the representation of their "half-lives".
Measuring the current proportions of starting (radioactive) and ending (stable) particles in the series (usually the element lead) enabled the ability to determine the dates that those solids were formed. The oldest rocks found became evidence on the age of the earth (at over 4 billion years)
An extensive list of stable and unstable isotopes along with their breakdown emissions led to the theories of nuclear structure, which fostered the development of Nuclear Medicine (tracers), Nuclear Electric Power generation and the Fission and Fusion (hydrogen) bombs.
Further explorations driven by the development of particle generators, starting around 1931, have led to new theories and the discoveries of "anti-particles", Strong and Weak forces, Quarks and the current "Standard Model" of nuclear chemistry. This book ends in 1983 with the mention of "Grand Unified Theories" which seem to explain three of the four forces (Electromagnetic, Strong, Weak, but not the Gravitational force).
The theme is that "Scientists" are the curious and motivated ones who devise tests and gather data, and then formulate plausible theories around them. Their continual compiling of ever deeper observations and theories is what drives our technological advance which is often expressed in new products and new powers.
The explanations in Asimov's book are so well done, even without complicated mathematical detail, that this book should be recommended reading for all students entering the realm of science. This book is becoming old and is out of print, so I suggest you try and get a copy before they are gone. My library had a copy in 2021, but it was gone by 2022! This book should be reprinted and offered in a course of preliminary physics. (high school level).