As per usual, Einstein cut to the heart of the matter. "God does not play dice" was his famous quote. However, this time he finally met his match: himself. In the end, his statement about dice is only true if the speed of light (c) is not the absolute speed limit Einstein envisioned. Quantum theory proved to be correct regardless, but Einstein never knew. He died in 1955, and it would take Bell and others to prove him wrong years later.
When we build an AI Mind with a modicum of innate knowledge, an example of logical or linguistic negation is not only necessary but not to be mundane or ordinary. So in the German AI we use perhaps the most famous of all scientific negations -- "Gott spielt nicht Würfel" -- and we use its English translation in the English AI: "God does not play dice."
Every scientist display himself as atheist to the world due to the other scientists and scientist community think belive in god is taboo,but that doesnt mean they are really not belive in god. i dont support god as a particular human being i think god mean a ultimate source of energy which controll the universe.
The intellectual legacy of Isaac Newton was a vision of theclockwork universe, set in motion at the instant of creationbut thereafter running in prescribed grooves, like a well-oiledmachine. It was an image of a totally deterministic worldoneleaving no room for the operation of chance, one whosefuture was completely determined by its present. As the greatmathematical astronomer Pierre-Simon de Laplace eloquentlyput it in 1812 in his Analytic Theory of Probabilities:An intellect which at any given moment knew all the forces thatanimate Nature and the mutual positions of the beings that compriseit, if this intellect were vast enough to submit its data toanalysis, could condense into a single formula the movement ofthe greatest bodies of the universe and that of the lightest atom:for such an intellect nothing could be uncertain, and the futurejust like the past would be present before its eyes.This same vision of a world whose future is totally predictablelies behind one of the most memorable incidents inDouglas Adams's 1979 science-fiction novel The Hitchhiker'sGuide to the Galaxy, in which the philosophers Majikthise andVroomfondel instruct the supercomputer "Deep Thought" tocalculate the answer to the Great Question of Life, the Uni-verse, and Everything. Aficionados will recall that after fivemillion years the computer answered, "Forty-two," at whichpoint the philosophers realized that while the answer wasclear and precise, the question had not been. Similarly, thefault in Laplace's vision lies. not in his answer-that the universeis in principle predictable, which is an accurate statementof a particular mathematical feature of Newton's law ofmotion-but in his interpretation of that fact, which is a seriousmisunderstanding based on asking the wrong question.By asking a more appropriate question, mathematicians andphysicists have now come to understand that determinismand predictability are not synonymous.In our daily lives, we encounter innumerable cases whereLaplacian determinism seems to be a highly inappropriatemodel. We walk safely down steps a thousand times, untilone day we turn our ankle and break it. We go to a tennismatch, and it is rained off by an unexpected thunderstorm.We place a bet on the favorite in a horse race, and it falls atthe last fence when it is six lengths ahead of the field. It's notso much a universe in which-as Albert Einstein memorablyrefused to believe-God plays dice: it seems more a universein which dice play God.Is our world deterministic, as Laplace claimed, or is it governedby chance, as it so often seems to be? And if Laplace isreally right, why does so much of our experience indicate thathe is wrong? One of the most exciting new areas of mathematics,nonlinear dynamics-popularly known as chaos theoryclaimsto have many of the answers. Whether or not it does, itis certainly creating a revolution in the way we think aboutorder and disorder, law and chance, predictability andrandomness.According to modern physics, nature is ruled by chanceon its smallest scales of space and time. For instance, whethera radioactive atom-of uranium, say-does or does not decayat any given instant is purely a matter of chance. There is nophysical difference whatsoever between a uranium atom thatis about to decay and one that is not about to decay. None.Absolutely none.There are at least two contexts in which to discuss theseissues: quantum mechanics and classical mechanics. Most ofthis chapter is about classical mechanics, but for a moment letus consider the quantum-mechanical context. It was this viewof quantum indeterminacy that prompted Einstein's famousstatement (in a letter to his colleague Max Born) that "youbelieve in a God who plays dice, and 1 in complete law andorder." To my mind, there is something distinctly fishy aboutthe orthodox physical view of quantum indeterminacy, and 1appear not to be alone, because, increasingly, many physicistsare beginning to wonder whether Einstein was right all alongand something is missing from conventional quantummechanics-perhaps "hidden variables," whose values tell anatom when to decay. (I hasten to add that this is not the conventionalview.) One of the best known of them, the Princetonphysicist David Bohm, devised a modification of quantummechanics that is fully deterministic but entirely consistentwith all the puzzling phenomena that have been used to supportthe conventional view of quantum indeterminacy.Bohm's ideas have problems of their own, in particular a kindof "action at a distance" that is no less disturbing than quantumindeterminacy.However, even if quantum mechanics is correct aboutindeterminacy on the smallest scales, on macroscopic scalesof space and time the universe obeys deterministic laws, Thisresults from an effect called decoherence, which causes sufficientlylarge quantum systems to lose nearly all of their indeterminacyand behave much more like Newtonian systems. Ineffect, this reinstates classical mechanics for most humanscalepurposes. Horses, the weather, and Einstein's celebrateddice are not unpredictable because of quantum mechanics.On the contrary, they are unpredictable within a Newtonianmodel, too. This is perhaps not so surprising when it come tohorses-living creatures have their own hidden variables,such as what kind of hay they had for breakfast. But it wasdefinitely a surprise to those meteorologists who had beendeveloping massive computer simulations of weather in thehope of predicting it for months ahead. And it is really ratherstartling when it comes to dice, even though humanity perverselyuses dice as one of its favorite symbols for chance.Dice are just cubes, and a tumbling cube should be no lesspredictable than an orbiting planet: after all, both objects obeythe same laws of mechanical motion. They're differentshapes, but equally regular and mathematical ones.To see how unpredictability can be reconciled with determinism,think about a much less ambitious system than theentire universe-namely, drops of water dripping from a tap. *This is a deterministic system: in principle, the flow of waterinto the apparatus is steady and uniform, and what happensto it when it emerges is totally prescribed by the laws of fluidmotion. Yet a simple but effective experiment demonstratesthat this evidently deterministic system can be made tobehave unpredictably; and this leads us to some mathematical"lateral thinking," which explains why such a paradox ispossible.If you turn on a tap very gently and wait a few seconds forthe flow to settle down, you can usually produce a regularseries of drops of water, falling at equally spaced times in aregular rhythm. It would be hard to find anything more predictablethan this. But if you slowly turn the tap to increasethe flow, you can set it so that the sequence of drops falls in avery irregular manner, one that sounds random. It may take alittle experimentation to succeed, and it helps if the tap turnssmoothly. Don't turn it so far that the water falls in an unbrokenstream; what you want is a medium-fast trickle. If you getit set just right, you can listen for many minutes without anyobvious pattern becoming apparent.In 1978, a bunch of iconoclastic young graduate studentsat the University of California at Santa Cruz formed theDynamical Systems Collective. When they began thinkingabout this water-drop system, they realized that it's not asrandom as it appears to be. They recorded the dripping noiseswith a microphone and analyzed the sequence of intervalsbetween each drop and the next. What they found was shorttermpredictability. If I tell you the timing of three successivedrops, then you can predict when the next drop will fall. Forexample, if the last three intervals between drops have been0.63 seconds, 1.17 seconds, and 0.44 seconds, then you can besure that the next drop will fall after a further 0.82 seconds.(These numbers are for illustrative purposes only.) In fact, ifyou know the timing of the first three drops exactly, then youcan predict the entire future of the system.So why is Laplace wrong? The point is that we can nevermeasure the initial state of a system exactly. The most precisemeasurements yet made in any physical system are correct toabout ten or twelve decimal places. But Laplace's statement iscorrect only if we can make measurements to infinite precision,infinitely many decimal places-and of course there'sno way to do that. People knew about this problem of measurementerror in Laplace's day, but they generally assumedthat provided you made the initial measurements to, say, tendecimal places, then all subsequent prediction would also beaccurate to ten decimal places. The error would not disappear,but neither would it grow.Unfortunately, it does grow, and this prevents us fromstringing together a series of short-term predictions to get onethat is valid in the long term. For example, suppose I knowthe timing of the first three water drops to an accuracy of tendecimal places. Then I can predict the timing of the next dropto nine decimal places, the drop after that to eight decimalplaces, and so on. At each step, the error grows by a factor ofabout ten, so I lose confidence in one further decimal place.Therefore, ten steps into the future, I really have no idea at allwhat the timing of the next drop will be. (Again, the precisefigures will probably be different: it may take half a dozendrops to lose one decimal place in accuracy, but even then ittakes only sixty drops until the same problem arises.)This amplification of error is the logical crack throughwhich Laplace's perfect determinism disappears. Nothingshort of total perfection of measurement will do. If we couldmeasure the timing to a hundred decimal places, our predictionswould fail a mere hundred drops into the future (or sixhundred, using the more optimistic estimate). This phenomenonis called "sensitivity to initial conditions," or more informally"the butterfly effect." (When a butterfly in Tokyo flapsits wings, the result may be a hurricane in Florida a monthlater.) It is intimately associated with a high degree of irregularityof behavior. Anything truly regular is by definitionfairly predictable. But sensitivity to initial conditions rendersbehavior unpredictable-hence irregular. For this reason, asystem that displays sensitivity to initial conditions is said tobe chaotic. Chaotic behavior obeys deterministic laws, but itis so irregular that to the untrained eye it looks pretty muchrandom. Chaos is not just complicated, patternless behavior;it is far more subtle. Chaos is apparently complicated, apparentlypatternless behavior that actually has a simple, deterministicexplanation.The discovery of chaos was made by many people, toonumerous to list here. It came about because of the conjunctionof three separate developments. One was a change of scientificfocus, away from simple patterns such as repetitivecycles, toward more complex kinds of behavior. The secondwas the computer, which made it possible to find approximatesolutions to dynamical equations easily and rapidly.The third was a new mathematical viewpoint on dynamics-ageometric rather than a numerical viewpoint. The first providedmotivation, the second provided technique, and thethird provided understanding.The geometrization of dynamics began about a hundredyears ago, when the French mathematician Henri Poincare-amaverick if ever there was one, but one so brilliant that hisviews became orthodoxies almost overnight-invented theconcept of a phase space. This is an imaginary mathematicalspace that represents all possible motions of a given dynamicalsystem. To pick a nonmechanical example, consider thepopulation dynamics of a predator-prey ecological system.The predators are pigs and the prey are those exotically pungentfungi, truffles. The variables upon which we focus attentionare the sizes of the two populations-the number of pigs(relative to some reference value such as one million) and thenumber of truffles (ditto). This choice effectively makes thevariables continuous-that is, they take real-number valueswith decimal places, not just whole-number values. Forexample, if the reference number of pigs is one million, then apopulation of 17,439 pigs corresponds to the value 0.017439.Now, the natural growth of truffles depends on how manytruffles there are and the rate at which pigs eat them: thegrowth of the pig population depends on how many pigsthere are and how many truffles they eat. So the rate ofchange of each variable depends on both variables, an observationthat can be turned into a system of differential equationsfor the population dynamics. I won't write them down,because it's not the equations that matter here: it's what youdo with them.These equations determine-in principle-how any initialpopulation values will change over time. For example, if westart with 17,439 pigs and 788,444 truffles, then you plug inthe initial values 0.017439 for the pig variable and 0.788444for the truffle variable, and the equations implicitly tell youhow those numbers will change. The difficulty is to make theimplicit become explicit: to solve the equations. But in whatsense? The natural reflex of a classical mathematician wouldbe to look for a formula telling us exactly what the pig populationand the truffle population will be at any instant. Unfortunately,such "explicit solutions" are so rare that it isscarcely worth the effort of looking for them unless the equationshave a very special and limited form. An alternative isto find approximate solutions on a computer; but that tells usonly what will happen for those particular initial values, andmost often we want to know what will happen for a lot of differentinitial values.Poincare's idea is to draw a picture that shows what happensfor all initial values. The state of the system-the sizes ofthe two populations at some instant of time-can be representedas a point in the plane, using the old trick of coordinates.For example, we might represent the pig population bythe horizontal coordinate and the truffle population by thevertical one. The initial state described above corresponds tothe point with horizontal coordinate 0.017439 and verticalcoordinate 0.788444. Now let time flow. The two coordinateschange from one instant to the next, according to the ruleexpressed by the differential equation, so the correspondingpoint moves. A moving point traces out a curve; and thatcurve is a visual representation of the future behavior of theentire system. In fact, by looking at the curve, you can "see"important features of the dynamics without worrying aboutthe actual numerical values ofthe coordinates.For example, if the curve closes up into a loop, then thetwo populations are following a periodic cycle, repeating thesame values over and over again-just as a car on a racetrackkeeps going past the same spectator every lap. If the curvehomes in toward some particular point and stops, then thepopulations settle down to a steady state, in which neitherchanges-like a car that runs out of fuel. By a fortunate coincidence,cycles and steady states are of considerable ecologicalsignificance-in particular, they set both upper and lowerlimits to populations sizes. So the features that the eye detectsmost easily are precisely the ones that really matter. Moreover, a lot of irrelevant detail can be ignored: for example, wecan see that there is a closed loop without having to work outits precise shape (which represents the combined "waveforms"of the two population cycles).What happens if we try a different pair of initial values?We get a second curve. Each pair of initial values defines anew curve; and we can capture all possible behaviors of thesystem, for all initial values, by drawing a complete set ofsuch curves. This set of curves resembles the flow lines of animaginary mathematical fluid, swirling around in the plane.We call the plane the phase space of the system, and the set ofswirling curves is the system's phase portrait. Instead of thesymbol-based idea of a differential equation with various initialconditions, we have a geometric, visual scheme of pointsflowing through pig/truffle space. This differs from an ordinaryplane only in that many of its points are potential ratherthan actual: their coordinates correspond to numbers of pigsand truffles that could occur under appropriate initial conditions,but may not occur in a particular case. So as well as themental shift from symbols to geometry, there is a philosophicalshift from the actual to the potential.The same kind of geometric picture can be imagined forany dynamical system. There is a phase space, whose coordinatesare the values of all the variables; and there is a phaseportrait, a system of swirling curves that represents all possiblebehaviors starting from all possible initial conditions, andthat are prescribed by the differential equations. This ideaconstitutes a major advance, because instead of worryingabout the precise numerical details of solutions to the equations,we can focus upon the broad sweep of the phase portrait,and bring humanity's greatest asset, its amazing image processing abilities, to bear. The image of a phase space as away of organizing the total range of potential behaviors, fromamong which nature selects the behavior actually observed,has become very widespread in science.The upshot of Poincare's gre'at innovation is that dynamicscan be visualized in terms of geometric shapes called attractors.If you start a dynamical system from some initial pointand watch what it does in the long run, you often find that itends up wandering around on some well-defined shape inphase space. For example, the curve may spiral in toward aclosed loop and then go around and around the loop forever.Moreover, different choices of initial conditions may lead tothe same final shape. If so, that shape is known as an attractor.The long-term dynamics of a system is governed by itsattractors, and the shape of the attractor determines what typeof dynamics occurs.For example, a system that settles down to a steady statehas an attractor that is just a point. A system that settles downto repeating the same behavior periodically has an attractorthat is a closed loop. That is, closed loop attractors correspondto oscillators. Recall the description of a vibrating violinstring from chapter 5; the string undergoes a sequence ofmotions that eventually puts it back where it started, ready torepeat the sequence over and over forever. I'm not suggestingthat the violin string moves in a physical loop. But mydescription of it is a closed loop in a metaphorical sense: themotion takes a round trip through the dynamic landscape ofphase space.Chaos has its own rather weird geometry: it is associatedwith curious fractal shapes called strange attractors. The butterflyeffect implies that the detailed motion on a strange attractorcan't be determined in advance. But this doesn't alter the factthat it is an attractor. Think of releasing a Ping-Pong ball into astormy sea. Whether you drop it from the air or release it fromunderwater, it moves toward the surface. Once on the surface, itfollows a very complicated path in the surging waves, but howevercomplex that path is, the ball stays on-or at least verynear-the surface. In this image, the surface of the sea is anattractor. So, chaos notwithstanding, no matter what the startingpoint may be, the system will end up very close to its attractor.Chaos is well established as a mathematical phenomenon,but how can we detect it in the real world? We must performexperiments-and there is a problem. The traditional role ofexperiments in science is to test theoretical predictions, but ifthe butterfly effect is in operation-as it is for any chaotic system-how can we hope to test a prediction? Isn't chaos inherentlyuntestable, and therefore unscientific?The answer is a resounding no, because the word "prediction"has two meanings. One is "foretelling the future," and thebutterfly effect prevents this when chaos is present. But theother is "describing in advance what the outcome of an experimentwill be." Think about tossing a coin a hundred times. Inorder to predict-in the fortune-teller's sense-what happens,you must list in advance the result of each of the tosses. Butyou can make scientific predictions, such as "roughly half thecoins will show heads," without foretelling the future indetail-even when, as here, the system is random. Nobody suggeststhat statistics is unscientific because it deals with unpredictableevents, and therefore chaos should be treated in thesame manner. You can make all sorts of predictions about achaotic system; in fact, you can make enough predictions todistinguish deterministic chaos from true randomness. Onething that you can often predict is the shape of the attractor,which is not altered by the butterfly effect. All the butterflyeffect does is to make the system follow different paths on thesame attractor. In consequence, the general shape of the attractorcan often be inferred from experimental observations.The discovery of chaos has revealed a fundamental misunderstandingin our views of the relation between rules and thebehavior they produce-between cause and effect. We used tothink that deterministic causes must produce regular effects,but now we see that they can produce highly irregular effectsthat can easily be mistaken for randomness. We used to thinkthat simple causes must produce simple effects (implying thatcomplex effects must have complex causes), but now weknow that simple causes can produce complex effects. Werealize that knowing the rules is not the same as being able topredict future behavior.How does this discrepancy between cause and effect arise?Why do the same rules sometimes produce obvious patternsand sometimes produce chaos? The answer is to be found inevery kitchen, in the employment of that simple mechanicaldevice, an eggbeater. The motion of the two beaters is simpleand predictable, just as Laplace would have expected: eachbeater rotates steadily. The motion of the sugar and the eggwhite in the bowl, however, is far more complex. The twoingredients get mixed up-that's what eggbeaters are for. Butthe two rotary beaters don't get mixed up-you don't have todisentangle them from each other when you've finished. Whyis the motion of the incipient meringue so different from thatof the beaters? Mixing is a far more complicated, dynamicprocess than we tend to think. Imagine trying to predictwhere a particular grain of sugar will end up! As the mixturepasses between the pair of beaters, it is pulled apart, to leftand right, and two sugar grains that start very close togethersoon get a long way apart and follow independent paths. Thisis, in fact, the butterfly effect in action-tiny changes in initialconditions have big effects. So mixing is a chaotic process.Conversely, every chaotic process involves a kind of mathematicalmixing in Poincare's imaginary phase space. This iswhy tides are predictable but weather is not. Both involve thesame kind of mathematics, but the dynamics of tides does notget phase space mixed up, whereas that of the weather does.It's not what you do, it's the way that you do it.Chaos is overturning our comfortable assumptions abouthow the world works. It tells us that the universe is farstranger than we think. It casts doubt on many traditionalmethods of science: merely knowing the laws of nature is nolonger enough. On the other hand, it tells us that some thingsthat we thought were just random may actually be consequencesof simple laws. Nature's chaos is bound by rules. Inthe past, science tended to ignore events or phenomena thatseemed random, on the grounds that since they had no obviouspatterns they could not be governed by simple laws. Notso. There are simple laws right under our noses-laws governingdisease epidemics, or heart attacks, or plagues of locusts.If we learn those laws, we may be able to prevent the disastersthat follow in their wake.Already chaos has shown us new laws, even new types oflaws. Chaos contains its own brand of new universal patterns.One of the first to be discovered occurs in the dripping tap.Remember that a tap can drip rhythmically or chaotically,depending on the speed of the flow. Actually, both the regularlydripping tap and the "random" one are following slightlydifferent variants of the same mathematical prescription. But asthe rate at which water passes through the tap increases, thetype of dynamics changes. The attractor in phase space thatrepresents the dynamics keeps changing-and it changes in apredictable but highly complex manner.Start with a regularly dripping tap: a repetitive drip-dripdrip-drip rhythm, each drop just like the previous one. Thentum the tap slightly, so that the drips come slightly faster.Now the rhythm goes drip-DRIP-drip-DRIP, and repeats everytwo drops. Not only the size of the drop, which governs howloud the drip sounds, but also the timing changes slightlyfrom one drop to the next.If you allow the water to flow slightly faster still, you get afour-drop rhythm: drip-DRIP-drip-DRIP. A little faster still,and you produce an eight-drop rhythm: drip-DRIP-drip-DRIPdrip-DRIP-drip-DRIP. The length of the repetitive sequenceof drops keeps on doubling. In a mathematical model, thisprocess continues indefinitely, with rhythmic groups of 16,32, 64 drops, and so on. But it takes tinier and tinier changesto the flow rate to produce each successive doubling of theperiod; and there is a flow rate by which the size of the grouphas doubled infinitely often. At this point, no sequence ofdrops repeats exactly the same pattern. This is chaos.We can express what is happening in Poincare's geometriclanguage. The attractor for the tap begins as a closed loop,representing a periodic cycle. Think of the loop as an elasticband wrapped around your finger. As the flow rate increases,this loop splits into two nearby loops, like an elastic bandwound twice around your finger. This band is twice as long asthe original, which is why the period is twice as long. Then inexactly the same way, this already-doubled loop doublesagain, all the way along its length, to create the period-fourcycle, and so on. After infinitely many doublings, your fingeris decorated with elastic spaghetti, a chaotic attractor.This scenario for the creation of chaos is called a perioddoublingcascade. In 1975, the physicist Mitchell Feigenbaumdiscovered that a particular number, which can be measuredin experiments, is associated with every period-doubling cascade.The number is roughly 4.669, and it ranks alongside 1t(pi) as one of those curious numbers that seem to have extraordinarysignificance in both mathematics and its relation tothe natural world. Feigenbaum's number has a symbol, too:the Greek letter () (delta). The number 1t tells us how the circumferenceof a circle relates to its diameter. Analogously,Feigenbaum's number () tells us how the period of the dripsrelates to the rate of flow of the water. To be precise, the extraamount by which you need to turn on the tap decreases by afactor of 4.669 at each doubling of the period.The number 1t is a quantitative signature for anythinginvolving circles. In the same way, the Feigenbaum number ()is a quantitative signature for any period-doubling cascade,no matter how it is produced or how it is realized experimentally.That very same number shows up in experiments on liquidhelium, water, electronic circuits, pendulums, magnets,and vibrating train wheels. It is a new universal pattern innature, one that we can see only through the eyes of chaos; aquantitative pattern, a number, emerges from a qualitativephenomenon. One of nature's numbers, indeed. The Feigenbaumnumber has opened the door to a new mathematicalworld, one we have only just begun to explore.The precise pattern found by Feigenbaum, and other patternslike it, is a matter of fine detail. The basic point is thateven when the consequences of natural laws seem to be patternless,the laws are still there and so are the patterns. Chaosis not random: it is apparently random behavior resultingfrom precise rules. Chaos is a cryptic form of order.Science has traditionally valued order, but we are beginningto appreciate the fact that chaos can offer science distinctadvantages. Chaos makes it much easier to respond quickly toan outside stimulus. Think of tennis players waiting toreceive a serve. Do they stand still? Do they move regularlyfrom side to side? Of course not. They dance erratically fromone foot to the other. In part, they are trying to confuse theiropponents, but they are also getting ready to respond to anyserve sent their way. In order to be able to move quickly inany particular direction, they make rapid movements in manydifferent directions. A chaotic system can react to outsideevents much more quickly, and with much less effort, than anon chaotic one. This is important for engineering controlproblems. For example, we now know that some kinds of turbulenceresult from chaos-that's what makes turbulence lookrandom. It may prove possible to make the airflow past an aircraft'sskin much less turbulent, and hence less resistant tomotion, by setting up control mechanisms that respondextremely rapidly to cancel out any small regions of incipientturbulence. Living creatures, too, must behave chaotically inorder to respond rapidly to a changing environment.This idea has been turned into an extremely useful practicaltechnique by a group of mathematicians and physicists,among them William Ditto, Alan Garfinkel, and Jim Yorke:they call it chaotic control. Basically, the idea is to make thebutterfly effect work for you. The fact that small changes ininitial conditions create large changes in subsequent behaviorcan be an advantage; all you have to do is ensure that you getthe large changes you want. Our understanding of howchaotic dynamics works makes it possible to devise controlstrategies that do precisely this. The method has had severalsuccesses. Space satellites use a fuel called hydrazine to makecourse corrections. One of the earliest successes of chaoticcontrol was to divert a dead satellite from its orbit and send itout for an encounter with an asteroid, using only the tinyamount of hydrazine left on board. NASA arranged for thesatellite to swing around the Moon five times, nudging itslightly each time with a tiny shot of hydrazine. Several suchencounters were achieved, in an operation that successfullyexploited the occurrence of chaos in the three-body problem(here, Earth/Moon/satellite) and the associated butterflyeffect.The same mathematical idea has been used to control amagnetic ribbon in a turbulent fluid-a prototype for controllingturbulent flow past a submarine or an aircraft. Chaoticcontrol has been used to make erratically beating hearts returnto a regular rhythm, presaging invention of the intelligentpacemaker. Very recently, it has been used both to set up andto prevent rhythmic waves of electrical activity in brain tissue,opening up the possibility of preventing epileptic attacks.Chaos is a growth industry. Every week sees new discoveriesabout the underlying mathematics of chaos, new applicationsof chaos to our understanding of the natural world, ornew technological uses of chaos-including the chaotic dishwasher,a Japanese invention that uses two rotating arms,spinning chaotically, to get dishes cleaner using less energy;and a British machine that uses chaos-theoretic data analysisto improve quality control in spring manufacture.Much, however, remains to be done. Perhaps the ultimateunsolved problem of chaos is the strange world of the quantum,where Lady Luck rules. Radioactive atoms decay "at random";their only regularities are statistical. A large quantity ofradioactive atoms has a well-defined half-life-a period oftime during which half the atoms will decay. But we can'tpredict which half. Albert Einstein's protest, mentioned earlier,was aimed at just this question. Is there really no differenceat all between a radioactive atom that is not going todecay, and one that's just about to? Then how does the atomknow what to do?Might the apparent randomness of quantum mechanics befraudulent? Is it really deterministic chaos? Think of an atomas some kind of vibrating droplet of cosmic fluid. Radioactiveatoms vibrate very energetically, and every so often a smallerdrop can split off-decay. The vibrations are so rapid that wecan't measure them in detail: we can only measure averagedquantities, such as energy levels. Now, classical mechanicstells us that a drop of real fluid can vibrate chaotically. Whenit does so, its motion is deterministic but unpredictable. Occasionally,"at random," the vibrations conspire to split off atiny droplet. The butterfly effect makes it impossible to say inadvance just when the drop will split; but that event has precisestatistical features, including a well defined half-life.Could the apparently random decay of radioactive atomsbe something similar, but on a microcosmic scale? After all,why are there any statistical regularities at all? Are they tracesof an underlying determinism? Where else can statistical regularitiescome from? Unfortunately, nobody has yet made thisseductive idea work-though it's similar in spirit to the fashionabletheory of superstrings, in which a subatomic particleis a kind of hyped-up vibrating multidimensional loop. Themain similar feature here is that both the vibrating loop andthe vibrating drop introduce new "internal variables" into thephysical picture. A significant difference is the way these twoapproaches handle quantum indeterminacy. Superstring theory,like conventional quantum mechanics, sees this indeterminacyas being genuinely random. In a system like the drop,however, the apparent indeterminacy is actually generated bya deterministic, but chaotic, dynamic. The trick-if only weknew how to do it-would be to invent some kind of structurethat retains the successful features of superstring theory,while making some of the internal variables behave chaotically.It would be an appealing way to render the Deity's dicedeterministic, and keep the shade of Einstein happy.Chapter 6 : Broken SymmetryChapter 7 : The Rhythm of LifeChapter 8 : Do dice Play God Chapter 9 : Drops Dynamics and Daisies 2b1af7f3a8