The Age of Computing: A Personal Memoir
In the history of modern technology, computer science must figure as an extraordinary chapter, and not only because of the remarkable speed of its development. It is unfortunate, however, that the word "science" has been widely used to designate enterprises that more properly belong to the domain of engineering.
"Computer science" is a glaring misnomer, as are "information science," "communication science," and other questionable "sciences." The awe and respect which science enjoys and which engineering is denied is inexplicable, at least to one who sees the situation from the other side. ...
Since World War II, the discoveries that have changed the world were not made so much in lofty halls of theoretical physics as in the lessnoticed labs of engineering and experimental physics.
The roles of pure and applied science have been reversed; they are no longer what they were in the golden age of physics, in the age of Einstein, Schrodinger, Fermi and Dirac. Readers of Scientific American, nourished on the Wellsian image of science, will recoil from even entertaining the idea that the age of physical "principles" may be over.
The laws of Newtonian mechanics, quantum mechanics and quantum electrodynamics were the last in a long and noble line that appears to have somewhat dried up in the last 50 years. As experimental devices (especially measuring devices) are becoming infinitely more precise and reliable, the wealth and sheer mass of new and baffling raw data collected by experiment greatly exceeds the power of human reason to explain them.
Physical theory has failed in recent decades to provide a theoretical underpinning for a world which increasingly appears as the work of some seemingly mischievous demiurge. The failure of reason to explain fact is also apparent in the life sciences, where "theories" (of the kind that physics has led us to expect) do not exist; many are doubtful that this kind of scientific explanation will ever be successful in explaining the secrets of life.
...Historians of science have always had a soft spot for the history of theoretical physics. The great theoretical advances of this century  relativity and quantum mechanics  have been documented in fascinating historical accounts that have captivated the mind of the cultivated public.
There are no comparable studies of the relations between science and engineering. Breaking with the tradition of the Fachidiot, theoretical physicists have bestowed their romantic autobiographies on the world, portraying themselves as the high priests of the reigning cult.
By their less than wholly objective accounts of the development of physics, historians have conspired to propagate the myth of science as being essentially theoretical physics. Though the myth no longer described scientific reality 50 years ago, historians pretended that all was well, that nothing had changed since the old heroic days of Einstein and his generation.
There were a few dissenters, however, such as the late Stanislaw Ulam, who used to make himself obnoxious by proclaiming that Enrico Fermi was "the last physicist." He and others who proclaimed such a possibility were prudently ignored.
Physicists did what they could to keep the myth alive. With impeccable chutzpah, they went on promulgating new "laws of nature" and carefully imitated their masters of another age. With dismaying inevitability, many of these latterday "laws" have been exposed as quasimathematical embellishments, devoid of great physical or scientific significance.
Historians of science have seen fit to ignore the history of the great discoveries in applied physics, engineering and computer science, where real scientific progress is nowadays to be found. Computer science in particular has changed and continues to change the face of the world more thoroughly and more drastically than did any of the great discoveries in theoretical physics.
... Although the basic rules for the computation of reliability were long known, it took several years during and immediately after World War II for the importance of the concept of reliability to be explicitly recognized and dealt with. Only then did reliability computation become an essential feature in computer design.
The late Richard Feynman was one of the first to realize the centrality of reliability considerations in all applied scientific work. In the early days of the Manhattan Project in Los Alamos (in 1943 and early 1944), he tested the reliability of his first program in a dramatic fashion, setting up a daylong contest between human operators working with handoperated calculators and the first electromechanical IBM machines.
At first, human operators showed an advantage over the electromechanical computers; as time wore on, however, the women who worked with the calculators became visibly tired and began to make small errors. Feynman's program on the electromechanical machine kept working. The electromechanical computers won out by virtue of their reliability.
Feynman soon came to realize that reliable machines in perfect working order were far more useful than much of what passed for theoretical work in physics, and he loudly stated that conviction. His supervisor, Hans Bethe  the head of TDivision (T for theory) at the time and a physicist steeped in theory  at first paid no attention to him.
At the beginning of the Manhattan project, only about a dozen or so handoperated machines were available in Los Alamos; they regularly broke down, thereby slowing scientific work. In order to convince Bethe of the importance of reliable computation, Feynman recruited me to help him improve the performance of the handoperated desk calculators, avoiding the weeklong delays in shipping them to San Diego for repairs. We spent hours fixing the small wheels until they were in perfect order. Bethe, visibly concerned when he learned that we had taken time off from our physics research to do these repairs, finally saw that having the desk calculators in good working order was as essential to the Manhattan Project as the fundamental physics.
Throughout his career, Feynman kept returning to the problem of the synthesis of reliable computers. Toward the end of his life, he gave a remarkable address at the 40th anniversary of the Los Alamos Laboratory where he sketched a reliability theory based on thermodynamical analogies.
In contrast to Bethe, John von Neumann very quickly realized the importance of reliability in the design of computers. It is no exaggeration to say that von Neumann had some familiarity (in the 1950s) with all the major ideas that have since proved crucial in the development of supercomputers.
...The first largescale electronic computer to be built, the one that may be said to inaugurate the computer age, was the ENIAC. It was built at the Moore School of the University of Pennsylvania by an engineer and a physicist  Presper Eckert and John Mauchly. Their idea, trivial by the standards of our day, was a revolutionary development when completed in 1945.
At the time, all electromechanical calculators were built exclusively to perform ordinary arithmetic operations. Any computational scheme involving several operations in a series or in parallel had to be planned separately by the user. Mauchly realized that if a computer could count, then it could do finitedifference schemes for the approximate solution of differential equations. It occurred to him that such schemes might be implemented directly on an electronic computer, an unheard of idea at the time.
...Of all the oddly named computers, the MANIAC's name turned out to be the most unfortunate: George Gamow was instrumental in rendering this and other computer names ridiculous when he dubbed the MANIAC "Metropolis And von Neumann Install Awful Computer."
Fermi and Teller were the first hackers. Teller would spend his weekends at the laboratory playing with the machine. Fermi insisted on doing all the menial work himself, down to the least details, to the awed amazement of the professional programmers. He instinctively knew the right physical problems that the MANIAC could successfully handle.
His greatest success was the discovery of the strange behavior of nonlinear systems arising from coupled nonlinear oscillators. The MANIAC was a large enough machine to allow the programming of potentials with cubic and even quartic terms. Together with John Pasta and Stanislaw Ulam, he programmed the evolution of a mechanical system consisting of a large number of such coupled oscillators. His idea was to investigate the time required for the system to reach a steady state of equidistribution of energy. By accident one day, they let the program run long after the steady state had been reached. When they realized their oversight and came back to the computer room, they noticed that the system, after remaining in the steady state for a while, had then departed from it, and reverted to the initial distribution of energy (to within 2 percent).
The results were published in what was to be the last paper Fermi published before he died. Fermi believed this computersimulated discovery to be his greatest contribution to science. It is certainly the first major scientific discovery made by computer, and it is not fully understood to this day (though it has spawned some beautiful ideas). ...
[Metropolis was not up to date on ergodic theory, integrability and chaos?]
About Me
 Steve Hsu
 Senior VicePresident for Research and Innovation, Professor of Theoretical Physics, Michigan State University
Sunday, January 18, 2009
The Age of Computing
A remarkable essay by Nick Metropolis, which appeared in Daedalus (winter 1992). Originally trained as an experimental physicist, Metropolis played an important role in the Manhattan project and was a pioneer in electronic computation. The Metropolis Algorithm, used in Monte Carlo simulation, bears his name.
Subscribe to:
Post Comments (Atom)
Blog Archive

▼
2009
(204)

▼
01
(22)
 Physicsworld on quants
 The Promised Land
 Creeping Facebook
 BJ Penn  GSP
 IBM podcast: building a smarter financial system
 Horsepower matters; psychometrics works
 Psychometrics links
 Genetic substructure of Ashkenazim
 Hail to the Chief, so long to the thief
 Obama and JFK 1961
 The Age of Computing
 Darwin bicentennial
 Risk and the spurious air of technicality
 Brainpower and globalization
 Backreaction, black holes and monsters
 Frauds!
 Confirmation bias and the Einstein myth
 Pinker on personal genomics
 Saturday afternoon, Silicon Valley
 Training Frank Mir
 Beamtimes and Lifetimes
 The Big Bang Theory and teleportation

▼
01
(22)
Labels
 physics (359)
 genetics (293)
 globalization (264)
 finance (263)
 brainpower (258)
 genomics (232)
 technology (222)
 american society (214)
 China (189)
 innovation (180)
 economics (173)
 psychometrics (171)
 ai (167)
 science (164)
 photos (162)
 psychology (153)
 travel (142)
 machine learning (133)
 biology (127)
 universities (126)
 higher education (121)
 genetic engineering (120)
 human capital (116)
 credit crisis (115)
 iq (106)
 startups (105)
 cognitive science (97)
 podcasts (85)
 autobiographical (82)
 careers (82)
 credit crunch (78)
 statistics (77)
 political correctness (76)
 elitism (75)
 politics (74)
 evolution (73)
 gilded age (73)
 geopolitics (70)
 income inequality (70)
 quantum mechanics (69)
 social science (68)
 talks (68)
 genius (67)
 bounded rationality (63)
 books (62)
 caltech (62)
 history of science (60)
 mma (56)
 sci fi (55)
 harvard (53)
 silicon valley (53)
 realpolitik (52)
 academia (51)
 MSU (50)
 kids (50)
 education (49)
 mathematics (49)
 bgi (48)
 cdo (45)
 derivatives (42)
 history (42)
 intellectual history (42)
 neuroscience (42)
 behavioral economics (41)
 biotech (41)
 jiujitsu (40)
 literature (39)
 physical training (38)
 ufc (37)
 bjj (36)
 bubbles (36)
 film (36)
 mortgages (36)
 computing (35)
 video (34)
 expert prediction (33)
 affirmative action (32)
 google (32)
 hedge funds (32)
 many worlds (32)
 economic history (31)
 black holes (30)
 foo camp (30)
 nuclear weapons (30)
 race relations (30)
 efficient markets (29)
 movies (29)
 quants (29)
 security (29)
 feynman (28)
 sports (28)
 von Neumann (28)
 music (27)
 entrepreneurs (25)
 housing (25)
 obama (25)
 singularity (25)
 subprime (25)
 berkeley (24)
 taiwan (24)
 conferences (23)
 athletics (22)
 meritocracy (22)
 ultimate fighting (22)
 venture capital (21)
 wall street (21)
 cds (20)
 internet (20)
 quantum field theory (20)
 scifoo (20)
 blogging (19)
 gender (18)
 goldman sachs (18)
 new yorker (18)
 smpy (17)
 treasury bailout (17)
 university of oregon (17)
 algorithms (16)
 cryptography (16)
 freeman dyson (16)
 japan (16)
 autism (15)
 cosmology (15)
 oppenheimer (15)
 personality (15)
 christmas (14)
 dna (14)
 fitness (14)
 happiness (14)
 height (14)
 privacy (14)
 social networks (14)
 wwii (14)
 chess (13)
 hedonic treadmill (13)
 les grandes ecoles (13)
 probability (13)
 aspergers (12)
 blade runner (12)
 government (12)
 india (12)
 malcolm gladwell (12)
 neanderthals (12)
 net worth (12)
 nobel prize (12)
 nsa (12)
 philosophy of mind (12)
 television (12)
 entropy (11)
 geeks (11)
 harvard society of fellows (11)
 research (11)
 string theory (11)
 war (11)
 Einstein (10)
 ability (10)
 football (10)
 italy (10)
 mutants (10)
 nerds (10)
 olympics (10)
 pseudoscience (10)
 russia (10)
 Go (9)
 art (9)
 climate change (9)
 complexity (9)
 crossfit (9)
 dating (9)
 encryption (9)
 eugene (9)
 flynn effect (9)
 james salter (9)
 pop culture (9)
 turing test (9)
 alan turing (8)
 data mining (8)
 determinism (8)
 energy (8)
 france (8)
 games (8)
 keynes (8)
 manhattan (8)
 pca (8)
 philip k. dick (8)
 quantum computers (8)
 real estate (8)
 robot genius (8)
 usain bolt (8)
 aig (7)
 alpha (7)
 ashkenazim (7)
 basketball (7)
 democracy (7)
 environmentalism (7)
 free will (7)
 fx (7)
 game theory (7)
 hugh everett (7)
 paris (7)
 poker (7)
 qcd (7)
 success (7)
 Fermi problems (6)
 anthropic principle (6)
 bayes (6)
 class (6)
 cold war (6)
 drones (6)
 godel (6)
 nassim taleb (6)
 new york times (6)
 noam chomsky (6)
 patents (6)
 prostitution (6)
 simulation (6)
 tail risk (6)
 teaching (6)
 volatility (6)
 academia sinica (5)
 bobby fischer (5)
 econtalk (5)
 global warming (5)
 information theory (5)
 iraq war (5)
 kasparov (5)
 luck (5)
 nonlinearity (5)
 perimeter institute (5)
 rationality (5)
 renaissance technologies (5)
 sad but true (5)
 software development (5)
 vietnam war (5)
 warren buffet (5)
 100m (4)
 Iran (4)
 Poincare (4)
 bill gates (4)
 borges (4)
 cambridge uk (4)
 censorship (4)
 charles darwin (4)
 creativity (4)
 fake alpha (4)
 feminism (4)
 hormones (4)
 humor (4)
 inequality (4)
 intellectual property (4)
 judo (4)
 kerviel (4)
 markets (4)
 microsoft (4)
 mixed martial arts (4)
 monsters (4)
 moore's law (4)
 solar energy (4)
 soros (4)
 trento (4)
 200m (3)
 babies (3)
 brain drain (3)
 charlie munger (3)
 chet baker (3)
 correlation (3)
 demographics (3)
 ecosystems (3)
 equity risk premium (3)
 facebook (3)
 fannie (3)
 fst (3)
 intellectual ventures (3)
 jim simons (3)
 language (3)
 lee kwan yew (3)
 lewontin fallacy (3)
 lhc (3)
 magic (3)
 michael lewis (3)
 nathan myhrvold (3)
 neal stephenson (3)
 olympiads (3)
 path integrals (3)
 risk preference (3)
 search (3)
 sec (3)
 sivs (3)
 society generale (3)
 supercomputers (3)
 thailand (3)
 alibaba (2)
 assortative mating (2)
 bear stearns (2)
 bruce springsteen (2)
 charles babbage (2)
 cheng ting hsu (2)
 cloning (2)
 computers (2)
 david mamet (2)
 digital books (2)
 donald mackenzie (2)
 drugs (2)
 eliot spitzer (2)
 empire (2)
 epidemics (2)
 exchange rates (2)
 frauds (2)
 freddie (2)
 gaussian copula (2)
 heinlein (2)
 industrial revolution (2)
 james watson (2)
 ltcm (2)
 mating (2)
 mba (2)
 mccain (2)
 mit (2)
 monkeys (2)
 national character (2)
 nicholas metropolis (2)
 no holds barred (2)
 offices (2)
 oligarchs (2)
 palin (2)
 population structure (2)
 prisoner's dilemma (2)
 skidelsky (2)
 socgen (2)
 sprints (2)
 systemic risk (2)
 twitter (2)
 ussr (2)
 variance (2)
 virtual reality (2)
 abx (1)
 anathem (1)
 andrew lo (1)
 antikythera mechanism (1)
 athens (1)
 atlas shrugged (1)
 ayn rand (1)
 bay area (1)
 beats (1)
 book search (1)
 bunnie huang (1)
 car dealers (1)
 carlos slim (1)
 catastrophe bonds (1)
 cdos (1)
 ces 2008 (1)
 chance (1)
 children (1)
 cochranharpending (1)
 cpi (1)
 david x. li (1)
 dick cavett (1)
 dolomites (1)
 dune (1)
 eharmony (1)
 escorts (1)
 faces (1)
 fads (1)
 favorite posts (1)
 fiber optic cable (1)
 francis crick (1)
 gary brecher (1)
 gizmos (1)
 greece (1)
 greenspan (1)
 hypocrisy (1)
 igon value (1)
 iit (1)
 inflation (1)
 information asymmetry (1)
 iphone (1)
 jack kerouac (1)
 jaynes (1)
 jazz (1)
 jfk (1)
 john dolan (1)
 john kerry (1)
 john paulson (1)
 john searle (1)
 john tierney (1)
 jonathan littell (1)
 las vegas (1)
 lawyers (1)
 lehman auction (1)
 les bienveillantes (1)
 lowell wood (1)
 lse (1)
 machine (1)
 mcgeorge bundy (1)
 mexico (1)
 michael jackson (1)
 mickey rourke (1)
 migration (1)
 money:tech (1)
 myron scholes (1)
 netwon institute (1)
 networks (1)
 newton institute (1)
 nfl (1)
 oliver stone (1)
 phil gramm (1)
 philanthropy (1)
 philip greenspun (1)
 portfolio theory (1)
 power laws (1)
 pyschology (1)
 randomness (1)
 recession (1)
 sales (1)
 singapore (1)
 skype (1)
 standard deviation (1)
 star wars (1)
 starship troopers (1)
 students today (1)
 teleportation (1)
 tierney lab blog (1)
 tomonaga (1)
 tyler cowen (1)
 venice (1)
 violence (1)
 virtual meetings (1)
 war nerd (1)
 wealth effect (1)
9 comments:
theoretical physics: engineering ::
fashion show: power loom
I think he does know about these things. Everyone believes it happens because FPU is a discrete version of the KdV equation and by KAM theory some invariant tori will persist when perturbed. However, a full proof that this is true didn't exist as late as the early 1990's, which was the last time I paid direct attention to the topic. These small divisor type perturbative calculations in high dimensions are notoriously difficult to do.
I just read the full essay and I think Metropolis doesn't make a distinction between perhaps what should be called
Scientific Computation, which is what he contributed to and Computer Science, which is a field of mathematics. The latter has very deep and nontrivial ideas that the general public has no clue about, especially in 1992. I would also say that the Monte Carlo has been put on solid ground if you take a Bayesian perspective. It is an approximate way to map out the posterior probability.
The first occurrence of Metropolis's name in your post could use a capital 'M'.
Thanks! How did I miss that?
Carson, thanks for the clarification.
It's ironic that Metropolis' essay repudiates the dismissive first comment (12:15 AM) by the curmudgeonly Anonymous, whom we heard from a few posts ago, even as it seems to echo it. Few people have integrated the theoretical and the practical—call it good engineering sense—as effectively as Richard Feynman and Enrico Fermi.
Few people have integrated the theoretical and the practical—call it good engineering sense—as effectively as Richard Feynman and Enrico Fermi.
I tell pretty much anyone who will listen that if you can not successfully navigate from the concrete to the abstract and back again, then you really don't understand what you are talking about. That's gibberish to most people, though.
A meeting of the next generation ('70s versus the '40s and'50s)...
I remember having arguments about computer science versus computer engineering in the 1970s. In 1975 the MIT EE department chose to become Electrical Engineering and Computer Science, sort of a compromise. There is actually a fair bit of computer science, but it is computer engineering that has been so incredibly productive these past decades. This pattern is not new.
Look at the golden age of heat engines which led to the supporting science of thermodynamics. Steam engines rebuilt the world, transportation, and the means of production, even as scientists struggled to understand how these machines worked. You can find lots of fields in which engineering led the way and science followed.
Right now the big challenge in the life sciences is in managing all the information. My impression is that we are going to see some science pulled out of this, though it may not look like theoretical physics. Newton's breakthrough with gravity revolved around stuff that could be computed in his day. We can compute a lot more nowadays, so our equivalent formulas and equations are going to have to take that into account.
Post a Comment