This is an awkwardly named book, partially because it has nothing to do with humanism, per se. de Waal seems to think that “humanism” seems to be rougThis is an awkwardly named book, partially because it has nothing to do with humanism, per se. de Waal seems to think that “humanism” seems to be roughly definitionally coterminous with “secular morality,” but there are plenty of humanists who don’t subscribe to secular morality and many secularists who aren’t humanists. The book could have been better had that word not been in the title at all.
“The Bonobo and the Atheist” presents a pretty uncontroversial thesis, even for someone like me whose grasp of the scholarly literature in primate ethology is virtually nonexistent: humans aren’t the only animals that display moral behavior. In fact, de Waal’s book is a great sourcebook of just these kinds of behaviors across a wide range of species, mostly culled from anecdotes relating to his personal research at various institutions over the last several decades. (He’s currently the Charles Howard Candler Professor of Primate Behavior in the Department of Psychology at Emory University.) There are plenty of examples of chimps and bonobos showing prosocial behavior all over the book, and even behavior that we could call moral, like expressing frustration or anger at a perceived lack of fairness.
If you are a reader whose curiosity was piqued by the word “atheist” in the title, and were looking for another Hitchens- or Dawkins-inspired dime store screed on how religion will be the end of humanity as we know it, you will be disappointed as well. In fact, the thesis of this book presented no reason whatsoever to even discuss the relatives merits and drawbacks of religion, but I guess throwing in a buzzworthy word into the title can’t harm sales. As is pretty clear from the summary of the book, the main idea he’s trying to present is that moral behavior is present in all of the higher apes (other animals too, but being a primatologist, his focus is mostly the bonobo and the chimpanzee). Consequently, it should come as no surprise that the moral tendencies we share with these animals predates any idea of God, and certainly predates any attempts at formal institutions built to worship said God. The big take-away from the book is that morality, no matter which animal it occurs in, is a product of biology, not of divine revelation or supernatural intercession.
Being originally from the Netherlands, de Waal was at first very perplexed at the popularity, not to mention the fundamentalism, of American life. He moved to the states at in 1981, when he was 33 years old, and has spent more than half of his life here. It’s understandable why European atheists look at American atheists with bemusement, because they seem to be so passionate about something so obvious. However, despite all his time in the U.S, de Waal still seems to think that popularizing atheists with a broad American audience (Dawkins, Hitchens, and Harris – he thinks Dennett is by far the most rational and thoughtful of the bunch) are “sleeping furiously” – that is to say, since atheism is essentially the rejection of a proposition, they’re making a big deal out of the existence of essentially … nothing. He drives this point home repeatedly in the book, and it’s somewhat befuddling. Here’s a scientist, an open atheist, and yet he doesn’t seem to see the value in doing the yeoman’s work of getting bullshit like “creation science” out of the classroom where it most certainly doesn’t belong.
Are the Dawkins, Hitchens, and Harris sometimes grating and strident? Sometimes, tin-eared interlocutors for the ideas they’re trying to promote? Most certainly. But when he refers to them as sleeping furiously, he seems to be criticizing them for more than their stentorian shrillness. In a country where one out of three Americans still believe that the creation account(s) in Genesis hold the authority of science, we need to learn how to walk before we can run. We need more public education about science, and at their best, most effective and most appealing, that’s what they’re doing. Only then can we get the buffoons who think the world was created in six days kicked off of school boards and out of Congress. But to say they are sleeping furiously is akin to calling their work in vain. Perhaps it’s the optimist in me (it occasionally peeks out), but I hope that’s not the case. I’m convinced that a world that makes its decisions based on evidence, weighted consideration of the facts, and rational criticism of ideas is the best possible world that we have to offer future generations. But that kind of world doesn’t just pop into existence ex nihilo. It has to be worked for, and that’s what I’ve always imagined as the most important role for the “public atheist.” ...more
My encounter with this book was really a misalignment of my personal expectations of what I was expecting from its author versus that author’s attemptMy encounter with this book was really a misalignment of my personal expectations of what I was expecting from its author versus that author’s attempt at writing a popular, engaging Bildungsroman about his initiation into the world of fieldwork with African baboons. Based solely off of what I had heard about “Behave: The Biology of Humans at Our Best and Worst,” I bought this mistakenly thinking there would be more discussion of his research and the conclusions he’s drawn from it. If you’re familiar with Sapolsky from his numerous lectures available on YouTube, you know his professorial, bone-dry sense of humor. Because it’s s easy to read and relatable, “A Primate’s Memoir” almost reads like it has a completely different author altogether.
Robert Sapolsky became entranced with apes (especially gorillas) during his first visits to the Museum of Natural History where he immersed himself in the dioramas. Over time, his interests would vary slightly over time to baboons, particularly the problems they answered about human health when we look at how factors like social support networks and dominance were correlated with long-term health outcomes. Sapolsky was just becoming increasingly interested in neuroendocrinology, and especially his thesis that the concentration of stress hormones in baboons was positively correlated with their health problems.
Instead of discussions of the science or its relevance for human health, Sapolsky continues his narrative to roll out a series of stories about the people, events, and ideas that he encountered during his travels all over Africa. Some of the stories he has to tell will truly curl your toes, like the numerous times he barely escaped with his life. For example, he was in Uganda at the time of Idi Amin’s ouster. He was also in the Sudan to witness the fighting there long before South Sudan gained its independence in 2011. Others are hilarious, especially the stories involving baboon behavior. Others are quite the education in cross-cultural understanding, like the African man who had never been on an elevator before who thought that pressing “2” and then “3” would take him to the fifth floor of the building.
Had I known what this was, I almost certainly wouldn’t have picked it up. It’s far froom a disappointment, it’s just that the content (and the tone) weren’t what I was expecting. Sapolsky is a lot better of a storyteller than you would ever imagine from just watching his online lectures. I wish there was a way to write about academic “high science” in the conversational register that Sapolsky manages to achieve in this book. That would be the kind of accidental discovery I wouldn’t mind making one bit. As it is, though, the book remains just a series of stories, sometimes riveting and sometimes droll, that don’t really shed much light on the work that he was doing at the time....more
W. H. Auden once said that we live societies “to which the study of that which can be weighed and measured is a consuming love” – but that hasn’t alwaW. H. Auden once said that we live societies “to which the study of that which can be weighed and measured is a consuming love” – but that hasn’t always been the case. The science of Aristotle, arguably the biggest influence on post-Hellenic science west of the Levant, was thoroughly qualitative. Only later, after the rediscovery of the Plato whose fascination with numbers and ratios bordered on worship, did science begin to take on a properly quantitative quality. As the subtitle of the book hints, this begins to happen sometime in the mid-thirteenth century, and this is precisely the set of stories that Crosby seeks to elucidate for the general reader. He wants to retrace the steps that took us from a world of “emotional attachment to perception and experience, to a visualizing and quantifiable approach to reality,” to “comprehending reality as composed of quanta.”
Because of what Crosby is trying to do, much of the book reads like a survey of medieval and Renaissance math and science. In a few hundred years, the West went from the Dark Ages (I’ve always despised that term since it’s so wrong and inappropriate, but if fits anywhere it’s true of the quantitative sciences) to the bourgeoning of an array of common things and ideas that would have been impossible without better economizers; just a few of these things include military maneuvering, increasing calendrical accuracy (i.e., the transition from the Julian to the Gregorian calendar), cartography, time-keeping devices, grammar and alphabetization, geometric perspective in painting, astronomy and currency and bookkeeping. The invention of polyphonic music, perhaps the greatest innovation of the medieval West, would have been impossible without the modern musical notation that replaced neumatic notation (commonly, though questionably, attributed to Guido of Arezzo during the early eleventh century).
His chapter on the development of music from 600 to around 1500 traces its development from the earliest Gregorian chant to the acme of Flemish polyphony, stating that the importance of music can be traced to its unique place in the quadrivium as “the only one of the four members in which measurement had immediate practical application.” Similarly, as the medieval visual art gently bleeds into the masterpieces of the Renaissance, we see a growing fascination with naturalism in painting that would have been impossible without new insights into optics, illusion, perspective, and depth – all quantifiable and “mathematizable.” Those familiar with the Renaissance greats will readily recognize that Leonardo, Masaccio, and Raphael are just as much about mystical Platonic ratios as they are about older, medieval considerations. Crosby ends his historical journey in a place that conveniently ties up several loose knots that would interest other kinds of historians, including those interested in the development of capitalism and the mercantile economy – namely, the advent of double-entry bookkeeping. While the mechanical clock “enabled them to measure time, double entry bookkeeping enabled them to stop it - on paper, at least.”
While Crosby does little to actually make new discoveries in the fields he considers, he goes far in recasting and repurposing the information he has readily available. It seems incontrovertibly true that his central argument is true. How well does his evidence explain or support this argument? This seems shakier to me. As I noted above, taken as a whole, the book can come across as a history of medieval math, medieval science, medieval astronomy, etc. But his voice is quick-witted and engaging, sometimes even chatty – probably not what you were expecting given the title of the book. And rather than fully “accounting” for the rise of the particular phenomenon he is trying to explain, this book at the very least rediscovers some of the important philosophical fundamentals that undergird his concerns. However, he fails at answering the all-important “why?” Perhaps this question is better-suited to cliometricians and psychohistorians than historians of science....more
The Scope Trial (occasionally referred to with both contempt and fondness as “The Monkey Trial”) has a life of its own, and much of that life has littThe Scope Trial (occasionally referred to with both contempt and fondness as “The Monkey Trial”) has a life of its own, and much of that life has little or nothing to do with what actually occurred in Dayton, Tennessee during the summer of 1925 when William Jennings Bryan and Clarence Darrow met to defend the merits of the case. Lawrence and Lee’s 1955 play “Inherit the Wind” and the film based off it five years later form much of the basis for popular (but ultimately false) ideas about the trial. And of course it doesn’t help matters that the topics of science and religious have been held to be, at least in the popular imagination, mortal enemies.
In “Summer for the Gods,” Edward J, Larson retells the story of the trial stripped of all the mythology, without compromising readability or interest for the layperson. Larson is both a law and history professor, so he’s in a unique position to clarify the historical content and the legal matters. He does a stupendous job of doing both.
Not that the idea of media sensationalism is anything new, but one of the things I liked most about this book was that it shows exactly how the trial was, in many ways, a Potemkin village. As soon as the Butler Act (the statute which prevented the teaching of evolutionary theory in science classrooms in the state of Tennessee) was passed, the newly founded ACLU offered to defend anyone prosecuted by the state for breaking the law. Their plan – for the case to work its way up through the courts and eventually find itself in the Supreme Court docket – didn’t go exactly as planned.
The trial ended up bringing names that spelled the worst kind of boosterism for the beleaguered small-town residents of Dayton who had probably never seen the likes of the media circus they witnessed for those several days – two of the country’s best-known attorneys, Clarence Darrow for the defense and William Jennings Bryan heading up the prosecution. Darrow was fresh out of defending accused murders Leopold and Loeb, whose trial had only a year before also been breathlessly called in the media “the trial of the century”; Bryan was a decade out of his two-year stint as Woodrow Wilson’s Secretary of State, from which he resigned due to the international buildup of the First World War. He was a staunch progressive – back when “progressive” meant, among other things, supporting prohibition and belief in Biblical literalism. How times change.
The issues on the table? Well, they weren’t anything resembling what recent similar cases – say Dover v. Kitzmiller – argued. Bryan’s legal arguments really had very little to do with the merits of science or evolutionary theory. Instead, he argued on majoritarian grounds that if a state law is passed, it was obviously the will of the people and, having gained the appropriate number of votes in the legislature and being signed by the governor, it was constitutionally legitimate. It was much more of a states’ rights, or even a people’s rights, approach than the imagined epic battle between science and religion. The lynchpin of the defense was to get Bryan to testify and ultimately push him into a corner about the proclaimed literal truth of Genesis. A little spoiler alert: despite Darrow’s attempt to utterly embarrass and confound Bryan by getting him on the witness stand and grilling him on the timeline of the events in Old Testament (probably the most historically accurate part of the trial that people would remember) the trial ends in a way that most people who don’t know much about it wouldn’t anticipate. The presiding judge dismisses Bryan’s testimony as irrelevant, and Scopes loses. And since the Bryan’s purpose isn’t to shame Scopes or even make him a personal target, he magnanimously offered to pay the $100 fine for Scope’s conviction, which never had to be paid anyway, since the fine was overturned by a higher court.
Being one of the many whose sole knowledge of the Scopes Trial was based mostly on the play and what was casually bandied about in high school science books, I appreciated Larson’s approach, as full of it is of equanimity and balance. Larson says a few things that make it rather obvious where he falls in the “debate” insofar as there is one (and among professional biologists, there really isn’t): he can look down condescendingly on Bryan on the witness stand trying to defend his ultra-literal view of Genesis, but those of us who credit science where it is due have a hard time not having a little fun at Bryan’s expense. Go read, then watch “Inherit The Wind.” Then as a good counterbalance, and some reliable history, read this. It’s one of the best books on science and religion I’ve had the pleasure of reading in a while....more
These days "magic" seems quite separate from the pursuit of science; Paracelsian iatrochemistry sounds about as scientific as the use of an ouija boarThese days "magic" seems quite separate from the pursuit of science; Paracelsian iatrochemistry sounds about as scientific as the use of an ouija board. But to divorce these two different kinds practices - the art of magic, the power to conjure, to discern the occult "mathematical secrets of the universe" on the one hand and what we would consider rigorous, empirical observation on the other - is quite ahistorical and misunderstands the spirit of science in the Renaissance. Allen Debus, professor for many years at the University of Chicago and historian of early modern science, drives this point home repeatedly in each of the general area discussed in this book.
The topics covered are ones that you would expect to be found in a book that summarizes the history of major scientific developments from approximately 1400 to 1650 - the study of nature (especially flora and fauna), the increased understanding of human physiology, cosmology, and a brief precis explaining the development of the scientific method generally speaking.
Many of the Renaissance humanists, most notably Paracelsus, wholly rejected the scholasticism and Aristotelianism of previous generations and wished to infuse science and the study of nature with a renewed appreciation for mysticism and alchemy. While a religious understanding of the universe was utterly central to Paracelsian science, he simultaneously emphasized observation, which had been critically ignored by Aristotle and his studious promulgators. (His interest in chemistry, especially iatrochemistry, speaks to his interest in observation.) Aristotle's appreciation of science had been vitiated of all divine wisdom and knowledge by his paganism; Paracelsus wished to correct for this by suffusing science with neo-Platonic, Hermetic, and alchemical texts. He thought that the mathematical formalism of science resembled scholasticism, and he avoided it like the plague.
Empiricism and observation critically improved a number of scientific areas, not just alchemical medicine. In the fifteenth century, crude medieval woodcuts of plants based on Pliny's centuries-old descriptions dominated the scholarship of botany. The drawings of Swiss naturalist Conrad Gesner and Italian botanist Aldrovandi were much more accurate than previous ones, and therefore could greatly benefit both botanists and physicians alike.
In the area of medicine, matters had similarly stagnated. The practices of Galen predominated for a millennium after Galen's death because of their wide use and various translations. Debus discusses the historical developments contributed by those from Vesalius to William Harvey, the first person to accurately characterize blood flow in the human body.
This book is a wonderful introduction for two reasons: it covers the wide range of what were considered the sciences in the few centuries Debus is most concerned with without overwhelming the non-specialist reader, and he continually stresses the continuity between what we would today consider “magic” and empirical, rational, deductive reasoning, or what we would be more likely today to associate with science. He does this effectively in every chapter, and as someone who has a longstanding interest in the history of medieval and Renaissance science, it is refreshing to see an author who isn’t trying to retrospectively make modern science out of something supposedly written by Hermes Trismegistus. He lets the two stand side by side in whatever tension they might have, and deals with them as they are, not as he wants them to be....more
Glancing over many of the other lower ratings of this book, I’ve found that most people have already hit upon the major points of why I found it such Glancing over many of the other lower ratings of this book, I’ve found that most people have already hit upon the major points of why I found it such an unsatisfying reading experience, and there were quite a few of them. To begin with, the actual title and the informational content of the book don’t really seem to jibe. There’s too much biographical information here, and of too many people, for the entire book to cohere in any meaningful way. The connection that one chapter has to the next is tenuous at best. For example, Gleick starts out talking about the ways in which African drummers drum in order to retain the information in a message over long distances (an fascinating way to the begin talking about information as a broad subject), but then almost inexplicably jumps directly into a short history of early English dictionary-making in the next chapter, and follows that with a history of the work Charles Babbage and Ada Byron Lovelace did together, including the Difference Machine and the Analytical Machine. Connecting them is only the thinnest of threads – the work of Claude Shannon and the birth of information theory - which isn’t even substantively developed until halfway through the book. Because of this, the whole endeavor ends up being a mile wide and an inch deep.
Is it just me, or does most non-technical science- or technology-oriented writing “The Information” read this way? The narrative net seems like it needs to be cast so far and wide that even those readers who might be put to sleep reading about something like information theory (why are these people reading this book in the first place?) will be able to maintain their interest. It can mostly be avoided when the subject is narrowed to the life and/or ideas of one person, as in Gleick’s previous book on Isaac Newton, though I found that book a little unsatisfying for a different reason: I thought it was much too short.
To give off the sense that this book wasn’t fun to read would be unfair. If you’re broadly interested in the history of science, this provides as a good introduction to a number of topics: in addition to the ones already mentioned, Gleick discusses telegraphy, the birth of statistical mechanics in physics and the concept of entropy, and the rise and difficulties of quantum computing. It’s just that the star of the show, the history of how “information” has been treated as such, suffers tremendously.
I picked it up because 1) it was on the discount shelf at Barnes&Noble for a reasonable price (and if you can get it for six dollars, I would still say it’s worth investing in), and 2) I felt that my knowledge of information theory would be insufficient for a book that demanded a readership with more expertise. For those interested in something like the history of computing, this would be a wonderful place to start. Anyone expecting something more tightly focused on the likes of Claude Shannon, Norbert Weiner, their colleagues, and the development of fields like information theory and cybernetics will walk away wishing for something much more focused. ...more
Living in a time of Dolly the sheep and bioluminescent rabbits, it’s easy to lose sight of the ever-blurrier distinction between nature and “art” (undLiving in a time of Dolly the sheep and bioluminescent rabbits, it’s easy to lose sight of the ever-blurrier distinction between nature and “art” (understood in the sense of anything “artificial”). Most of us are familiar with an example that Newman himself mentions – that of Hawthorne’s story “The Birth-Mark,” which tells the story of a man who tries to eliminate a small mark on his otherwise remarkably beautiful wife’s face. The moral is so universal as to be predictable: his effort to perfect the already perfect is to have hubris that cannot go unpunished, it is to fail to accept in a Niebuhrian sense our own limitedness and the sin of human nature. But one of the goals of Newman’s book is to show that this conversation is much older than the nineteenth century. It goes back at least to the myth of Icarus and Arachne.
I’ve had a longstanding interest in the history of science, but this was admittedly a bit of a blind purchase from the University of Chicago Press. The subtitle, “Alchemy and the Quest to Perfect Nature,” hinted at perhaps a bit of room for the reader of general interest, but I had no such luck. The topic is fascinating in itself, and I walked away from the book feeling that I’d learned a lot about the history of the art-nature debate in its various instantiations throughout history.
The preface offers a nice, general introduction to the history of alchemy. (It should be noted that Newman’s use of the word “alchemy” is extremely general. Scientists locked away in a laboratory trying to transmute lead or aluminum into gold should be banished from your mind. Instead, think about any kind of transformation produced by a human being, including the visual arts and, in much more recent history, the production of human life through means other than coitus, i.e., in vitro fertilization or other methods.) In fact, Newman’s thesis is that “alchemy provided a uniquely powerful focus for discussing the boundary between art and nature” and that this whole discussion “can only be understood if the reader is willing to engage with the presuppositions of premodern philosophers, theologians, alchemists, and artists about the structure and nature of the world around them” (p. 8).
He’s not kidding, either. The heart of this book is a truly exhaustive attempt to chart the history of this idea, many of whose contributors will not be recognizable, even to readers of the history of science. Some of more popular: Ibn Khaldun, Avicenna, Averroes, Thomas Aquinas play gigantic roles. But most are unfamiliar: Jean de Meun, Petrus Bonus, Thomas Erastus, Bernard Palissy, and Zosimos of Panopolis are just a few of the dozens. There are so many of these minor people, whose ideas loom large in the book, that they are often picked up, left behind, only to be picked up in another chapter, leaving the readers to flip back and forth in order to intellectually orient themselves.
I’ll spare you the various transmutations and permutations of these complex arguments (they do get very complex), but Newman seems to have two important takeaways. One is that our view of alchemy as synonymous with witchcraft or other black arts is naïve and undeveloped, and that it needs to be expanded to include all of the arts in the sense described above. The second and more important one is that this intellectual conversation has a long, subtle, and storied past in the disciplines of philosophy, theology, and the natural sciences. This should without a doubt be read by someone with a narrow, serious interest in this subspecialty (if this is the case, you’re probably already familiar with Newman’s name, since he’s one of the better-known scholars in the field). The book doesn’t have even a bit of appeal to a slightly more popular audience that it might have had that would have made it much more enjoyable, at least for this non-specialist....more
Kenneth Miller, a professor of biology at Brown University, has made a name for himself in communities that are deeply concerned with the intersectionKenneth Miller, a professor of biology at Brown University, has made a name for himself in communities that are deeply concerned with the intersection of religion and science, both on the atheist/skeptical side and the religious side. He successfully manages to irritate both camps because he says that supporting evolution and deistic belief are not necessarily contradictory. (Miller is a Catholic.) This shouldn’t be too controversial of a statement for someone who has thought about the issue for more than a few minutes, but it still seems to disconcert people.
“Finding Darwin’s God: A Scientist’s Search for Common Ground Between God and Evolution” works in some ways, but it is not what it is advertised to be. Judging from the title alone, you might guess that it involves a lot of digging through Darwin’s papers for his (non)religious inclinations, and to be fair we do get a very small amount of this. It will probably come as no surprise that Darwin was at different times throughout his life more conflicted and sometimes less conflicted about the existence of a Christian God, or even the God of deism. Earlier in his career, he was very convinced by the arguments of renowned eighteenth-century English scientist William Paley’s watchmaker analogy set forth in his “Natural Theology,” but seemed to become more skeptical as the publication of “Origin of Species” approached, and certainly toward the end of his life.
First, the part of the book that I wasn’t expecting: approximately the first two-thirds of this book is dedicated to demolishing creationist “science” (not really science at all), and particularly youth earth creationism. I realize the continuing need for popularizing science education, but I was more interested in the “Finding Darwin’s God” angle than a re-hashing of basic high school biology and chemistry which we all *supposed* to have learned. Even though this part of the book was a slog, he was extraordinarily thorough. He shows how a literal interpretation of Genesis no longer makes any sense considering what we know about morphology, radioactive dating, and the fossil record. He also equips someone who might be less familiar with pro-evolution arguments with examples, including the biochemical details of the blood clotting cascade and the development of the eukaryotic cilium. There is also a wonderful part of the book that explains how Gould’s punctuated equilibrium only exists as a different-looking phenomenon when you use shortened geological time scales, and that when you re-elongate these scales, you get the evolutionary tree of common descent that would have been more recognizable to Darwin himself. These couple of hundred pages were largely designed to arm the non-biologist with technical arguments to combat creationist nonsense, and they do a fine job.
The last two chapters are where Miller finally starts to explore the possible arguments for God. None of his arguments are convincing. He even says a couple of things that are embarrassing for a scientist of his caliber, like when he wanders into the field of cosmology: “…when one makes a run backwards in time to the moment before the big bang, one must imagine inconceivable amounts of mass and energy concentrated at a single point in space” (p. 225). Except that even talking about “before the big bang” makes no sense, since that very event is what created space and time as we know it. There was no time before the big bang that we know of. It’s like talking about cakes before the time of baking. It seems that he might be trying to raise the question of what allowed the big bang to occur. A great question, and we have the greatest minds in science working on it. The current answer? We don’t know.
A bit later, Miller delves into the miraculous: “What can science say about a miracle? Nothing. By definition, the miraculous is beyond explanation, beyond our understanding, beyond science. This does not mean that miracles do not occur. A key doctrine in my own faith is that Jesus was born of a virgin, even though it makes no scientific sense – there is the matter of Jesus’s Y-chromosome to account for. But that is the point. Miracles, by definition, do not have to make scientific sense” (p. 239). This truly is a disappointing argument from someone who just spent two-hundred pages arguing against creationism because it *doesn’t make scientific sense*. One of the points of science is to try to build heuristic models that explain the universe around us, or some aspect of that universe, that account for the most observable data. We must either reject or be agnostic about those phenomena which cannot be assimilated into these models.
Miller sometimes waxes philosophical, with about as much success. On God’s eternality: “This means that God, who always has been and always will be, transcends time and therefore is the master of it” (p. 242). I realize this is a stock-in-trade argument from classical Christian theology, but it is fundamentally flawed: something cannot exist outside of time because time is a predicate of existence. To exist means to have *come into existence*. The popular formulation of this argument is when a theist asks an atheist “What caused the big bang?” and the atheist responds “What caused God?” If you’re operating under the assumption that everything needs a cause, as classical Christian theology does, saying that God is an exception to your own rule isn’t going to work. It’s a logical fallacy called special pleading.
So, why does Kenneth Miller believe in God? One reason is his acceptance of the God of the Gaps arguments; he seems to be perplexed by the fact that we don’t have all of cosmology explained away. The second is his peculiar interpretation of quantum mechanics. He thinks that the random events of quantum mechanics and the simultaneous orderliness of the universe have something to with a God, though he never comes out and explicitly states it, and never clarifies how the indeterminacy of quantum mechanics would provide evidence for God.
What kind of God does Miller believe in? In the closing lines of the book, he quotes Darwin: “There is grandeur in this view of life; with its several powers having been originally breathed by the Creator into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most wonderful and most beautiful have been, and are being evolved” (p. 292). This is awe-filled Darwin at his most wondrous. However, even when Darwin indulged these sentiments, this is clearly the God of deism: a world set into motion by a distant, non-personal God who created natural laws and then let happen what may. It doesn’t at all comport with the fundamental tenets of the Catholic Church (the virgin birth, the assumption, et cetera) in which Miller claims to believe.
Darwin’s God wasn’t the God of miracles, and therefore isn’t Miller’s God. He was the God of reverence for the mysteries of the universe, which have been slowly decreasing in number since the rise of modern science. This number will never reach zero; there will always be something new to learn, and science will never disappear. But looking for God in the unexplained nooks and crannies of science leaves a smaller and smaller place for Him/Her/It with each passing year, and this seems to be a theological approach in danger of having God slip through its very fingers....more
When we think about the modern biological sciences, one name invariably pops into mind: Charles Darwin. Keith Thomson’s book, “Before Darwin: ReconcilWhen we think about the modern biological sciences, one name invariably pops into mind: Charles Darwin. Keith Thomson’s book, “Before Darwin: Reconciling God and Nature,” looks at the approximately two centuries of science that predate Darwin, partially in an attempt to see what influenced him, but mostly because it’s a fascinating history in and of itself. Thomson is almost wholly concerned with an age in which all natural science (then still often called “natural philosophy”) was almost always natural theology – that is, an understanding that the study of science and nature would draw one close to understanding the mind of God. William Paley, the eighteenth-century English naturalist whose book “Natural Theology” had a tremendous influence on Darwin’s early career, thought that the ways of God are shown to man through a rigorous and critical study of the natural world.
We get a quick, breathless account of big scientific developments from Copernicus to Newton, and see that the more we learn about God, the less ground natural theologians have to stand on. Thomson rhetorically asks, “Once Pandora’s Box was opened and a new, lesser, role ascribed to God, who could predict where matters would end?” (p. 44).
The rest of the book is taken up with discussing the contributions of several scientists, many of them not nearly as recognized as they should be, including Thomas Burnet, John Ray, Robert Plot, and Martin Lister. Paley and Ray especially built an argument from design, but there was one glaring problem: it’s clear there are many things in nature that are not perfect, and that don’t look like they were designed. The human eye – commonly adduced by modern-day creationists as an example of “irreducible complexity” – has a blind spot that lacks photoreceptors and therefore would make us more susceptible to attacks from predators if we still lived out in the open. The sacroiliac region at the base of the spine is mechanically imperfect to bear our weight, which often results in back pain as we age. Someone convinced that the human body is a perfectly designed machine can’t explain the appendix, a vestigial organ for which there is no observable purpose.
What Thomson seems to be saying is that natural theology had a historical tendency to reverse engineer science to fit its own theological ends. Therefore, what we see here is not so much science as we would understand the term today, but the use of science as a kind of anthropocentric cherry-picking to shore up preformulated beliefs, namely the creation accounts (there are two of them) in Genesis. Ironically, these culminate in a the work of Steno, a Dutch geologist and anatomist who was blithely unconcerned with how much his own work – the work of a Catholic bishop, mind you – confirmed or denied the accounts in Genesis.
There’s tons of other fascinating stuff in here that I won’t get into about interpretations of the fossil record (apparently people used to think that fossils just grew in place in the ground and that their resemblance to animals was purely coincidental), geology, paleontology, and what everyone thought about the Great Flood. It could also serve as a reference work if you’re interested enough in the history of natural science in the seventeenth and eighteenth centuries. It’s pretty much rekindled my long-dormant interest in the history of science....more
Most people are probably passingly familiar with Franz Anton Mesmer, the eighteenth-century German-born physician and originator of what we now know aMost people are probably passingly familiar with Franz Anton Mesmer, the eighteenth-century German-born physician and originator of what we now know as “mesmerism,” but the background that Robert Darnton (formerly of Princeton University, but now heads the Harvard University Library) brings to the this book puts mesmerism into not just medical and physical, but also political perspective.
Pre-Revolutionary France was peopled with scientists trying to create new cosmologies to explain the mysterious universe around them. “Science had captivated Mesmer’s contemporaries by revealing to them that they were surrounded by wonderful, invisible forces: Newton’s gravity, made intelligible by Voltaire; Franklin’s electricity, popularized by a fad for lightning rods and by demonstrations in the fashionable lyceums and museums of Paris and other miraculous gases of the Charlieres and Montgolfieres that astonished Europe by lifting man into the air for the first time in 1783” (p. 10). It was a time of both experimentation and empiricism – and lots of quackery. Mesmer himself proposed that a superfine fluid pervaded the entire universe, but especially the body. “Individuals could control and reinforce the fluid’s action by ‘mesmerizing’ or massaging the body’s ‘poles’ and thereby overcoming the obstacle, inducing a ‘crisis,’ often in the form of convulsions, and restoring health or the ‘harmony’ of man with nature” (p. 4).
There were, however, institutionalized consensus positions on scientific issues, and the literary and medical journals and professional societies who held them would openly call out Mesmer on his unsubstantiated claims. Mesmer was unconcerned, though. As he said, “It is to the public that I appeal.” The accreditation and approval of official societies meant nothing to him, and he didn’t bother seeking it; rather, he wanted to bring his science to the people and let it speak for itself, and accept it on their own accord.
However, mesmerists didn’t think that mesmerism’s power stopped and started with the body. Instead, they suggested that the health of the body was related to many other things, including mental health, morality, and even the possibility for political change. Darnton details some of the more important people of Mesmer’s inner group, and the splitting into factions that eventually occurred. One of the factions, led by a man named Bergasse, “developed the social and political aspects of his theory – his own ideas about ‘universal morality, about the principles of legislation, about education, habits, the arts, etc.,’” (p. 78). “Carra [another one of the breakaways from Mesmer’s official doctrine] and his friends, especially Bergasse, dealt with the cosmological side of mesmerism by extracting a political theory from the obscure, strictly apolitical pontifications of Mesmer. ‘Political theory’ may be too dignified a term for their distortions of his ideas, but they themselves considered their theories consistent and reasonable, and the police viewed them as a thread to the state” (p. 107).
What was it in mesmerism that appealed to the radical mentality before the Revolution? The mesmerists began to think that the professional, academic journals and societies had formed a kind of anti-democratic coterie whose job it was to marginalize legitimate scientists with valid ideas. In other words, some mesmerists began to see science as something other than what could be described, for lack of a better term, as an “elitist” enterprise. Science had no One Right Answer, and the ridiculing poorly known scientists for their ideas was no better than what Louis XVI was doing; science and political theory – namely, democracy – had collided.
Obscure as it sounded, the ideas of Carra and Bergasse took Mesmer to his logical conclusions: unjust legislation, just like a bad moral disposition, “disrupted one’s atmosphere and hence one’s health, just as physical causes could produce moral effects, even on a broad scale” (p. 108). By construing Mesmer so liberally (and so inaccurately), Carra, Bergasse and others were able to cast a single net around both the world of science, ethics, and revolutionary politics. “By injecting a Rousseauist bias into a mesmerist analysis of the physical and psychological relations among men, Bergasse saw a way to revolutionize France. He would reverse the historical trend of physico-moral causality, reforming institutions by physically regenerating Frenchmen. Improved bodies would improve morals, and better morals would eventually produce political effects” (p. 124).
I just happened to read this soon after finishing George L. Mosse’s “Confronting the Nation: Western and Jewish Nationalism,” which has a few chapters that discuss fascism and its relation to nationalism. In one of those chapters, he pinpoints the French Revolution as the historical event that allows movements like fascism to eventually develop, especially with the mass mobilization of politics. Although Darnton never explicitly suggests this, his book seems to be solid evidence of Mosse’s thesis. Mesmer choosing to ignore scientific consensus and saying “I wish only to convince the public,” his conspiratorial view well-known scientists trying to crush and demolish him, and the collusion of science and politics (especially more race-related “science” as we get into the nineteenth century) all have strong lines of continuity with what we will later call fascism. For anyone interested in how science, ideology and politics can become so easily and terribly entangled, I found this to be a wonderful case study.
But it’s just as good for those interested in the more pedestrian history or sociology of science. Darnton’s background in eighteenth-century European (especially French) history was essential for building the picture that he does, and for building the conclusions that he convincingly reaches. For those interested in something along the same lines but a bit more popular, Darnton’s “The Great Cat Massacre,” which I have also reviewed on this site, is a wonderful and equally insightful collection of essays on early modern French cultural and literary themes. ...more
Ernest Gellner (1925-1995) was a French-born Czech-Englishman whose interests are as varied as his string of ethnonyms would suggest. In addition to hErnest Gellner (1925-1995) was a French-born Czech-Englishman whose interests are as varied as his string of ethnonyms would suggest. In addition to holding well-known chairs in sociology, anthropology, and philosophy, he was also interested in the methodological foundations of science, the political culture of Islamic societies, and dismantling what he considered to be three of the biggest con-games that have taken in intellectuals of the twentieth century: postmodernism, Freudianism, and Marxism. Perhaps his best-known book is “Plough, Sword, and Book.”
This is an imprint from the University of Wales called “Political Philosophy Now,” and aims to summarize Gellner’s oeuvre. Lessnoff does a competent job at this, even if his approach isn’t nearly as witty and sharp as Gellner’s notoriously was. His delivery is flat and academic, but he’s clearly very familiar with Gellner’s work, and especially the conversations in which Gellner was intellectually engaged. Since I haven’t read any of Gellner’s original work, I can only assume that his interpretation of Gellner is accurate. He’s certainly not an apologist for Gellner, and openly criticizes him when he feels it is necessary.
I won’t discuss all of the topics here, but I thought that some of Gellner’s work deserved particular attention. The best part of the book is the last chapter of the book called “Relativism and Cognitive Ethics.” Cognitive ethics is, as I understand it, essentially Gellner’s way of defining intellectual honesty, and is loosely synonymous with the scientific standards of testability and falsifiability in the Popperian sense. He accuses Freudians and Marxists of lacking this cognitive ethic, because imbedded in these systems are ways of deflecting all criticisms. If you’re not a Freudian, you’re simply in a state of false consciousness (note the similarity to Marxist rhetoric); you’re in denial of Freud’s truth. If you deny Marxism, you’re a useful idiot for the bourgeoisie, blind to the alienating effects of capitalism. Basically, all these systems (he goes on to critique postmodernism along the same lines), have internally coopted all criticisms, and therefore completely protects itself from attack. They’re unfalsifiable, and therefore necessarily unscientific – which is a problem when many of their practitioners wear the cloak of scientific respectability.
There are also chapters on nationalism, Gellner’s theory of history (as presented in “Plough, Sword, and Book”), politics in modern society, and a blistering attack on the linguistic philosophy popular at Oxford during the middle of the century, especially that of Wittgenstein (found in “Words and Things”). The only chapter that I didn’t find convincing was the one on Islamic society in which he states, quite oddly, that theocracies are particularly adept at conforming to modernist ideals and suggests a distinction between high and low Islam. This was counterintuitive at best.
Lessnoff’s book is a great survey of Gellner’s life’s work. I would certainly suggest this for anyone in reading one of Gellner’s books, which many of which seem difficult but very rewarding. ...more
Roy Porter is mostly known for his books on the history of medicine and the development of medical practice in Europe. “Madmen” is Porter’s attempt atRoy Porter is mostly known for his books on the history of medicine and the development of medical practice in Europe. “Madmen” is Porter’s attempt at outlining the changes in the care of “lunatics” (as the subtitle puts it), mostly during Georgian England. The book traces the different approaches to various mental illnesses from the time of humoralism up until the birth of what can be recognized as modern-day psychiatry in the early nineteenth century.
Porter begins by challenging Foucault’s concept of the Great Confinement, in which unreasonable members of society were institutionalized in large numbers. According to Foucault, before the Great Confinement folly had “a liberty and truth of its own, engaging in a dialogue with reason” but afterwards became disqualified, abominated, and reduced to pure negation (unreason). Foucault also maintains that it was it mostly the poor who were institutionalized by the rising middle classes. Porter challenges this as historically inaccurate at least in England; instead, the progress was slow and gradual. Also, “it is a key contention for Foucault that the Great Confinement was driven by the powerful to police the poor … but it would be a mistake to underestimate the numbers of bourgeois, gentry, and nobility who were also being confined” (p. 21).
Porter gives an historical account of the four-fold humoralism (blood, phlegm, yellow bile, and black bile) as a way of explaining how rational, mortal men could attain balance with the cosmos; aetiologies of sickness were also explained as an imbalance between the humors well into the eighteenth century. Even though there was a medical tendency to somatize mental illness, elements in the culture (including Richard Burton, author of “The Anatomy of Melancholy,” and himself a renowned melancholic) portrayed it as a kind of psychomachia – literally, a battle between the rational and irrational parts of the soul. In later Georgian England, insanity became something to be pitied, aided by Locke’s conception of madness as a false association of ideas, instead of “the overthrow of noble reason by base passions”
While confinement of the insane did exist before around 1800, it was in private institutions, and sometimes in churches. It wasn’t until after this date that the state began to demand confinement for lunatics in subsidized asylums. Before this, going back even to the Restoration, many madhouses were actually private residences – indeed, that’s even how the word “madhouse” came about – in an age before licensing and regulation. These enterprises could be greatly lucrative for the people running them, since the owners could mandate that the lunatics stay there indefinitely while they collected the money from the lunatics’ family members. The Act for Regulating Private Madhouses of 1774 went some way toward protecting the mentally fit from being wrongly confined, but they would have to wait until the 1840s for legislation that attempted to supervise the living conditions and quality of care of the patients.
During the first part of the nineteenth century, there grew to be a body of treatments which we can increasingly recognize as psychiatric. “In particular, currents in metaphysics and medicine were proposing fresh paradigms of mind and body, behavior and self, and thereby opening a new field eventually to be denominated the psychiatric. For this, the catalyst proved to be the associated emergence of bricks-and-mortar institutions for lunatics; for the presence of the first time of concentrations of patients isolated in madhouses, encouraged close ‘scientific’ surveillance of delusions and delinquencies, stirring the clinical ‘psychology’ of the disturbed. This hitherto unparalleled scrutiny of lunatics under controlled conditions, particularly while interacting with keepers, formed the matrix for practical (experimental) discipline of managing madness” (p. 178).
Much of the book, especially the second half, was bogged down in doctors and case studies of individual patients, which really subtracted from the bigger picture that Porter is trying to illustrate here. This usually isn’t a problem for me, but it was the equivalent of zooming too close in a photograph and losing focus. Had it not been for these minutiae, it story could have been much more effective. However, because of my interest in the topic, I still want to read Porter’s “The Greatest Benefit to Mankind: The Medical History of Humanity.” ...more
For at least a century, the compatibility of science and religion seems to keep popping up as a perennial question demanding our renewed attention. PrFor at least a century, the compatibility of science and religion seems to keep popping up as a perennial question demanding our renewed attention. Prevalent among pretty much all non-scientists and even the vast majority of practicing scientists is the rather naïve idea that science is an objective set of facts that have come about through a purely positivistic, empirical search for knowledge about the universe. With this view often comes its corollaries, like the idea that science is an activity which is totally divorced from other stories and mythologies that we weave about ourselves, wholly objective and cut off from religious or mythological assumptions we have about human nature. In this book, Mary Midgley – always the ideological shit-stirrer when it comes to the sacred cows of science – wants to argue that science is actually much more complicated than this.
Rather than being an objective pursuit apart from other human interests, many forms of science actually show themselves to be closely tied up with grander stories that we tell ourselves which transcend the boundaries of normal science. To quote Midgley, “I had been struck for some time by certain remarkable prophetic and metaphysical passages that appeared suddenly in scientific books about evolution, often in their last chapters. Though these passages were detached from the official reasoning of the books, they seemed still to be presented as science. But they made startling suggestions about vast themes such as immortality, human destiny, and the meaning of life.” Relating to evolution, Midgley is particularly critical of two popular trends: we can call them the Escalator Fallacy (the optimistic one) and the Meaningless Speck (the pessimistic one). The Escalator Fallacy, offered up in various forms by names as diverse as Herbert Spencer, Lamarck, and Karl Marx, says that so far evolution’s highest and most profound achievement is the human being, and that over time, we will only grow in physical strength, intellect, creativity, awareness, etc. On the contrary, the idea of the Meaningless Speck, espoused by the likes of famous astrophysicist Steve Weinberg, holds that the more that we know about the universe, the more pointless it seems to become, but that science provides the soupcon of solace and consolation to keep us that “lifts human life a little above the level of face, and gives it some of the grace of tragedy.” Needless to say, despite both of these ideas being expressed by many well-known scientists, neither of these conclusions are exactly what we would call “scientific.” Rather, they are very much mythical ideas about our place in the universe that, if we’re not careful, become imbricated in the practice of science itself, and therefore actually seem to become equivalent in truth-value to the claims of science.
Midgley is also critical of the conclusions that scientists often draw about life from a misguided understanding of evolutionary mechanics. For example, she rakes Richard Dawkins over the coals for coining the term “selfish gene,” because she thinks it’s silly to impute descriptors of animal behavior to long chains of sugars, phosphates, and bases. (Of course, it is not the selfishness of the gene that helps it survive at all, she argues, but rather that the gene creates an animal better-suited to its environment and therefore much more likely to pass that gene to consequent generations.) However when Dawkins imports the language of human intentionality, Midgley thinks he’s promoting the “worship of competition. It is projecting a Thatcherite take on economics on to evolution. It’s not an impartial scientific view; it’s a political drama.” (And by using the word “misguided” in the first sentence of this paragraph, I’m not so much suggesting that Dawkins has a misguided understanding of evolution – needless to say, I respect and value his opus of scientific work, not to mention his tireless work to popularize scientific ideas. But by using the adjective “selfish,” he is consciously choosing language which makes it seem as if genes are thinking, breathing, cognizant things.)
She ends the book in much the same way that the books that she criticizes do, however – namely by concluding a far-overreaching generalization from the relatively small body of examples that she has considered. Because of the stories that scientists overlay on evolution (sociobiology is also considered in the book), she says that science is not really a realm that values logic, reasoning, and deduction more than any other epistemic pursuit, and that science is just “one more way of knowing,” along with poetry or religion.
I’m sorry, but this simply will not do. Whereas orthodox religion has constantly been shoved further into the corner in light of scientific and technological advances, science continues to be the one self-correcting process that can render accurate, reliable information about the world. This is not to say that it or even its most advanced practitioners are without fault, nor are they ever able, by definition, to escape the subjectivity of their own minds. But to go from noticing that some scientists occasionally graft and interweave conclusions that can be considered non-scientific into popular explanations of their work to assuming that therefore there can be nothing we can even begin to consider with a large degree of objectivity is the very definition of poisoning the well.
Because of the flawed nature of logical induction and human error, science has made and will continue to make many mistakes. But here’s the kicker: science is self-correcting. Scientific ideas are never considered to be Truths (capital “T”) as the truths of religion or the Truth and Beauty of Keats are. Scientific models that work are always provisional, and therefore always up for revision and sometimes complete and total overhaul if they fail, or become unable of explaining a particular phenomenon. Nothing remotely similar can be said to be the case for other methods of exploring the universe around us – especially religion.
Mary Midgley passed away last year on October 10, 2018 at the age of 99. Much of what she said angered and provoked the scientific establishment, and much of what she said I think is partially wrong or at least overstated. However, I’ve never failed to find at least a glimmer of something thought provoking in her work. She was the kind of person the world needs more of: a provocateur never afraid to ask hard questions and even throw the occasional grenade. Do yourself the honor of picking up something by her. Whatever it is will almost certainly challenge the way that you see and think about the world. In the past, I’ve reviewed Wickedness, her book on the nature of human morality. You can read it here: https://www.goodreads.com/review/show......more