Afterlives

Skyward.jpg

The belief in an afterlife is perhaps the most contested, yet most powerful, of human belief systems. For centuries, religions have held the strongest claims for what happens to the human soul when it departs from the body at death.[1] Judaism presents several ideas about the afterlife, and, while there is no hell as understood in Christianity, heaven (known by many names in the rabbinic tradition) will be the eternal home of any righteous person, for it is one’s actions in life that will secure his or her place in the afterworld. In Islam, the hereafter is also complex; the soul may enter heaven or hell, but some also believe it may enter an intermediary sleep until a great resurrection comes. In Christianity, believers come in to the presence of God at death, and based on God’s judgment will be met with the reward of eternal heaven or the punishment of eternal hell.[2]  Buddhists hold a more cyclical approach to the hereafter, wherein death leads to rebirth, and through karma and enlightenment the soul can escape this cycle and achieve nirvana, the absence of suffering. Thus, belief in an afterlife directly influences our different ethical codes for behavior and life purpose(s). It simultaneously forms and is formed by our living, breathing experiences. In her 2005 book Spook: Science Tackles the Afterlife, Mary Roach travels the world to discover the facts behind what happens when we die. From an ethnographic account of India’s belief in reincarnation to her narrative of meeting with a medium in Arizona, USA, Roach explores everything from weighing the soul to communicating with ghosts. In the end, Roach illuminates the power of belief itself when she writes, “[t]he closer you are to the teller of a ghost story, the more likely you are to believe that the ghost in the story was a ghost, and not a raccoon or a temporal lobe seizure. Your beliefs are not formed by researchers or debunkers or television psychics, unless perhaps one of them is your mother or your good pal. Your beliefs are formed by your own experiences and those of your inner circle. And then validated by the researchers or the debunkers or the television psychics.” After all that research, we might wonder what this science writer herself believes about the afterlife. Let’s just say those in her inner circle have told her a lot of ghost stories.

 

[1] Of course, this belief depends upon another belief—the belief in Cartesian Dualism wherein we accept the premise that the body and the soul are indeed separate entities. This is opposed to the Materialist belief that the human soul, or consciousness, resides in the body as a whole or in the brain (which is still the material of the body), that ultimately body and soul are inseparable. The latter of these (Materialism) no doubt stirs a strong disbelief rooted in the fear that there is no life after death.

[2] It is important to note the rhetoric of the afterlife promises immortality and thus negates or at least softens the threat of death’s unknown territory.


The Ancient Egyptians were among the first to believe that one’s actions in life would determine one’s place in the afterlife. The deceased’s heart was measured against a feather: the Goddess Ma’at of Truth or Justice presiding as Anubis made sure t…

The Ancient Egyptians were among the first to believe that one’s actions in life would determine one’s place in the afterlife. The deceased’s heart was measured against a feather: the Goddess Ma’at of Truth or Justice presiding as Anubis made sure the scales were true, and Thoth wrote down the results as Osiris judged.

Hybrid Beliefs: Sacred and Secular

Picture2.png

Since at least 100,000 years ago, humans have been attracted to certain places that are associated with extraordinary events. These centers[1] were perhaps first associated with supernatural appearances, sacred powers, or merely areas were sacred presence was manifested. During the Neolithic period, the first man-made centers were constructed, a key example being Stonehenge in Wiltshire, England. When the first civilizations were built, centers became architectural syntheses of religious and political power, given that the kings whose dominion controlled these city-states and empires were believed to be divinely engendered. The Romans were perhaps the first civilization to invest sacred values to what was considered secular institutions, like the military. It is from Rome that we have the beginning of the military monument for the memorial remembrance of heroes who sacrificed themselves for the Patria, the fatherland. Sacred and sacrifice originate in the Latin sacro, “to set apart as hallow.”

Many of these monuments were located on what was known as sacred ground, battlefields where the dead were often buried: hallowed ground. These military memorials still function as sites where the secular and at least tacit religious beliefs are merged. Silence and genial reverence are communicated, and church-like hallowedness is practiced. Remembrance and honor are combined with sacred consciousness and shrine-like ornamentation. 

Today, on Memorial Day, many of these centers are being visited (through passed approach due to COVID-19) by those who remember, paying homage to those who sacrificed for a greater cause or belief. Many there will unconsciously disavow the liminal indifference between the sacred and the secular. 

[1] For centers, see David Summers, Real Spaces: World Art History and the Rise of Western Modernism (London, 2003). 

The Structure of Scientific Belief(s)

140938041.jpg

When MIT Professor of Linguistics and Philosophy Thomas S. Kuhn published the landmark study, The Structure of Scientific Revolutions in 1962, postmodernism had yet to be invented. Nonetheless, Kuhn's thesis that scientific knowledge was based on research models established through consensus of the scientific community, and that these "paradigm shifts" set the criteria for subsequent scientific knowledge, in many ways began the critique of science that would contribute to many postmodern philosophical critiques. The most shocking conclusion of Kuhn's work, to the orthodox scientific community, was that scientific knowledge is based on subjective beliefs and that objectivity is a constructed consensus among the scientific community. Since logical positivism had become part dogma and cult in England and America from the 1920s on, nobody had dared to suggest that science might be just another belief system, but after The Structure of Scientific Revolutions, the field of science studies is now a critical component of any serious philosophical discussions. With the digital revolution and the cult of quantification, which has infected all parts of human life, Kuhn's work and the critiques of science that it engendered, are more and more relevant each passing day.  

Prophecy: A Very Short History

John Collier, Priestess of Delphi (1891)

John Collier, Priestess of Delphi (1891)

Perhaps the turning point between hominids and homo sapiens about 100,000 years ago was the awakening of fear of the future. As far as we can surmise, human beings are the only creatures who have developed a consistent sense of fear of the unknown. Along with the consciousness of death, early humans acquired the concomitant consciousness of time, and along with these temporal obsessions, humans generated concepts of the sacred and guilt. The coincidence between these five developments is undoubtedly related to the common thread of fear.[1] The sense of fear necessitated the belief in some predictor of the unknown. The belief in an assurance that some special individual could foretell and explicate future events. 

As soon as humans developed a profound concern about future events, there emerged the prophet, literally someone who “speaks before.” The first prophets were probably medicine men, given that prophecy has always been related to divine inspiration. In all its forms, divine inspiration, madness, sacred communication, and all variants thereof, have been connected to intoxication. The various ritualistic formulas for intoxication would have required knowledge of fungi, plants, oils, and, after the invention of agriculture, the complexities of fermentation. The link between knowledge of intoxicants, healing power, the pharmakon[2], and the power of prediction have been primordially linked since paleolithic times. 

One of the earliest and most enduring notions of the prophet is the Oracle at Delphi, the great prophecy center of the high priestess of Apollo, the Pythia. At Delphi, prophecy was aided by the intoxicating fumes that surfaced from within the earth on the spot where the oracle sat upon the tripod chair in the shrine. Oracles, “frenzied women from whose lips the god speaks,” [3] originated in the ancient Near East, and they are connected to the prophets in the Torah who, among other things, predicted of the coming of the messiah. In the monotheistic tradition, “The Last Prophet” is Mohammed. 

Today, through digital dissemination, we have a plethora of itinerary prophets, from the ever-present eschatological Evangelical apocalyptic prophet to zodiac prognosticators, online psychics, Nostradamus aficionados, political forecasters, and relationship oracles. Today, the intoxicant is, of course, algorithms rather than ingested potion. The fear of the future is alive and well, and along with it, the need to believe in someone, something that can alleviate that fear. Lest not forget the motivation of all human gamblers, from sports to the stock market: they are habitually wagering on prophecy.

[1] The seminal studies remain, Rudolph Otto, The Idea of the Holy, translated by John W. Harvey (Oxford, 1958), and Mircea Eliade, History of Religious Ideas, Vol. 1: From the Stone Age to the Eleusinian Mysteries, trans. by William R. Trask (Chicago, 1978), and René Girard, Violence and the Sacred, trans. by Patrick Gregory (Baltimore, 1977). The study of the connection between primordial fear and a more radical notion of the human “fear of freedom,” has been tangentially discussed by Sade, Nietzsche, Freud, Frazer, Georges Bataille, and Maurice Blanchot, among others. 

[2] For pharmakos, as both a healing and poisoning force, see Jacques Derrida, “Plato’s Pharmacy,” Dissemination, trans. Barbara Johnson. London, 1981), 61-172, and René Girard, The Scapegoat, trans. Yvonne Freccero (Baltimore, 1986) and Gerard, Violence and the Scared, op. cit., above. 

[3] Walter Burkert, Greek Religion (Harvard, 1985), p 116-118. In Rome, prophets were augurs, those who predicted the future through their interpretation of the flight of birds, the art of augury. 

 

The Accidental Birth of Fake News

orson-welles_radio.jpg

The reaction to Orson Welles' Mercury Theater on the Air radio adaption of H. G. Wells The War of the Worlds is well-known and has been studied as perhaps the first instance of fictive media.[1] On another plane than realism in fiction and cinema, fictive media uses the available technological forms to recreate not life, as in the operation of mimesis, but the reportage of life through the non-artistic and seemingly trusted medium of news, documentary, or reportage. The roots of this manipulative genre go back to early modern journalism, where false stories reported by pamphleteers disguised as journalists. The most large-scale examples of this pre-20th century practice were the numerous libelles published in France during the Ancient Régime.[2] Today we identify a the libelles as political propaganda, although their enormous influence on the French people was based on their supposedly journalistic credibility. What distinguishes modern fictive media, which was perhaps inaugurated by Welles' broadcast, is that its veracity is grounded on the technological advancements of modern media. 

Welles was fascinated by the medium of radio, the technology of which allowed not just realism but also imaginative involvement by the listener; he believed that because of this dimension of imaginative participation, perhaps it could have a greater dramatic force than theater. It was this imaginative element that is crucial for understanding the powerful reaction that thousands of Americans had to the broadcast. Listeners were able not only to hear what seemed like real radio reportage, but the medium of radio also allowed them—forced them—to visualize what was being reported.

boston-daily-globe-2jpg.jpg

As Hadley Cantril writes in the introduction to The Invasion from Mars, which remains the best sociological study of the event, 27.5 out of 32 million families in the United States had radios in 1938, “a greater proportion than ha[d] telephones, automobiles, plumbing, electricity, newspapers or magazines." Additionally, "The radio was their principal, in some cases the only source of information about the wider world. They were accustomed to trusting it; why should they doubt the familiar voices, describing events in their familiar manner?"[3] It was Welles and the Mercury players, already having masterfully exploited the technology of radio during numerous earlier broadcasts of literary adaptations, who fortuitously made a Martian invasion horribly believable. The irony—the lesson of which is hardly irrelevant now—is that it was modern technology that, instead of stifling the fancies of the imagination, actually opened the door to flights of terrifying imaginings. As it is the case at this moment in history, we are dulled into an objectivity where technology is taken for granted, like in Goya's etching illustrates, "The Sleep of Reason Produces Monsters." 

[1] The broadcast was on Sunday, October 30, 1938, a day before Halloween. The original adaptation, dramatized in the form of news bulletins, was written by Howard Koch. Welles, as well as the Mercury Theater players, thought the radio play was at best “silly.” Many thought it was a ploy by Welles to take attention away from their torturous rehearsals and production problems for the upcoming production of Danton’s Death in the Mercury Theater. The background to War of the Worlds is brilliantly described in Simon Callow, Orson Welles: The Road to Xanadu (New York, 1995). 

[2] The seminal studies on the Libelles are by Robert Darnton; The Literary Underground of the Old Regime (Cambridge, 1982), The Revolution in Print: The Press in France, 1775-1800 (Berkeley, 1989). 

[3] Hadley Cantril, The Invasion from Mars: A Study in the Psychology of Panic (Princeton, 1940), quoted in Callow, Road to Xanadu, pp. 402-3. 

Sugar Pills & The Mind

It’s easy to confine belief to the religious and spiritual realms, but the placebo’s powerful effect reminds us there are sciences behind belief as well. In order to determine whether the pharmacological effects of a drug treatment are working to reduce a patient’s pain or other symptoms, clinical studies will include the administration of placebos, ranging from saline water or sugar pills, creating a control group against which the real drug can be measured. In many cases, patients in the placebo control group will report a reduction in symptoms. Doctors and psychologists agree that this result is due to the patient’s belief that the treatment will have a positive effect. Initially, it was thought that the placebo worked because its inefficacy was hidden from patients; however, recent studies show that even patients who know they are being given a placebo will still report positive effects as long as they have been told, and therefore believe, they will experience them. But it wasn’t until the 1950’s when placebos became a standard part of clinical trials to test new drugs. Whether it was due to the real drugs not being available or patients imagining that they were sick, fake drugs and even “sham surgeries” have been part of medical practice throughout history. From Latin, placebo means “I shall please,” so is it any wonder that doctors decided to use their patients’ powers of belief to cure them?

The Tooth Fairy's Extractable Roots

Mouse Fairy.png

The loss of one’s first baby tooth has been universally recognized as a rite of passage for millennia—well before the Tooth Fairy began leaving coins beneath the pillows of children young enough to believe in Santa Claus. And compared to the long tradition of Saint Nick, the Tooth Fairy’s roots only reach the late Nineteenth Century.

Teeth are symbolic for strength, beauty, vitality, potency, and wisdom and are often considered magical, as Theodore Ziolkowski notes.[1] Citing ancient and modern myths and practices, including Cadmus’s sowing a plain with dragon teeth to cultivate a race of warriors and James Frazer’s account of an African tribe that kills their ruler if he loses a tooth — Ziolkowski writes, “Because of the virtue inherent in teeth, they must not be allowed to fall into the hands of one’s enemies; extracted teeth should be buried or hidden, a superstition still evident in the practice of mothers who carefully save their children’s baby teeth.”[2]

Ceremonial disposal of teeth has taken many forms, as B.R. Townend outlines.[3] These rituals include throwing the tooth (into the sun, fire, between the legs or on a roof), hiding the tooth (in a tree, wall, or in the ground), offering the tooth to an animal (such as a rodent or crow), or swallowing the tooth (by the mother or child). The form of the disposal depends on a culture’s beliefs, as science writer/researcher Rosemary Wells explains, but that the ritual is usually accompanied by an incantation, which is “an audible plea for help to get a new and better tooth to replace the lost one…”[4]. For example, by offering one’s tooth to a rat, a popular tradition in much of Europe and Latin America, one hopes their new tooth will be as strong as the ever-growing rat’s tooth, which Michael Hingston explains is “a wish for transference anthropologists call ‘sympathetic magic.’”[5] But if looking for a direct link to the American Tooth Fairy, things are more complicated. 

While the tradition of exchanging teeth for money dates back to the medieval Norse “tooth fee,” these other traditions of exchange are not quite like the American Tooth Fairy. In many medieval traditions, teeth were placed aside sleeping children to protect them from malevolent pixies and water spirits, such as “Jenny Greenteeth.”[6] In Italy, it is not a fairy but the toothless witch Befana (and the Venetian Marantega) who gives children presents during Christmas and coins for their lost teeth, perhaps in hopes of filling her own smile.[7] The British custom of giving “Fairy Coins” to sleeping servant girls doesn’t involve teeth, but it does include a fairy that exchanges hard work for money. 

Most researchers locate the Tooth Fairy’s origin in France and cite La Bonne Petite Souris as the possible inspiration. In this folktale, a queen escapes from a bad king with the help of a fairy disguised as a mouse. The small mouse knocks out the king’s teeth and puts them under his pillow until having him assassinated—a tale that perhaps resonates more with the tribe described by Frazer than America’s beloved Tooth Fairy, but coupled with the long Irish and British traditions of benevolent fairies, the American Tooth fairy evolved into her own. 

Though referenced in literature much earlier, the American Tooth Fairy didn’t gain traction until the late 1940s, when post-war prosperity gave families the ability to focus on their children.[8] This was also a time children were raised on Disney movies with characters like The Blue Fairy and Tinker Bell. Finally, in 1979, the Tooth Fairy fluttered into the pages of the World Book Encyclopedia.[9]

And while economic shifts helped solidify the link between the magic of the Tooth Fairy and the magic of capitalism, there is perhaps a greater lesson to be learned than everything—including one’s own body—can be converted to cash. Some Tooth Fairies are said to pay more for cleaner teeth, giving the child incentive to have good dental hygiene. And harkening back to earlier traditions that were more rooted in magic and religion, the Tooth Fairy’s arrival coincides with the child’s passage into a new life phase. 

While some scholars link the loss of a child’s first tooth to the resolution of the Oedipal conflict, Wells offers an interesting insight into the way this rite of passage takes a child through the significant phases outlined by Arnold van Gennep: separation, transition, and incorporation. For American children who celebrate the loss of each baby tooth, not just the first one, this process is repeated over years, and by the time the child loses his/her last baby tooth, usually around the age of ten, he/she no longer believes in Tooth Fairy, and perhaps this only intensifies folklorist Tad Tuleja’s point: “In an economy where the ultimate magic is the power of money, the responsible parent ironically prepares the child for reality by encouraging a fantasy that Wells appropriately calls ‘a reassuring image of good capitalist values.’”[10]

[1] “The Telltale Teeth: Psychodonia to Sociodontia,” PLMA, vol. 91, no. 1, Jan. 1976. 

[2] Ibid., pp. 11-12. 

[3] "The Non-therapeutic Extraction of Teeth and Its Relation to the Ritual Disposal of Shed Deciduous Teeth,” British Dental Journal, 1963. Michelle Konstantinovsky, “Cash for Teeth: The Legend of the Tooth Fairy, How Stuff Works, Feb. 6, 2020. https://history.howstuffworks.com/history-vs-myth/tooth-fairy.htm

[4] Rosemary Wells, “The Making of an Icon: The Tooth Fairy in North American Folklore and Popular Culture,” The Good People: New Fairylore Essays, ed. Peter Narváez, The University Press of Kentucky: Lexington, 1991, p. 428.

[5] “Don’t tell the kids: The Real History of the Tooth Fairy,” Salon, Feb. 9, 2014. https://www.salon.com/2014/02/09/dont_tell_the_kids_the_real_history_of_the_tooth_fairy/

[6] Tad Tuleja, “The Tooth Fairy: Perspectives on Money and Magic,” The Good People: New Fairylore Essays, ed. Peter Narváez, The University Press of Kentucky: Lexington, 1991.

[7] Ibid.

[8] Ibid.

[9] Krista Killgrove, “Where Did the Tooth Fairy Come From?” Forbes, Sept. 14, 2016. https://www.forbes.com/sites/kristinakillgrove/2016/09/14/where-did-the-tooth-fairy-come-from/#2bcf949159d4

[10] Tad Tuleja, “The Tooth Fairy: Perspectives on Money and Magic,” The Good People: New Fairylore Essays, ed. Peter Narváez, The University Press of Kentucky: Lexington, 1991, p. 418. 

Bad Penny

bad-penny.jpg

While the proverb “a bad penny always turns up” refers to an undesirable person or thing that will always return or repeat, this metaphor of the “bad penny” stems from the very real problem of counterfeit or debased coins finding their way into our currencies. King Alyattes of Lydia (now Turkey) is credited with minting the first coins in 600 B.C. Since then, coin clipping, the practice of shaving the circumferences of precious metal coins while continuing to circulate them at face value, has been practiced by governments as a means of creating inflation. This method continued into the mid-20th century until we stopped making coins from soft, valuable metals such as silver and gold and began making them from cheap, hard metals like copper and copper-nickel alloy. Of course, governments now will just print more paper money, which decreases its purchasing value. No matter the economic system, money is one of those human inventions we have come to believe in without really questioning what it’s worth…until bitcoin, perhaps. A centralized government or bank does not administer this peer-to-peer cryptocurrency, and while some people believe bitcoin is nothing more than a Monopoly money fad, in 2017 the University of Cambridge presented researched estimates that most of the 2.9 to 5.8 million users with cryptocurrency wallets were using bitcoin. With the advent of cryptocurrency, it’s tempting to believe the literal problem of the bad penny may disappear. Still, cryptocurrency works like cash and is subject to robbery, just like those old precious ancient and medieval coins. Like the stock market, money is a mysterious and enchanting thing. Like religion, it is an institution that can live or die, depending on the effectiveness of the scaffolding of belief. 

The Religiosity of American Sports

The Lutheran Church in the Foothills is often called The Church of the Holy Touchdown.

The relationship between sports and religion in America goes beyond praying for your team and common superstitions. American sports have a rich history of specific teams and iconic moments that resonated and continue to resonate with religious connotations. A sports fan originated as a term for a religious zealot or fanatic (from the Latin fanus, "temple"). The extreme connection between fans and their teams often manifests itself as a kind of religious devotion. Alongside deep personal and collective identification with a team is the belief that one's team is supernaturally destined. 

Baseball perhaps has the most extended history of religiosity of all American sports. Given baseballs' unique spatio-temporal parameters (uniquely among sports, baseball's spatial dimensions are not territorialized and are uniquely differentiated from park to park. Baseball's temporality is also uniquely not gauged by any mechanized clock but by immanent elements (outs), and in theory, has infinite longevity; a game can possibly never end), it has inspired numerous metaphysical speculations concerning its mythological aspects. (Baseball writer Tom Boswell called baseball America's mythology.)

The first historical instance of a divinely influenced miracle in American sports was the 1914 "Miracle Braves," who went from last place in the National League at midseason to winning the pennant and then sweeping the highly favored and defending champion Philadelphia Athletics in the World Series. Fifty-five years later, the New York Mets received the moniker "Miracle Mets," after going from 9th place the previous year to first in 1969, culminating with upsetting the heavily favored Baltimore Orioles in the World Series. It was in 1969 that the Mets also inaugurated the phrase, "You Gotta Believe," which was famously displayed by "The Sign Guy" during the Mets next miraculous run four a pennant four years later in 1973. (Fans with signs was rare at that time.)

Believing in the impossible through almost religious fervor is most poignant when related to miracle comebacks, singularly incredible plays, or unfathomable upsets. Famously, as the clock ran out in the "Miracle on Ice" upset of the USSR by the US in the 1980 Olympics, ABC announcer Al Michaels would utter "Do You Believe in Miracles? Yes!"

In 2004, after suffering through 86 years of the so-called "Curse of the Bambino," the Boston Red Sox fans saw their team come from a 3-0 game deficit against their hated rivals, the New York Yankees, and go on to win the World Series. Many of the Red Sox faithful, many of them Roman Catholic, would visit family graves all across New England after that World Series to share the miracle with a long-gone relative.

In pro football, the religious element is more related to miracle plays. The most desperate, last-second miraculous play is itself called a "Hail Mary." In the final seconds of the 1972 AFC playoff game between the Pittsburgh Steelers and the Oakland Raiders, the Steelers Franco Harris made an impossible catch off a ricocheted pass, barely grabbing the ball millimeters from the ground and running for the game-winning touchdown. The play is forever known as "The Immaculate Reception." And then there is the "Music City Miracle" of the 2000 AFC playoffs, when, with 16 seconds left in the game, the Tennessee Titans ran back a kickoff using an impossible lateral pass on the return. 

We will surely have more miracles in sports, which points to the interesting fact that it is in sports wherein—in our increasingly secular and technophilic age—faith in miracles still abides. 

Octopus Hoax has Too Many Legs to Stand On

bones_hartzell-tree_octopus_exhibit.jpg

Lyle Zapato yanked more than a few legs in 1998, when he launched his website and campaign to “Save The Endangered Pacific Northwest Tree Octopus.” The website, complete  with photographs and information about the endangered cephalopod and its habitat, was convincing enough that researchers used it for several media literacy studies. But what if this hoax actually has plenty of legs to stand on? 

Writers and researchers have been documenting tree-climbing octopi for centuries. The Greek poet Oppian tells of an octopus that climbed an olive tree to eat its fruit and kiss its branches. In Naturalis Historia, Pliny the Elder explains why the hungry cephalopod climbed the tree: because he wanted to get to other tide (or over the fence and to the salted fishponds). In the more recent Words of the Lagoon, marine biologist R.E. Johannes recounts Palauan fishermens’ stories of octopi giving birth in mangroves [1]. Although Johannes points out how risky it is for an aquatic species to give birth in a tree—not to mention that no known octopus gives live birth—he doesn’t discredit the stories. Doubtful so many knowledgeable fishermen could misidentify an octopus, Johannes briefly wonders if he was pranked before explaining: “Palauan informants were quick to differentiate between actual observations and legends, but all of them were quite serious about the reality of the octopus story.” 

sighting.dehydrated_tree_octopus.jpg

We may never find an octopus climbing a Pacific Northwest pine, but there is enough anecdotal evidence to suggest some octopi probably have climbed a tree. They stretch their arms across damp rocks to get from one tide pool to another, and they are notorious escape artists. If Inky could bust through an enclosure, slink across eight feet of floor and through 164 feet of pipe before escaping into Hawke’s Bay [2], he could probably climb a tree with four hands tied behind his back—anything to leaf the aquarium!




[1] R.E. Johannes, Words of the Lagoon Fishing and Marine Lore in the Palau District of Micronesia, University of California Press: Berkeley, 1981. https://archive.org/stream/bub_gb_TloVDfV7QLoC/bub_gb_TloVDfV7QLoC_djvu.txt

[2] Dan Bilefsky, “Inky the Octopus Escapes From a New Zealand Aquarium,” The New York Times, April 13, 2016. https://www.nytimes.com/2016/04/14/world/asia/inky-octopus-new-zealand-aquarium.html

I Believe, Therefore I am

Belief.png

According to Descartes, cogito ergo sum, “I think therefore I am,” is the one and final indisputable truth for any human subject. As he put it in Discourse on Method, published in 1637, “we cannot doubt of our existence while we doubt…” In other words, it is possible to believe in the doubtfulness of any type of human knowledge except that which is foundational for any conscious thought: If I am thinking, I exist. The etymology of the word belief curiously contains the formula of this famous Cartesian reduction. Glauben, from which the English belief derives, means “to believe, to have faith in, to think.” Thinking and believing are always already linked, and vice versa. Descartes could not trust the adage of “seeing is believing,” because, since Plato, there was always the doubt that what we see is mere phenomenon or phantasm. The Cartesian certainty would hold sway until Rousseau’s attack on the very integrity of the human intellect. Rousseau would shift the location of truth and belief to the human heart. It was feeling that could not be doubted. That one should believe in one’s heart instead of one’s thoughts ushered in both the end of the Cartesian certainty and Romanticism’s cult of feeling. That Rousseau focused on the heart is also etymologically acute: the French croyance, “belief,” comes from the Latin credere (“to believe,” as in creed), whose root is the Proto Indo-European *ḱḗr, “heart,” from which comes the Greek καρδία (“heart, mind, stomach”) and Latin cor, “heart,” from which derive the French, coeur, and Spanish corazon, and various musical and romantic words like chord, chorus, and  concord. The linkage is literal: the heart (cor) is a cord (cordis) or enclosure (χόρτος). In the end, belief and truth come from the heart, or, we should say, the truth is, credo ergo sum.

The Siren ‘Bloop’

Although Thaler makes a great point—using real institutions in fake documentaries fuels the distrust of those institutions, which provides anti-intellectual movements a stronger, louder voice— what about the artistic merit and the deeper truth Mermaids reveals?

Read More

Very Superstitious

nazar-amulet-64999_1920.jpg

‘Tis the season for superstitions: black cats cross our paths around Halloween; we play tug-of-war with the turkey’s wishbone at Thanksgiving; and now, December presents us with Friday the 13th, a day we’ve considered highly unlucky ever since October 13, 1307 when King Philip IV and Pope Clement V arrested the Knights Templar, believed to be protecting the Holy Grail, and massacred them after charging them with Satanism. The superstition about the unluckiness of the number 13 itself persists enough that we don’t even name the 13th floor of buildings. One theory suggests that because 12 is a perfect number, 13 must be unlucky because it is not. Another theory suggests that we associate 13 with evil because thirteen people attended The Last Supper, one of them being the betrayer Judas. While not all superstitions stem from biblical roots, most of them share a particular similarity. In his 1972 hit “Superstition,” Stevie Wonder sings, “When you believe in things that you don’t understand, then you suffer…” Yet, the irony is that superstitions have evolved from gestures meant to ward off suffering or to protect us from evil. This makes sense when we consider the root of the word, “superstare,” literally meant to “stand over" and “survive,” which conjures images of a victor standing over his dying conquest at the end of a battle. However, the term, as Wonder elucidates, has come to mean something more irrational—a belief in supernatural causation, that a gesture can invite (breaking a mirror) or prevent (knocking on wood) harm. This is why we don’t name the 13th floor of buildings and also why, in some parts of Utah, lancing a lemon with a rusty nail is believed to ward off the evil eye. So this Friday the 13th, as Hollywood capitalizes on our fears of this day, throw a pinch of salt over your left shoulder, into the face of the Devil who lurks there[1], and go out to enjoy the full moon.

[1] The left side is considerer the sinister side, from the Latin “sinestra,” left hand. In Christian symbolism, the left is the evil side. At the Last Judgement, the elect will be on Jesus’ left, the damned on His right, the side of the righteous.