Hey you! If you missed last week's edition – how to read like a writer, Feynman on the role of scientific culture in modern society, vintage Scandinavian fairy tale illustrations – you can catch up right here. And if you're enjoying this, please consider supporting with a modest donation.
"Something is always born of excess: great art was born of great terrors, great loneliness, great inhibitions, instabilities, and it always balances them."
The third volume of Anaïs Nin's diaries has been on heavy rotation in recent weeks, yielding Nin's thoughtful and timeless meditations on life, mass movements, Paris vs. New York, what makes a great city, and the joy of handcraft.
The subsequent installment, The Diary of Anais Nin, Vol. 4: 1944-1947 (public library) is an equally rich treasure trove of wisdom on everything from life to love to the art of writing. In fact, Nin's gift shines most powerfully when she addresses all of these subjects and more in just a few ripe sentences. Such is the case with the following exquisite letter of advice she sent to a seventeen-year-old aspiring author by the name of Leonard W., whom she had taken under her wing as creative mentor.
I like to live always at the beginnings of life, not at their end. We all lose some of our faith under the oppression of mad leaders, insane history, pathologic cruelties of daily life. I am by nature always beginning and believing and so I find your company more fruitful than that of, say, Edmund Wilson, who asserts his opinions, beliefs, and knowledge as the ultimate verity. Older people fall into rigid patterns. Curiosity, risk, exploration are forgotten by them. You have not yet discovered that you have a lot to give, and that the more you give the more riches you will find in yourself. It amazed me that you felt that each time you write a story you gave away one of your dreams and you felt the poorer for it. But then you have not thought that this dream is planted in others, others begin to live it too, it is shared, it is the beginning of friendship and love.
You must not fear, hold back, count or be a miser with your thoughts and feelings. It is also true that creation comes from an overflow, so you have to learn to intake, to imbibe, to nourish yourself and not be afraid of fullness. The fullness is like a tidal wave which then carries you, sweeps you into experience and into writing. Permit yourself to flow and overflow, allow for the rise in temperature, all the expansions and intensifications. Something is always born of excess: great art was born of great terrors, great loneliness, great inhibitions, instabilities, and it always balances them. If it seems to you that I move in a world of certitudes, you, par contre, must benefit from the great privilege of youth, which is that you move in a world of mysteries. But both must be ruled by faith.
The Diary of Anais Nin, Vol. 4: 1944-1947 is brimming with such poetic yet practical sagacity on the creative life and is a beautiful addition to other famous advice on writing like Kurt Vonnegut's 8 rules for a great story, David Ogilvy's 10 no-bullshit tips, Henry Miller's 11 commandments, Jack Kerouac's 30 beliefs and techniques, John Steinbeck's 6 pointers, and Susan Sontag's synthesized learnings.
:: SHARE ::
"The aggregate of our joy and suffering…every hero and coward, every creator and destroyer of civilization…every young couple in love…lived there – on a mote of dust suspended in a sunbeam."
Thirty-five years ago today, the Voyager 1 launched into space in a quest to explore the outer Solar System and carried with it the Golden Record, an ultimate mixtape of humanity's sounds that was also a record of how Carl Sagan and Annie Druyan fell in eternal love. There's hardly a better way to celebrate the Voyager's legacy than with Sagan's iconic, timeless, infinitely humbling yet awe-inspiring Pale Blue Dot (public library), based on the photograph of the same title taken by the Voyager 1 in 1990.
This charming animated adaptation was young illustrator Adam Winnik's graduation thesis project – enjoy.
Look again at that dot. That's here. That's home. That's us. On it everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives. The aggregate of our joy and suffering, thousands of confident religions, ideologies, and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilization, every king and peasant, every young couple in love, every mother and father, hopeful child, inventor and explorer, every teacher of morals, every corrupt politician, every 'superstar,' every 'supreme leader,' every saint and sinner in the history of our species lived there – on a mote of dust suspended in a sunbeam.
The Earth is a very small stage in a vast cosmic arena. Think of the endless cruelties visited by the inhabitants of one corner of this pixel on the scarcely distinguishable inhabitants of some other corner, how frequent their misunderstandings, how eager they are to kill one another, how fervent their hatreds. Think of the rivers of blood spilled by all those generals and emperors so that, in glory and triumph, they could become the momentary masters of a fraction of a dot.
Our posturings, our imagined self-importance, the delusion that we have some privileged position in the Universe, are challenged by this point of pale light. Our planet is a lonely speck in the great enveloping cosmic dark. In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves.
The Earth is the only world known so far to harbor life. There is nowhere else, at least in the near future, to which our species could migrate. Visit, yes. Settle, not yet. Like it or not, for the moment the Earth is where we make our stand.
It has been said that astronomy is a humbling and character-building experience. There is perhaps no better demonstration of the folly of human conceits than this distant image of our tiny world. To me, it underscores our responsibility to deal more kindly with one another, and to preserve and cherish the pale blue dot, the only home we've ever known.
Complement with another animated adaptation from pickings past, then do yourself a favor and reread Pale Blue Dot in its glorious entirety.
:: WATCH / SHARE ::
"Generating interesting connections between disparate subjects is what makes art so fascinating to create and to view . . . we are forced to contemplate a new, higher pattern that binds lower ones together."
It seems to be the season for fascinating meditations on consciousness, exploring such questions as what happens while we sleep, how complex cognition evolved, and why the world exists. Joining them and prior explorations of what it means to be human is The Ravenous Brain: How the New Science of Consciousness Explains Our Insatiable Search for Meaning (public library) by Cambridge neuroscientist Daniel Bor in which, among other things, he sheds light on how our species' penchant for pattern-recognition is essential to consciousness and our entire experience of life.
The process of combining more primitive pieces of information to create something more meaningful is a crucial aspect both of learning and of consciousness and is one of the defining features of human experience. Once we have reached adulthood, we have decades of intensive learning behind us, where the discovery of thousands of useful combinations of features, as well as combinations of combinations and so on, has collectively generated an amazingly rich, hierarchical model of the world. Inside us is also written a multitude of mini strategies about how to direct our attention in order to maximize further learning. We can allow our attention to roam anywhere around us and glean interesting new clues about any facet of our local environment, to compare and potentially add to our extensive internal model.
Much of this capacity relies on our working memory – the temporary storage that holds these primitive pieces of information in order to make them available for further processing – and yet what's most striking about our ability to build such an "amazingly rich" model of the world is that the limit of our working memory is hardly different from that of a monkey, even though the monkey's brain is roughly one-fifteenth the size of ours: Experiment after experiment has shown that, on average, the human brain can hold 4 different items in its working memory, compared to 3 or 4 for the monkey.
What makes the difference, Bor argues, is a concept called chunking, which allows us to hack the limits of our working memory – a kind of cognitive compression mechanism wherein we parse information into chunks that are more memorable and easier to process than the seemingly random bits of which they're composed. Bor explains:
In terms of grand purpose, chunking can be seen as a similar mechanism to attention: Both processes are concerned with compressing an unwieldy dataset into those small nuggets of meaning that are particularly salient. But while chunking is a marvelous complement to attention, chunking diverges from its counterpart in focusing on the compression of conscious data according to its inherent structure or the way it relates to our preexisting memories.
To illustrate the power of chunking, Bor gives an astounding example of how one man was able to use this mental mechanism in greatly expanding the capacity of his working memory. The man, an undergraduate volunteer in a psychology experiment with an average IQ and memory capacity, took part in a simple experiment, in which the researchers read to him a sequence of random digits and asked him to say the digits back in the order he'd heard them. If he was correct, the next trial sequence would be one digit longer; if incorrect, one digit shorter. This standard test for verbal working memory had one twist – it took place over two years, where the young man did this task for an hour a day four days a week.
Initially, he was able to remember roughly 7 numbers in the sequence – an average improvement over the 4-item limit that most people arrive at with a few simple and intuitive rehearsal strategies. But the young man was so bored with the experiment he decided to make it interesting for himself by doing his best to greatly improve his limit – which he did. By the end, some 20 months later, he was able to say back a sequence that was 80 digits long – or, as Bor puts it, "if 7 friends in turn rapidly told him their phone numbers, he could calmly wait until the last digit was spoken and then, from memory, key all 7 friends' numbers into his phone's contact list without error," an achievement that would make Joshua Foer proud.
But how, exactly, was an average person capable of such a superhuman feat? Bor sheds light:
This volunteer happened to be a keen track runner, and so his first thought was to see certain number groups as running times, for instance, 3492 would be transformed into 3 minutes and 49.2 seconds, around the world-record time for running the mile. In other words, he was using his memory for well-known number sequences in athletics to prop up his working memory. This strategy worked very well, and he rapidly more than doubled his working memory capacity to nearly 20 digits. The next breakthrough some months later occurred when he realized he could combine each running time into a superstructure of 3 or 4 running times – and then group these superstructures together again. Interestingly, the number of holders he used never went above his initial capacity of just a handful of items. He just learned to cram more and more into each item in a pyramidal way, with digits linked together in 3s or 4s, and then those triplets or quadruplets of digits linked together as well in groups of 3, and so on. One item-space, one objet in working memory, started holding a single digit, but after 20 months of practice, could contain as much as 24 digits.
This young man had, essentially, mastered exponential chunking. But, Bor points out, chunking isn't useful only in helping us excel at seemingly meaningless tasks – it is integral to what makes us human:
Although [chunking] can vastly increase the practical limits of working memory, it is not merely a faithful servant of working emory – instead it is the secret master of this online store, and the main purpose of consciousness.
There are three straightforward sides to the chunking process – the search for chunks, the noticing and memorizing of those chunks, and the use of the chunks we've already built up. The main purpose of consciousness is to search for and discover these structured chunks of information within working memory, so that they can then be used efficiently and automatically, with minimal further input from consciousness.
Perhaps what most distinguishes us humans from the rest of the animal kingdom is our ravenous desire to find structure in the information we pick up in the world. We cannot help actively searching for patterns – any hook in the data that will aid our performance and understanding. We constantly look for regularities in every facet of our lives, and there are few limits to what we can learn and improve on as we make these discoveries. We also develop strategies to further help us – strategies that themselves are forms of patterns that assist us in spotting other patterns, with one example being that amateur track runner developing tactics to link digits with running times in various races.
But, echoing Richard Feynman's eloquent lament on the subject, Bor points to a dark side of this hunger for patterns:
One problematic corollary of this passion for patterns is that we are the most advanced species in how elaborately and extensively we can get things wrong. We often jump to conclusions – for instance, with astrology or religion. We are so keen to search for patterns, and so satisfied when we've found them, that we do not typically perform sufficient checks on our apparent insights.
Still, our capacity for pattern-recognition, Bor argues, is the very source of human creativity. In fact, chunking and pattern-recognition offer evidence for the combinatorial nature of creativity, affirm Steve Jobs's famous words that "creativity is just connecting things", Mark Twain's contention that "all ideas are second-hand", and Nina Paley's clever demonstration of how everything builds on what came before.
The arts, too, generate their richness and some of their aesthetic appeal from patterns. Music is the most obvious sphere where structures are appealing – little phrases that are repeated, raised a key, or reversed can sound utterly beguiling. This musical beauty directly relates to the mathematical relation between notes and the overall logical regularities formed. Some composers, such as Bach, made this connection relatively explicit, at least in certain pieces, which are just as much mathematical and logical puzzles as beautiful musical works.
But certainly patterns are just as important in the visual arts as in music. Generating interesting connections between disparate subjects is what makes art so fascinating to create and to view, precisely because we are forced to contemplate a new, higher pattern that binds lower ones together.
What is true of creative skill, Bor argues, is also true of our highest intellectual contribution:
Some of our greatest insights can be gleaned from moving up another level and noticing that certain patterns relate to others, which on first blush may appear entirely unconnected – spotting patterns of patterns, say (which is what analogies essentially are).
Best of all, this system expands exponentially as it feeds on itself, like a muscle that grows stronger with each use:
Consciousness and chunking allow us to turn the dull sludge of independent episodes in our lives into a shimmering, dense web, interlinked by all the myriad patterns we spot. It becomes a positive feedback loop, making the detection of new connections even easier, and creates a domain ripe for understanding how things actually work, of reaching that supremely powerful realm of discerning the mechanism of things. At the same time, our memory system becomes far more efficient, effective – and intelligent – than it could ever be without such refined methods to extract useful structure from raw data.
Though some parts of The Ravenous Brain fringe on reductionism, Bor offers a stimulating lens on that always fascinating, often uncomfortable, inevitably alluring intersection of science and philosophy where our understanding of who we are resides.
:: SHARE ::
"To the dumb question 'Why me?' the cosmos barely bothers to return the reply: Why not?"
"One should try to write as if posthumously," Christopher Hitchens famously opined in a New York Public Library talk three days before his fatal cancer diagnosis. "Distrust compassion; prefer dignity for yourself and others," he advised young contrarians years earlier. How striking, then, becomes the clash between his uncompromising ethos and the equally uncompromising realities of death, recorded in Mortality (public library), his last published work, out this week – a gripping and lucid meditation on death as it was unfolding during Hitch's last months of life. But what makes the book truly extraordinary is his profound oscillation between his characteristic, proud, almost stubborn self-awareness – that ability to look on with the eye of the critic rather than the experiencing self – and a vulnerability that is so clearly foreign to him, yet so breathlessly inevitable in dying. The ideological rigor with which he approaches his own finality, teasing apart religion and politics and other collective and thus impersonal facets of culture, cracks here and there, subtly at first, letting the discomfort of his brush with the unknown peek through, then gapes wide open to reveal the sheer human terror of ceasing to exist.
We begin by seeing Hitchens, a true contrarian himself, defy death's common psychology:
The notorious stage theory of Elisabeth Kübler-Ross, whereby one progresses from denial to rage through bargaining to depression and the eventual bliss of 'acceptance,' hasn't so far had much application to my case. In one way, I suppose, I have been 'in denial' for some time, knowingly burning the candle at both ends and finding that it often gives a lovely light. But for precisely that reason, I can't see myself smiting my brow with shock or hear myself whining about how it's all so unfair: I have been taunting the Reaper into taking a free scythe in my direction and have now succumbed to something so predictable and banal that it bores even me. Rage would be beside the point for the same reason. Instead, I am badly oppressed by the gnawing sense of waste. I had real plans for my next decade and felt I'd worked hard enough to earn it. Will I really not live to see my children married? To watch the World Trade Center rise again? To read – if not indeed to write – the obituaries of elderly villains like Henry Kissinger and Joseph Ratzinger? But I understand this sort of non-thinking for what it is: sentimentality and self-pity.
One coping mechanism is stoic wryness:
To the dumb question 'Why me?' the cosmos barely bothers to return the reply: Why not?
As a bastion of semantic clarity, Hitch doesn't miss the opportunity to dismember a number of the metaphors we use about and around death, echoing Susan Sontag's classic and revolutionary Illness as Metaphor in discussing the "war-on-cancer" cliché:
Myself, I love the imagery of struggle. I sometimes wish I were suffering in a good cause, or risking my life for the good of others, instead of just being a gravely endangered patient. Allow me to inform you, though, that when you sit in a room with a set of other finalists, and kindly people bring a huge transparent bag of poison and plug it into your arm, and you either read or don't read a book while the venom sack gradually empties itself into your system, the image of the ardent soldier or revolutionary is the very last one that will occur to you. You feel swamped with passivity and impotence: dissolving in powerlessness like a sugar lump in water.
Still, Hitchens uses his death as a vehicle for advancing his lifelong crusade against religion, which earned him a spot as one of "the Four Horsemen of New Atheism" – along with Richard Dawkins, Dan Dennett, and Sam Harris – and takes a number of clever stabs at religion's paradoxes:
Many readers are familiar with the spirit and the letter of the definition of 'prayer,' as given by Ambrose Bierce in his Devil's Dictionary. It runs like this, and is extremely easy to comprehend:
Prayer: A petition that the laws of nature be suspended in favor of the petitioner; himself confessedly unworthy.
Everybody can see the joke that is lodged within this entry: The man who prays is the one who thinks that god has arranged matters all wrong, but who also thinks that he can instruct god how to put them right. Half-buried in the contradiction is the distressing idea that nobody is in charge, or nobody with any moral authority. The call to prayer is self-cancelling.
But, every once in a while, between the busting of clichés, the complacent edge of his self-awareness softens and gives way to the real and raw human terror of his experience:
It's normally agreed that the question 'How are you?' doesn't put you on your oath to give a full or honest answer. So when asked these days, I tend to say something cryptic like, 'A bit early to say.' (If it's the wonderful staff at my oncology clinic who inquire, I sometimes go so far as to respond, 'I seem to have cancer today.') Nobody wants to be told about the countless minor horrors and humiliations that become facts of 'life' when your body turns from being a friend to being a foe: the boring switch from chronic constipation to its sudden dramatic opposite; the equally nasty double cross of feeling acute hunger while fearing even the scent of food; the absolute misery of gut-wringing nausea on an utterly empty stomach; or the pathetic discovery that hair loss extends to the disappearance of the follicles in your nostrils, and thus to the childish and irritating phenomenon of a permanently runny nose. Sorry, but you did ask… It's no fun to appreciate to the full the truth of the materialist proposition that I don't have a body, I am a body.
Indeed, this daily attrition of bodily dignity, which bleeds into an attrition of character, is hard even for Hitch to intellectualize, try as he might:
Most despond-inducing and alarming of all, so far, was the moment when my voice suddenly rose to a childish (or perhaps piglet-like) piping squeak. It then began to register all over the place, from a gruff and husky whisper to a papery, plaintive bleat. And at times it threatened, and now threatens daily, to disappear altogether. I had just returned from giving a couple of speeches in California, where with the help of morphine and adrenaline I could still successfully 'project' my utterances, when I made an attempt to hail a taxi outside my home – and nothing happened. I stood, frozen, like a silly cat that had abruptly lost its meow. I used to be able to stop a New York cab at thirty paces. I could also, without the help of a microphone, reach the back row and gallery of a crowded debating hall. And it may be nothing to boast about, but people tell me that if their radio or television was on, even in the next room, they could always pick out my tones and know that I was 'on' too.
Like health itself, the loss of such a thing can't be imagined until it occurs. In common with everybody else, I have played versions of the youthful 'Which would you rather?' game, in which most usually it's debated whether blindness or deafness would be the most oppressive. But I don't ever recall speculating much about being struck dumb. (In the American vernacular, to say 'I'd really hate to be dumb' might in any case draw another snicker.) Deprivation of the ability to speak is more like an attack of impotence, or the amputation of part of the personality. To a great degree, in public and private, I 'was' my voice. All the rituals and etiquette of conversation, from clearing the throat in preparation for the telling of an extremely long and taxing joke to (in younger days) trying to make my proposals more persuasive as I sank the tone by a strategic octave of shame, were innate and essential to me. I have never been able to sing, but I could once recite poetry and quote prose and was sometimes even asked to do so. And timing is everything: the exquisite moment when one can break in and cap a story, or turn a line for a laugh, or ridicule an opponent. I lived for moments like that. Now if I want to enter a conversation, I have to attract attention in some other way, and live with the awful fact that people are then listening 'sympathetically.'
The final pages of Mortality feature Hitch's fragmentary scribbles from the days immediately preceding his death, concluding, poignantly, with this:
From Alan Lightman's intricate 1993 novel Einstein's Dreams; set in Berne in 1905:
With infinite life comes an infinite list of relatives. Grandparents never die, nor do great-grandparents, great-aunts… and so on, back through the generations, all alive and offering advice. Sons never escape from the shadows of their fathers. Nor do daughters of their mothers. No one ever comes into his own… Such is the cost of immortality. No person is whole. No person is free.
:: SHARE ::
Why it's easier to prevent a crying spell than to stop one already underway.
The human body is an extraordinary machine, and our behavior an incessant source of fascination. In Curious Behavior: Yawning, Laughing, Hiccupping, and Beyond (public library), psychology and neuroscience professor Robert R. Provine undertakes an "analysis and celebration of undervalued, informative, and sometimes disreputable human behavior" by applying the lens of anthropologically-inspired, observational "Small Science" – "small because it does not require fancy equipment and a big budget, not because it's trivial" – to a wealth of clinical research into the biology, physiology, and neuropsychology of our bodily behaviors.
Take, for instance, the science of what we call "crying," a uniquely human capacity – a grab-bag term that consists of "vocal crying," or sobbing, and "emotional tearing," our quiet waterworks. Provine explains:
As an adult, you cry much less than when young, and your crying is more often subdued, teary weeping than the demonstrative, vocal sobbing of childhood. . . [T]he trauma that causes your crying is now more often emotional than physical. However, whether intentional or not, as adult or child, you cry to solicit assistance, whether physical aid or emotional solace. Paradoxically, your adult cry for help is more private than the noisy, promiscuous pronouncement of childhood, often occurring at home, where it finds a select audience. The developmental shift from vocal crying to visual tearing favors the face-to-face encounters of an intimate setting. The maturation of inhibitory control gives adults the ability to select where and when crying occurs, or to inhibit it altogether, options less available to children.
To better illustrate the physiology of crying, Provine contrasts it with that of laughing, pointing out that the two are complementary behaviors and understanding one helps understand the other.
Specialists may argue whether there is a typical cry or laugh, but enough is known about these vocalizations to provide vivid contrasts. A cry is a sustained, voiced utterance, usually of around one second or more (reports vary), the duration of an outward breath. Think of a baby's 'waaa.' . . . Cries repeat at intervals of about one second, roughly the duration of one respiratory cycle . . . A laugh, in contrast, is a chopped (not sustained), usually voiced exhalation, as in 'ha-ha-ha,' in which each syllable ('ha') lasts about 1/15 second and repeats every 1/5 second.
One curious feature crying and laughing have in common, which any human being with a beating heart can attest to:
Crying and laughing both show strong perseveration, the tendency to maintain a behavior once it has started. These acts don't have an on-off switch, a trait responsible for some quirks of human behavior. Whether baby or adult, it's easier to prevent a bout of crying than to stop it once under way. Crying causes more crying. Likewise, laughter causes more laughter, a reason why headliners at comedy clubs want other performers to warm up the audience, and why you may be immobilized by a laughing fit that can't be quelled by heroic attempts at self-control. In fact, voluntary control has little to do with starting or stopping most crying or laughing.
So, if vocal crying evolved to attract help, what's the evolutionary purpose of quiet tears? For one, they contain lysozyme, the body's own antiseptic, which sanitizes and lubricates the eye. But, Provine argues, there might be something much more interesting and neurobiologically profound at work:
Several lines of evidence suggest that the NGF [nerve growth factor] in tears has medicinal functions. The NGF concentration in tears, cornea, and lacrimal glands increases after corneal wounding, suggesting that NGF plays a part in healing. More directly, the topical application of NGF promotes the healing of corneal ulcers and may increase tear production in dry eye . . . Although more of a scientific long shot, I suggest that tears bearing NGF have an anti-depressive effect that may modulate as well as signal mood.
Non-emotional, healing tears may have originally signaled trauma to the eyes, eliciting caregiving by tribe members or inhibiting physical aggression by adversaries. This primal signal may have later evolved through ritualization to become a sign of emotional as well as physical distress. In this evolutionary scenario, the visual and possibly chemical signals of emotional tears may be secondary consequences of lacrimal secretions that originally evolved in the service of ocular maintenance and healing.
If anything, Provine points to this as a direction of curiosity for future research:
Emotional tearing is a uniquely human and relatively modern evolutionary innovation that may have left fresh biological tracks of its genesis. The contrast of the human lacrimal system with that of our tearless primate relatives may reveal a path to emotional tearfulness that involves NGF. NGF may be both a healing agent found in tears and a neurotrophin that plays a central role in shaping the neurologic circuitry essential for emotional tearing during development and evolution. A lesson of NGF research is that pursuit of the scientific trail can lead to serendipitous discoveries both broad and deep. Emotional tears may provide an exciting new chapter in the NGF saga, and vice versa.
The rest of Curious Behavior goes on to explore such seemingly mundane by, in fact, utterly fascinating phenomena as yawning, sneezing, coughing, tickling, nausea, and, yes, farting and belching.
:: SHARE ::