Author, Speaker, Educator, Poet, Business Advisor to Social Entrepreneurs, Global Goodwill Ambassador and Humanitarian. DhAnAnJay ParKhe .Chooses Mentees to help them learn Strategies and Execution of the Art, Craft and Science of Doing Better, Still Better to be Able to Beat in business. Mentoring isn't Sweetener, it is Brutally Honest, Bitter Truth Pill and KickAss is . Many Crack. Few WIN!
… that today, besides being New Year’s Eve in many countries (Happy New Year!), is the anniversary of Auld Lang Syne? The grand old song of New Year’s Eve, “Auld Lang Syne,” was first played as a new year’s song by Guy Lombardo and his Royal Canadians at the Hotel Roosevelt Grill in 1929.
Today’s Inspirational Quote:
“In the New Year, never forget to thank your past years because they enabled you to reach today! Without the stairs of the past, you cannot arrive at the future!”
Rumination is relentless thinking focused on one’s negative feelings and problems. Whereas reflection can be productive, and motivate us to improve, ruminating is typically self-defeating. It can even be unhealthy.
… that today is the Festival of Enormous Changes at the Last Minute? On the next to last day of the year, before you must begin work on your new resolutions, take time out today to complete work on your previous year’s resolutions. Make those changes! 😉
Today’s Inspirational Quote:
“We change the world not by what we say or do, but as a consequence of what we have become.”
“As the gap between rich and poor increases, the cost of screwing up increases,” said Philip Cohen, a sociologist at the University of Maryland who studies families and inequality. “The fear is they’ll end up on the other side of the divide.”
This is the brainpickings.org weekly digest by Maria Popova. If you missed last week’s edition — a 100-year-old Holocaust survivor on how books save lives, Nietzsche’s brilliant thought experiment illustrating the key to existential contentment — you can catch up right here. (ALSO: Don’t miss the annual review of the best of Brain Pickings 2018.) And if you are enjoying this newsletter, please consider supporting my labor of love with a donation – this year, I spent innumerable hours and tremendous resources on it, and every little bit of support helps enormously. If you already donate: THANK YOU.
“I… a universe of atoms… an atom in the universe,” the Nobel-winning physicist Richard Feynman wrote in his lovely prose poem about evolution. “The fact that we are connected through space and time,” evolutionary biologist Lynn Margulis observed of the interconnectedness of the universe, “shows that life is a unitary phenomenon, no matter how we express that fact.”
A century before Feynman and Margulis, the great Scottish-American naturalist and pioneering environmental philosopher John Muir (April 21, 1838–December 24, 1914) channeled this elemental fact of existence with uncommon poetic might in John Muir: Nature Writings (public library) — a timeless treasure I revisited in composing The Universe in Verse.
Recounting the epiphany he had while hiking Yosemite’s Cathedral Peak for the first time in the summer of his thirtieth year — an epiphany strikingly similar to the one Virginia Woolf had at the moment she understood what it means to be an artist — Muir writes:
When we try to pick out anything by itself, we find it hitched to everything else in the universe. One fancies a heart like our own must be beating in every crystal and cell, and we feel like stopping to speak to the plants and animals as friendly fellow mountaineers. Nature as a poet, an enthusiastic workingman, becomes more and more visible the farther and higher we go; for the mountains are fountains — beginning places, however related to sources beyond mortal ken.
One is constantly reminded of the infinite lavishness and fertility of Nature — inexhaustible abundance amid what seems enormous waste. And yet when we look into any of her operations that lie within reach of our minds, we learn that no particle of her material is wasted or worn out. It is eternally flowing from use to use, beauty to yet higher beauty; and we soon cease to lament waste and death, and rather rejoice and exult in the imperishable, unspendable wealth of the universe, and faithfully watch and wait the reappearance of everything that melts and fades and dies about us, feeling sure that its next appearance will be better and more beautiful than the last.
More and more, in a place like this, we feel ourselves part of wild Nature, kin to everything.
A year earlier, during his famous thousand-mile walk to the Gulf of Mexico, Muir recorded his observations and meditations in a notebook inscribed John Muir, Earth-Planet, Universe. In one of the entries from this notebook, the twenty-nine-year-old Muir counters the human hubris of anthropocentricity in a sentiment far ahead of his time and, in many ways, ahead of our own as we grapple with our responsibility to the natural world. More than a century before Carl Sagan reminded us that we, like all creatures, are “made of starstuff,” Muir humbles us into our proper place in the cosmic order:
The universe would be incomplete without man; but it would also be incomplete without the smallest transmicroscopic creature that dwells beyond our conceitful eyes and knowledge… The fearfully good, the orthodox, of this laborious patchwork of modern civilization cry “Heresy” on every one whose sympathies reach a single hair’s breadth beyond the boundary epidermis of our own species. Not content with taking all of earth, they also claim the celestial country as the only ones who possess the kind of souls for which that imponderable empire was planned.
This star, our own good earth, made many a successful journey around the heavens ere man was made, and whole kingdoms of creatures enjoyed existence and returned to dust ere man appeared to claim them. After human beings have also played their part in Creation’s plan, they too may disappear without any general burning or extraordinary commotion whatever.
However disquieting and corrosive to the human ego such awareness may be, Muir argues that we can never be conscientious citizens of the universe unless we accept this fundamental cosmic reality. In our chronic civilizational denial of it, we are denying nature itself — we are denying, in consequence, our own humanity. A century before the inception of the modern environmental movement, he writes:
No dogma taught by the present civilization seems to form so insuperable an obstacle in the way of a right understanding of the relations which culture sustains to wildness as that which regards the world as made especially for the uses of man. Every animal, plant, and crystal controverts it in the plainest terms. Yet it is taught from century to century as something ever new and precious, and in the resulting darkness the enormous conceit is allowed to go unchallenged.
I have never yet happened upon a trace of evidence that seemed to show that any one animal was ever made for another as much as it was made for itself. Not that Nature manifests any such thing as selfish isolation. In the making of every animal the presence of every other animal has been recognized. Indeed, every atom in creation may be said to be acquainted with and married to every other, but with universal union there is a division sufficient in degree for the purposes of the most intense individuality; no matter, therefore, what may be the note which any creature forms in the song of existence, it is made first for itself, then more and more remotely for all the world and worlds.
This revelatory sense of interconnectedness comes over Muir again a decade later, as he journeys to British Columbia on a steamer in the spring of 1879, experiencing for the first time the otherworldly wonder and might of the open ocean. A century after William Blake saw the universe in a grain of sand, Muir writes:
The scenery of the ocean, however sublime in vast expanse, seems far less beautiful to us dry-shod animals than that of the land seen only in comparatively small patches; but when we contemplate the whole globe as one great dewdrop, striped and dotted with continents and islands, flying through space with other stars all singing and shining together as one, the whole universe appears as an infinite storm of beauty.
In 2018, the 12th year of Brain Pickings, I poured tremendous time, thought, love, and resources into this labor of love, which remains free and is made possible by patronage. If you found any joy and consolation here this year, please consider supporting it with a donation. And if you already donate, from the bottom of my heart: THANK YOU.
You can become a Sustaining Patron with a recurring monthly donation of your choosing, between a cup of tea and a Brooklyn lunch.
Or you can become a Spontaneous Supporter with a one-time donation in any amount.
“To get fame and money, for the sake of which I wrote, it was necessary to hide the good and to display the evil,”Leo Tolstoy confessed with uncompromising self-awareness in reflecting on his youthful vice of writing for the wrong reasons — as a young man, he had treated the making of literature as a means to a material end, a bargaining chip traded for admiration and profit with other literary profiteers who were just as “self-confident and self-satisfied as only those can be who are quite holy or who do not know what holiness is.” Around the same time, across the Atlantic, the young William James made the difficult decision of choosing purpose over profit — a decision that would eventually establish him as the founding father of American psychology — and observed the crux of the tradeoff: “After all, the great problem of life seems to be how to keep body and soul together.” Of course, artists must eat — but at what cost does their livelihood come, weighed on whose scale?
Nearly a century and half after James and Tolstoy’s moral struggle with the competing forces of culture and commerce — a struggle that has intensified infinitely with the rise of the modern market system — another titan of literature and seer of truth addressed these elemental questions of creative culture with uncommon lucidity and luminosity of sentiment.
Hard times are coming, when we’ll be wanting the voices of writers who can see alternatives to how we live now, can see through our fear-stricken society and its obsessive technologies to other ways of being, and even imagine real grounds for hope. We’ll need writers who can remember freedom — poets, visionaries — realists of a larger reality.
Le Guin was a seer in the largest sense — her gaze bent past our culture’s horizons of peril and possibility visible to most, and she saw the early warning sings of a darkening reality. A decade after she first began admonishing against the commodification of art, she points to the creation of cultural artifacts motivated not by artistic merit but by marketability as one of the most perilous traps of our times:
Right now, we need writers who know the difference between production of a market commodity and the practice of an art. Developing written material to suit sales strategies in order to maximize corporate profit and advertising revenue is not the same thing as responsible book publishing or authorship.
Yet I see sales departments given control over editorial. I see my own publishers, in a silly panic of ignorance and greed, charging public libraries for an e-book six or seven times more than they charge customers. We just saw a profiteer try to punish a publisher for disobedience, and writers threatened by corporate fatwa. And I see a lot of us, the producers, who write the books and make the books, accepting this — letting commodity profiteers sell us like deodorant, and tell us what to publish, what to write.
Books aren’t just commodities; the profit motive is often in conflict with the aims of art. We live in capitalism, its power seems inescapable — but then, so did the divine right of kings. Any human power can be resisted and changed by human beings. Resistance and change often begin in art. Very often in our art, the art of words.
I’ve had a long career as a writer, and a good one, in good company. Here at the end of it, I don’t want to watch American literature get sold down the river. We who live by writing and publishing want and should demand our fair share of the proceeds; but the name of our beautiful reward isn’t profit. Its name is freedom.
Le Guin’s unassailable belief in literature as a force of freedom and her fierce advocacy for public libraries were a large part of our inspiration for donating all proceeds from A Velocity of Being: Letters to a Young Reader — which contains her last published piece — to the public library system. Seeing her deliver the speech live, with quietly impassioned conviction and incandescent dignity, only amplifies the urgency and bittersweet hopefulness of her message, which stands as a pillar of her legacy:
“Everybody should be quiet near a little stream and listen,” the great children’s book author Ruth Krauss — a philosopher, really — wrote in her last and loveliest collaboration with the young Maurice Sendak in 1960. At the time of her first collaboration with Sendak twelve years earlier, just after the word “workaholic” was coined, the German philosopher Josef Pieper was composing Leisure, the Basis of Culture — his timeless and increasingly timely manifesto for reclaiming our human dignity in a culture of busyness. “Leisure,” Pieper wrote, “is not the same as the absence of activity… or even as an inner quiet. It is rather like the stillness in the conversation of lovers, which is fed by their oneness.”
A generation earlier, with a seer’s capacity to peer past the horizon of the present condition and anticipate a sweeping cultural current before it has flooded in, and with a sage’s ability to provide the psychic buoy for surviving the current’s perilous rapids, Bertrand Russell (May 18, 1872–February 2, 1970) addressed the looming cult of workaholism in a prescient 1932 essay titled In Praise of Idleness (public library).
A great deal of harm is being done in the modern world by belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work.
With his characteristic wisdom punctuated by wry wit, he examines what work actually means:
Work is of two kinds: first, altering the position of matter at or near the earth’s surface relatively to other such matter; second, telling other people to do so. The first kind is unpleasant and ill paid; the second is pleasant and highly paid. The second kind is capable of indefinite extension: there are not only those who give orders, but those who give advice as to what orders should be given. Usually two opposite kinds of advice are given simultaneously by two organized bodies of men; this is called politics. The skill required for this kind of work is not knowledge of the subjects as to which advice is given, but knowledge of the art of persuasive speaking and writing, i.e., of advertising.
Russell points to landowners as a historical example of a class whose idleness was only made possible by the toil of others. For the vast majority of our species’ history, up until the Industrial Revolution, the average person spent nearly every waking hour working hard to earn the basic necessities of survival. Any marginal surplus, he notes, was swiftly appropriated by those in power — the warriors, the monarchs, the priests. Since the Industrial Revolution, other power systems — from big business to dictatorships — have simply supplanted the warriors, monarchs, and priests. Russell considers how the exploitive legacy of pre-industrial society has corrupted the modern social fabric and warped our value system:
A system which lasted so long and ended so recently has naturally left a profound impress upon men’s thoughts and opinions. Much that we take for granted about the desirability of work is derived from this system, and, being pre-industrial, is not adapted to the modern world. Modern technique has made it possible for leisure, within limits, to be not the prerogative of small privileged classes, but a right evenly distributed throughout the community. The morality of work is the morality of slaves, and the modern world has no need of slavery.
Writing nearly a century after Kierkegaard extolled the existential boon of idleness, Russell considers how this manipulated mentality has hypnotized us into worshiping work as virtue and scorning leisure as laziness, as weakness, as folly, rather than recognizing it as the raw material of social justice and the locus of our power:
The conception of duty, speaking historically, has been a means used by the holders of power to induce others to live for the interests of their masters rather than for their own. Of course the holders of power conceal this fact from themselves by managing to believe that their interests are identical with the larger interests of humanity. Sometimes this is true; Athenian slave owners, for instance, employed part of their leisure in making a permanent contribution to civilization which would have been impossible under a just economic system. Leisure is essential to civilization, and in former times leisure for the few was only rendered possible by the labors of the many. But their labors were valuable, not because work is good, but because leisure is good. And with modern technique it would be possible to distribute leisure justly without injury to civilization.
Russell notes that WWI — which was dubbed “the war to end all wars” by a world willfully blind to the fact that violence begets more violence, unwitting that this world war would pave the way for the next — furthered our civilization conflation of duty with work and work with virtue, lulling us into the modern trance of busyness. More than half a century before Annie Dillard observed that “how we spend our days is, of course, how we spend our lives,” Russell traces the ledger of our existential spending back to war’s false promise of freedom:
The war showed conclusively that, by the scientific organization of production, it is possible to keep modern populations in fair comfort on a small part of the working capacity of the modern world. If, at the end of the war, the scientific organization, which had been created in order to liberate men for fighting and munition work, had been preserved, and the hours of work had been cut down to four, all would have been well. Instead of that the old chaos was restored, those whose work was demanded were made to work long hours, and the rest were left to starve as unemployed. Why? Because work is a duty, and a man should not receive wages in proportion to what he has produced, but in proportion to his virtue as exemplified by his industry.
Pointing out that this equivalence originates in the same morality — or, rather, immorality — that produced the slave state, he exposes the core cultural falsehood it has effected, which stands as a monumental obstruction to equality and social justice in contemporary society:
The idea that the poor should have leisure has always been shocking to the rich.
Born in an era when urban workingmen had just acquired the right to vote in Great Britain, Russell draws on his own childhood for a stark illustration of this belief and its far-reaching tentacles of socioeconomic oppression:
I remember hearing an old Duchess say: “What do the poor want with holidays? They ought to work.” People nowadays are less frank, but the sentiment persists, and is the source of much of our economic confusion.
That sentiment, Russell reminds us again and again, is ahistorical. Advances in science, technology, and the very mechanics of society have made it no longer necessary for the average person to endure fifteen-hour workdays in order to obtain basic sustenance, as adults — and often children — had to in the early nineteenth century. But while the allocation of our time in relation to need has changed immensely, our attitudes about how that time is spent hardly have. He writes:
Every human being, of necessity, consumes, in the course of his life, a certain amount of the produce of human labor.
The wise use of leisure, it must be conceded, is a product of civilization and education. A man who has worked long hours all his life will be bored if he becomes suddenly idle. But without a considerable amount of leisure a man is cut off from many of the best things. There is no longer any reason why the bulk of the population should suffer this deprivation; only a foolish asceticism, usually vicarious, makes us continue to insist on work in excessive quantities now that the need no longer exists.
But while reinstating the dignity of leisure — or what Russell calls idleness — is a necessary condition for recalibrating our life-satisfaction to more adequately reflect the contemporary realities of work and need, it is not a sufficient one. Exacerbating our already warped relationship with work is the muddling of needs and wants at the heart of capitalist materialism — something Russell would address nearly two decades later in his Nobel Prize acceptance speech, listing acquisitiveness as the first of the four desires driving human behavior. He considers the radical shift that would take place if we were to stop regarding the virtue of work as an end in itself and begin seeing it as a means to a state of being in which work is no longer needed, reinstating leisure and comfort — that is, a contented sense of enoughness — as the proper existential end:
What will happen when the point has been reached where everybody could be comfortable without working long hours?
In the West, we have various ways of dealing with this problem. We have no attempt at economic justice, so that a large proportion of the total produce goes to a small minority of the population, many of whom do no work at all. Owing to the absence of any central control over production, we produce hosts of things that are not wanted. We keep a large percentage of the working population idle, because we can dispense with their labor by making the others overwork. When all these methods prove inadequate, we have a war; we cause a number of people to manufacture high explosives, and a number of others to explode them, as if we were children who had just discovered fireworks. By a combination of all these devices we manage, though with difficulty, to keep alive the notion that a great deal of severe manual work must be the lot of the average man.
Our society, Russell argues, is driven by “continually fresh schemes, by which present leisure is to be sacrificed to future productivity.” He challenges the inanity of this proposition:
The fact is that moving matter about, while a certain amount of it is necessary to our existence, is emphatically not one of the ends of human life. If it were, we should have to consider every navvy superior to Shakespeare. We have been misled in this matter by two causes. One is the necessity of keeping the poor contented, which has led the rich, for thousands of years, to preach the dignity of labor, while taking care themselves to remain undignified in this respect. The other is the new pleasure in mechanism, which makes us delight in the astonishingly clever changes that we can produce on the earth’s surface. Neither of these motives makes any great appeal to the actual worker. If you ask him what he thinks the best part of his life, he is not likely to say: “I enjoy manual work because it makes me feel that I am fulfilling man’s noblest task, and because I like to think how much man can transform his planet. It is true that my body demands periods of rest, which I have to fill in as best I may, but I am never so happy as when the morning comes and I can return to the toil from which my contentment springs.” I have never heard workingmen say this sort of thing. They consider work, as it should be considered, a necessary means to a livelihood, and it is from their leisure hours that they derive whatever happiness they may enjoy.
Decades before Diane Ackerman made her exquisite case for the evolutionary and existential value of play, Russell considers how the cult of productivity has demolished one of life’s pillars of satisfaction. Noting that modern people — true of the moderns of 1932, even truer of today’s — enjoy a little leisure but wouldn’t know what to do with themselves if they had to work only four hours a day, he observes:
In so far as this is true in the modern world, it is a condemnation of our civilization; it would not have been true at any earlier period. There was formerly a capacity for lightheartedness and play which has been to some extent inhibited by the cult of efficiency. The modern man thinks that everything ought to be done for the sake of something else, and never for its own sake.
The seedbed of this soul-shriveling belief is the notion — a driving force of consumerism — that the only worthwhile activities are those that bring material profit. A formidable logician, Russell exposes the self-unraveling nature of this argument:
Broadly speaking, it is held that getting money is good and spending money is bad. Seeing that they are two sides of one transaction, this is absurd; one might as well maintain that keys are good, but keyholes are bad. Whatever merit there may be in the production of goods must be entirely derivative from the advantage to be obtained by consuming them. The individual, in our society, works for profit; but the social purpose of his work lies in the consumption of what he produces. It is this divorce between the individual and the social purpose of production that makes it so difficult for men to think clearly in a world in which profit-making is the incentive to industry. We think too much of production, and too little of consumption. One result is that we attach too little importance to enjoyment and simple happiness, and that we do not judge production by the pleasure that it gives to the consumer.
Another result, Russell argues, is a kind of split between positive idleness, which ought to be the nourishing end of work, and negative idleness, which ends up being the effect of work under the spell of consumerism and its consequent socioeconomic inequality. He writes:
The pleasures of urban populations have become mainly passive: seeing cinemas, watching football matches, listening to the radio, and so on. This results from the fact that their active energies are fully taken up with work; if they had more leisure, they would again enjoy pleasures in which they took an active part.
With an eye to our civilization’s triumphs and failures of self-actualization, Russell points out that, historically, there has been a small leisure class enjoying a great many privileges without a basis in social justice, profiting on the backs of a large working class toiling for survival. While this rendered the oppressive leisure class morally condemnable, it resulted in the vast majority of art and science — “the whole of what we call civilization.” He writes:
Without the leisure class, mankind would never have emerged from barbarism.
The method of a hereditary leisure class without duties was, however, extraordinarily wasteful. None of the members of the class had been taught to be industrious, and the class as a whole was not exceptionally intelligent. The class might produce one Darwin, but against him had to be set tens of thousands of country gentlemen who never thought of anything more intelligent than fox-hunting and punishing poachers.
Russell’s most compelling point is the most counterintuitive — the idea that reclaiming leisure is not a reinforcement of elitism but the antidote to elitism itself and a form of resistance to oppression, for it would require dismantling the power structures of modern society and undoing the spell they have cast on us to keep the poor poor and the rich rich. To correctly calibrate modern life around a sense of enough — that is, around meeting the need for comfort rather than satisfying the endless want for consumerist acquisitiveness — would be to lay the groundwork for social justice. In such a society, Russell argues, no one would have to work more than four hours out of twenty-four — a proposition even more countercultural today than it was in his era. He paints the landscape of possibility:
In a world where no one is compelled to work more than four hours a day, every person possessed of scientific curiosity will be able to indulge it, and every painter will be able to paint without starving, however excellent his pictures may be. Young writers will not be obliged to draw attention to themselves by sensational potboilers, with a view to acquiring the economic independence needed for monumental works, for which, when the time at last comes, they will have lost the taste and the capacity.
Above all, there will be happiness and joy of life, instead of frayed nerves, weariness, and dyspepsia. The work exacted will be enough to make leisure delightful, but not enough to produce exhaustion. Since men will not be tired in their spare time, they will not demand only such amusements as are passive and vapid. At least 1 per cent will probably devote the time not spent in professional work to pursuits of some public importance, and, since they will not depend upon these pursuits for their livelihood, their originality will be unhampered, and there will be no need to conform to the standards set by elderly pundits. But it is not only in these exceptional cases that the advantages of leisure will appear. Ordinary men and women, having the opportunity of a happy life, will become more kindly and less persecuting and less inclined to view others with suspicion. The taste for war will die out, partly for this reason, and partly because it will involve long and severe work for all. Good nature is, of all moral qualities, the one that the world needs most, and good nature is the result of ease and security, not of a life of arduous struggle. Modern methods of production have given us the possibility of ease and security for all; we have chosen, instead, to have overwork for some and starvation for the others. Hitherto we have continued to be as energetic as we were before there were machines; in this we have been foolish, but there is no reason to go on being foolish for ever.
Why Feedback Matters
Feedback isn’t just a ritual of the modern workplace. It’s the means by which organisms, across a variety of life-forms and time periods, have adapted to survive. To University of Sheffield cognitive scientist Tom Stafford, feedback is the essence of intelligence. “Thanks to feedback we can become more than simple programs with simple reflexes, and develop more complex responses to the environment,” he writes. “Feedback allows animals like us to follow a purpose.”
Sugar. It’s everywhere, we consume too much, and the health risks are impossible to ignore. Ready to take action? Sure, reducing your sugar intake can feel like an insurmountable challenge. But thousands have made the jump… and so can you.
Lasting dietary changes are of course the long game, but there are lots of ways to get started. Let’s begin by limiting sugar’s visual footprint in your home.
An easy trick is relocating temptations like cookies and candy to a high shelf, or a difficult-to-reach cabinet. In no time, that extra degree of effort will have you instinctively choosing healthier options… quite literally, “the path of least resistance.”
The very next day, you’ll want to — hmmm.
Sometime in the night you managed to drag a chair from the office, reach the highest shelf and eat four boxes of coconut macaroons.
That’s perfectly okay! It’s important not to beat yourself up: sugar’s grip on the metabolism is a formidable opponent, and we’re far from done.
Let’s up your game. Stash forbidden snacks in a remote area of the house: the attic, the boiler room, the crawl space behind the nursery. As a deterrent, arrange a series of delicate or noisy items in front of the entrance: faced with moving them, you’ll be afforded a critical moment to ask, is another sugar rush really worth it?
Now — huh.
Fifteen minutes later and the attic is empty. The pie, the blueberry cheese braid, yes, but also the winter clothes, the musical instruments, even the trunk of encyclopedias you planned to give to the church; all gone. Where did…? Never mind.
The framed family portraits meant as obstacles lay smashed in the corner. It’s like you didn’t even look at them.
Time for the heavy artillery.
Bury all remaining snacks deep in the earth. Destroy the shovel. Surround the area with barbed wire; if possible, adopt an angry dog to — holy Christ, it’s thirty seconds later and the backyard is a wasteland, one giant crater: just black earth and flashes of the bone-white limestone upon which this community was built. The barbed wire has been tossed aside like so much bothersome tinsel and the dog, given to a loving home. How? It goes without saying the maple logs are gone.
What you need now is a friend. A good friend who doesn’t scare easily. Have them gather all sweets from the surrounding area and bring them to a location you have never seen, could never know. When the items are secure, your friend must restrain you with industrial-grade equipment, such that you cannot harm yourself or others. Finally —
You regain consciousness. Caroline is dead and there’s blood everywhere. Blood and… frosting? Oh, god. You have to run. Caroline was such a great part of your life, maybe the best part, honest even when it was painful, forever curious — remarkable at such a late age — but there’s no time for that now. You hear sirens in the distance and you throw a change of clothes and five sleeves of Fig Newtons in a backpack and leave.
That was six months ago.
You live in the cracks between the veneer of polite society now. When you do surface, it’s never long enough for someone to learn your face. You spend your days foraging for roots, nuts, edible flowers, the occasional sick or injured animal you bring down with your dirty, claw-like hands. That sun-drenched kitchen, its cabinets full of poison, is a receding memory.
Congratulations. You’ve said goodbye to sugar.
In this healthier, grateful body, take a moment to appreciate the benefits of your new life. Feel how much energy you have at the start of each day, wiping leaves and rain from your face, scanning the horizon for threats and sustenance. Appreciate how soundly you sleep each night, rags or hay piled beneath you, the cold stars wheeling above.
“I’ve learned that people will forget what you’ve said, people will forget what you did, but they will never forget how you made them feel.”
― Maya Angelou
To understand the best way forward, we often have to look back.
A typical beginning for any career is figuring out how to get the best job, with the right title and the highest pay. A typical ending to any career is figuring out how to give back. The wise ones figure this out sooner rather than later. The even wiser figure out that the quickest path to the getting is in the giving.
An Itch To Give Back
In a recent article in the Atlantic, “What I Learned About Life at My 30th College Reunion,” writer Deborah Copaken observed a number of trends from attending her 30-year college reunion at Harvard.
“No one’s life turned out exactly as anticipated, not even for the most ardent planner.”
“Every classmate who became a teacher or doctor seemed happy with the choice of career.”
“Nearly every single banker or fund manager wanted to find a way to use accrued wealth to give back….”
“[T]hose who went into [art] as a career were mostly happy and often successful, but they had all, in some way, struggled financially.”
“In our early 50s, people seem to feel a pressing need to speak truths and give thanks and kindness to one another before it’s too late to do so.”
The doctors, teachers and artists―careers fundamentally focused on giving―were notably fulfilled. Those who had followed the money were now in search of a way to contribute and give back. But giving seems to be so core to who we are that somehow as we grew older, an internal need arises to give back “before it’s too late.”
Giving is often misconstrued as a quid pro quo. Our gut reaction is to think, “If I give, what will I get in return?” But giving plays an unrecognized role in getting. Former Girl Scouts CEO Frances Hesselbein is just one example.
At the beginning of every meeting, a question hangs in the air: Who will be heard? The answer has huge implications not only for decision making, but for the levels of diversity and inclusion throughout the organization. Being heard is a matter of whose ideas get included — and who, therefore, reaps the accompanying career benefits — and whose ideas get left behind.
Yet instead of relying on subject matter experts, people often pay closest attention to the person who talks most frequently, or has the most impressive title, or comes from the CEO’s hometown. And that’s because of how our brains are built.
The group decision-making process, rather than aligning with actual competence, habitually falls for messy proxies of expertise, a phrase coined by University of Utah management professor Bryan Bonner. Essentially, when our brains are left to their own devices, attention is drawn to shortcuts, such as turning focus to the loudest or tallest person in the room. Over time, letting false expertise run the show can have negative side effects.
“The expert isn’t heard, and then the expert leaves,” Bonner said in an interview with the NeuroLeadership Institute, where I head the diversity and inclusion practice. “They want to realize their potential. [If] people can’t shine when they should be shining, there’s a huge human cost.”
If the people who offer the most valuable contributions to your organization aren’t appropriately recognized for it, they won’t stay long. Or, possibly worse, they will stay and stop trying. As my mother was fond of reminding me when I got my first management role: “When people can’t contribute, they either quit and leave or they quit and stay.”
One of the most important assets a group can have is the expertise of its members. But research indicates that even when everyone within a group recognizes who the subject matter expert is, they defer to that member just 62 percent of the time; when they don’t, they listen to the most extroverted person. Another experiment found that “airtime” — the amount of time people spend talking — is a stronger indicator of perceived influence than actual expertise. Our brains also form subtle preferences for people we have met over ones we haven’t, and assume people who are good at one thing are also good at other, unrelated things. These biases inevitably end up excluding people and their ideas.
In recruiting, management scholars have found that without systemic evaluation, hiring managers will favor and advocate for candidates who remind them of themselves. This plays out in meetings, too, where diversity goals can be undermined by these messy proxies to the extent that we use proxies that hinder particular groups: Height gives men and people from certain nations (whose populations tend to be taller) an advantage, and loudness disadvantages introverts and people with cultural backgrounds that tend to foster soft-spokenness. This phenomenon applies to both psychological and demographic diversity.
People are not naturally skilled at figuring out who they should be listening to. But by combining organizational and social psychology with neuroscience, we can get a clearer picture of why we’re so habitually and mistakenly deferential, and then understand how we can work to prevent that from happening.
How Proxies Play Out in the Brain
The brain uses shortcuts to manage the vast amounts of information that it processes every minute in any given social situation. These shortcuts allow our nonconscious brain to deal with sorting the large volume of data while freeing up capacity in our conscious brain for dealing with whatever cognitive decision making is at hand. This process serves us well in many circumstances, such as having the reflex to, say, duck when someone throws a bottle at our head. But it can be harmful in other circumstances, such as when shortcuts lead us to fall for false expertise.
At a cognitive level, the biases that lead us to believe false expertise are similarity (“People like me are better than people who aren’t like me”); experience (“My perceptions of the world must be accurate”); and expedience(“If it feels right, it must be true”). These shortcuts cause us to evaluate people on the basis of proxies — things such as height, extroversion, gender, and other characteristics that don’t matter, rather than more meaningful ones.
Although we humans may have biased brains, we also have the capacity to nudge ourselves toward more rational thinking.
The behavioral account of this pattern was first captured by breakthrough research from Daniel Kahneman and the late Amos Tversky, which eventually led to a Nobel Prize in Economic Science for Kahneman, and his bestseller Thinking, Fast and Slow. Their distinction between so-called System 1 thinking, a “hot” form of cognition involving instinct, quick reactions, and automatic responses, and System 2 “cool” thinking, or careful reflection and analysis, is very important here. System 1 thinking can be seen as a sort of autopilot. It’s helpful in certain situations involving obvious, straightforward decisions — such as the ducking-the-bottle example. But in more complicated decision-making contexts, it can cause more harm than good — for instance, by allowing the person with the highest rank in the meeting to decide the best way forward, rather than the person with the best idea.
Taking Steps to Combat Your Own Decision-Making Bias
Given the extent to which Western business culture puts a premium on individualism and fast decision making, it’s understandable that so many people have been trained to go their own way as quickly and confidently as possible. The good news is that with the right systems in place, people can be trained to approach problem solving in a different, less bias-ridden way.
Although we cannot block a biased assumption of which we are unaware, we can consciously make an effort to direct our attention to the specific information we need to evaluate, and to weigh it consciously. Just about any sort of decision can get hijacked by mental shortcuts, so it’s useful to have a few tools to nudge yourself and others toward more reflective, rigorous, and objective thinking.
Set up “if-then” plans. To guide attention back from these proxies of expertise, you can formulate “if-then” plans, which help the anterior cingulate cortex — a brain region that allows us to detect errors and flag conflicting information — find differences between our actual behavior and our preferred behavior. By incorporating this type of bias-mitigation plan before we enter into a situation where we know a decision will be made, we increase our chances of making optimal decisions.
For example, you can say to yourself: “If I catch myself agreeing with everything a dominant, charismatic person is saying in a meeting, then I will privately ask a third person (not the presenter or the loudest person) to repeat the information, shortly after the meeting, to see if I still agree.”
Get explicit, and get it in writing. One fairly easy intervention is to instruct employees to get in the habit of laying out, in writing, the precise steps that led to a given decision being made. You also can write out the process for your own decision making.
For example, narratives in the form of “We decided X, which led us to conclude Y, which is why we’re going with strategy Z” bring a certain transparency and clarity to the decision-making process and serve as a record that can be referenced later to evaluate which aspects of the process worked and which didn’t.
Incentivize awareness. Along those same lines, managers should reward employees who detect flaws in their thinking and correct course. At the NeuroLeadership Institute, we have a “mistake of the month” section in our monthly work-in-progress meetings to help model and celebrate this kind of admission.
To use a sports example, New England Patriots quarterback Tom Brady reportedly pays his defense if they can intercept his passes in practice. (It must help. He’s one of two players in NFL history to win five Super Bowls.) The takeaway: By making error detection a team sport, you destigmatize the situation, highlight the learning opportunities, and increase the likelihood of making better decisions in the future.
Set up buffers. Taking your decision making from “hot” to “cool” often requires a conscious commitment to create a buffer between when you receive information and when you make a decision on how to move forward.
For example, before a big decision is officially made, everyone involved should be encouraged to spend 10 minutes relaxing or going for a walk before reconvening one last time to discuss any potential issues that haven’t yet come up. This is a way of “cooling off” and making sure things have been thought through calmly. Another way to accomplish this is to engage in a “pre-mortem” — imagining a given decision went poorly and then working backward to try to understand why. Doing so can help identify biases that might otherwise go undetected.
Cut the cues. The most common and research-backed approach involves giving hirers access to fewer of the sorts of cues that can trigger expedience biases. Blind selection is a classic example. In the 1970s and 1980s, top orchestras instituted a blind selection process in which the identity of applicants was concealed from the hiring committee, often by literally hiding the player behind a screen while he or she performed. As a result, the number of female musicians in the top five U.S. symphony orchestras rose from 5 percent in 1970 to more than 25 percent in 1996.
Bonner, the Utah psychologist, says to “take the humanity out” when you can. “Set up situations where people exchange information with as little noise as possible,” he says. If you’re brainstorming, have everyone write down their ideas on index cards or on shared documents, then review the ideas anonymously — that way the strength of the idea, rather than the status of the source, will be the most powerful thing.
Technology can also be leveraged. For example, the “merit-based matching” app Blendoor strips the name, gender, and photos of an applicant from a recruiter’s view, and Talent Sonar uses predictive analytics to shape job listings that attract both male and female candidates, and performs a blind resume review, which leads to a 30 percent larger hiring pool, the company says.
Biases are human — a function of our brains — and falling for them doesn’t make us malicious. We have the capacity to nudge ourselves toward more rational thinking, to identify and correct the errors we make as a result of bias, and to build institutions that promote good, clear thinking and decision making. With the right systems, tools, and awareness in place, we can better cultivate the best ideas from the most well-suited minds. It just takes a bit of effort, and in the long run pays off in big ways. The best ideas get a chance to be heard — and implemented — and your best thinkers are recognized and keep on thinking.
Khalil Smith heads the diversity and inclusion practice at the NeuroLeadership Institute. He has 20-plus years of experience in leadership, strategy, and HR, including more than 14 years at Apple Inc.
As with everything in life, if you want to improve you need to see where you are and where you want to go. To succeed (or stay successful) in leadership, you need to see what you did this past year and where you want to go in the new year.
On the other hand, the best leaders know that progress isn’t something you check in on once a year. Progress and success can be achieved only through continuous improvements.
Effective leadership can be mastered, and a leadership checklist is an effective tool for making it happen. A solid year-end checkup will make you a better leader—today, tomorrow, and every day of the year.
Here’s my checklist. Try it out and see how you score:
Did you lead with character? Is what you said the same as what you did? Describing you as a leader, will people cite your character?
Did you create a compelling vision? Were you able to translate your personal goals into a compelling vision that people can rally around?
Did you identify next steps? Did you articulate goals, roles and responsibilities so everyone could be successful in their own right?
Did you think strategically? Did you set forth a pragmatic strategy for achieving both short- and long-term goals?
Did you act decisively? True leadership is about making good and timely decisions and ensuring they are executed. Have you done that? If yes, great! If not, why not? What will you do differently?
Did you build others up? Did you build confidence in others?
Did you communicate effectively? Did you communicate persuasively, concisely and memorably?
Did you listen before you spoke? Did you listen to people until they felt heard? Did you listen with the intent to learn?
Did you encourage feedback from others? Did you listen to feedback and adapt in response?
Did you cultivate leadership in others? have you spent time developing leadership throughout the organization? If not, why not? And how will you begin?
Did you lead with positivity? If you aren’t leading with positivity, you’re likely leading with negativity, and it has to change for you to become an effective leader.
Did you take ownership? Always be responsible for what you do as a leader. If not, people come to feel they cannot count on you.
Did you manage relationships? Personal relationships are at the core of great leadership.
Did you lead with inspiration? Did you create an environment in which others feel inspired and motivated, secure in their capabilities and competence, ready for new challenges and successes?
Did you cultivate a culture of respect? Did you treat people respectfully?
Did you navigate or fix? Did you get out of the way and allow people to show you what they’re capable of? Or did you do the work for them?
Did you value the unique contributions of others? Did you value the gifts that each individual brings by recognizing and appreciating their individual efforts and work?
Did you lead by example? As a leader, did you set an example others would want to emulate?
This year-end checklist is really a year-round checklist—it works as a daily, weekly, monthly and annual self-evaluation. Make adaptations to suit your particular situation, then use it to hold yourself accountable for who you are and what you do to become the kind of leader you want to be.
Lead from within: Use a checklist to become a better leader, for yourself and for those you lead. It will help you and your team become more effective and successful.
Using Neuroscience to Make Feedback Work and Feel Better
Research shows that using feedback is how organisms — and organizations — stay alive. Here’s how leaders can make the most of the anxiety-producing process.
by David Rock, Beth Jones, and Chris Weller
Illustration by Lars Leetaru
A version of this article appeared in the Winter 2018 issue of strategy+business.
Not too long ago, 62 employees at a major consultancy found themselves getting called into a room in pairs, neither person having any prior relationship to the other, for what they were told was a role-playing exercise. Researchers asked them to sit across from each other. Participants then learned they weren’t assigned to be collaborators, but adversaries — opposing sides engaging in a mock negotiation to buy or sell a biotechnology plant. They had six minutes to haggle over the price, and heart-rate monitors would track the ups and downs of the argument.
When the negotiations were finished, each side gave feedback about his or her opponent’s performance. Some participants were told to give the feedback unprompted. Others were instructed to ask for feedback. Quietly, the heart-rate monitors listened.
Here’s what the researchers found: If you want to put people on edge, tell them they will receive some feedback. Or, just as bad, tell them they’ll be giving feedback. Subjects in the study felt equally anxious offering feedback and receiving it, which might explain why so much workplace feedback — particularly in the United States — amounts to a series of polite statements, with few suggestions for improvement.
The simplest questions sometimes hold the greatest potential for insight. As a leader, consider this one: Would you follow you? Not you on your best day, or your worst day, but on a typical day — when you’re communicating, solving problems, and making decisions with the natural tendencies that shape the character of your unique leadership style.
This question matters because when you’re just being you in the company of your team, bosses, and clients, you are continuously showing whether you are someone to trust.
If you’re authentic and fully invested in making an impact while enhancing the experience of those around you, then you strengthen your case. Conversely, if you’re inauthentic or let yourself be silenced in the moments that matter, your case weakens every time you withdraw.
Giving up booze turned out to be incredibly easy once I thought of it not as denying myself something but as deciding not to regularly ingest a depressant. Above all it was a relief. When you home in on the absence of that grouchy feeling not drinking makes you feel like a superhero. You wake up every morning and feel good. And if you remember how you used to feel waking up hungover that feeling of waking up with a spring in your step becomes incredibly addictive.