Social psychologists, sociologists and anthropologists would not be baffled by this apparent contradiction. Many have long believed that morality is essentially a system of social regulation. As such it is in no more need of a divine foundation or a philosophical justification than folk dancing or tribal loyalty. Indeed, if ethics is just the management of the social sphere, it should not be surprising that as we live in a more globalized world, ethics becomes enlarged to encompass not only how we treat kith and kin but our distant neighbours too.
“The midlife crisis was invented in London in 1957. That’s when a 40-year-old Canadian named Elliott Jaques stood before a meeting of the British Psycho-Analytical Society and read aloud from a paper he’d written. Addressing about a hundred attendees, Jaques claimed that people in their mid-30s typically experience a depressive period lasting several years. … In ordinary people symptoms could include religious awakenings, promiscuity, a sudden inability to enjoy life, ‘hypochondriacal concern over health and appearance,’ and ‘compulsive attempts’ to remain young.”
In an age that supremely prizes capitalist efficiency, the proliferation of pointless jobs is a puzzle. Why are employers in the public and private sector alike behaving like the bureaucracies of the old Soviet Union, shelling out wages to workers they don’t seem to need? Since bullshit jobs make no economic sense, David Graeber argues, their function must be political. A population kept busy with make-work is less likely to revolt.
Ehrenreich contemplates with some satisfaction not just the approach of her own death but also the passing of her generation. As the boomers have aged, denial of death, she argues, has moved to the center of American culture, and a vast industrial ecosystem has bloomed to capitalize on it. Across twelve chapters, Ehrenreich surveys the health care system, the culture of old age, the world of “mindfulness,” and the interior workings of the body itself, and finds a fixation on controlling the body, encouraged by cynical and self-interested professionals in the name of “wellness.”
The most consistent finding out of this vast literature, the one fundamental result, is that personal space expands with anxiety. If you score high on stress, or if the experimenter stresses you ahead of time—maybe you take a test and are told that you failed it—your personal space grows with respect to other people.
Two numbers colleges used to rely on – SAT scores and GPAs – just aren’t that reliable anymore, admissions people say. So how to judge student applications? The answer is not good: “Admissions officers at about half of the institutions surveyed said an applicant’s ‘ability to pay’ was of at least ‘some importance’ in application decisions.”
From personal assistant robots acting as companions, to robots who offer reminders of daily tasks when our memories fail, to surgical precision robots that remove human error, the future of aging looks a lot different from today. Today’s strategy is to develop basic AI capabilities in home companion robots while technology develops. But how far is too far?
“As a philosophy major in college before medical school, I believe I learned what it means to be a good doctor equally from my humanities classes as from my science classes. Studying the humanities helps students develop critical-thinking skills, understand the viewpoints of others and different cultures, foster a just conscience, build a capacity for empathy, and become wise about emotions such as grief and loss. These are all characteristics that define a good doctor.”
“These seven moral rules – love your family, help your group, return favors, be brave, defer to authority, be fair, and respect others’ property – appear to be universal across cultures. My colleagues and I analyzed ethnographic accounts of ethics from 60 societies (comprising over 600,000 words from over 600 sources). We found that these seven cooperative behaviors were always considered morally good.”
Scientists have been at this question for several years, studying people’s activity online and revealing interesting trends as to what makes content eye-catching and more likely to go viral. Emotional arousal is one key determinant. After analyzing 7,000 articles from the New York Times, Jonah Berger and Katherine Milkman from UPenn found that one of the main factors driving readers to share a story via email was how much it stirred them up.
“We read about computers that can master ancient games and drive cars. [Turing Award-winning researcher Judea] Pearl is underwhelmed. As he sees it, the state of the art in artificial intelligence today is merely a souped-up version of what machines could already do a generation ago: find hidden regularities in a large set of data. … The key, he argues, is to replace reasoning by association with causal reasoning” – that is, teach machines to process cause and effect.
Hidden civilizations offer one possible answer to the Fermi Paradox, which raises the question of why we haven’t found evidence of intelligent alien life if many such races exist out there. Rather than support Enrico Fermi’s theory that intelligent life is unique to Earth, Dark Forest Theory raises the possibility that alien life is too intelligent to be detected, either because it’s hiding and/or because it’s plotting another race’s destruction.
Thomas Nagel argued that when we sense that something – or everything – in life is absurd, we’re experiencing the clash of two perspectives from which to view the world. One is that of the engaged agent, seeing her life from the inside, with her heart vibrating in her chest. The other is that of the detached spectator, watching human activity coolly, as if from the distance of another planet.
What has changed is not so much the level of noise, which previous centuries also complained about, but the level of distraction, which occupies the space that silence might invade. There looms another paradox, because when it does invade—in the depths of a pine forest, in the naked desert, in a suddenly vacated room—it often proves unnerving rather than welcome. Dread creeps in; the ear instinctively fastens on anything, whether fire-hiss or bird call or susurrus of leaves, that will save it from this unknown emptiness. People want silence, but not that much.
“These companies aren’t out to nail trends, as the fast fashion manufacturers of past decades did, but rather to sell an all-encompassing clothing system through which consumers are meant to live. In tech terms, the brands are platforms and the products must be scalable, aimed at as wide and profitable an audience as possible, whether those products are fabric sneakers or ethically manufactured underwear. It’s clothing as software, embracing an ethos of one-for-all uniformity.”
Why does that make a difference? “The vocal variety now offered by these companies minimizes the subservient female assistant vibe. And as these assistants are increasingly being adopted in households with children, bossing around not just a female-voiced assistant seems like a healthy step in teaching gender equality and eliminating traditional gender role expectations. For younger children anthropomorphizing the bot, a changing voice may also make it clear that this is a computer entity, not a person.”
“It all began in a simpler time—February 2015 — on an ordinary Thursday evening when a photo of a dress posted on Tumblr got picked up by BuzzFeed. Was the dress white and gold, or blue and black? The question pitted brother against brother, friend against friend, caused celebrities to weigh in, and basically ground the internet to a halt. The dress was actually blue, but that hardly mattered. … What mattered was that the chasm between perception and reality had opened up, and we found ourselves teetering on the edge.”
The fandom, especially the fandom among women, for this podcast is reaching pop-star levels. “Calling themselves Murderinos, they came to hear expletive-laden tales of serial killings and brutal homicides told by Georgia Hardstark and Karen Kilgariff, the irreverent hosts of the wildly popular true-crime comedy podcast ‘My Favorite Murder.'” (One of the host’s offhand comments, “Toxic masculinity ruins the party again,” has become something of a rallying cry.)
There is a world that exists—an uncountable number of differently-flavored quarks bouncing up against each other. There is a world that we perceive—a hallucination generated by about a pound and a half of electrified meat encased by our skulls. Connecting the two, or conveying accurately our own personal hallucination to someone else, is the central problem of being human. Everyone’s brain makes a little world out of sensory input, and everyone’s world is just a little bit different.
Like the New York intellectuals who had clustered around Commentary and the Partisan Review in the Sixties, and partly in conscious imitation of them, the writers and editors of the new magazines blended art, criticism, philosophy and self-examination in the confidence that these activities would all be, when carried out with a sufficient level of clarity and insight, mutually re-inforcing.
A Dartmouth-led study finds that the brain may tune towards social learning even when it is at rest. The findings published in an advance article of Cerebral Cortex, demonstrate empirically for the first time how two regions of the brain experience increased connectivity during rest after encoding new social information.
Learning to see is not an innate gift; it is an iterative process, always in flux and constituted by the culture in which we find ourselves and the tools we have to hand. Harriot’s 6-power telescope certainly didn’t provide him with the level of detail of Galileo’s 20-power. Yet the historian Samuel Y Edgerton has argued that Harriot’s initial (and literal) lack of vision had more to do with his ignorance of chiaroscuro – a technique from the visual arts first brought to full development by Italian artists in the late 15th century.
For many, the idea of the ‘creative person’ comes from popular media, which inundates us with news stories and movie portrayals of the suffering artist and the mad genius. And there are anecdotal accounts closer to our real lives: many of us have heard stories about someone who suffers from a deep depression – but also creates beautiful poetry. Repeatedly hearing these accounts fuels a stereotype. When we frequently see two unique things (eg, extraordinary creativity and mood disorders) occur together, they become paired in our minds, creating what is termed an illusory correlation.
If one looks at history, the answer seems obvious: What fences have very often indicated is not simply what is mine and what is yours, but, more subtly, who I am versus who you are. This tendency is based on the human inclination to define one’s identity in contrast to someone cast as a different, an untrustworthy Other best kept at a distance.
That’s what a pair of researchers found when studying firefighters in Colorado and undergraduates in London. “When you experience stressful events, whether personal (waiting for a medical diagnosis) or public (political turmoil), a physiological change is triggered that can cause you to take in any sort of warning and become fixated on what might go wrong.”
“Inundated via social media with the opinions of multitudes, users are diverted from introspection; in truth many technophiles use the internet to avoid the solitude they dread. All of these pressures weaken the fortitude required to develop and sustain convictions that can be implemented only by traveling a lonely road, which is the essence of creativity.”
“My work consists entirely of creating the conditions for genius to flow. I am not in possession of it — it resides in that flow of output, which everyone participates in. “Genius” is the oxygen that those in a shared space breathe in and are transformed by; it allows them to reach their full potential. In this way, “genius” returns to its original Latin meaning of an “attendant spirit.”
The act of gesturing seems to be universal (every known human group does it), and we know that there are certain gestures that are culture-specific. (There are places where you definitely shouldn’t make the thumb-and-forefinger “okay” sign.) “What people produce much more often are gestures for ‘yes’ and ‘no’; points to people, places and things; gestures that sketch objects, actions and represent abstract ideas through visual metaphors. These are the real workhorses of gestural expression. And, as it turns out, a case can be made that these workhorses are broadly similar the world over.”