Self-help—the enemy of the uncalm—is, unsurprisingly, an American phenomenon. It evinces a sensibility well suited to a country where the self has always been the most relevant unit.
The moral philosopher Samuel Scheffler at New York University has suggested that the real problem with a fantasy of immortality is that it doesn’t make sense as a coherent desire. Scheffler points out that human life is intimately structured by the fact that it has a fixed (even if usually unknown) time limit. We all start with a birth, then pass through many stages of life, before definitely ending in death.
For all its sundry failings and inexcusable prejudices, conventional art history provided a fundamental framework for assessing quality. Grouping works according to such commonalities as place of origin, period and circumstances of execution, artistic intent, function and medium facilitated comparative judgments. In the last decades, academia largely rejected this sort of connoisseurship, because it was too often tied to “great man” narratives. Over the same period, professional art criticism was effectively obliterated by a journalistic obsession (both in the surviving print media and online) with glamour, scandal and money. While the art world was never entirely free from market forces, those forces are now essentially left alone to determine value.
Kwame Anthony Appiah: “Like all the words in our language, the identity labels we use are a common possession. Were everybody to follow Humpty Dumpty’s example [‘When I use a word, it means just what I choose it to mean’], we simply couldn’t understand one another. If Toni Morrison isn’t a black woman, the term isn’t doing any work. … ‘Lesbian’ isn’t much use if you’re looking for a partner on Bumble unless it signifies a woman who might be open to sex with another woman.”
In short, the algorithm is able to note differences between what it encounters and what it has seen in the past. Like most people but unlike most other algorithms, the new system Higgins built for Google can understand that it hasn’t come across a brand new object just because it’s seeing something from a new angle.
While freelance websites may have raised wages and broadened the number of potential employers for some people, they’ve forced every new worker who signs up into entering a global marketplace with endless competition, low wages, and little stability. Decades ago, the only companies that outsourced work overseas were multinational corporations with the resources to set up manufacturing shops elsewhere. Now, independent businesses and individuals are using the power of the internet to find the cheapest services in the world too, and it’s not just manufacturing workers who are seeing the downsides to globalization. All over the country, people like graphic designers and voice-over artists and writers and marketers have to keep lowering their rates to compete.
“People get hung up on how eccentric some of his ideas were, but the core of his claims remains relevant and important. That is to say: our aesthetic experience, our experience of beauty in ordinary life, must be central to thinking about any good life and society. It’s not just decoration or luxury for the few. If you are taught how to see the world properly through an understanding of aesthetics, then you’ll see society properly.”
Everything about the recent past, and the generalization of the op-ed form across the internet, suggests there is an inexhaustible fund of such figures, a reserve army of op-ed labor waiting in the wings. Twitter has helped turn the internet into an engine for producing op-eds, for turning writers into op-ed writers, and for turning readers into people on the hunt for an op-ed. The system will not be satisfied until it has made op-ed writers of us all.
When scientists tried to reproduce the results of 100 psychology studies a few years ago, they came to an alarming conclusion: Fewer than half of the studies could be replicated, suggesting the field might be rife with flawed knowledge about human behavior. Now, a few of those same scientists—along with some new colleagues—have taken stock of the field again, by trying to reproduce 21 studies recently published in two of science’s top journals, Science and Nature.
Humans are born incomplete. The brain absorbs huge amounts of essential information throughout childhood and adolescence, which it uses to carry on building who we are. It’s as if the brain asks a single, vital question: Who do I have to be, in this place, to thrive? If it was a boastful hustler in ancient Greece and a humble team-player in ancient China, then who is it in the West today? The answer is a neoliberal.
“It began to unfold back in the ’60s and ’70s, when identity came to the forefront. People felt unfulfilled. They felt they had these true selves that weren’t being recognized. In the absence of a common cultural framework previously set by religion, people were at a loss. Psychology and psychiatry stepped into that breach. In the medical profession, treating mental health has a therapeutic mission, and it became legitimate to say the objective of society ought to be improving people’s sense of self-esteem. This became part of the mission of universities, which made it difficult to set educational criteria as opposed to therapeutic criteria aimed at making students feel good about themselves. This is what led to many of the conflicts over multiculturalism.”
At first, you think: Rich people making a difference — so generous! Until you consider that America might not be in the fix it’s in had we not fallen for the kind of change these winners have been selling: fake change.
Philosophy professor Neal Tognazzini: “I think there’s something distinctively valuable about allowing many aspects of your life — even the very fact of your life — to recede into the background, into a unconscious mental box we might label ‘presuppositions.’ I would go so far as to say that these presuppositions are what enable you to live a life at all.”
Jonardon Ganeri, himself a biracial philosopher, argues that it’s an unintended consequence of one of the qualities the field most values in itself – one that distinguishes it from most of the humanities – combined with the conscientious scholar’s natural reluctance to muck about with subject matter she doesn’t know.
The idea that historians could use their knowledge of the past to advise useful courses of action for the future goes all the way back to Thucydides. “In recent decades, however, things have changed. The longstanding view of the historian as being, in modern jargon, ‘policy-relevant’, has fallen out of favour and often arouses suspicion” – within the discipline as well as outside it. Robert Crowcroft makes the case for a widespread revival of the approach now called “applied history.”
In the coming years, it may be that conversational, artificially intelligent assistants will become part of the answer, deciding whether or not to alert us to messages, helping us retrieve information and recommending items of interest. But figuring out book reviews, indexes and the rest took several centuries, so we shouldn’t expect an immediate solution. In the meantime we must endure information overload: the feeling that arises in the space of time between a sudden increase in the flow of information and the development of the tools to enable us to cope with it.
“Yes, at the moment the concept is seen as little more than another bit of self-referential young person slang, used only in the deepest recesses of the web. … But irony poisoning should be entered, we think, into the pantheon of social science concepts that are used to rigorously measure, study and perhaps one day understand how social media platforms can rewire your brain and alter society.” Max Fisher and Amanda Taub explore the concept and how it works.
Perhaps it’s time we realize that consuming more news about the world around us is not the way to improve it (or ourselves), personally or politically. Two thousand years ago, Marcus Aurelius wrote in his Meditations, “Are you distracted by breaking news? Then take some leisure time to learn something good, and stop bouncing around.”
There is a great conundrum, or — if you prefer — a dark secret, about modern philosophy: while diversity is the lifeblood of philosophy, philosophy as we now find it in the United States (and equally elsewhere) has come to fear and shun diversity, specifically the diversity of philosophical opinion and argumentation from extra-European cultures. How did this happen? And why?
A new nationally representative survey about “screen time and device distractions” from the Pew Research Center indicates that it’s not just parents who think teenagers are worryingly inseparable from their phones—many teens themselves do, too. Fifty-four percent of the roughly 750 13-to-17-year-olds surveyed said they spend too much time absorbed in their phones, and 65 percent of parents said the same of their kids’ device usage more generally.
Modern narratives of capitalist development often assume that vast wealth accumulated by a few accompanies improved circumstances for many. The history of slavery’s capitalism warns against all these expectations.
Martha Nussbaum: “We humans are very self-focused. We tend to think that being human is somehow very special and important, so we ask about that, instead of asking what it means to be an elephant, or a pig, or a bird. … The question, ‘What is it to be human?’ is not just narcissistic, it involves a culpable obtuseness. It is rather like asking, ‘What is it to be white?’ It connotes unearned privileges that have been used to dominate and exploit. But we usually don’t recognize this because our narcissism is so complete.”
Perhaps whole revised paradigms of thought, such as those a century or so ago when relativity and quantum mechanics emerged, will take comprehension in currently unimaginable directions. Maybe we shall find that the cosmos is just mathematics rendered substantial. Maybe our comprehension of consciousness will have to be left to the artificial device that we thought was merely a machine for simulating it. Maybe, indeed, circularity again, only the artificial consciousness we shall have built will have the capacity to understand the emergence of something from nothing. I consider that there is nothing that the scientific method cannot elucidate.
Do people judge themselves to be more or less authentic over time? My inclination is that people believe they are becoming more of their true self as time passes. After all, most of us would like to think that we are growing and changing in positive ways. And the constant bombardment of messages to be ‘true to ourselves’ can suggest that some force prevents us from fully expressing who we really are.
What did people used to believe they lost when they lost their privacy? Surprisingly, it turns out that a large number of people began to speak of privacy in a self-conscious way only toward the end of the nineteenth century. As is often the case, the first defenders of privacy became aware of its value at the moment they were on the verge of losing it.
Close reading is hard, which is how this class ended up telling its professor that “The Love Song of J. Alfred Prufrock” was about a prostitute. “The predominant interpretation holds sway because students have been trained that their emotional response to a text is just as valid as, say, what it means to read a text within its historical or cultural context.”
Comedian Adam Cayton-Holland: “I don’t really have a joke about it. It’s more like a public service announcement that I make into the microphone, urging people to seek help, urging people to not feel ashamed of feeling powerless when it comes to their brains. You can see the audience tense up. I pepper in a few jokes here and there, to try to cut the tension. Which kind of works. It’s not perfect, but it’s real.”
The transformation of American colleges and universities into corporate concerns is particularly evident in the maze of offices, departments and agencies that manage the moral lives of students. When they appeal to administrators with demands that speakers not be invited, that particular policies be implemented, or that certain individuals be institutionally sanctioned, students are doing what our institutions have formed them to do. They are following procedure, appealing to the institution to manage moral problems, and relying on the administrators who oversee the system.
While there has been a lot of discussion about “what’s left for humans?” as AI improves at exponential rates — the customary answer is that humans need to focus on the things they are uniquely good at, such as creativity, intuition, and personal empathy — I think we now have to ask, “what’s left for firms?”
Sometimes, the only way to help someone seems to be a cruel or nasty approach – a strategy that may leave the ‘helper’ feeling guilty and wrong. Now research from my team at the Liverpool Hope University in the UK sheds light on how the process works.