“Heroes loom large as exemplars of morality. They often embody virtues that we wish to express in our lives,” writes a research team led by psychologists Daryl Van Tongeren of Hope College and Jeffrey Green of Virginia Commonwealth University. Their findings suggests that subtle reminders of the superhero ethos can inspire us to emulate their selfless behavior.
“One reason is that our senses are unreliable. Often, we have to make decisions on the basis of what we’ve just heard or seen. But these sensations can be noisy. How can we be sure of what it was we just heard or saw? Think of radar operators who have to hunt for weak signals and decide if this is an incoming missile or a flock of birds. The wrong decision could start World War Three.”
Roughly speaking, the most common defense of diversity has two parts. The first focuses on the educational and social benefits of diversity. The second attempts to show the inherent value of a diverse environment, one that is in some sense representative of the diversity of the American, or perhaps global, population.
Most philosophers agree that shame is about failing to live up to our moral ideals, but stories such as Lucy Grealy’s and others’ seem not to fit this definition. For example, it’s common for people who suffer from mental illness to feel shame. People who experience povertyfeel shame because of it. It’s also common for women to feel shame more often than men, and for black people to feel shame more often than white people. To argue that all these people must feel shame because, deep down, they feel like moral failures, we’re assuming that entire populations are suffering from delusion. Maybe the problem isn’t that these cases are irrational. Maybe the problem is that shame isn’t about ideals in the first place.
The borderless fluidity of open offices seems perfectly suited to the ambitions of the internet age—while also replicating its failed aspirations toward “connectivity.” Just as hyper-modulated online interactions, contrary to the promise of their conceptual foundations, cordon people into niche micro experiences, so the open office counterintuitively isolates office workers. A recent study from Harvard Business School confirms this deterioration of face-to-face interaction.
When The New York Times used the word, the Facebook comments were … interesting, and some were thoughtful. One linguist: “In its attempt to be gender-inclusive… one can argue that it’s gender-erasing of women who have fought for a long time to not just have Latino, but to have Latino/Latina, to make sure women are represented.”
Here’s the theory: “The power of the Amazon review is not what you might think. They’re not really there to help you purchase a clock or a book or even to develop a conspiracy theory about the increasing flimsiness of Ziploc sandwich bags compared to other brands. … I mean, they are there to help you purchase things, but that is secondary. The real reason to read Amazon reviews, and, in particular, to follow the Hansel-and-Gretel breadcrumb trail of those reviews as left by one person from product to product, is to glimpse into a life, strange and whole and utterly unlike your own. This is where the real magic lies.”
Between 6,000 and 10,000 churches die every year in the United States. “As donations and attendance decrease, the cost of maintaining large physical structures that are only in use a few hours a week by a handful of worshippers becomes prohibitive. None of these trends show signs of slowing, so the United States’s struggling congregations face a choice: start packing or find a creative way to stay afloat.”
China’s rise as a tech powerhouse has dovetailed with Silicon Valley’s growing, and often vividly expressed, distrust toward democracy itself. Always steeped in libertarian pique—not long ago, technologists expressed hope for floating ad-hoc nation-states or, as Larry Page put it, referencing Burning Man, “some safe places where we can try out some new things”— Silicon Valley now toys with Californian secessionism and Singapore-style authoritarian technocracy. That new horizon, that place of raucous experimentation with a frontier-like possibility at striking it rich, they believe, is in China.
Here’s the problem: the theory of mind we call carry around with us and use every day has no basis in what neuroscience—Nobel Prize winning neuroscience–tell us about how the brain works. Neuroscience has revealed that the theory is quite as much of a dead end as Ptolemaic astronomy. It’s been around for such a longtime only because it was the predictive device natural selection came up with, in spite of being fundamentally mistaken about how things were really arranged.
It is not the first smart city—municipalities around the world have adopted smart infrastructure like artificial-intelligence-enabled traffic lights—but it might be the most ambitious. The project’s 200-page wish list of features is astounding. The “vision document” imagines not only the revitalization of a 12-acre plot that has sat largely vacant since its heyday as an industrial port, but its transformation into a micro-city outfitted with smart technologies that will use data to disrupt everything from traffic congestion to health care, housing, zoning regulations, and greenhouse-gas emissions. Long before flying cars, smart sensors won’t just be in our mattresses or our bidets, they’ll be embedded in the walls of our homes and the concrete beneath our feet.
Most of us don’t seek out a new form of language, and if we happen to come across arbitrary sentences or silly paragraphs, we’re less than thrilled about it. The old idioms work just fine. We know what they mean. Even if I store food in cartons in the fridge, I don’t “keep all my eggs in one basket.” Even if you never cook for yourself, you sometimes “put it on the back burner.” Does this mean that old idioms are inevitably clichéd?
In 1930, John Maynard Keynes predicted that, by the end of the century, technology would have become so far advanced that developed economies would have a 15-hour workweek. So how did we get to our current state, almost two decades into the 21st century? It turns out that Keynes was only half right—technology has advanced spectacularly, but we are far from a 15-hour workweek.
There are only two problems with the work ethic today: Work doesn’t reliably deliver the social, moral, and spiritual goods it promises, and artificial intelligence is about to render the work ethic moot.
In just a few minutes of mental wandering, you have made several distinct round trips from past to future: forward a week to the important meeting, forward a year or more to the house in the new neighborhood, backward five hours to today’s meeting, forward six months, backward five years, forward a few weeks. You’ve built chains of cause and effect connecting those different moments; you’ve moved seamlessly from actual events to imagined ones. And as you’ve navigated through time, your brain and body’s emotional system has generated distinct responses to each situation, real and imagined. The whole sequence is a master class in temporal gymnastics.
We know that surveillance has a chilling effect on freedom. People change their behavior when they live their lives under surveillance. They are less likely to speak freely and act individually. They self-censor. They become conformist. This is obviously true for government surveillance, but is true for corporate surveillance as well. We simply aren’t as willing to be our individual selves when others are watching.
The word “robot,” like the words “shalom” and “free-range chicken,” does not have a universally agreed-upon definition, but the usual criteria include autonomy, an ability to change its surroundings, intelligence, and the possession of a body. Then it gets trickier: How intelligent? Must a robot be mobile? Is a dishwasher a robot? According to the podcast “Robot or Not?” a self-driving car is not (you designate its destination), but a Roomba is (because it’s more in control of its path than you are).
If we are to heal the divides of the contemporary historical moment, we should give away the fiction that reason alone has ever held the day. The present warrants criticism, but it will do no good if it’s based on a myth about some glorious, dispassionate past that never was.
The challenge of Duo, with its addiction-causing rewards and points and treasure chests, is that it’s trying to figure out both how to get people to learn – and how to get them to stay. The CEO: “We prefer to be more on the addictive side than the fast-learning side. … If someone drops out, their rate of learning is zero.”
Although most of us would agree that both bullshit and the outright lie are modes of misrepresentation, there exists a key difference between the two. Neither the bullshitter nor the liar can be relied upon to tell the truth. But in order to lie, the liar must first believe that she knows the truth; only then can she persuade her audience of what she knows to be untrue. The bullshitter, on the other hand, maintains no relationship at all with the truth: it is irrelevant to the bullshitter whether what she says is true or false, and what she is guilty of misrepresenting is precisely her concern for the distinction between the two.
After they finished lying to her, researcher Danielle Polage asked the students to again rate their certainty that each of these events had or had not happened. Fascinatingly (and a little creepily), subjects showed a statistically significant change in their beliefs, indicating that they became less sure that untrue events hadn’t happened to them after saying that they had. Conversely, when subjects were later asked to deny events that had happened to them, they became less sure that those events did take place.
“We allow our great cultural institutions to fall into disrepair and disrepute because, as we strip them of their reverential traditions and their arduous canon, we also strip them of our reasons to cherish them. We call them before the tribunal of public opinion to justify their very existence, as if we can no longer see through the smog to the heights of Parnassus, lonelier than ever because we have forgotten that it is even there. We attempt to chain the Muses to the machinery of our modern malaise, as if we do not remember that they exist to show us the way to transcend that malaise, to find our way home again, by way of that steep and difficult climb, to the bosom of art and learning.”
Even Herodotus never considered how to integrate the historic timelines of the Greeks, Egyptians, and Persians. The problem was the lack of any fixed common calendar, any agreed-upon way of determining which year was which and what happened when, since each civilization had its own notional Year One. Then, because he got tired of having to consult many different books, the ruler of a kingdom on the Caspian Sea asked a Persian scholar to develop a timeline that could cover all peoples and their histories. (It was only happenstance that this happened in a year that carried a big round number in the European calendar.)
Over the past century we’ve vastly increased the time and money invested in science, but in scientists’ own judgement we’re producing the most important breakthroughs at a near-constant rate. On a per-dollar or per-person basis, this suggests that science is becoming far less efficient.
The ubiquitous use of ‘intellectual property’ began in the digital era of production, reproduction and distribution of cultural and technical artifacts. As a new political economy appeared, so did a new commercial and legal rhetoric. ‘Intellectual property’, a central term in that new discourse, is a culturally damaging and easily weaponised notion. Its use should be resisted.
As we attempt to grapple with this bleak post-human future, we must also confront the question of what humans can hope to understand. Parts of the physical world are understood. They can be observed and described by theories—but much of it cannot. Human observation bumps up against stark limits. Human reasoning is not limitless either, but it does allow us to think through what might in principle be “over the horizon.”
If the most polarized population uses the Internet and social media the least, to suddenly point a finger at technology says more about our anxieties about the rate of technological change than about what has actually happened to us. The fact is that this twenty-two-year-old dynamic of polarization can’t easily be associated with the Internet.
That transformation, from facts to numbers to data, traces something else: the shifting prestige placed on different ways of knowing. Facts come from the realm of the humanities, numbers represent the social sciences, and data the natural sciences. When people talk about the decline of the humanities, they are actually talking about the rise and fall of the fact, as well as other factors. When people try to re-establish the prestige of the humanities with the digital humanities and large data sets, that is no longer the humanities. What humanists do comes from a different epistemological scale of a unit of knowledge.
Respect for children means respect for the adults that they will one day become; it means helping them to the knowledge, skills, and social graces that they will need if they are to be respected in that wider world where they will be on their own and no longer protected. For the teacher, respect for children means giving them whatever one has by way of knowledge, teaching them to distinguish real knowledge from mere opinion, and introducing them to the subjects that make the mind adaptable to the unforeseen. To dismiss Latin and Greek, for example, because they are not “relevant” is to imagine that one learns another language in order, as Matthew Arnold put it, “to fight the battles of life with the waiters in foreign hotels.”
With its heavy focus on artificially intelligent curation, Google Photos suggests the dawning of a new age of personalized robot historian. The trillions of images we are all snapping will become the raw material for algorithms that will curate memories and construct narratives about our most intimate human experiences. In the future, the robots will know everything about us — and they will tell our stories.