“Just as the division of labor among humans leads to much better work outcomes, so will the rise of automation benefit the worker. Only the outcome will be many multiples greater than that which springs from human divisions of labor. Imagine the future if robots achieve their potential to erase all manner of work forms,” he writes. “How very exciting.”
Currently the record is held by Sebastiaan Bowier, who in 2012 set a record of 133.78 km/h, or just over 83 mph. It’s hard to imagine how his bike, which looked more like a tiny landbound rocket than any kind of bicycle, could be significantly improved on. But every little bit counts when records are measured down a hundredth of a unit, and anyway, who knows but that some strange new shape might totally change the game?
Some people might be more comfortable disclosing their innermost feelings to an AI. A study conducted by the Institute for Creative Technologies in Los Angeles in 2014 suggests that people display their sadness more intensely, and are less scared about self-disclosure, when they believe they’re interacting with a virtual person, instead of a real one. As when we write a diary, screens can serve as a kind of shield from outside judgment.
I think it’s time we take a lesson from the history of science. Beauty does not have a good track record as a guide for theory-development. Many beautiful hypotheses were just wrong, like Johannes Kepler’s idea that planetary orbits are stacked in regular polyhedrons known as ‘Platonic solids’, or that atoms are knots in an invisible aether, or that the Universe is in a ‘steady state’ rather than undergoing expansion. And other theories that were once considered ugly have stood the test of time.
“The common misconception is that this trick involves the performer somehow ‘throwing’ their voice through a clever trick of the voice box.” But that’s not it at all. “‘Imagine you hear a loud sound, and at exactly the same time, there is an abrupt appearance of something. Then, automatically — because of the coincidence in time — you would tend to associate these two events as originating from the same cause,’ says [researcher] Salvador Soto-Faraco … ‘That is the inference that happens in ventriloquist illusions.'”
Today digital technology is all the rage because after decades of development it has become incredibly useful. Still, if you look closely, you can already see the contours of its inevitable descent into the mundane. We need to start preparing for a new era of innovation in which different technologies, such as genomics, materials science, and robotics, rise to the fore.
For most of the two and a half centuries since the Reverend Thomas Bayes first made his pioneering contributions to probability theory, his ideas were side-lined. The high priests of statistical thinking condemned them as dangerously subjective and Bayesian theorists were regarded as little better than cranks. It is only over the past couple of decades that the tide has turned.
Today, nearly all scientists say that coincidences are just that: coincidences – void of greater meaning. Yet, they’re something we all experience, and with a frequency that is uniform across age, sex, country, job, even education level. Those who believe that they’ve had a ‘meaningful coincidence’ in their lives experience a collision of events so remarkable and unlikely that they chose to ascribe a form of grander meaning to the occurrence, via fate or divinity or existential importance.
Even the mathematically averse among us today recognize the basic geometry that Radolph and Ragimbold failed to grasp, for we live in a numerate society, surrounded by countless manifestations of mathematics. Broadly defined as the ability to reason with numbers and other mathematical concepts, numeracy underlies our current information explosion. Its clichés dot popular speech: “do the math,” “crunch the numbers,” “figure the odds.” From birth to death, numbers track our lives institutionally and demographically. Some scorn such customs (think of Mark Twain’s “figures” of “lies, damned lies, and statistics”), but we all acknowledge numeracy as a cultural given, and agree that mathematics fuels the science, technology, and industry of our world.
The pessimist in me, however, thinks San Francisco can only continue further down this path, with the old-money propertied class dying or cashing out, the non-techies getting squeezed, and everyone getting pushed into the four-level hierarchy. In case there’s any doubt, I find the growth of this rigid caste system horrifying, and antithetical to both liberal democracy and the American project. It also seems that, at least in San Francisco, we’re close to a point of no return.
It took centuries for the public sphere to develop—and the technology companies have eviscerated it in a flash. By radically remaking the advertising business and commandeering news distribution, Google and Facebook have damaged the economics of journalism. Amazon has thrashed the bookselling business in the U.S. They have shredded old ideas about intellectual property—which had provided the economic and philosophical basis for authorship. The old, enfeebled institutions of the public sphere have grown dependent on the big technology companies for financial survival. And with this dependence, the values of big tech have become the values of the public sphere.
When personal security and community confidence combine they can deliver social emancipation, as Ireland’s recent referendums and law reforms on divorce, same-sex marriage and abortion (on top of a strikingly diverse political leadership of late) amply illustrate. It is a veritable loosing of the conservative shackles of church, class and culture that has catapulted Ireland to the forefront of Millennial-style liberalism.
“Almost every major religious or philosophical tradition heavily emphasises the value of self-restraint as a pathway to a virtuous and satisfying life. It makes sense that impulse-control has been held in such high regard historically: the ability to curb destructive urges and prudently delay gratification makes life easier in the long run. So what does modern brain science tell us about the best ways to approach willpower?”
Prior to the sixteenth century, no one was a genius. Rather, one had genius. The original sense of the word genius was of a “tutelary spirit attendant on a person.” Muses and spirits, almost always in the form of women, influenced the lucky men who channeled them. Great works were a joint effort, a communication with the divine at the service of the community. But as the Enlightenment descended and humanism began to eclipse Christianity, the mind of man slowly became the center of the world.
The authors call the social withdraw they captured in data a “natural human response” triggered by a change in environment, but they acknowledge their findings contradict an established theory about collective intelligence. When forced to share space, humans behave much like swarms of insects. This has appeared to be true in a range of contexts, the authors note, citing studies involving the US Congress, college dormitories, co-working spaces, and corporate buildings.
The problem doesn’t necessarily lie in the experiments (the gorilla suit in the middle of the basketball game, etc.), but in the conceptual ideas behind the experiments. “The assumption of human blindness or bias makes scientists themselves blind to the other, more positive aspects of human cognition and nature.”
About two dozen labs in the U.S. are studying “experimental aesthetics” – why we like what we like, and why we make art at all. “The mysteries of the aesthetic response, and the creative impulse, have become a burgeoning area of inquiry for scientific researchers across many disciplines. They hope quantifiable data and statistical analysis can help explain matters that some consider ineffable — like why we paint or sing, or why we naturally favor Van Gogh’s sunflowers over the landscapes we encounter in budget hotel rooms.”
We have been engineering our environments to more productively serve human needs for tens of millennia. We cleared forests for grasslands and agriculture. We selected and bred plants and animals that were more nutritious, fertile and abundant. It took six times as much farmland to feed a single person 9,000 years ago, at the dawn of the Neolithic revolution, than it does today, even as almost all of us eat much richer diets. What the palaeoarcheological record strongly suggests is that carrying capacity is not fixed. It is many orders of magnitude greater than it was when we began our journey on this planet.
While Plato and Aristotle were concerned with character-centred virtue ethics, the Aztec approach is perhaps better described as socially-centred virtue ethics. If the Aztecs were right, then ‘Western’ philosophers have been too focused on individuals, too reliant on assessments of character, and too optimistic about the individual’s ability to correct her own vices. Instead, according to the Aztecs, we should look around to our family and friends, as well as our ordinary rituals or routines, if we hope to lead a better, more worthwhile existence.
In many ways, the really improbable event of recent decades was the manner in which so much of the world experienced stability and predictability. What was the probability that we could, collectively, have created such an unprecedented quantity of wealth, health, and prosperity?
“From the tables of European royalty to a bag of 10 Hoodsies for $2.98 at Market Basket, the story of ice cream echoes that of the American experiment — democratization, fueled by technology, ingenuity, and mass marketing.” Ice cream figured in the assimilation of immigrants, and it was even tied up with Prohibition.
“I had been here for just a couple of months, and I was getting used to [Chef Bottura’s] style,” Canadian-born chef de partie Jessica Rosval told me when I visited the restaurant. “He burst into the kitchen one day and said, ‘Okay, everybody, new project for today: Lou Reed, Take a Walk on the Wild Side. Everybody make a dish.’ And I was just like, ‘Oh my gosh, where do I even start?’” But Rosval’s initial panic soon turned to excitement. “We created a wide variety of dishes,” she said. “Some people focused on the bass line of the song. Some people focused on the lyrics. Some people focused on the era in which the song was written. We had this diverse array of different plates that were created from this one moment of inspiration when Massimo had been listening to the song in his car.”
Last year, Hanson Robotics released its first consumer robot, Professor Einstein, a $199, 16-inch animatronic companion for kids that can answer questions, play brain games and discuss science and math. This year the company, which has about 50 employees, plans to release updates for Professor Einstein and to produce about 100 copies of Sophia and other human-sized robots. The androids function as programmable machines that can be used to train doctors, deliver therapies for depression, care for the elderly and interact with customers. Most importantly, Hanson is excited about all the functions people have yet to dream up. Imagine your iPhone without the apps.
From the beginning, in fact, Berners-Lee understood how the epic power of the Web would radically transform governments, businesses, societies. He also envisioned that his invention could, in the wrong hands, become a destroyer of worlds, as Robert Oppenheimer once infamously observed of his own creation.
Consider the holes in doughnuts. No, not the “doughnut holes” made out of the dough, because they’re clearly not holes. “If we do not take the removed dough to be the hole, then what do we take the hole to be? Are holes material things, where material things are physical (like tables and chairs), or are holes immaterial things, where immaterial things are not physical (like abstract entities)? Or are holes not even things at all?”
As the philosopher Noam Chomsky has said, “we will always learn more about human life and personality from novels than from scientific psychology” – something the critic and author David Lodge has explored. In his 2004 book Consciousness and the Novel, Lodge argues that “literature is a record of human consciousness, the richest and most comprehensive we have… The novel is arguably man’s most successful effort to describe the experience of individual human beings moving through space and time.”
The methods used to search for the subatomic components of the universe have nothing at all in common with the field geology methods in which I was trained in graduate school. Nor is something as apparently obvious as a commitment to empiricism a part of every scientific field. Many areas of theory development, in disciplines as disparate as physics and economics, have little contact with actual facts, while other fields now considered outside of science, such as history and textual analysis, are inherently empirical. Philosophers have pretty much given up on resolving what they call the “demarcation problem,” the search for definitive criteria to separate science from nonscience; maybe the best that can be hoped for is what John Dupré, invoking Wittgenstein, has called a “family resemblance” among fields we consider scientific. But scientists themselves haven’t given up on assuming that there is a single thing called “science” that the rest of the world should recognize as such.