The fandom, especially the fandom among women, for this podcast is reaching pop-star levels. “Calling themselves Murderinos, they came to hear expletive-laden tales of serial killings and brutal homicides told by Georgia Hardstark and Karen Kilgariff, the irreverent hosts of the wildly popular true-crime comedy podcast ‘My Favorite Murder.'” (One of the host’s offhand comments, “Toxic masculinity ruins the party again,” has become something of a rallying cry.)
There is a world that exists—an uncountable number of differently-flavored quarks bouncing up against each other. There is a world that we perceive—a hallucination generated by about a pound and a half of electrified meat encased by our skulls. Connecting the two, or conveying accurately our own personal hallucination to someone else, is the central problem of being human. Everyone’s brain makes a little world out of sensory input, and everyone’s world is just a little bit different.
Like the New York intellectuals who had clustered around Commentary and the Partisan Review in the Sixties, and partly in conscious imitation of them, the writers and editors of the new magazines blended art, criticism, philosophy and self-examination in the confidence that these activities would all be, when carried out with a sufficient level of clarity and insight, mutually re-inforcing.
A Dartmouth-led study finds that the brain may tune towards social learning even when it is at rest. The findings published in an advance article of Cerebral Cortex, demonstrate empirically for the first time how two regions of the brain experience increased connectivity during rest after encoding new social information.
Learning to see is not an innate gift; it is an iterative process, always in flux and constituted by the culture in which we find ourselves and the tools we have to hand. Harriot’s 6-power telescope certainly didn’t provide him with the level of detail of Galileo’s 20-power. Yet the historian Samuel Y Edgerton has argued that Harriot’s initial (and literal) lack of vision had more to do with his ignorance of chiaroscuro – a technique from the visual arts first brought to full development by Italian artists in the late 15th century.
For many, the idea of the ‘creative person’ comes from popular media, which inundates us with news stories and movie portrayals of the suffering artist and the mad genius. And there are anecdotal accounts closer to our real lives: many of us have heard stories about someone who suffers from a deep depression – but also creates beautiful poetry. Repeatedly hearing these accounts fuels a stereotype. When we frequently see two unique things (eg, extraordinary creativity and mood disorders) occur together, they become paired in our minds, creating what is termed an illusory correlation.
If one looks at history, the answer seems obvious: What fences have very often indicated is not simply what is mine and what is yours, but, more subtly, who I am versus who you are. This tendency is based on the human inclination to define one’s identity in contrast to someone cast as a different, an untrustworthy Other best kept at a distance.
That’s what a pair of researchers found when studying firefighters in Colorado and undergraduates in London. “When you experience stressful events, whether personal (waiting for a medical diagnosis) or public (political turmoil), a physiological change is triggered that can cause you to take in any sort of warning and become fixated on what might go wrong.”
“Inundated via social media with the opinions of multitudes, users are diverted from introspection; in truth many technophiles use the internet to avoid the solitude they dread. All of these pressures weaken the fortitude required to develop and sustain convictions that can be implemented only by traveling a lonely road, which is the essence of creativity.”
“My work consists entirely of creating the conditions for genius to flow. I am not in possession of it — it resides in that flow of output, which everyone participates in. “Genius” is the oxygen that those in a shared space breathe in and are transformed by; it allows them to reach their full potential. In this way, “genius” returns to its original Latin meaning of an “attendant spirit.”
The act of gesturing seems to be universal (every known human group does it), and we know that there are certain gestures that are culture-specific. (There are places where you definitely shouldn’t make the thumb-and-forefinger “okay” sign.) “What people produce much more often are gestures for ‘yes’ and ‘no’; points to people, places and things; gestures that sketch objects, actions and represent abstract ideas through visual metaphors. These are the real workhorses of gestural expression. And, as it turns out, a case can be made that these workhorses are broadly similar the world over.”
China’s booming start-up scene has become as much a feature of its top-tier cities as traffic and smog. It used to be that college graduates applied for jobs at banks or state-owned enterprises, the proverbial “iron rice bowl” that their parents sought for them after the chaos of the Cultural Revolution. But many of those jobs were unsatisfying: In a 2012 Gallup survey, 94 percent of Chinese respondents said they were unengaged with their jobs. Now, with public and private funding flowing into Chinese start-ups, entrepreneurship has become an appealing alternative for a generation disillusioned with the conveyor-belt career paths of their forebears.
‘Who are you to tell me what to believe?’ replies the zealot. It is a misguided challenge: it implies that certifying one’s beliefs is a matter of someone’s authority. It ignores the role of reality. Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way.
We should let go of the idea that our technologies are us, that we are somehow the sum total of the platforms we use… Just maybe, if more people can be convinced that this wealth of culture offers them a mirror to themselves, they might be willing to put down the phone for a few minutes and gaze inside.
Technology is no longer a novelty—it’s a given. And artists, who might have in the past approached technological advancement with a hint of idealistic curiosity, now question the impact it’s had on the way humans interact with one another. This tension is ripe territory for artists, who are often more interested in creating provocations around technology than they are in building practical applications.
It’s a step more useful for midterm politics than for reality, unfortunately: “All 49 members of the Democratic caucus are in favor of the resolution, along with Sen. Susan Collins (R-Maine). If it passes, the resolution still faces a tough vote in the House, as well as the signature of President Donald Trump.”
Turns out USians are most afraid, right now, that government officials are corrupt, and afraid for the planet, and afraid of losing health care. Whew, whatever happened to public speaking? Well, we’re complex: “We’re simultaneously too primitive and too evolved for our own good. Our lizard brains are ruthlessly efficient.”
Sweden’s war on cash has changed a lot about the country, including how robbers operate. “As Sweden’s supply of banknotes continues to dwindle, criminals have shown new enthusiasm for the endangered-species black market, previously cornered by reptile wranglers and orchid thieves. Crimes involving protected species recently reached their highest level in a decade. A single great gray owl — known as the ‘phantom of the north’ — now goes for 1 million kronor (about $120,000) on the dark web.”
This widespread rejection of scientific findings presents a perplexing puzzle to those of us who value an evidence-based approach to knowledge and policy. Yet many science deniers do cite empirical evidence. The problem is that they do so in invalid, misleading ways. Psychological research illuminates these ways.
While in the new millennium the quality of French intellectual life has plummeted, its reputation remains. Shlomo Sand bracingly compares media-friendly intellectuals such as Houellebecq, Éric Zemmour and Alain Finkielkraut to Nazi-collaborating writers such as Robert Brasillach and Pierre Drieu La Rochelle. Like such past figures, Sand argues, they cling to a France that is “totally imaginary” and yearn for it to be purified of the Other. In 1940 that meant Jews, in 2018 Islam.
A clearer sense of the greater science ecosystem is required to figure out what role science should play and how society can best make that happen. Who gets to do research in the 21st century, and why? How has it changed over time? Is science in good shape, and how can we know? When I started asking these questions I realized there’s a lot that even scientists still don’t know about themselves.
In the United States, theory has become a utopian experiment and experience: it exists alongside increasingly historicist literary studies as a site of mixture and reprieve; it promises, for example, to help literary scholars moonlight as media theorists and art historians, while reminding them to consider the horrors of colonialism and the errors of the Enlightenment. Meanwhile, it makes the rounds online, on social media, in popular music, in art world press releases, and in the New York Times, decontextualized and meme-like, sometimes the stuff of conspiracy and outrage and at others the balm of empathy.
Sometimes, the word ‘beauty’ aspires to the solidity of a proper noun, grand and true. Other times, it seems a more nebulous term for an elusive kind of experience. We can be careless about the beautiful, shrugging it off as a matter of mere appearance. It is not grave like the stuff of our political lives, or profound like our moral considerations. Certainly, we know to admire the beautiful in its different forms – a painting, a song, a building, sometimes even an act or a gesture – and we might go so far as to believe that our engagement with beautiful things constitutes a deep and meaningful experience, as though it were a momentary pause in the hectic thoroughfare of our lives. But we rarely permit matters of beauty the same seriousness that we customarily grant big ideas such as ‘democracy’ and ‘justice’.
A central tension in the field, one that muddies the timeline, is how “the Singularity”—the point when technology becomes so masterly it takes over for good—will arrive. Will it come on little cat feet, a “slow takeoff” predicated on incremental advances in A.N.I., taking the form of a data miner merged with a virtual-reality system and a natural-language translator, all uploaded into a Roomba? Or will it be the Godzilla stomp of a “hard takeoff,” in which some as yet unimagined algorithm is suddenly incarnated in a robot overlord?
Can we have knowledge of the past? Does science progress toward a more truthful apperception of the physical world? Or is it all a matter of opinion, a sociological phenomenon that reflects consensus, not truth? Unfettered emission of greenhouse gases promotes global warming. Species evolve through natural selection. Can we meaningfully assess the truth of these assertions?
“Lost in the public’s romance with the brain is the most fundamental lesson neuroscience has to teach us: that the organ of our minds is a purely physical entity, conceptually and causally embedded in the natural world. Although the brain is required for almost everything we do, it never works alone. Instead, its function is inextricably linked to the body and to the environment around it. The interdependence of these factors is masked however by a cultural phenomenon I call the ‘cerebral mystique’ – a pervasive idealisation of the brain and its singular importance, which protects traditional conceptions about differences between mind and body, the freedom of will and the nature of thought itself.”