In July of 2019, Microsoft invested a billion dollars, which allowed OpenAI to create a supercomputer with two hundred and eighty-five thousand C.P.U. cores, ten thousand G.P.U.s, and four hundred gigabits per second of network connectivity per server. Microsoft claims that it ranks in the top five supercomputers in the world, processing more than twenty-three thousand teraflops per second. The power of the supercomputer has been transformative. GPT-2, which John Seabrook took for a test drive in 2019, asking it to write an article for The New Yorker, had 1.5 billion parameters. GPT-3 has a hundred and seventy-five billion. – The New Yorker

Previous articleIf You Don’t Start Until Your Teens, Can You Still Make It In Ballet?
Next articleBlackface Didn’t Start With American Minstrel Shows. It’s Been Around For Centuries