![]() ![]() It will make our governments rational and responsive. It will unleash lives of leisure and contemplation, helping us reclaim the humanity we have lost to late capitalist mechanization. It will make our jobs more meaningful and exciting. Generative AI will end poverty, they tell us. These folks are just tripping: seeing, or at least claiming to see, evidence that is not there at all, even conjuring entire worlds that will put their products to use for our universal elevation and education. Here I am defining hallucination not in the mystical or psychedelic sense, mind-altered states that can indeed assist in accessing profound, previously unperceived truths. ![]() Warped hallucinations are indeed afoot in the world of AI, however – but it’s not the bots that are having them it’s the tech CEOs who unleashed them, along with a phalanx of their fans, who are in the grips of wild hallucinations, both individually and collectively. How else could bots like Bing and Bard be tripping out there in the ether? By appropriating a word commonly used in psychology, psychedelics and various forms of mysticism, AI’s boosters, while acknowledging the fallibility of their machines, are simultaneously feeding the sector’s most cherished mythology: that by building these large language models, and training them on everything that we humans have written, said and represented visually, they are in the process of birthing an animate intelligence on the cusp of sparking an evolutionary leap for our species. That’s true – but why call the errors “hallucinations” at all? Why not algorithmic junk? Or glitches? Well, hallucination refers to the mysterious capacity of the human brain to perceive phenomena that are not present, at least not in conventional, materialist terms. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |