- Reaction score
- 1,734
Some AI experts have begun to confront the 'Singularity.' What they see scares them.
In 1993, computer scientist and sci-fi author Vernor Vinge predicted that within three decades, we would have the technology to create a form of intelligence that surpasses our own. “Shortly after, the human era will be ended,” Vinge said.
As it happens, 30 years later, the idea of an artificially created entity that can surpass—or at least match—human capabilities is no longer the domain of speculators and authors. Ranks of AI researchers and tech investors are seeking what they call artificial general intelligence (AGI): an entity capable of human-level performance at all kinds of intellectual tasks. If humans produce a successful AGI, some researchers now believe, “the end of the human era” will no longer be a vague, distant possibility.
Futurists often credit Vinge with popularizing what many commentators have called “the Singularity.” He believed that technological progress could eventually spawn an entity with capabilities surpassing the human brain. Its introduction to society would warp the world beyond recognition—a “change comparable to the rise of human life on Earth,” in Vinge’s own words.
Perhaps it’s easiest to imagine Singularity as a powerful AI, but Vinge envisioned it in other ways. Biotech or electronic enhancements might tweak the human brain to be faster and smarter, combining, say, the human mind’s intuition and creativity with a computer’s processor and information access to perform superhuman feats. Or as a more mundane example, consider how the average smartphone user has powers that would awe a time traveler from 1993.
In 1993, computer scientist and sci-fi author Vernor Vinge predicted that within three decades, we would have the technology to create a form of intelligence that surpasses our own. “Shortly after, the human era will be ended,” Vinge said.
As it happens, 30 years later, the idea of an artificially created entity that can surpass—or at least match—human capabilities is no longer the domain of speculators and authors. Ranks of AI researchers and tech investors are seeking what they call artificial general intelligence (AGI): an entity capable of human-level performance at all kinds of intellectual tasks. If humans produce a successful AGI, some researchers now believe, “the end of the human era” will no longer be a vague, distant possibility.
Futurists often credit Vinge with popularizing what many commentators have called “the Singularity.” He believed that technological progress could eventually spawn an entity with capabilities surpassing the human brain. Its introduction to society would warp the world beyond recognition—a “change comparable to the rise of human life on Earth,” in Vinge’s own words.
Perhaps it’s easiest to imagine Singularity as a powerful AI, but Vinge envisioned it in other ways. Biotech or electronic enhancements might tweak the human brain to be faster and smarter, combining, say, the human mind’s intuition and creativity with a computer’s processor and information access to perform superhuman feats. Or as a more mundane example, consider how the average smartphone user has powers that would awe a time traveler from 1993.
Even AI scientists fear the idea of superhuman intelligence
We don’t fully understand why many AI systems behave in the ways they do—a problem that may never disappear.
www.popsci.com