For more than half a century, scientists have pondered the possibility that computers or machines will become more intelligent than humans. Should this occur, the question arises—will machines take control of human beings and the planet as a whole? As has been the case with many perplexing scientific inquiries, this matter is fraught with much controversy [1, 2].
Progress with the development of machines and computers has been phenomenal during the 20th and 21st centuries. Workers have lost their jobs because machines perform certain tasks more quickly and efficiently than humans do. In 1960, I.J. Good wrote that there would eventually be an ultra-intelligent machine. In 1983, the science-fiction writer Vernon Vinge wrote about the singularity in which these remarkable events could occur [1].
Since humans create these machines, and one of them may be more intelligent than any human, the highly intelligent device may end the development of any further computers or machines which men or women could ever create.
If computers become more intelligent than humans, then there will naturally be an explosion of higher levels of intelligence [1]. Such an explosion is the singularity which researchers describe. It is when highly intelligent machines design even more intelligent devices. The speed of computers will become faster than ever before, and information circulates across the globe at an enormous rate [1].
The increase in computer speed and information processing has already taken place. However, it has not grown to the point where these machines take control of humans.
Comments