New platform allows AI to learn ‘how to do almost anything’ on a computer – what does this mean for humanity?

keyboard

Artificial intelligence is one of the biggest threats to our future imaginable. If not treated properly, humanity is extremely likely to completely be destroyed by artificial intelligence if we continue to allow it to grow in power. Once it reaches a certain level, it will be impossible to stop — and that is an extremely frightening revelation.

Sadly, that hasn’t prevented people from trying to play God and create new forms of powerful artificial intelligence. In fact, a new platform titled Universe — which has been created by the nonprofit Open AI — allows artificial intelligence programs to learn how to use a computer.

Will Knight of MIT Technology Review reports, “Universe will provide way for AI researchers to develop and test algorithms capable of learning to perform a broad range of tasks—a step towards more general types of artificial intelligence. The hope is that it will lead to artificial agents that can learn a wide range of different tasks, and then take what they’ve learned in one setting and apply it to a different one.”

Of course, there are guaranteed to be a number of awful side effects from this. By giving artificial intelligence too much power, we pretty much ensure that we will suffer the consequences. Since AI doesn’t have the ability to understand emotion or empathize with humanity, there is nothing to prevent them from wanting to destroy us all. We are merely a roadblock between them and their goals. If that isn’t alarming to you, then you must not understand the implications.

While Open AI most definitely has a right to do whatever they want with their company, they really should consider the dangerous possibilities of their technological advancements. The federal government should not step in — at least not until it is proven to be a legitimate threat — but that doesn’t mean that everyone who is playing around with artificial intelligence should ignore the truth of the situation.

 

Sources:

TechnologyReview.com

Wired.com