CONTAINMENT: Because of the containment problem I just mentioned I don't believe that humanity will ever develop artificial intelligence that rivals our own. I don't think that we'll ever be able to understand the true nature of our minds, because our comprehension is itself contained within that system. That's why research into AI relies heavily on trying to generate emergent effects -- complex results from simple inputs. Getting complex results from complex inputs is easy, but so far no one has been able to come up with a system of rules that fully describes human behavior. Constructing simple inputs is easy, but it's almost always impossible to get more complexity out of a system than you put into it in the first place.



Email blogmasterofnoneATgmailDOTcom for text link and key word rates.

Site Info