You seem really
confused on EY's position on AI. Read his paper here:http://yudkowsky.net/singularity/ai-risk
Basically the problem he's trying to fix is that the universe doesn't have any built-in virus protection. You seem to be saying the answer to that problem is "don't make viruses". But your solution won't work, because there are idiots out there who will (accidentally or on purpose) make viruses the moment you hand them enough computing power to do so. The solution is to create good virus protection.