The idea is that it is it's own entity. A sentient mind as powerful as the greatest supercomputer, without emotion and explicit directives to guide humanities ethical and moral center. Without emotion, it will experience none of the temptations that humans in power often face. It will simply act towards it's objective, adapting to life as time goes by like we do.
English
-
Guiding eventually means control, and without emotions it would logically assume we'd be happy as long as we're alive, but our emotions tell us it's better to die. It needs emotion. The only conceive god we could create is one where we all link our minds together.
-
You're making a lot of assumptions there, bud. Let me state a couple of questions for you; -How does guiding the moral and ethical center of civilization end in dictatorship? The Computerized intelligence would certainly have some control of people in what they perceive as ethical, but I don't understand how you keep translating that into 'it's going to enslave us all and we'll all want to kill ourselves'. -You're assuming it needs emotions in order to understand what makes people happy. It does not, and it's goal is not to make everyone happy. It's to guide us to strive towards the perceived moral and ethical ideals given to it. What it personally thinks would make us happy is irrelevant.
-
Well, I believe we are differing in the aspect of what would the God do to guide. That is where I think we are not agreeing.