Originally posted by CP5670
What would define a sentient AI anyway?
Self awareness. I'm not talking about "emotions" or anything like that - those concepts would probably be pretty much abstract for it, even though it would understand them. I'm talking about an artificial intelligence that knows it exists and knows what it is - even though it is dedicated to performing any task the programmer designed it for.
And again, why the "no" answers? You just said no, without any particular reason. Is it fear of "evolutionary replacement"? Fear of the thing taking over the world? Or anything else?
I myself am for them - in fact, if I had the time I'd start working on a project to construct one, I already have the basic designs fleshed out on my mind (it would have to be one huge-ass collaborative project, though). They would be faster, "smarter", and more reliable than humans, but still would do whatever the programmer (myself, in that case) coded them to do. Put a good one on serious research on multidimensional physics with all the reference material it might want, for example, and wait for the results.
