Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: an0n on January 12, 2005, 02:57:18 pm

Title: Interesting Thought.....
Post by: an0n on January 12, 2005, 02:57:18 pm
Do you think, in the years to come - as artificial intelligence becomes more advanced and machines get smarter and smarter - that the rallying call for the inevitable rebellion in the more human-centric areas of the world (*cough*Redneck / Bible Belt*cough*) will be based around the fact that machines were the first to land on another world?

I mean, there will come a point when the machines begin to mark their own developments and achievements. I wonder how important the Spirit and Opportunity will be, in their cold, red, unwaivering eyes......
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 03:01:39 pm
Yeah, but then they'd just go in with their laser-beam optics and pointless flying squid creatures and scorch the area ;)
Title: Interesting Thought.....
Post by: Rictor on January 12, 2005, 03:03:56 pm
One things for sure, you won't live long enough to find out. Think on that while lasers are raining down on wherever it is that you live (presumably not in Keira Knightely).

Anyone remember Captain Power and the Soldiers of the Future?
Title: Interesting Thought.....
Post by: Janos on January 12, 2005, 03:29:26 pm
When the situation is dire, call for... Asimov!
Title: Re: Interesting Thought.....
Post by: Liberator on January 12, 2005, 03:38:49 pm
Quote
Originally posted by an0n
(*cough*Redneck / Bible Belt*cough*)


I resent the implication you limey bugger.

There is nothing inherent to the human form that is central to sentience.  If machines do ever become sentient, I assure you that it will not be because of any design element of man's.
Title: Re: Re: Interesting Thought.....
Post by: aldo_14 on January 12, 2005, 03:40:07 pm
Quote
Originally posted by Liberator


I resent the implication you limey bugger.

There is nothing inherent to the human form that is central to sentience.  If machines do ever become sentient, I assure you that it will not be because of any design element of man's.


More likely that than anything else.
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 03:42:00 pm
'You Limey Bugger'

LOL!!

Well done! :D
Title: Interesting Thought.....
Post by: Taristin on January 12, 2005, 03:43:03 pm
:lol:
Title: Interesting Thought.....
Post by: Genryu on January 12, 2005, 03:57:51 pm
Quote
Originally posted by Rictor
Anyone remember Captain Power and the Soldiers of the Future? [/B]


Isn't that the show with the CG ennemy with a=so much poly that you can count them on your fingers and a few toes ?:D
Title: Interesting Thought.....
Post by: Rictor on January 12, 2005, 04:18:11 pm
Sure is.

Quote
I assure you that it will not be because of any design element of man's

How else could it ever happen? Most likely not "design element" but "lack of design element", but its still man made.
Title: Interesting Thought.....
Post by: Night Hammer on January 12, 2005, 05:00:52 pm
a monkey is gonna take a dump in a robots head, and by doing so create a sinapse in the "robot brain" that a human scientist could never do?

i dunno.....:nervous:
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 05:11:14 pm
Funny thing is, Asimov didn't write 'I Robot', it was originally a short story about a Robot called Adam Link, by Eando Binder. This is confirmed, btw, in the introduction of 'The Complete Robot' by Mr Asimov himself ;) He used the title again though his own book is by no means a copy of the storyline as far as I know :)
Title: Interesting Thought.....
Post by: WMCoolmon on January 12, 2005, 06:20:45 pm
Quote
Originally posted by Flipside
Funny thing is, Asimov didn't write 'I Robot', it was originally a short story about a Robot called Adam Link, by Eando Binder. This is confirmed, btw, in the introduction of 'The Complete Robot' by Mr Asimov himself ;) He used the title again though his own book is by no means a copy of the storyline as far as I know :)


Wait. I thought 'I, Robot' was a book of Short Stories by Asimov? With Robbie, Dr. Calvin, etc etc?

Or is there another I,Robot? :confused:

Edit: Oh, and to respond to the thread/Lib...

I think it is possible for robots to turn on humans, even without being 'sentient'. Say you define an integrated robot. You give it moods - some things make it angry, some things make it happy. The various moods exist only as variables, but they affect other variables. Perhaps if the robot is angry it will do work slower, or not at all. And it will work to get rid of objects which it annoy it more often (ie get in the way).
Of course, you'd have to give it some sort of list of things it could do to get rid of things, probably a good physics engine so it could move around and move objects efficiently. If you add 'break object' to its get rid of list, and then get the robot angry enough, perhaps it will break you? It wouldn't be for any more advanced reason than a preprogrammed reaction.

Not only that, but the mood states would give a fairly good simulation of intelligence with a good enough parsing system and the ability to simulate emotions and hold grudges.
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 06:29:06 pm
There was previous one, by the same title, Isaac Asimov names it as one of the books that inspired him to start writing Robot books that weren't 'clank clank aaaargh!', as he put it :)

I think the copyright reached the old 50-year limit, back when that meant anything, and Isaac decided to re-use the title.
Title: Interesting Thought.....
Post by: WMCoolmon on January 12, 2005, 06:31:47 pm
Aaaahhhh.

Was the I, Robot movie based on that first book then?
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 06:33:50 pm
I'm not certain to be honest with you, I've never been able to get my hands on the book :( And I still haven't gotten round to watching the movie either.

Doesn't it have Asimov's name all over the posters though? I can't remember? Seems a bit odd to make a movie of Asimov's I,Robot, it'd play like 'The Twilight Zone'...
Title: Interesting Thought.....
Post by: WMCoolmon on January 12, 2005, 06:42:04 pm
I saw it - and enjoyed it - but it clearly wasn't Asimov. It had more action, and human vs. robot action, than you'll find in Asimov's books.

There were references (US Robotics, the three laws, Dr. Calvin) but they seemed more references to Asimov than taken from any specific story.
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 06:51:34 pm
Ah... I see.... one of those movies....

:Koff: :Koff: War of the Worlds :Koff:

Oh well, I expect I'll watch it at some point, and probably enjoy it regardless :)
Title: Interesting Thought.....
Post by: Liberator on January 12, 2005, 07:40:10 pm
Quote
Originally posted by Rictor
Sure is.


How else could it ever happen? Most likely not "design element" but "lack of design element", but its still man made.


My point was this:

Creation of sentience is above Man's ability.

beyond genetic technologies, beyond cloning, that is God's domain.

We cannot create sentience, only, perhaps, provide a vessel for it.  I can see a time when Man will be able to transfer his conciousness into a machine(although anyone that does this would no longer be technically human, and the definition of "alive" would have to be rewritten, and then there's the whole "core/arm" thing).  But, no matter how advanced the processing array or complex the programming routine, it is simply beyond Man's capability to create a sentient machine.
Title: Interesting Thought.....
Post by: dan87uk on January 12, 2005, 07:51:09 pm
i agree with liberator, of course i still like to speculate and i have an open mind, but until im proved wrong i agree with lib
Title: Interesting Thought.....
Post by: WMCoolmon on January 12, 2005, 07:51:09 pm
How do you define sentience, Lib?

You mention genetic technologies and cloning, which were both things that people believed were 'God's domain' in the past, but which have been done successfully in at least one or two instances. How is sentience any different?
Title: Interesting Thought.....
Post by: Liberator on January 12, 2005, 08:08:19 pm
I define sentience as the indefinable element that puts Humans above other forms of life(even the Great Apes).  Some animals are intelligent, some incredibly so.  Some are capable of amazing displays of emotion.  Some are even capable of recognizing themselves in a photo or mirror.  But with all that, there is some element of Humanity that eludes scientific quantification and/or qualification.  That is sentience.
Title: Interesting Thought.....
Post by: pyro-manic on January 12, 2005, 08:29:54 pm
Hmmm. So you can't define it then? :p

I think that true AI will be created (give it perhaps a century or so), but as long as the Laws are implemented, then there's nothing to worry about. Ideally, the machines would turn out as somthing like the Minds from Iain Banks' Culture novels.

Without the Laws, though, we're going to get a very nasty surprise.
Title: Interesting Thought.....
Post by: Flipside on January 12, 2005, 08:43:04 pm
Sentience is, to a certain degree, knowledge of 'Me', though most creatures have this, humans are among a small group that seem to have a long-term grasp of the phrase 'Not me anymore' or 'Dead Me'.

Humans seem to take it a step further though, we dream up the concept of 'revenge', of which there have been few proven cases in other species. Yet our history is absolutely covered in it ;) The blessing of technology always carries it's own curse, in this case it was the written word, and the holding of grudges. Maybe that is where we will go wrong? Maybe we will transfer memory from upgrade to upgrade, keeping the old AI 'Personality' but allowing it to learn to the point where it can become too aware of the information it has access to? For it to start to develop a concept of what is 'right' and 'wrong' and how it has changed depending on the needs of the times? How much logic would it take to compare the situation of slaves history? But the real big question is 'how long would it take for a machine to care?.
Title: Interesting Thought.....
Post by: vyper on January 13, 2005, 01:55:19 am
[q]ut, no matter how advanced the processing array or complex the programming routine, it is simply beyond Man's capability to create a sentient machine.[/q]

There is no limit to man's ability at any one moment save time itself, he will always learn to overcome.
Title: Interesting Thought.....
Post by: Liberator on January 13, 2005, 02:30:21 am
He can make a machine to seem sentient, to emulate different behaviors.  But a machine cannot be built to comprehend the concept of Me and recognize that it exists in more than the now.  Part of sentience is the understanding that there is more than Now.  Animals can seem like they understand, but their memory only extends to Man == Good, Man == Bad and other assorted Object specific behaviors.  They will remember you feeding them or being nice to them, but that is associated with you in the Now not that you were nice yesterday or last week.
Title: Interesting Thought.....
Post by: Flipside on January 13, 2005, 04:31:40 am
Well, animals can recognise patterns, if you put food in the same place, at the same time every week, you can be pretty sure after a few weeks the animals will be there waiting for you when you get there.

Man is different as in he would probably be waiting to mug you of all the food, so that he can be the guy with the food instead.

I think simply because we don't understand how the concept of 'Me' and 'Not Me' works does not mean we will never find out in the future. Our greatest asset is our media, and Artifical Intelligence is born of that same Media.
Title: Interesting Thought.....
Post by: aldo_14 on January 13, 2005, 07:26:55 am
Quote
Originally posted by Liberator
I define sentience as the indefinable element that puts Humans above other forms of life(even the Great Apes).  Some animals are intelligent, some incredibly so.  Some are capable of amazing displays of emotion.  Some are even capable of recognizing themselves in a photo or mirror.  But with all that, there is some element of Humanity that eludes scientific quantification and/or qualification.  That is sentience.


If you can't define sentience, then you can't define what is not sentient.  Likewise for intelligence; this is the inherent problem faced by both philosphy and artificial intelligence (amongst others).

Regardless of whether sentience itself can be defined as the work of a supreme diety or dieties, or whether it is an effect of some biological / physiological / environmental process, so long as we cannot define it, we cannot claim to be ablse to say whether or not we can create it.

Also, it's very easy to 'miss' intelligence by characterising it in terms of human behaviour and perceptions; how can we really judge the intelligence and sentience of sea-life, for example, when we cannot even experience that environment as they do?

Anyways, everyone knows humans are only the 3rd most intelligent species on Earth.
Title: Interesting Thought.....
Post by: Bobboau on January 13, 2005, 11:28:24 am
well if you'r going to define sentience as the thing man kind will never be able to create, or that wich cannot be defined, then Yeah, I guess we'll never make it.

though with this sort of thing I don't think well be directly makeing it ether, I do think  however we will make the thing that makes the thing, a program capable of learning and modifying it'self doesn't seem outside the grasp of humanity, and that + time is all it needs.
Title: Interesting Thought.....
Post by: karajorma on January 13, 2005, 02:42:11 pm
Quote
Originally posted by Liberator
I define sentience as the indefinable element that puts Humans above other forms of life(even the Great Apes).


That's just a cop-out. No matter what scientists do, no matter how smart a computer you'd simply say that it lacks that thing and claim that it's therefore not sentient.

Doesn't matter if the machine can compose three different arias at the same time depending on its mood while similtaniously taking and passing the Turin test you'd still claim it lacked a soul and therefore wasn't sentient.

Then again you'd probably say the same if someone thawed out a neanderthal and taught it to speak English.