28181. uzmakk - 4/18/2006 7:31:50 PM And, of course, his name is Atheist. Hup hup...excuse me...I think we're off and running. 28182. Adam Selene - 4/18/2006 7:37:23 PM Thanks, Arky.
Thanks uzmakk... uhh...
Well, if you're serious about your doubts, I'd realistically say I'm about 2% agnostic, 98% atheist. Is that better? I mean, I'm almost sure I'm an atheist. I certainly haven't met a religion yet that I didn't find absurd, but I do have a lingering sense that there are other kinds of consiousness - like, for example, a brain cell would feel as part of our minds. No identity, but just a part of something bigger. And who knows - if time is really infinite (and there is no big crunch coming without a periodic bang... and there is no endless entropic expansion....) then we should all meet again in endless variations. 28183. uzmakk - 4/18/2006 8:46:40 PM Oh meh God. 28184. Adam Selene - 4/19/2006 1:54:00 AM You hava a God? Who knew? ;) 28185. Adam Selene - 4/19/2006 1:25:11 PM I have a question for this crowd.
[b]Is it possible to create an "artifical" intelligence that is not really artificial, but just as real as you or me?[/b]
In other words, would an artificial life ever be accepted as equal to another person? I guess that in the world where even infidels and minus-one-day-old kids are "not human," perhaps this is asking too much. But is it even conceivable? 28186. Adam Selene - 4/19/2006 1:25:36 PM (Oh, how I wish for an edit button....) 28187. iiibbb - 4/19/2006 4:08:59 PM I think for artificial inteligence to be "real" it would have to cross some undefinable (by me) boundary where the algorithms used to create it no longer define it.
28188. Adam Selene - 4/19/2006 4:36:53 PM iiibbb, so, you think it's feasible? Even if "god" didn't "breath life" into it, and it didn't have a "soul," you'd still grant it all the rights and respect that we give human life?
Here's the kicker... would you convict someone to the death penalty for killing it? 28189. iiibbb - 4/19/2006 6:19:50 PM Given the way we have figured out technology I certainly think it is feasible... I'm not sure how we will impliment technology... perhaps the boundary between organic and silicon will become very clouded. Perhaps you could argue that we might not be able to do it without God's hand... and then is it our creation or Gods (are test-tube babies, our creation or God's? Are clones, our creation or God's?...)
Before we can really say whether we will successfully cross the boundary, we must define the boundary or boundaries that define thought and consciousness.
Some would argue life must be self-aware... it must replicate... it must adapt/evolve without guidance from the creator. It must be somewhat unpredictible.
I think the vision that closest matches my vision of what such a technology would be like is that of Blade Runner... the androids appear to be as much organic as inorganic... obviously using DNA as their platform.
Interesting philosphical question about penalizing someone for killing it. Quite difficult for me to get my head around that one. It probably quite depends on the circumstances and how much of a realization of true consciousness I guess.
Blade-Runner does a good job with this question too. I think human nature would respond to it across a spectrum of those who look at it as a product, to those who look at it as real, to those who fear it. 28190. sakonige - 4/19/2006 6:57:07 PM Is it possible to create an "artifical" intelligence that is not really artificial, but just as real as you or me?
sure. You can see a range of intelligence and self-awareness in animals. At some point in that continuum, you begin to feel empathy for the organism. That's when it becomes as real to you as you or me. The same would hold true for artifacts. At some level of machine self-awareness, you would begin to empathize. You would know how the thing felt. 28191. sakonige - 4/19/2006 7:03:03 PM I've got a cat resting his cheek on my wrist that is an incredibly complex, loving, and intelligent being. He's as "real" as you or me.
The little spider making its way down the window screen is real, too.
The moss at the edge of the flowerbed, it's alive, but just barely aware.
The computer on the desk is only animated.
28192. PelleNilsson - 4/19/2006 7:04:41 PM We need to define artificial intelligence. The accepted definition we have is the Turing test: if you correspond with somebody/something and you cannot determine if it is a human or a machine, then in case it is a machine, it has intelligence.
If by "intelligence" you mean knowledge and the ability to apply it in a logical way, I think it can be done. But if you mean, in Adam's words, "just as real as you or me" I think not. 28193. iiibbb - 4/19/2006 7:21:07 PM I don't know if the Turing test is the accepted definition. There are many exceptions by which a machine could fool a judge. 28194. iiibbb - 4/19/2006 7:24:02 PM Koko 28195. iiibbb - 4/19/2006 7:27:41 PM All Ball 28196. PelleNilsson - 4/19/2006 8:41:20 PM From your Wiki link:
Turing predicted that machines would eventually be able to pass the test. In fact, he estimated that by the year 2000, machines with 109 bits (about 119 MiB) of memory would be able to fool 30% of human judges during a 5-minute test.
Do you really think that is good enough? Shouldn't they be capable of fooling 100% of the judges indefinetely?
But more importantly, do you think that a machine could be as real as "you or me"?
28197. Adam Selene - 4/19/2006 9:11:16 PM The Turing test isn't as interesting to me as it was before I played with Eliza and that ilk of computer program. You could make the case that Eliza already passed the Turing test - at least it fooled a lot of people for many minutes (and some for a lot longer.)
But I was more thinking of the philosopical issues... is there anything inherent about being man-made that precludes any "real" intelligence? I guess a religous person would phrase it... could it have a soul?" 28198. Adam Selene - 4/19/2006 9:16:39 PM I like sakonige's criteria - empathy. If you "know" what it's feeling, if you can predict it's behavior due to it's similarity with your own reactions, you would feel that it's really human (or at least, an animal.)
There may be other "kinds" of intelligences than what we consider human, but without that empathy we'd probably never grant it any significant human rights. Rightly or wrongly - anything too different from a human mockup will probably never be accepted. Hell, we as a species have had a hard enough time accepting those with different skin colors and/or religous beliefs as human. 28199. iiibbb - 4/19/2006 10:11:12 PM The turing test breaks down in light of Koko... as well as a lot of mentally deficient humans.
I'm more likely to buy the empathy argument... or at least some sort of introspective capacity... the capacity for abstract thought is another good "test"... the capacity to link seemingly unrelated topics.
I think it might be very hard for us to judge something we made. 28200. alistairConnor - 4/19/2006 10:12:01 PM would you convict someone to the death penalty for killing it?
That's easy...
I would never convict anyone to the death penalty, for killing anything.
|