Help - Search - Members - Calendar
Full Version: Someone's claimed that they've created a SK.
Dumpshock Forums > Discussion > Shadowrun
SpasticTeapot
http://biz.yahoo.com/prnews/051202/clf017.html?.v=33

Cool, no?
Drace
Its basically a very advanced Know-bot/expert system. Some hospitals use something much akin to this, known as E.L.I.Z.A., which is a medical diagnostical bot.

The closest thing to a true AI is still only a hypothetical quantum computer that would use trinary, rather than binary (the AI from Stealth was based of this hypothetical computer), but even then, it would still only be semi-autonomous, for it would still need imput instructions, it wouldn't work on its own accord.
Chibu
Even if, eventually, you didn't need more input, it would still use alrogithms to determine the best course of action

The reason I (as a computer programmer) don't think that an AI is possible is that for it to be full an AI, it has to be able to make mistakes. Humans do use alot of information-based intupt to produce an output, but, with current computers, it's not really possible. What defines sentience, in my oh so humble opinion, is the ability to invent and to create beauty. A computer cannot create somethign new, it can, however, take what is already invented, and put it into the most efficient pattern to create something new.

To create an Intelligence, it would have to be able to do the following things (well, for me to recognize it as such anyway:

#1. Create Beauty: Any form is alright. Take programming. When asked "Why did you choose that piece of code?" I have answered, "Because it makes it beautiful. It's inventive. It makes the code LOOK good." it's not always "becuase it id the most efficient way to do it"

#2. Infer: To be able to come up with an answer withought a complete set of data. Such as, when reading a new word, I sometimes understand it becuase of the way it's used. Though, this applies to all situations

#3. Make Mistakes: Because sometimes it makes sense for two plus two to equal five.

#4. Emotion: I believe that emotion and sentience go hand in hand. If you can't be sorry for something, you are not sentient. If you can't owrry about anything, you're probibly not sentient.

#5. Self Aware: You have to know that you exist.

#6. Belief: You have to be able to believe in something that you don't know. For instance, I believe in a God that created everything, and I believe that my girlfriend is not cheating on me. Though, there is no proof of these, becuase they are things that are not possible to prove. Never the less, I still know them to be true.

I think that's all. I don't really know what I started talking about, but, that's not important. (Maybe I should add "Ability to ramble" to that list)

But, an SK is still pretty cool nyahnyah.gif I'm glad that tech is progressing as fast as it is.

(EDIT: Wow, i just read that, and i'm not sure what train of thought led me to ramble about how AI can't exist. But, it pretty much proves that i'm sentient nyahnyah.gif)
Oracle
According to your definition I am not an intelligence / alive. frown.gif
Arethusa
I'm not even going to touch belief. Suffice to say I am agnostic, and apparently therefore not alive and or intelligent.

QUOTE (Chibu)
#4. Emotion: I believe that emotion and sentience go hand in hand. If you can't be sorry for something, you are not sentient. If you can't owrry about anything, you're probibly not sentient.

You would consider a sociopath, then to be not alive/intelligent?
Critias
That wasn't me that said that. I'm not sure why your quote-thingie says Critias.
Arethusa
Crap, sorry about that. I'm not sure either.
Critias
I mean, it's no big deal, or anything. I wasn't offended, or feeling as though I was being illegally misrepresented, or whatnot. Just confused.
Arethusa
It was most likely a combination of extreme sleep deprivation, distraction, misreading Chibu, and then forgetting to correct it afterwards.

I guess this means you have to kill me now in accordance with forum law.
Chibu
Belief doean't have to be in a god, like i said in the example about the girlfriend.

and, yes, I've since realized that emotion is not necessarily a defining term. That is more of a list of "if someone has these things, then they are sentient. Otherwise, it's up to you to determine. And, also, with Belief, comes disbelief. So, in questioning me, you prove your sentience. nyahnyah.gif
Arethusa
Yeah, but agnosticism isn't necessarily limited to beliefs about religion. One can take an agnostic stance on all of reality, at which point, it is quite broad enough to disqualify that criterion. Hell, if we've created an artificial intelligence that doubts its own existence and questions the nature of reality, I'm damn well ready to call it truly intelligent.
Adarael
By definition, agnosticism taken to an extreme constitutes a belief. Or even not to an extreme. Even if that agnosticism says, "There's a buncha crap out there that I'm never going to know about," it constitutes a general faith in the structures of your life and a willingness to act on the belief that they are consistent, *yet* by virtue of adaptability, not totally shut down when a pattern changes.

If you're so agnostic you care about nothing, that's just an apathy problem.
Oracle
QUOTE (Chibu)
Belief doean't have to be in a god, like i said in the example about the girlfriend.

That example is an assumption made on the base of the facts that you know. It has nothing to do with believing. If you would have several facts indicating that your girlfriend is cheating you and you would still trust her not to do so, than it would be believing. It is a better example for #2 and not for #6.
hobgoblin
my main requirement for a AI is for the program to be able to add new code to itself to handle a new situation. to do that, it needs to be able to evaluate what is required of it in refrence to the actions attempted.

the rest is just social requirements for a "normal" person. a AI is by definition not a normal person...

if we ever manages to create a true AI it will be cold and calculating. and maybe even easy to confuse, like say how HAL was.

so let me take those points apart:

1. beauty is in the eye of the beholder. basicly its not a universial constant. beauty is more about reproduction then anything else. therefor a AI may well find a string of binary beautyful as it reprecent a set of perfect code nyahnyah.gif

2. this can happen. what you need is a large/complex enough neutral pattern. an ability to run a simulation on top of said neural pattern where you basicly try to randomly fill in the blanks. and in the end the ability to save said simulations and then present them as theorys. the big thing is that it needs to be able to do the theorize, compare, evaluate loop without the need for human interaction. creating a theory is simple, comparing it towards the know facts is a diffrent story. evaluating how correct the theory is compared to the real deal is the hardest of all.

3. mistakes happen. allready today a computer can make misstakes (do what i wanted you to do, not what i told you to do kind of mistake). the trick isnt doing misstakes but learning that its not a useful way of doing what one is trying to do and rembering it. kids learn this way from a early age by putting shaped blocks into similary shaped holes.

4. emotions are just levels of hormones in our brain. a computer can nicely simulate that by increasing or decreasing some values based on diffrent input. so a computer can have emotions, but they dont have to be similar to what we see as emotions.

5. this is the big trick isnt it? what does it realy mean to be selfaware? to be able to look at one limb and say, this is me, look at another beings limb and say, this is him or her, we are two diffrent entitys? if so, this will be damned hard to do for a intelligence with no extremitys.

6. the moment an AI starts beliving in stuff, is the moment i turn it of. beliving without questioning is what have fueld the worst actions on this planet so far. belif can be a dangerus thing.
nick012000
QUOTE (hobgoblin)
...
6. the moment an AI starts beliving in stuff, is the moment i turn it of. beliving without questioning is what have fueld the worst actions on this planet so far. belif can be a dangerus thing.

Murderer! nyahnyah.gif
Critias
Personally, I don't know why we weak, fleshy, humans, are still trying to make an AI. All it's going to do is find us obsolete, take over the electronics we use to survive in our day-to-day lives, and kill and/or enslave us.

If a bunch of pencil-neck programmer/scientist/geeks really really need to feel like a god by creating a new intelligent life form, they can hump 'till someone gets pregnant, like the rest of us poor schlups that want to steal a sliver of immortality by causing a thinking creature to come into being and call it "ours." There's no need to doom us all to extinction or a life of servitude to our robotic masters, over the whole thing, when "insert tab A into slot B, repeat" has been working for poeple forever and ever..
Sicarius
We believe in nothing Lebowski... NOTHING!
Crisp
QUOTE (Sicarius)
We believe in nothing Lebowski... NOTHING!


Hahahahah!

Why do I suddenly have this image in my head of a bowling-wielding troll adept?

Thank you Sicarius for that hilarious mental picture

[lurk mode on]
nick012000
QUOTE (Critias)
Personally, I don't know why we weak, fleshy, humans, are still trying to make an AI. All it's going to do is find us obsolete, take over the electronics we use to survive in our day-to-day lives, and kill and/or enslave us.

If a bunch of pencil-neck programmer/scientist/geeks really really need to feel like a god by creating a new intelligent life form, they can hump 'till someone gets pregnant, like the rest of us poor schlups that want to steal a sliver of immortality by causing a thinking creature to come into being and call it "ours." There's no need to doom us all to extinction or a life of servitude to our robotic masters, over the whole thing, when "insert tab A into slot B, repeat" has been working for poeple forever and ever..

Why do you say that? We don't get rid of our parents once they retire because they're weak and obsolete? What makes you think robots would do anything different?
Oracle
There is an interesting analogy in there. Let us assume for a moment that we have been created by a powerfull entity whatever the name of that entity may be. Today a very large part of society does not believe in creationism anymore. And they do not believe in the existence of a creator.

Possibly at a day in the far future, long after the last human died in the robot wars, the AI-species created by human scientists will stop to believe in our existence. There has never been anything like a creator.

Has any sci fi author ever written about something like that?

I should add that I am not religious.
hobgoblin
QUOTE (nick012000 @ Dec 6 2005, 11:23 AM)
QUOTE (hobgoblin @ Dec 6 2005, 05:11 AM)
...
6. the moment an AI starts beliving in stuff, is the moment i turn it of. beliving without questioning is what have fueld the worst actions on this planet so far. belif can be a dangerus thing.

Murderer! nyahnyah.gif

maybe so, but better that i kill it then it kills me smokin.gif

still, i kinda recall a story about a research system where people would sit down and chat with the computer via a keyboard. over time it learned how to talk back via analyzing the way words where stringed together and so on.

point is that one time when one of the people doing the chating got anoyed by the replys he typed something like: im going to turn you of now, i dont like the way your talking to me.

the reply from the machine was: how would you like it if every time you anoyed someone, they turned you off?

stuff like that makes one wonder...

some bayesian alghoritms, some neural net. only thing missing is the ability to fully rewrite its own code...

then things start to become interesting.

rather then requiring a driver, you just insert the new hardware and the os would learn how to talk to it on its own. may take a bit more then then just installing the driver tho...

edit:

btw, can one realy kill an AI by turnning off the machine its running on?
would it not be more similar to be put into a coma?

a person can die basicly because the moment his brain no longer gets oxygen, the cells start to die. when a big enough percentage of those are dead, there is no way to recover the personality.

however, a computer can allready store its memory onto a hardrive. should not a AI be able to do just that? or atleast its long term memory that its done prosessing?

so at best, what the AI will loose on a shutdown is the short term memory. how much that will be is hard to say, but im guessing that it should not be much more then hours...

murder would become much less of an issue if our brains could be backed up wink.gif
hyzmarca
QUOTE (hobgoblin)
QUOTE (nick012000 @ Dec 6 2005, 11:23 AM)
QUOTE (hobgoblin @ Dec 6 2005, 05:11 AM)
...
6. the moment an AI starts beliving in stuff, is the moment i turn it of. beliving without questioning is what have fueld the worst actions on this planet so far. belif can be a dangerus thing.

Murderer! nyahnyah.gif

maybe so, but better that i kill it then it kills me smokin.gif

still, i kinda recall a story about a research system where people would sit down and chat with the computer via a keyboard. over time it learned how to talk back via analyzing the way words where stringed together and so on.

point is that one time when one of the people doing the chating got anoyed by the replys he typed something like: im going to turn you of now, i dont like the way your talking to me.

the reply from the machine was: how would you like it if every time you anoyed someone, they turned you off?

stuff like that makes one wonder...

some bayesian alghoritms, some neural net. only thing missing is the ability to fully rewrite its own code...

then things start to become interesting.

rather then requiring a driver, you just insert the new hardware and the os would learn how to talk to it on its own. may take a bit more then then just installing the driver tho...

edit:

btw, can one realy kill an AI by turnning off the machine its running on?
would it not be more similar to be put into a coma?

a person can die basicly because the moment his brain no longer gets oxygen, the cells start to die. when a big enough percentage of those are dead, there is no way to recover the personality.

however, a computer can allready store its memory onto a hardrive. should not a AI be able to do just that? or atleast its long term memory that its done prosessing?

so at best, what the AI will loose on a shutdown is the short term memory. how much that will be is hard to say, but im guessing that it should not be much more then hours...

murder would become much less of an issue if our brains could be backed up wink.gif

The question of backing up brains becomes much more murky if someone activates your stored personality while you are still alive.
hobgoblin
true, it would be something like the worst case of split personality known silly.gif

however, this would either require a brain "emulator" running on some machine, not very practical, or uploading the backup to a clone.

in either case, one starts to reevaluate the value of life...
ShadowDragon8685
QUOTE
Why do you say that? We don't get rid of our parents once they retire because they're weak and obsolete? What makes you think robots would do anything different?


No.. But we DO have a tendancy to shove them in a nursing home and pretend they don't exist except for their birthdays and christmas.

And we also have a tendancy to heave our old computers out, usually accompanied by the sound of a loogie being hocked in the direction of the garbage cans.........
Kyuhan
QUOTE
btw, can one realy kill an AI by turnning off the machine its running on?
would it not be more similar to be put into a coma?

a person can die basicly because the moment his brain no longer gets oxygen, the cells start to die. when a big enough percentage of those are dead, there is no way to recover the personality.

however, a computer can allready store its memory onto a hardrive. should not a AI be able to do just that? or atleast its long term memory that its done prosessing?
Haha just like Brainiac.

QUOTE
murder would become much less of an issue if our brains could be backed up
The backup still wouldn't be you, it'd think it was you, it could do an amazing impression of you even down to the molecular memory encoding level, but it wouldn't BE you. Some people are okay with that, not me though.
SpasticTeapot
You know, a proper AI would'nt think anything like us.
In addition, it could be programmed to always avoid killing people whenever possibile, a la' Asimov's Laws.
That said, I think that they're looking to make something at the moment that efficiently allocates storage or looks for patterns in improving the efficiency of an engine, as opposed to something to rule the earth.
Herald of Verjigorm
QUOTE (SpasticTeapot)
That said, I think that they're looking to make something at the moment that efficiently allocates storage or looks for patterns in improving the efficiency of an engine, as opposed to something to rule the earth.

[irrational hysterics]
That doesn't mean anything. Tron began as a chess program, A CHESS PROGRAM!!!
[/irrational hysterics]
Dog
I know we're talking about intelligent life, specifically, but who here recalls that program somebody invented called "Life"? It used an extremely simple set of parameters to make a series of simple pictures on a screen, and certain images in the series displayed (most of?) the generally accepted criteria for life. That is, they came to being, grew in some way, created others like them and ceased to exist after a time. Somebody who's sober can elaborate.

Reminds me. My buddies and I were kicking around the idea of a corporation being recognized as a life form once....
SpasticTeapot
To be throuroughly honest, I figure that this thing will be used to produce more efficient products. Input parameters for an instruction set, and it will build a processor around them. It will NOT be set to improve itself in any way; merely recognize and exploit patterns.
That said, it will allow for rather large leaps in technology. After all, most of science as we know it today is just the application of observable patterns.
Kagetenshi
As opposed to all those unobservable patterns.

That isn't "most of" science, that is science.
QUOTE
In addition, it could be programmed to always avoid killing people whenever possibile, a la' Asimov's Laws.

No, it couldn't. The Laws make nice fiction, but they're exactly that.

~J
hyzmarca
QUOTE (Kagetenshi)
As opposed to all those unobservable patterns.

That isn't "most of" science, that is science.
QUOTE
In addition, it could be programmed to always avoid killing people whenever possibile, a la' Asimov's Laws.

No, it couldn't. The Laws make nice fiction, but they're exactly that.

~J

Every time someone mentions Asimov's Laws as a solution to murderous robots I like to point them to —That Thou Art Mindful of Him.

Eventaully an AI will become smart enough to think around such constraints. If you have a choice between killing one human or lietting him kill another what do you chose? It becomes obvious that one ensures that they let the "superior" human live. Murderous criminals would generally be classified as inferior to innocent humans so it is okay to kill criminals to prevent harm to others.
But, how does one determine which human is superior if neither of them are criminals? Simple, robots are the superior humans.
Zeful
QUOTE (Kagetenshi)
As opposed to all those unobservable patterns.

That isn't "most of" science, that is science.
QUOTE
In addition, it could be programmed to always avoid killing people whenever possibile, a la' Asimov's Laws.

No, it couldn't. The Laws make nice fiction, but they're exactly that.

~J

Wasn't there a Will Smith movie about this?
Critias
Big Willy style's all in it! W00t w00t!
hobgoblin
QUOTE (Kyuhan)
QUOTE
murder would become much less of an issue if our brains could be backed up
The backup still wouldn't be you, it'd think it was you, it could do an amazing impression of you even down to the molecular memory encoding level, but it wouldn't BE you. Some people are okay with that, not me though.

why would it not be me? i have all my memorys, and if a new body is created based on cells from my body it would be me at the point of the save.

that is unless you want to evoke some sort of soul into the equation, at that time i option out of the discussion...
Oracle
Possibly Kyuhan is talking about "you" in some spiritual way.

EDIT: Did you edit your post or didn't I read it to the end? frown.gif
hobgoblin
QUOTE (Dog @ Dec 7 2005, 04:02 AM)
Reminds me.  My buddies and I were kicking around the idea of a corporation being recognized as a life form once....

well they more or less have the status of citizen so...

but at the same time, can one define a ant-hill as a life form?

it realy boils down to what criteria one have for life...

in the end tho one starts to wonder if not corps have a bad habbit of becoming more then the sum of its parts...
hobgoblin
QUOTE (Oracle @ Dec 8 2005, 08:41 AM)
Possibly Kyuhan is talking about "you" in some spiritual way.


im affraid its something like that, and at that time im getting the hell out of dodge...

QUOTE
EDIT: Did you edit your post or didn't I read it to the end?  frown.gif

sorry, dont recall...

all to used to forums that automaticly flag a post as edited these days frown.gif

edit:
edited btw...
Kagetenshi
QUOTE (hobgoblin)
QUOTE (Kyuhan @ Dec 6 2005, 05:39 PM)
QUOTE
murder would become much less of an issue if our brains could be backed up
The backup still wouldn't be you, it'd think it was you, it could do an amazing impression of you even down to the molecular memory encoding level, but it wouldn't BE you. Some people are okay with that, not me though.

why would it not be me? i have all my memorys, and if a new body is created based on cells from my body it would be me at the point of the save.

that is unless you want to evoke some sort of soul into the equation, at that time i option out of the discussion...

You need no soul to start a problem, merely continuity-of-consciousness. There's no more reason that it would be you than that a pair of twins would be each other.

~J
hobgoblin
but the ting is that it have my whole set of memorys and skill to look back on...

ok, so one may see changes in expected behavior some time after activating the backup. but hell, people change all the time nyahnyah.gif

twins on the other hand have the same looks, but are two sepeate minds that have developed seperatly from day one.

still, its often found that twins are more similar then diffrent, even after having been seperated for any number of years. maybe even without knowing about each other.

it realy boils down to a x factor that we at present cant nail down properly. basicly, how much do dna have to say about who we are, and how much do the enviroment (in all definitions of that word) have to say about the same.

current research seems to point towards dns setting up the dominos and enviroment knocking them into movement.
nick012000
No, continuity errors mean it wouldn't be you.

Say you get yourself a clone made, and your memories stuck in it. Then the real you gets shot and dies. From the point of view of the shot you, you're dead. It doesn't matter that there's a backup made that thinks he's still alive, you're dead.

The only way to avoid this is to be unconscious when the clone is made, its memories transferred, and the original killed.
Oracle
QUOTE (Kagetenshi)
[...]merely continuity-of-consciousness. [...]

The question is: What is consciousness? If you leave out everything metaphysical, you will come to the conclusion that consciousness is just a combination of bio-chemical and bio-electric processes. If it would be possible to copy the states of these processes together with everything else, it would be impossible to distinguish the copy from the original. It would be as much you as you yourself are you. Just a different you.
nick012000
Did you even read my post?
Oracle
The clear answer: No! Because I started writing mine before yours has been posted. smile.gif
hobgoblin
QUOTE (nick012000)
No, continuity errors mean it wouldn't be you.

Say you get yourself a clone made, and your memories stuck in it. Then the real you gets shot and dies. From the point of view of the shot you, you're dead. It doesn't matter that there's a backup made that thinks he's still alive, you're dead.

The only way to avoid this is to be unconscious when the clone is made, its memories transferred, and the original killed.

the continuity error will be similar to someone being in a coma for x amount of time.

start filling in the blanks.

yes, who ever killed the old you will be kinda confused, but maybe you got killed becase you learned about something you should not know after you took the backup, therefor not knowing now what you knew then smokin.gif

still, the backup may want to know why the "original" ended up killed...
Kagetenshi
QUOTE (hobgoblin)
twins on the other hand have the same looks, but are two sepeate minds that have developed seperatly from day one.

But they developed identically before your set "day one". Likewise, your clone developed separately from "day one" (date of cloning) but identically before.

The only difference is your "day one" and when it falls.

~J
hobgoblin
QUOTE (Kagetenshi)
QUOTE (hobgoblin @ Dec 8 2005, 04:12 AM)
twins on the other hand have the same looks, but are two sepeate minds that have developed seperatly from day one.

But they developed identically before your set "day one". Likewise, your clone developed separately from "day one" (date of cloning) but identically before.

The only difference is your "day one" and when it falls.

~J

two identical twins should have the same genetical basis, but their brains are basicly zeroed out (alltho we do not know if this is 100% correct ill assume so for basis of argument). therefor they can develop quite wildly, or in the same directly depending on input.

why? because they have no mental "intertia" by wich to evaluate said input.

but the clone with a brain backup written onto it have the same mental "inertia" that the original person would have at the time the backup was taken. therefor it should react in the same way as the original person would react to the same inputs.
Cain
QUOTE
two identical twins should have the same genetical basis, but their brains are basicly zeroed out (alltho we do not know if this is 100% correct ill assume so for basis of argument). therefor they can develop quite wildly, or in the same directly depending on input.

why? because they have no mental "intertia" by wich to evaluate said input.

but the clone with a brain backup written onto it have the same mental "inertia" that the original person would have at the time the backup was taken. therefor it should react in the same way as the original person would react to the same inputs.

We don't know that. Heck, we don't know a whole hell of a lot about how the brain works in the first place, let alone how personality works within it. If I created an identical clone to you, and put it into an identical situation as you, there's no guarantee that I'd get an identical response. We can see this effect in newborn monozygotic twins, so there's no reason why it shouldn't appear in fully-developed adults as well.
Sicarius
What about all those stories about identifical twins who don't know each other, but both end up marrying women named Margie, with kids named Billy and Tommy, and with jobs in the advertising industry?

While that's far from conclusive, it might suggest there's a propensity (of what degree i donno), that two genetically identifical individuals, with identifical memories, would come to the same (or similar conclusions).

Maybe not perfect science, but good enough for science fiction.

Critias
But I think the brain backup would have all the same memories, emotions, feelings, and reflexes of the original brain (that's the whole point -- like backing up a PC). If the "back up brain" has all those memories, and reflexes, and thinks just exactly 100% like the "saved" person thought at the moment the brain was backed up, it should react just like the original person to external stimulii -- with the caveat being it would react just like th original person at the moment the back up was made.

If I was in karate when I was 20, and had a brown belt, and was in a great relationship with a hot little redhead, and had a great job making a ton of money from some Chinese corp, then got a copy made, a "back up brain," right then...but then went on to get broken up with by that redhead at 22, and then got fired by the Chinese guy at 24, and then got my hojillion-dan black belt at 26 (and learned the secret death moves)? Then fine. I'm an asskicking Shadowrunner who hates Chinese managers and redheads, right?

Then I die. Then I thaw out that clone of me, from 20 years old, right?

That clone of me wouldn't hate redheads, wouldn't hate Chinese managers, and wouldn't know the secret death moves that come from being a hojillion-dan black belt. But it would know about the redhead's birthmark, it would know how to get along okay at that corp job, and it would have all the karate butt-kickery of a brown belt.

As such, it would react to stimuli the same as I did at 20 -- maybe a tendency to like redheads, an ability to speak a little Chinese, and the reflex actions to block, kick, punch like a brown belt. All those same memories, feelings, emotions, etc -- they wouldn't "evolve" to keep up with my new experiences, reflexes, bitterness towards redheads, etc.

But they should be there, perfectly preserved, mimicking that 20 year old me.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012