Help - Search - Members - Calendar
Full Version: AI question
Dumpshock Forums > Discussion > Shadowrun
AKWeaponsSpecialist
So, should an AI wish to (for whatever reason), could it copy its own source code onto another node, thereby creating an identical AI? And if it can do so, could it also find another (willing) donor AI to create a hybrid offspring program?
remmus
unless very different in Shadowrun A.I in all there intelligence still thinks like a machine, it doesn´t operate under the same concept as a living being including such things as reproduction and "passing on it´s legacy"
Namelessjoe
i think if it were an eghost or an evolved BTL the ai may want to reproduce or atleast copulate...... ithink there isnt realy anthing stoping reproduction.... didnt the splintering of deus(i think the archology AI) make splinter ai's that tryed to recompile the original?
Brazilian_Shinobi
Aside from an academical/curiosity view point. I don't see why an AI would want to have "offsprings". I could see copying itself into a secure place or something, but this would be just like cloning. Possibly the AI could even get together with another AI and use some kind of genethic algorithm to create a third new AI that would have some merging of the two codes.
Jericho Alar
at least in SR3 AIs were only possible to create if you incubated the source code on a sufficiently powerful host with sufficient data input for a sufficiently long period of time. (i.e. prohibitively possible)

in fact, Renraku was the only megacorporation to be able to pull it off and even then it wasn't really intentional...

Military managed to do it once too sort of, also unintentionally.

so there's no indication in shadowrun that simply copying the source would create a new entity - it would appear to recreate the same entity, but at least with respect to how Deus/Morgan seem to work duplications of source are still the same consciousness; just in multiple places...

..this might have something to do with Naming from ED continuity.
Matsci
IIRC, either Unwired, Runner's Compainion or Running wild covered this. Copying an AI ends up with a non-sentient version of the code that the AI emerged from. Copying a AI evolved from a stealth program leads to a Stealth Program, frex.
Brazilian_Shinobi
So, AI are some sort of "magical" consciousness that come from the Ressonant/Dissonant Realms?
Jericho Alar
QUOTE (Brazilian_Shinobi @ Nov 18 2009, 02:12 PM) *
So, AI are some sort of "magical" consciousness that come from the Ressonant/Dissonant Realms?


No.

if I make a copy of your DNA is it sentient?
Brazilian_Shinobi
QUOTE (Jericho Alar @ Nov 18 2009, 05:26 PM) *
No.

if I make a copy of your DNA is it sentient?


If you put into the right vessel and wait, yeah!
I understand your point of view though. But since we are talking about creatures that live in the virtual realm instead of the real one, where its code is exactly its "soul", then yeah, copying it would creat a copy of the creature.
Ol' Scratch
As others basically said, AIs aren't simply code. The sentience they gain occurs because of some hereto unknown X-Factor™ (they actually have a term they use for it, but I'll be damned if I can remember what it is). It's occurring more and more frequently as the years roll by, but the programs that evolve to full sentience are varied and completely random. Your toaster has as much of a chance to become sentient as, say, a rating 12 Knowsoft cluster composed of the collected knowledge of all humanity does. If it was just a matter of copying the code, every toaster the world round would be sentient.
Karoline
QUOTE (Brazilian_Shinobi @ Nov 18 2009, 02:12 PM) *
So, AI are some sort of "magical" consciousness that come from the Ressonant/Dissonant Realms?


More or less. The fluff text talks about the process by which a program becomes an AI isn't really understood. Presumably (because of the fact that you can't just ctrl+c ctrl+v to get an army of AI) there is something connected to resonance involved in their creation, and not simply some factor of their source code. This is also shown in the fact that they tax their home node, and enhance it, and a way completely inconsistent with any other program in existence.

I'd guess it isn't a consciousness that comes from resonance though, so much as resonance warping the program, sort of like magical critters aren't from the astral plane, but simply warped by the magic from the astral plane.
Karoline
*cough*
Traul
As for nuyen or the lack of software piracy, you need to come up with some handwaving about the matrix hardware for this to work. While I agree that AIs cannot be described only by their code, anything that exists in the binary computers we know can be fully described by its state: code, execution pointer and memory snapshot. It is not about how the AI awoke or what it is made of, it is about what its home node can express.

My magic word for that is quantum computing: memory in the 2070s is internally randomized, and only some quantic states can lead to an AI awakening. How those states occur (and most importantly: how often they occur compared to their expectancy in a random case) leaves room for any kind of supernatural mojo. One could also make those states impossible to duplicate: any attempt would destroy the original one.
Karoline
QUOTE (Traul @ Nov 18 2009, 05:51 PM) *
One could also make those states impossible to duplicate: any attempt would destroy the original one.


Well, if you're dealing with quantum states, it isn't simply impossible to duplicate them, it is impossible to know them, as knowing what the states are would cause a breakdown of their quantum possibility. Basically quantum mechanics works on the theory that it could be in spot A or it could be in spot B, but it actually exists as a probability of being in spot A or being in spot B. But by discovering which spot it is -actually- in, it is only in that spot with 100% probability, and thus is no longer the same as it was before you discovered what spot it was in.
Mordinvan
QUOTE (Jericho Alar @ Nov 18 2009, 01:26 PM) *
No.

if I make a copy of your DNA is it sentient?

depends if the DNA is allowed sufficient run time or not.
If yes and in the proper environment then yes you do get a sentient creature.
Also ALL an AI is, is stored on the computer, if it is all copied you should get an exact copy of the A.I.
Traul
Maybe I should not have used the word "quantum state" as it has another meaning in physics. What I was meaning was the computer state of this quantum memory, no matter how it is defined and how it is stored on the physical device. There is a layer of engineering to turn quantum properties into whatever properties you want your system to have, and there is still plenty to discover there.
Karoline
QUOTE (Traul @ Nov 18 2009, 06:23 PM) *
Maybe I should not have used the word "quantum state" as it has another meaning in physics. What I was meaning was the computer state of this quantum memory, no matter how it is defined and how it is stored on the physical device. There is a layer of engineering to turn quantum properties into whatever properties you want your system to have, and there is still plenty to discover there.


Oh, right, that kind of quantum. spin.gif

I like my theory that computers somehow use quantum mechanics to operate and that is how they manage to create AIs, that or the resonance thing.

I think that all this leads to an answer for the original question being: No, an AI couldn't reproduce in any way and create an AI. It could perhaps somehow combine its code with another AI (or any other program) and hope that the program that is created eventually becomes and AI and claims it as its 'son' or 'daughter' but that seems unlikely to me.

Now I could envision two AI's falling in love and adopting a younger AI as a 'child' of sorts, but any sort of actual procreation seems unlikely.

Oh, and as for Deadalus (or Deus or whatever he was called) having splinters of his program being AI, I'd imagine that has something to do with him being on an utterly different level than the AIs that the rules provide for.
Jericho Alar
QUOTE (Brazilian_Shinobi @ Nov 18 2009, 04:03 PM) *
If you put into the right vessel and wait, yeah!
I understand your point of view though. But since we are talking about creatures that live in the virtual realm instead of the real one, where its code is exactly its "soul", then yeah, copying it would creat a copy of the creature.


QUOTE (Mordinvan @ Nov 18 2009, 06:12 PM) *
depends if the DNA is allowed sufficient run time or not.
If yes and in the proper environment then yes you do get a sentient creature.
Also ALL an AI is, is stored on the computer, if it is all copied you should get an exact copy of the A.I.


QUOTE (Myself @ Nov 18 2009, 12:58 PM) *
at least in SR3 AIs were only possible to create if you incubated the source code on a sufficiently powerful host with sufficient data input for a sufficiently long period of time.


(setting aside metaphysical arguments over whether we have souls or not) Source code is not Deus's 'soul' anymore than my DNA is mine. Just like my DNA is not my consciousness (it is merely a blueprint for the design that leads to my consciousness after a sufficient period of incubation and stimulus.) This is amply proven, at least in SR3 canon by Morgan(Megaera), Deus, and Megaera/Deus several times. - Deus could transfer himself but it was a massive undertaking and involved moving his executing program, and not his source.

You guys got the point otherwise though, I think you just missed who it was in the thread in particular that was making it nyahnyah.gif
Jericho Alar
QUOTE (Traul @ Nov 18 2009, 05:51 PM) *
As for nuyen or the lack of software piracy, you need to come up with some handwaving about the matrix hardware for this to work. While I agree that AIs cannot be described only by their code, anything that exists in the binary computers we know can be fully described by its state: code, execution pointer and memory snapshot. It is not about how the AI awoke or what it is made of, it is about what its home node can express.

My magic word for that is quantum computing: memory in the 2070s is internally randomized, and only some quantic states can lead to an AI awakening. How those states occur (and most importantly: how often they occur compared to their expectancy in a random case) leaves room for any kind of supernatural mojo. One could also make those states impossible to duplicate: any attempt would destroy the original one.


Science hasn't quite advanced to the point where it is possible to determine yet, but it is possible that biological entities can also be described fully by their State. (state being somewhat more complex but including minimally internal chemistry to cover execution pointer and memory snapshot and obviously DNA as source code)

Resuming is obviously somewhat more complicated with biologicals but you get the point. generally the 'spark' that generates the AI is experiential stimulus on a program that self-modifies during runtime (something that we generally -don't- do today despite having the technical capability to do so*) combined with mind-staggeringly powerful hardware and a continuous runtime on the program measured in years.

as for quantum computing; IF quantum computing becomes popular (using qubits for storage for instance) it will only be so if the storage methods are deterministic; non-deterministic RAM is useless RAM unless you're using it only for random number generation.

*because it creates potential for non-deterministic behavior in programs and that's generally discouraged due to currently using programs for predominantly deterministic tasks.
Ol' Scratch
An easier example to use, I think, is just your average PC and operating system. Leave it on long enough, and it'll start acting weird. Memory leaks, crashes, etc. All seemingly random and hard to reproduce, and unless it's a specific glitch as opposed to one that just comes from being on for so long, it's going to be completely corrected with a simple reboot. You could copy the program as much as you wanted, but weird memory leaks and crashes wouldn't happen just by running the program. It's all the other little things that occur during the course of its operation that create the weird glitches.

And in the end, that's what an AI is. A freaky glitch.
Karoline
QUOTE (Dr. Funkenstein @ Nov 18 2009, 08:10 PM) *
And in the end, that's what an AI is. A freaky glitch.


If I ever have a hacker type that dislikes AIs for some reason, I'll have to remember that one biggrin.gif
Traul
Once again, it does not work if you copy only the program, but it does work if you copy the program code and its running memory. Your OS does that all the time to switch between running processes.

If this was possible for AIs, the new AI would not even need to "grow": as all the information is contained in the state, what you would get is a copy of the same AI with the same memories, experiences and views on everything. Then they would start to live their separate lives.

QUOTE
as for quantum computing; IF quantum computing becomes popular (using qubits for storage for instance) it will only be so if the storage methods are deterministic; non-deterministic RAM is useless RAM unless you're using it only for random number generation.


That is only if the computing model is still based on deterministic logic. It might not be the only one possible, even if we don't know how to do without it for now. Especially if it helps the plot wink.gif
Karoline
QUOTE (Traul @ Nov 18 2009, 08:36 PM) *
That is only if the computing model is still based on deterministic logic. It might not be the only one possible, even if we don't know how to do without it for now. Especially if it helps the plot wink.gif


"Full engines!"
"Engines at full sir. We are now moving at the speed of plot sir."
Neraph
QUOTE (Matsci @ Nov 18 2009, 12:19 PM) *
IIRC, either Unwired, Runner's Compainion or Running wild covered this. Copying an AI ends up with a non-sentient version of the code that the AI emerged from. Copying a AI evolved from a stealth program leads to a Stealth Program, frex.

Actually, Runner's Companion said you can't copy an AI's source code. I'll get a page number sometime.
Jericho Alar
QUOTE (Traul @ Nov 18 2009, 08:36 PM) *
That is only if the computing model is still based on deterministic logic. It might not be the only one possible, even if we don't know how to do without it for now. Especially if it helps the plot wink.gif


non-deterministic hardware would generally be a scary proposition if you wanted to actually calculate anything nyahnyah.gif
Brazilian_Shinobi
Yeah, ok, I'll buy that. If we start using quantic computers on a non-deterministic enviroment you sure will have some unexpected freaky states. And I'll totally steal the "that's what an AI is. A freaky glitch." quote rotate.gif
Godwyn
"that's what an AI is. A freaky glitch." - Now saved to internal memory because it is that awesome. Though in our current campaign I have an AI player living in my internal commlink . . .
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012