IPB

Welcome Guest ( Log In | Register )

7 Pages V   1 2 3 > »   
Reply to this topicStart new topic
> Can a decker make an "AI", Well,i hope so-.^
Cynic project
post Dec 22 2004, 10:56 PM
Post #1


Running Target
***

Group: Members
Posts: 1,032
Joined: 6-August 04
Member No.: 6,543



I am so sorry, my internet went stupid...please forget this post. And if some can delete this thread,please do.
Go to the top of the page
 
+Quote Post
Walknuki
post Dec 22 2004, 10:56 PM
Post #2


Target
*

Group: Members
Posts: 77
Joined: 5-December 04
Member No.: 6,869



Maybe.
Go to the top of the page
 
+Quote Post
Cray74
post Dec 22 2004, 11:17 PM
Post #3


Running Target
***

Group: Members
Posts: 1,428
Joined: 9-June 02
Member No.: 2,860



Agents approximately equate their pilot rating with that of Rigger 3's Robotic pilot ratings. A rating 4-5 Agent is therefore comparable to a well-trained human in some intellectual respects.

However, it's relatively easy for a decker to program a rating 6, 8, or even 10 agent. With a good program plan and programming suite, it won't take years to code, either, and the resulting Agent should be (using the Rigger 3 robotic pilot scale) pretty darn smart.

Maybe a high level Agent doesn't sit back and say, "I think, therefore I am," and thus isn't a True AI, but it's smart in a brute force problem solving fashion, enough to be considered intelligent by other standards.

In other words, above a certain level, it depends on what you define as "artificially intelligent" as to whether the answer is "yes" or "no."
Go to the top of the page
 
+Quote Post
Cynic project
post Dec 22 2004, 11:23 PM
Post #4


Running Target
***

Group: Members
Posts: 1,032
Joined: 6-August 04
Member No.: 6,543



So using the parameters that I set up,the awnser is yes a shaodwruner can make an AI.

" By "AI" I mean a program that is capable of learning from outside stimuli. "

It is aurgable if the makes shadowrun mean an UBER AI with god like abilties..IE DEUS, when they mean AI.
Go to the top of the page
 
+Quote Post
mfb
post Dec 22 2004, 11:45 PM
Post #5


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



well, sorta. i believe you can make a frame or agent that incorporates the Cascading option; that's basically learning from outside stimuli. beyond that, no; what you're talking about is effectively an S-K, and lone deckers can't make those.
Go to the top of the page
 
+Quote Post
sidartha
post Dec 23 2004, 01:14 AM
Post #6


Moving Target
**

Group: Members
Posts: 216
Joined: 27-January 04
Member No.: 6,025



Mr Woodchuck and I just had this conversation a few days ago, weird eh?
What we came up with was that an AI in shadowrun, demigod definition was that it had to be a VERY high rating SK pushing twenty on the scale from one to ten.
It has to be unique in it's creation or duties from your run of the mill go get the info SK.
It has to display an emotion as it's X factor.
For instance, Deuce displayed pride, Morgan displayed love and Mirage displayed compassion.


So far eleven Megacorps working for two years haven't been able to reproduce AI's beyond the first three.
If you want to give your players that kind of power be my guest. Just remember the Arcology ;)
Go to the top of the page
 
+Quote Post
Cray74
post Dec 23 2004, 03:15 PM
Post #7


Running Target
***

Group: Members
Posts: 1,428
Joined: 9-June 02
Member No.: 2,860



QUOTE (Cynic project)
So using the parameters that I set up,the awnser is yes a shaodwruner can make an AI.

" By "AI" I mean a program that is capable of learning from outside stimuli. "


By that definition, quite a few real life programs are already AI.

QUOTE
It is aurgable if the makes shadowrun mean an UBER AI with god like abilties..IE DEUS, when they mean AI.


QUOTE
So far eleven Megacorps working for two years haven't been able to reproduce AI's beyond the first three.


Those are goofy cinematic AIs with Super Powers. S-Ks and AI's in Shadowrun are marked by super control of the Matrix beyond the ken of deckers and even the ability to manipulate human brains to produce Otaku. Bleh.

If your interest is NOT in god-like beings, but rather simply thinking programs, I think the bar is set much lower. A high rating Agent should be able to be able to make that leap to, "I think, therefore I am" with a little experience and polish.

Raising emotions to some high pedestal beyond the ken of normal machines is Hollywood influenced thinking. The Agent software that experiences a "priority shift in tasking due to threatened self-dissolution by IC," has just experienced fear of being killed and is responding by getting ready to fight. Programs with positive feedback loops to encourage certain learning behaviors (like an Agent learning a creator's habits) experience what amounts to pleasure in their success.
Go to the top of the page
 
+Quote Post
Toptomcat
post Dec 23 2004, 06:46 PM
Post #8


Moving Target
**

Group: Members
Posts: 626
Joined: 1-March 04
Member No.: 6,112



With high skill, luck, Karma, a drek-hot deck, a talented programming team, a spark of genius, and plenty of player motivation, yes, if only a minor or flawed one.
That's my philosophy, anyway, when GMing- 'don't say no, say how hard.'
Go to the top of the page
 
+Quote Post
Cray74
post Dec 23 2004, 07:05 PM
Post #9


Running Target
***

Group: Members
Posts: 1,428
Joined: 9-June 02
Member No.: 2,860



QUOTE (Toptomcat)
With high skill, luck, Karma, a drek-hot deck, a talented programming team, a spark of genius, and plenty of player motivation, yes, if only a minor or flawed one.
That's my philosophy, anyway, when GMing- 'don't say no, say how hard.'

So...what would the dread be of an AI, assuming it has no extra powers beyond that of an Agent or Smart Frame?

It'd kind of be like a Free Spirit or Ally, but without the Matrix sorcery powers, right?

Or maybe just a contact/ally?
Go to the top of the page
 
+Quote Post
mfb
post Dec 23 2004, 07:10 PM
Post #10


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



i dunno. if all of the megas can't get one working on purpose, i don't see a single decker doing it. multiple tests at TN 25-30--that's not a "no", technically.
Go to the top of the page
 
+Quote Post
BitBasher
post Dec 23 2004, 07:41 PM
Post #11


Traumatizing players since 1992
******

Group: Dumpshocked
Posts: 3,282
Joined: 26-February 02
From: Las Vegas, NV
Member No.: 220



I don't see a single decker ever having access to an ultraviolet host for a few years to leave code running, which is a requirement. Much less someone SINless and without millions and millions of nuyen to rent that processing power of that billion nuyen mainframe.
Go to the top of the page
 
+Quote Post
Cray74
post Dec 23 2004, 07:51 PM
Post #12


Running Target
***

Group: Members
Posts: 1,428
Joined: 9-June 02
Member No.: 2,860



QUOTE (mfb)
i dunno. if all of the megas can't get one working on purpose, i don't see a single decker doing it.

Well, yeah, but look at the deities-in-a-box the megacorps try to make.

What if the decker's goal is a human-in-a-box?
Go to the top of the page
 
+Quote Post
mfb
post Dec 23 2004, 07:53 PM
Post #13


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



the corps haven't managed that, either.
Go to the top of the page
 
+Quote Post
Cray74
post Dec 23 2004, 08:20 PM
Post #14


Running Target
***

Group: Members
Posts: 1,428
Joined: 9-June 02
Member No.: 2,860



QUOTE (mfb)
the corps haven't managed that, either.

Meh. By the time you get to a rating 10 Agent, you've got a program smarter than most humans. What's the difference if it also asks a few existential questions?
Go to the top of the page
 
+Quote Post
Moon-Hawk
post Dec 23 2004, 08:35 PM
Post #15


Genuine Artificial Intelligence
********

Group: Members
Posts: 4,019
Joined: 12-June 03
Member No.: 4,715



QUOTE (Cray74)
QUOTE (mfb @ Dec 23 2004, 07:53 PM)
the corps haven't managed that, either.

Meh. By the time you get to a rating 10 Agent, you've got a program smarter than most humans. What's the difference if it also asks a few existential questions?

Well, maybe a very, very high rated agent is capable of going "AI". The point is, it would need to run for years before it ever thought to ask an existential question, and even then would need some sort of X-factor to get it thinking along those lines.
Go to the top of the page
 
+Quote Post
mfb
post Dec 23 2004, 08:52 PM
Post #16


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



well, for one, it's not going to act like a program that's under the decker's control. it stops being an agent that the character uploads, and becomes a contact.

also, AI is one of the Big Mysteries in SR. if you allow anyone with a high programming skill to put one together anytime they want, it diminishes the mystery.

now, if you're giving a high-end agent or S-K a 'personality' of sorts, i'm all for that. one of my otaku has a daemon, rating 8, named Furious George. i run him like a seperate character; the otaku gives him a job, and he's smart enough to go out and do it himself. but he's not actually intelligent--just programmed to seem that way.
Go to the top of the page
 
+Quote Post
Moon-Hawk
post Dec 23 2004, 09:09 PM
Post #17


Genuine Artificial Intelligence
********

Group: Members
Posts: 4,019
Joined: 12-June 03
Member No.: 4,715



So can a decker create an AI? I think that depends on the degree of intention.
Let me explain; no, there's too much; let me sum up:
Could a decker set out to program an AI, write a program, then have it be an AI? No way.
Could a decker write a sophisticated, adaptive program like an Agent, SK, or Daemon that could, someday, after months or years of run time and a mysterious X-factor become an AI? Sure, why not? It'd be fun.
Could a decker manipulate the environment of said program to increase the likeliness of becoming AI? I dunno, maybe.

But as far as just making an adaptive program that learns from experience is easy, that's just an Agent.
Go to the top of the page
 
+Quote Post
Kagetenshi
post Dec 23 2004, 09:33 PM
Post #18


Manus Celer Dei
**********

Group: Dumpshocked
Posts: 17,006
Joined: 30-December 02
From: Boston
Member No.: 3,802



QUOTE (sidartha)
It has to display an emotion as it's X factor.

Ugh. The X-factor, whatever it is, is a cause rather than an effect, and thus displaying an emotion could be indicative of having been triggered, but I can't imagine how it would be the cause. Also agreed with Cray on the topic of emotions and their "specialness".

~J
Go to the top of the page
 
+Quote Post
mfb
post Dec 23 2004, 09:41 PM
Post #19


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



i disagree. for one, re-prioritizing in order to maximize self-preservation isn't necessarily fear--more likely, the program simply has a standing order to avoid destruction. now, a program that doesn't have a standing order to preserve itself, that changes its actions in order to avoid danger? that's fear, and that is something special and cool.
Go to the top of the page
 
+Quote Post
SirKodiak
post Dec 25 2004, 04:57 AM
Post #20


Moving Target
**

Group: Members
Posts: 120
Joined: 3-May 04
Member No.: 6,298



QUOTE
i disagree. for one, re-prioritizing in order to maximize self-preservation isn't necessarily fear--more likely, the program simply has a standing order to avoid destruction.


What you're getting into here is the claim that there is a difference between something that acts exactly like fear, and fear itself. This is where you get into the big philisophical question behind things like the Turing Test. If it quacks like a duck, walks like a duck, and acts like a duck, if it is indistinguishable from a duck, does that make it a duck?

Anyways, the main question here is a little vague because the definition of AI in the real world is a huge argument among researchers, and the definition of AI in Shadowrun is stupid. So, to answer a couple of more specific questions (all this being in my own opinion):

Can a Shadowrunner make a Shadowrun-style AI, an online god? No. The resources required are way beyond anything a Shadowrunner should ever see, unless you let your Shadowrunners own megacorps.

Can a Shadowrunner make an adaptive, learning program that can compete with a human for very specific tasks? Yes, these already exist now, and also exist in the Shadowrun books.

Can a Shadowrunner make a Virtual Personality, which interacts like a human being? These aren't really gone into in Shadowrun, but given the level of technology they have, this should be possible. I'd let them have it, though I'd do it by just making them a technology that exists in the world. You'll find these used instead of voice mail, instead of phone menus, instead of all the things which we stopped using people for now, but currently require you to hit buttons on your phone to operate. These are easy to add because they don't really break too many things, and make the computer scientist in me less crazy with the way computers work in Shadowrun.
Go to the top of the page
 
+Quote Post
SirKodiak
post Dec 25 2004, 04:57 AM
Post #21


Moving Target
**

Group: Members
Posts: 120
Joined: 3-May 04
Member No.: 6,298



Board went wonky on me, resulting in double post. Please ignore. Sorry!
Go to the top of the page
 
+Quote Post
Zeel De Mort
post Dec 25 2004, 07:30 PM
Post #22


Moving Target
**

Group: Members
Posts: 403
Joined: 27-August 02
From: Scotland
Member No.: 3,175



Quotes from Matrix:

QUOTE
Agents are roughly equivalent to robots, and are capable of learning and adapting their behavior to suit new conditions.

p88

- That's not AI by any means. A high rating agent would be very advanced and could even be better than a human decker if it's rating was REALLY high, but it's still nothing like an AI. Interestingly though, Agents have no ceiling on their rating, so you could theoretically have one at rating 20 or something if your Computer (Programming) skill was also 20.

QUOTE
They [SKs] are the most complex programs written...  In game terms, programming SKs requires the use of, at minimum, a Red-10 host and programming resources equal to a half-dozen top programmers.

The frame-core rating of an SK can be any rating, with a maximum of 14.

both p147

The SK is a far better platform to develop your AI from. If it requires half a dozen top programmers (I think VR 2.0 said Computer(Programming) skill of 12 each?), then I guess, perhaps, a REALLY hot decker could do it on his own if he was good enough, spent ALL his time working on it, and had a wicked programming suite to work in. Again, if you were amazing, you could continually boost programming time on a Red-10 host to create your SK. I'm sure they'd notice something very weird was going on as you drained all their resources but, again, possible. You might even be able to get away with it if you were a bit of a matrix legend.

The last part is a bit more fuzzy. But basically you have your rating 14 SK online for months or years, experiencing new things, roaming around extremely high level hosts, hopefully getting pulled into UV environments now and then, just waiting for that final spark (i.e. GM say-so) to launch it to AI status.


In summary, in my opinion, it'd be possible for a loan decker to create an AI, but would be ridiculously unlikely and would require extreme dedication, resources, and good fortune.
Go to the top of the page
 
+Quote Post
mfb
post Dec 25 2004, 08:07 PM
Post #23


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



QUOTE (SirKodiak)
What you're getting into here is the claim that there is a difference between something that acts exactly like fear, and fear itself.

yes, but a program designed to alter priorites to ensure its own continued existence is not at all indistinguishable from a duck. you can't dissect a duck and find the piece that makes it do things like fight for its life--with a program designed to ensure its own survival, you can.
Go to the top of the page
 
+Quote Post
SirKodiak
post Dec 26 2004, 02:40 AM
Post #24


Moving Target
**

Group: Members
Posts: 120
Joined: 3-May 04
Member No.: 6,298



QUOTE
yes, but a program designed to alter priorites to ensure its own continued existence is not at all indistinguishable from a duck. you can't dissect a duck and find the piece that makes it do things like fight for its life--with a program designed to ensure its own survival, you can.


Well if I use a compiler that obfuscates the code so you can't decompile, and trash the source, then you can't point to that piece in the program either. Similarly, if our understanding of how ducks work increases, then someday I may be able to do that for ducks or even people. If I can show the complex network of electrical and chemical signals that correspond to the self-survival behavior of ducks, does that mean they no longer feel fear?

What I think you're saying is that things which are designed can't have emotions or be aware. That's a perfectly acceptable moral or philisophical position if it's the way you want to go, but it's not one computer scientists tend to use because it puts computers forever out of that realm, so it isn't very useful for examining computers. It also means genetically engineered organisms are in a very different class, morally, from "natural" ones. It also completely rejects the idea of a creator god.

The other basis for that way of thinking is that you've probably never actually disected all the ducks or people in the world. Unless your way of life is very different than mine, you accept that the people around you are self-aware because they act like it, not because of their internals. If I could show you two universes, and in one a person is a normal person, driven by electrical and chemical behavior in their brain, and in the other, there is a computer running the show inside their skull, and their behavior is always identical, then is one alive and the other not?
Go to the top of the page
 
+Quote Post
mfb
post Dec 26 2004, 06:01 AM
Post #25


Immortal Elf
**********

Group: Members
Posts: 11,410
Joined: 1-October 03
From: Pittsburgh
Member No.: 5,670



what i'm saying is that a program that is designed to mimic fear isn't showing fear when it follows its programming. at best, it's a representation of fear--a painting; it has no self-awareness. in the parallel universes question, the robot is 'alive' only if it is self-aware; if it is simply following its programming, no. it's not alive. if the robot can fool people into thinking it's alive, that's because its programmer did a good job.
Go to the top of the page
 
+Quote Post

7 Pages V   1 2 3 > » 
Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 25th April 2024 - 10:52 AM

Topps, Inc has sole ownership of the names, logo, artwork, marks, photographs, sounds, audio, video and/or any proprietary material used in connection with the game Shadowrun. Topps, Inc has granted permission to the Dumpshock Forums to use such names, logos, artwork, marks and/or any proprietary materials for promotional and informational purposes on its website but does not endorse, and is not affiliated with the Dumpshock Forums in any official capacity whatsoever.