Help - Search - Members - Calendar
Full Version: Artificial Intelligence Queries...
Dumpshock Forums > Discussion > Shadowrun
Pages: 1, 2
Fortinbras
QUOTE (phlapjack77 @ Jun 13 2011, 05:06 AM) *
I think I know why you said this - DS occurs only if the user is in VR, and VR requires simsense, and AIs can't do simsense...can't find the reason AIs can't do simsense tho...

They have no brains for the ASIST to interact with. They don't experience emotions chemically.
Aria
Had another go at an 'Avatar' drone based on the Manservant. I'm trying for a little bit of robustness but looking human/humanoid

Manservant 2500¥
+Mimic [R:3] 15000¥
+Touch Sensors 1500¥
+Turbocharger [R:1] 900¥
+OM: Concealed Armour [R:6] 12000 (x2)¥

+Response Module [R:5] 4000¥ (AI Homenode bonus +response 5+weapon skill shouldn't be bad and makes some use of the pilot origin 1)

Total 36,000¥

Doesn't seem overpowered to me whilst still being useful. Additional extras might be cyberarm gyromounts etc for a bit more combat oumph!

Comments?
phlapjack77
QUOTE (Fortinbras @ Jun 13 2011, 06:14 PM) *
They have no brains for the ASIST to interact with. They don't experience emotions chemically.

I guess it seems that at the end of the day, ASIST is just a bunch of electrical impulses, right? Electrical "signals", so to speak? So instead of transmitting those signals to a brain or whatever, those signals are transmitted directly to the AI.

At some point there must be a hardware / software interface between the simsense signals and the brain, this interface translates the signals into something the brain understands. Shouldn't an AI be able to also do this translation, possibly much more intuitively?
Aerospider
QUOTE (phlapjack77 @ Jun 21 2011, 06:29 AM) *
I guess it seems that at the end of the day, ASIST is just a bunch of electrical impulses, right? Electrical "signals", so to speak? So instead of transmitting those signals to a brain or whatever, those signals are transmitted directly to the AI.

At some point there must be a hardware / software interface between the simsense signals and the brain, this interface translates the signals into something the brain understands. Shouldn't an AI be able to also do this translation, possibly much more intuitively?

The key factor here is that AIs do not have a brain. Or at least, not a biological one. They understand computer language as it is, no need for translation. When you're in VR and you see the icon of an AI moving around and interracting with the sculpting, it isn't experiencing the environment as you do in that it doesn't see, feel, hear, taste or smell the iconography. All your sensory input when you go VR is just the result of your sim module translating computer code to make it intuitively understandable by drawing parrallels with the meat world (in accordance with the sculptor's design). ASIST is for people because they aren't AIs.

So in answer to your question, I doubt that an AI would be able to translate a language it understands (computer code) into a language it doesn't (sensory information) in real time more intuitively than a piece of hardware designed for the purpose (sim module) or that one would ever want to.
Aria
Unless it's an eGhost? Presumably they experience the matrix as they would always have done and their code supports this?!?
phlapjack77
QUOTE (Aerospider @ Jun 21 2011, 07:30 PM) *
So in answer to your question, I doubt that an AI would be able to translate a language it understands (computer code) into a language it doesn't (sensory information) in real time more intuitively than a piece of hardware designed for the purpose (sim module) or that one would ever want to.

I see your point, and I'll concede it. smile.gif

But it still doesn't sit well with me that a human receiving electronic impulses can benefit from the impulses more than an AI could, a being that lives in the world of pure electronic impulses. But I guess it's just a game and all that...

*edit*
QUOTE (Aerospider @ Jun 21 2011, 07:30 PM) *
ASIST is for people because they aren't AIs.

I agree totally. People need assistance to translate these electrical impulses into something they understand. AI are able to understand it natively. So why do people get the simsense bonuses using the matrix, but AIs can't?

Sorry, this idea is now whirling around in my head. Feel free to ignore all this blather forthcoming...

So "simsense" is just simulated, or recorded, sensorium. Essentially all it's doing is bypassing the real-world creation of each sensation. In the real world, there is an apple. The photons from the apple strike your eyes, and your eyes transfer this signal to your brain, and you see an apple. In simsense, there are no photons, it's just the straight signal dumped right into your brain, no sensors (eyes) needed. In both cases, you see an apple ultimately because there is a signal sent to your brain that you see an apple.

If AIs can receive signals from various Sensors(routed through drones or vehicles or something), that means they're able to receive signals about the real-world just like a person does. Similarly, it shouldn't matter where these signals originate from. If a Sensor sees an apple, the AI in the drone will see an apple. Why then can't the AI receive the same signal from simsense, bypassing the need for a Sensor? There would need to be a simsense translation package bought for the AI maybe or something, but otherwise...

Thanks for listening..err, reading smile.gif
Fortinbras
As I understand simsense, the simrig is recording brainwaves and pulses and the sim module and ASIST are sending those signals to the sensory part of your brain. In the apple scenario, it's sending an electronic pulse that stimulates the visual part of your brain. ASIST turns those simsense signals into brainwaves.
It isn't recording information like a camera or a microphone, it's recording chemical and electronic reactions in the brain.

AIs don't have brains, so that signal doesn't affect them. They can get recorded information from something like a video camera, or cybereyes with an image link, but that pulse which stimulates the brain has nothing to stimulate in an AI.

What we need is someone to map the AI "brain" and create and interface for them so they can experience real world simsense the way metahumans experience the Matirx. That would be a cool adventure.
phlapjack77
Fortinbras, thanks - your interpretation has helped to quiet the rumblings in my head some.

And if I ever run a SR game again, I'm totally using your idea as a major plot hook smile.gif
Rubic
All an AI would need is the ability to interpret electrical signals for the data they represent. Thought has been codified enough that visual data has an electronic equivalent (visual simsense data). The AI does NOT need to have eyes to understand data, because at the point where the AI is dealing with it, this signal IS NOT chemical. The necessary translation to the digital has already been made, and at the digital level, the AI reads the data the way a brain reads data, with the exception that the brain needs to translate the data further. Corrupt data, IC, and feedback can and should cause damage to AIs comparable to any MEATBAGS, and each AI is inherently living in VR, as much as a full-con-borg.
phlapjack77
I'm thinking something like, it's like trying to read a Word document with Paint. Both are digital, but they just don't speak the same language.

As Fortinbras said, a translation package for humans has been created - the ASIST / DNi / whatever systems. Creating a translation package for AIs would be a great adventure plot.
Rubic
QUOTE (phlapjack77 @ Jun 21 2011, 10:15 AM) *
I'm thinking something like, it's like trying to read a Word document with Paint. Both are digital, but they just don't speak the same language.

As Fortinbras said, a translation package for humans has been created - the ASIST / DNi / whatever systems. Creating a translation package for AIs would be a great adventure plot.

You're right about that... but that also brings up the question of how a newly-exposed AI would react to the simulated bio-assault by Black IC or a computer virus.. Computer systems generally have precautions for it, but this could turn any given AI into a sort of 'Carrier,' similar to HMHVV for metahumans...
phlapjack77
VERY interesting plot point..."beta" version of an AI simsense translation package at large...has some bugs and other strange behaviors to it...PCs are contacted by a Johnson to retrieve / investigate, meetings are always over their commlinks or via VR, never face-to-face...
Fortinbras
AIs are currently immune to Black IC, so a program that could translate simsense could be both a blessing and a curse.
Maybe Cerberus needs to make a reappearance...
Aerospider
QUOTE (phlapjack77 @ Jun 21 2011, 02:00 PM) *
I see your point, and I'll concede it. smile.gif

But it still doesn't sit well with me that a human receiving electronic impulses can benefit from the impulses more than an AI could, a being that lives in the world of pure electronic impulses. But I guess it's just a game and all that...

*edit*

I agree totally. People need assistance to translate these electrical impulses into something they understand. AI are able to understand it natively. So why do people get the simsense bonuses using the matrix, but AIs can't?

Sorry, this idea is now whirling around in my head. Feel free to ignore all this blather forthcoming...

So "simsense" is just simulated, or recorded, sensorium. Essentially all it's doing is bypassing the real-world creation of each sensation. In the real world, there is an apple. The photons from the apple strike your eyes, and your eyes transfer this signal to your brain, and you see an apple. In simsense, there are no photons, it's just the straight signal dumped right into your brain, no sensors (eyes) needed. In both cases, you see an apple ultimately because there is a signal sent to your brain that you see an apple.

If AIs can receive signals from various Sensors(routed through drones or vehicles or something), that means they're able to receive signals about the real-world just like a person does. Similarly, it shouldn't matter where these signals originate from. If a Sensor sees an apple, the AI in the drone will see an apple. Why then can't the AI receive the same signal from simsense, bypassing the need for a Sensor? There would need to be a simsense translation package bought for the AI maybe or something, but otherwise...

Thanks for listening..err, reading smile.gif

Simsense is artificial. If you have a real, physical apple then to see it a human brain needs eyes and an AI 'brain' needs cameras. Simsense will make a human brain think it sees an apple by translating man-made code into the relevant cranial stimuli. To do the same for an AI you already have code it will understand so you just give it that. The AI doesn't literally see through cameras any more than people do - the sensor turns photon-collisions into electronic data and whilst that's enough for the AI the human needs it turned into something else, even if it's back into a bunch of photons shooting out of a trideo for the eyes to then translate for the brain.
Aerospider
QUOTE (Fortinbras @ Jun 21 2011, 05:25 PM) *
AIs are currently immune to Black IC, so a program that could translate simsense could be both a blessing and a curse.
Maybe Cerberus needs to make a reappearance...

How would Black Hammer/Out affect an AI any differently to Attack? Matrix damage is the only damage it can suffer. You'd have to go a long way to devise a more fearsome AI-killer than Nuke.
Fortinbras
QUOTE (Aerospider @ Jun 21 2011, 12:38 PM) *
How would Black Hammer/Out affect an AI any differently to Attack? Matrix damage is the only damage it can suffer. You'd have to go a long way to devise a more fearsome AI-killer than Nuke.

QUOTE (SR4a p.233)
Black Hammer is intended as a weapon against hackers in full VR
using hot sim, causing Physical damage rather than Matrix damage in
cybercombat (p. 237). Against cold sim VR users, it only inflicts Stun
damage. It has no effect on programs, agents, IC, sprites, or AR users.

QUOTE (Runners Companion p.89)
The only Condition Monitor that applies to a metasapient is
the Matrix Condition Monitor


Black IC does Physical or Stun damage, not Matrix damage. AIs can only take Matix damage. AIs are immune to Black IC because they can't experience simsense.

A simsense file is a series of commands written by and sent to a metahuman brain. The ASIST allows it to be written. A Matrix entity can interpret it as a simsense file(it's not invisible) but can't experience it.

Think of it as a file that can only be read and written by a certain program, like MS Word(poor example, but I don't know computers)
The brain is MS Word and the sim module and sim rig are the computer that allow it to be written. So even if you have a computer, if you don't have MS Word(the brain) you can only read the source code, but not the file.

Again, sorry for the poor metaphore, computer people, but I'm trying my best.
Rubic
QUOTE (Fortinbras @ Jun 21 2011, 01:06 PM) *
Black IC does Physical or Stun damage, not Matrix damage. AIs can only take Matix damage. AIs are immune to Black IC because they can't experience simsense.

A simsense file is a series of commands written by and sent to a metahuman brain. The ASIST allows it to be written. A Matrix entity can interpret it as a simsense file(it's not invisible) but can't experience it.

Think of it as a file that can only be read and written by a certain program, like MS Word(poor example, but I don't know computers)
The brain is MS Word and the sim module and sim rig are the computer that allow it to be written. So even if you have a computer, if you don't have MS Word(the brain) you can only read the source code, but not the file.

Again, sorry for the poor metaphore, computer people, but I'm trying my best.

I'd say it's more like the brain is Windows, while the AI is Linux. Linux has programs that can translate/process the information (sufficiently to read the data), but command codes that work in one will not work in the other. Because of this, Black Hammer/Out function like viruses for the metahuman brain (Windows), while not affecting an AI (Linux). "I understand what that attack was saying to me, but it didn't affect my code."
Aerospider
QUOTE (Fortinbras @ Jun 21 2011, 07:06 PM) *
Black IC does Physical or Stun damage, not Matrix damage. AIs can only take Matix damage. AIs are immune to Black IC because they can't experience simsense.

A simsense file is a series of commands written by and sent to a metahuman brain. The ASIST allows it to be written. A Matrix entity can interpret it as a simsense file(it's not invisible) but can't experience it.

Think of it as a file that can only be read and written by a certain program, like MS Word(poor example, but I don't know computers)
The brain is MS Word and the sim module and sim rig are the computer that allow it to be written. So even if you have a computer, if you don't have MS Word(the brain) you can only read the source code, but not the file.

Again, sorry for the poor metaphore, computer people, but I'm trying my best.

Dude, I do know this stuff and hope that perhaps my posts in this thread have helped explain it for others. But I must have misunderstood you because your previous post reads as though you were speculating with others about creating an adventure campaign around some hypothetical AI simsense system that would somehow make Black programs a problem for them.
Fortinbras
Yeah. Doing so would totally change the current rules as written. Sort of like what Dreamchipper did with simsense, Dragon Hunt did with dracoforms using VR or Tempo did with mundanes viewing astral.
It would be a total game changer, which is why I think it would be interesting.

I think it would be more interesting if the PCs get to decide whether to destroy the thing or put it on the Matrix for all to see, but that's just me.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012