Help - Search - Members - Calendar
Full Version: Datajacks with ASIST Converter
Dumpshock Forums > Discussion > Shadowrun
Cray74
On page 19 of Man & Machine, there is an option for Datajacks: the ASIST converter. This lets the datajack translate raw simsense signals from the Matrix and whatnot.

Can this also supply visual (and other) simsense signals from other sources, like to replace the display portion of a Smartlink or to replay video from a slotted chip or something? i.e., can it do the job of an Image Link and subdermal speakers?
Ancient History
An ASIST means you can process simsense signals, not video or audio data (there's a difference).
TheScamp
Also, Simsense data is rather overwhelming; it doesn't simply overlay your field of view, it pretty much becomes your vield of view. You can technically see past the data and through your regular eyes, but at a major penalty, so its usefulness as a display link substitute is somewhat lacking.
Cray74
QUOTE (Ancient History)
An ASIST means you can process simsense signals, not video or audio data (there's a difference).

So a simsense signal from the eyes and ears of a simsense movie actor in the simsense movie you're watching, though they are visual and audio data, are not visual and audio data?

Do you have a rules or book reference showing that an ASIST interface cannot be used to play back plain video and/or audio? It would seem that ASIST, the breakthrough for machine-brain interfacing, is found in limited forms in the Image Link, Transducer, Smartlink, and other cybernetic means of modifying your sensory data.

Deckers that get into a slave node controlling security cameras can look at the video and audio signals from those cameras in virtual windows, right?

QUOTE (TheScamp)
Also, Simsense data is rather overwhelming; it doesn't simply overlay your field of view, it pretty much becomes your vield of view


That's fine, for the most part. If I was playing back a video clip or writing a report on my wrist computer on a long flight, I wouldn't mind my visual and auditory perceptions dominated by the movie or report.
Ancient History
Simsense is NOT simply video and audio data. That's the point behind simsense: it contains full sensory and emotional data. You're not watching a movie like you would with image link and subdermal speakers, you're experiencing the movie. Very different data format, and what makes things like decking and rigging possible.
Siege
Yes! Someone else is thinking the same thing!

Seriously though, if simsense can manifest as visual or audio input ("as someone seeing something"), why couldn't that be distorted to represent a video screen?

As for differing formats, couldn't a standard audio/visual signal be processed into a limited form of simsense?

2) Instead of a full sensory override, how about limited sensory hallucinations that overlay existing senses?

-Siege
BlackSmith
ASIST hi-jacks your sense(s) while image link/dispaly link adds external data to your regural field of vision.

like when your looking at some picture hanging in the wall, you place a another pic to some corner over that other pic.
using ASIST to experiense the same picture would you make you feel, taste, hear and see only, and only that picture.
Siege
True, standard ASIST.

However, what if we limited that ASIST feed to exclude tacticle, olfactory signals and audio signals and _include_ only highly specific elements of the visual feed?

-Siege
Cray74
QUOTE (Ancient History)
Simsense is NOT simply video and audio data.  That's the point behind simsense: it contains full sensory and emotional data. 

So, by that logic, a rigger riding a drone gets everything, including the drone's emotions?

And a rigger in "captain's chair mode" is receiving a full immersion experience of all the drones in his network?

And a decker that stops at a Matrix site that offers showings of old 2D black-and-white movies gets the full immersion experience of sight, sound, taste, and emotions out of those old B&W movies?

And a decker examining an ethereal, tasteless, scentless, inert object (representing a file, for example) in the matrix is going to get some sort of emotional, tactile, taste and smell sensation off it?

Riiiight.

Those are obviously cases where simsense signals do not consist of all senses. A rigger riding drones does not experience the drone's emotions, because it has none. A rigger in captain's chair mode does not get blasted with all the input from all his drones. And a B&W film shown on the Matrix does not magically gain full sensory and emotional data.

Since there are obviously cases where simsense does not come with data from every sense, reasonably, an ASIST converter can take a signals for a few senses (visual and audio) and pump it into those senses of the user.
BlackSmith
QUOTE (Siege)
However, what if we limited that ASIST feed to exclude tacticle, olfactory signals and audio signals and highly specific elements of the visual feed?

audio would be quite easy (just leak trough only certain Frequens) but please tell me how can you limit your visual feed? i got only limited medical education thus i can't figure a way to limit a visual feed.
Siege
Keeping in mind that this is all hypothetical technology...

What if the simsense feed only has one signal? Say, a visual component? And that visual component happens to be, say, the Mona Lisa.

That's it, just the Mona Lisa. We use the modified ASIST to embed the image of the Mona Lisa as a slightly translucent image floating in our field of vision. The signal or image doesn't have to be exact -- we've all seen television with bad signals or bad pictures of things we know to be different in real life.

-Siege

Cray74
QUOTE (BlackSmith)
audio would be quite easy (just leak trough only certain Frequens) but please tell me how can you limit your visual feed? i got only limited medical education thus i can't figure a way to limit a visual feed.

That's easy, SR cyberware applies limited visual signals all the time:

1) Crosshairs from smartlinks don't completely obscure your vision
2) Retinal clocks put numbers up without completely obscuring your vision
3) Presumably, Image links could put up limited "windows" of videos over your vision, blocking out a portion of your view
4) As Siege suggested, translucent overlays

etc.

I'm going to ask Fanpro. From this limited sampling of responses, there doesn't seem to be any hard and fast rules answers on the board.
Siege
There isn't -- I tinkered with the same thing using simsense and a trode rig using strictly external hardware but there was never a definitive response either way.

I'll be interested to hear what they say.

-Siege
BitBasher
QUOTE
And a rigger in "captain's chair mode" is receiving a full immersion experience of all the drones in his network?
And a rigger in "captain's chair mode" is receiving a full immersion experience of all the drones in his network? No, because ASSIST isnt neeed for captains chair mode, that can be done with just a datajack.
Kagetenshi
QUOTE
So, by that logic, a rigger riding a drone gets everything, including the drone's emotions?


Yes, exactly.

QUOTE
And a rigger in "captain's chair mode" is receiving a full immersion experience of all the drones in his network?


Not quite; they make it clear that Captain's Chair is very much not being jumped into a large number of drones. I see what you mean, though, and more or less the answer IMO would be yes.

QUOTE
And a decker that stops at a Matrix site that offers showings of old 2D black-and-white movies gets the full immersion experience of sight, sound, taste, and emotions out of those old B&W movies?


Yes, definitely.

QUOTE
And a decker examining an ethereal, tasteless, scentless, inert object (representing a file, for example) in the matrix is going to get some sort of emotional, tactile, taste and smell sensation off it?


Yep.

The thing is, some of these things are going to be nonexistant. The rigger doesn't stop receiving the emotional track, it's just that the emotional track is blank. Same with the black and white movie example. It's just like watching a silent movie on your home VCR; your VCR doesn't stop reading sound data, it's just that the sound data happens to be blank.
As I see it, the difference between simsense and visual data a la image links, retinal clocks, etc. is that the former goes to your brain and puts information in there that overrides that coming from the various nerves while the latter just appends the information to the signal riding along the optical nerve.

~J
Drain Brain
Or, to continue the metaphore, ASIST changes your TV channel from 1 (your normal senses) to 2 (whatever is being piped in), wheras an image link, smartlink, display link, retinal clock - whatever is like setting up a second TV next to the original so you can see both channel 1 and 2 at once.

hehehe
BlackSmith
QUOTE (Cray74)
That's easy, SR cyberware applies limited visual signals all the time:

1) Crosshairs from smartlinks don't completely obscure your vision
2) Retinal clocks put numbers up without completely obscuring your vision
3) Presumably, Image links could put up limited "windows" of videos over your vision, blocking out a portion of your view
4) As Siege suggested, translucent overlays

etc.

I'm going to ask Fanpro. From this limited sampling of responses, there doesn't seem to be any hard and fast rules answers on the board.

yes, the hardware is specialy desinged for that but ASIST is not.

like i said...
ASIST is desigend to overflow/hi-jack your senses and block out all other output from them.
Display/image link add's something to already exiting sensation.

i try diffrent angle.
you are eating a break fast.
Display/image link adds some flavor in to it and changes it a bit.
ASIST changes the whole breaks fast so instead of serials your eating eggs-and-peakon.

and ASIST does not have "bad" images. your eye senses are stealed to the ASIST interface thus even if you keep your eyes open you simply dont see anything.
like in the movie, matrix. all data comes straight to the brain skiping totaly the real senses.

thus ASIST could not blend images or sounds or anything from both worlds without distracting huge the user.
Siege
B. Smith, you're assuming that ASIST must be an "all or nothing" gamble.

Either you get _everything_ from the simsense signal or you get nothing.

Cray and I think that instead of _everything_, a character can opt to receive only certain parts of the signal.

Or the signal itself may not include all possible tracks such as sound, smell, taste, etc.

Another hypothetical -- the general simsense tech can be modified in the manner described -- to weave select signals (feeds, tracks) into existing perceptions.

It seems reasonable, given what we know about this pseudo-tech. That being said, we'll wait to see what the official ruling is on the matter.

-Siege
TinkerGnome
QUOTE (Siege)
B. Smith, you're assuming that ASIST must be an "all or nothing" gamble.

I'd be interested in hearing an example where this isn't the case, actually. The only simsense I've seen in the books is a pretty full sensory overlay that gives you a +8 to all actions if you're getting dual feeds for some reason.

Cray's first three examples are all retinal modifications which project the image into your normal field of vision, so they're not really examples of partial simsense.
Siege
That's the point of this -- we're taking existing (fictional) tech and experimenting with what can and can't be done.

If the official ruling says "simsense is all or nothing" then the idea is interesting but impossible. But as there hasn't been an official ruling as yet...

Yes, the retinal displays (image, display links) could do all these things.

But we're proposing another way of doing it -- one that hasn't been considered before. Why use electric refrigerators when we could still use blocks of ice?

Saying, "well, it's never been done before" doesn't rule out "it can't be done".

-Siege
Entropy Kid
Don't have any books with me, so some of my rule references might be off. Having said that, I've always interpretted a smartlink as making the gun part of the shooter. Goggles give a -1 TN while a full smartlink gives -2 because of this difference, the limited simsense rig enables that. I personally don't think smartlinks have an eye display (a house ruling) and that it's the simsense rig that displays the crosshair, in the shooters brain and not the eyes. If it was actually the eyes, then why the extra TN reduction in a cyber smartlink over the goggles? They're doing the same thing.

The examples given describe ASIST as all or nothing sensory-wise, and that's how I think a normal ASIST converter would work. I don't however see any reason why one couldn't be designed to display limited and less distracting information. It would cost less, if it was like the system in a smartlink, or cost more if like a cyberdeck with selective display. The 'ware would also have to be connected to it's input source (I'm guessing datajack, but who knows)
Kagetenshi
But they do have an eye display. Man and Machine explicitly states it.

~J
Glyph
Actually, Siege, I think the ASIST link is probably "all or nothing". It is a full immersion of sensory data, which prevents other actions unless you also have an RAS override. It is far different from a retinal display.

It is primarily used by otaku to operate in the Matrix without a deck, and is mainly useful for such characters. Similarly, the invoked memory stimulator is only of use to cyberzombies. Both technologies have a lot of potential for additional uses, but that, IMO, should fall under the category of technological breakthroughs that occur during the campaign, rather than at the start of play.

**EDIT**
By the way, if you look at the components of a smartlink in Man & Machine, you will see that it includes both a retinal display and a limited simrig.
Siege
I'd also point out there are a lot of logical extrensions of tech that aren't covered by the relatively few sourcebooks released.

Which means a character could make a case for non-standard gear that people working in the profession might take for granted on a daily basis.

-Siege
Sonomancer
..... ehm, scuse me if I'm wrong, but the ASIST seems to just be a conductor which makes raw data comprehensible when piped into the brain. Its the RAS override that does the sense hi-jacking.
A person can jack into a car with a datajack and they wind up with all sorts of visual overlay right? A VCR on the other hand takes the next step into total immersion.
Entropy Kid
QUOTE
Its the RAS override that does the sense hi-jacking.
Pg. 21 M&M has the description of the RAS override, which you're right, is why there is a +8 TN, when not using it, there is only "up to a +4 modifier." According to the description it limits input from real senses and movement. It's built into most sim devices (VCRs, cyberdecks, etc.), but in something that was designed to work like a display link, wouldn't be there or allow the user to turn it off.

QUOTE
But they do have an eye display. Man and Machine explicitly states it.
I know (pg. 33 M&M), that's why I specified that my interpretation of smartlinks were a house rule.
Kagetenshi
Ah. I'd somehow managed to miss that note.

~J
Cain
The RAS override is the key difference, I think. As Entropy kid pointed out, most simsense devices have one built in, to prevent people from moving around while using simsense. That seems to lend credence to the theory that ASIST usage overrides the existing senses.

Meanwhile, Kagetenshi managed to hit my other points. If you're watching a silent TV show on your plasma-screen TV, you're still recieving a sound channel, it's just blank. If you're looking at something that has no audio/olfactory/tactile data, that doesn't mean that you're not using the bandwith to recive all that data; it's just that those signals are idle.
BlackSmith
i was a sec worried here...
like sai RAS overrides (shuts down) your muscle nerves to prevent you walking around the house while in matrix but nothing more.
ASIST handels the sense feed but it does not interact with your muscle system. your brain sends a message to punch and in matrix you hit the IC but without the RAS your hitting the wall at home at the same time, because of the ASIST you wont feel the pain&broken bones from your hand.

and i was never saying that every thing DOES have a taste/other sensation.
if it has but you are laking it, you are not geting the full 100% effect thus you cant work with full 100% to the situtaion.

IF it would be something else than "all or nothing" how effective it would be then?
because without the direct neural feed you are turning some of your sense "cool" to get the same data.

replacing your limited simring with ASIST might work but then you need modifications or othervise you would only feel what the gun "feels" or what you feel.
limited simsense enables you to feel both at the same time.

who said this conversation is hypothetical?
every one here has experienced ASIST.
no im not kiding.
now your using your eyes to read this and using your ears to hear that music.
3.. 2... 1...
your a sleep.
you dream about having PERFECT sex with a PERFECT woman.
...while eyes are seeing compleat black, all images are feed directly from your imagination to your experience system, specialy to your sight/touch/sound/smell.
you still move yourself but your SENSES are overriden by the dream.
moving your hand/taking a leak/farting is NOT a sense. touching/smelling/hearing something IS.

..and thats the wonderfull world of ASIST.
thanks for your attention.
next time put your RAS on so you might have dry pants.
Siege
ASIST is technology used to convert, communicate or interpret signals based on organic perceptions.

So no, this discussion is very hypothetical unless you're admitting to having cyber-augments.

-Siege
BlackSmith
... convert, communicate or interpret signals based on organic perceptions.

your memory is converted to something that you can feel and touch and taste while in sleep.

you got problem with cyber legs?
Siege
Sleep and dreaming are still organic processes -- ASIST might be a comparable parallel, but it's not the same thing.

And while modern prosthetics are marvels, cyberlegs they're not. Unless they've improved nerve-splicing and artificial stimulation while I wasn't looking.

Which, to be fair, is a distinct possibility.

-Siege
BlackSmith
ok.
im ready to hear your version of it.
if the way senses transport their info to brain and it is processed there has changed while i haven't looked i would like to hear your way then.
Siege
ASIST is artificial -- sleep is not.

-Siege
BlackSmith
and if you are in coma?
BitBasher
A coma is still your bodies own reaction to some abnormal stimulus.

Let's face it here, this is a GM's call because it isn't addressed either way in the rules. I lean twards no because ASIST is, by the acronm Artificial Sensory Input. It is actually injected senses.
BlackSmith
coma is artifical sleep.
you dont have brain activity but your body still reacts to sense impulses.

ASIST does just the opposite. shuts down senses and translates brain activity to data.
CanvasBack
Basically this is a discussion about simsense right? Because when it comes down to it, Deckers and and Riggers are having a simsense experience when they do what they do. The nature of their tasks and their hardware(plus software for deckers) pretty much dictacte what they get out of the experience. In either case, you're not going to be doing much else. Total immersion.

Think about the difference between regular sim-sense entertainment chips and BTLs. Certain governments have a problem with BTLS because their high sensory input can be very dangerous/addicting. They scale back the input, so that hopefully, after the experience is over you'll return to being a content SINner who pays his taxes instead of a crazy lunatic that's a drain on the system. Aside from the government, the only things limiting how much sensory information you get are the actual production qualities (How tricked out was the simrig the actor/actress was equipped with? How good was the sim-star?) and the quality of your own chipreader. The whole idea of chipping an experience is to sit back and enjoy the ride, even if you were able to, why would you want to get up and do other stuff? With BTL's it's pretty much impossible anyway. Again, Total Immersion. But it's scalable.

On the other hand, with a datajack you can jack into your car or truck, and have access to a virtual dashboard that will allow for "hands free driving" without having to go all the way in and assume the body of the vehicle like a rigger with a VCR. Is the virtual dashboard simsense? It sort of has to be since it isn't actually there... Not Total Immersion.

So, for my 2 nuyen.gif it all depends on the nature of the hardware. Simsense is a scalable technology that can work with other systems in some instances, but the normal use of it is a total immersion experience and too much input can fry your brain. wobble.gif
Kagetenshi
It's simulated sense, but it isn't SimSense™.

~J
Siege
That's an interesting point:

What is the difference between "simulated sense" and "Simsense"?

The virtual dashboard only requires a datajack (or a trode rig) without the driver having an image link or a display link.

-Siege
BitBasher
Because the expensive datajack adaptation for the car already has an ASIST converter built in so you dont need a cyber one, thus allowing people with minimal cyber to use that interface. biggrin.gif

EDIT: also IIRC the ASIST converter for a datajack just allows UMC code fromt he matrix, which is not actually an ASIST feed, to be converted to an ASIST feed for the user to interface with. This is needed because when jacking into a Cyberdeck, Remote Deck, Vehicle with adapter, or SimChip The incoming signal is already an ASIST signal that the barin can interpret via datajack. In the case of decking like an Otaku, the Matrix signal is not natively ASIST, normally your deck interprets it into that. Its naturally just data your mind cannot interpret, thats what the ASIST converter does in that sense.

That word, I do not think it means what you think it means.
Kagetenshi
"Simulated sense": sight is a sense. The datajack port is overlaying sights that aren't there.

~J
Arz
I think this topic is fast broadening over it's stated intent. So I'll add more.

Many of the tasks that you are talking about have little to do with the intent of the very vague device, ASIST Converter, which is intended solely for otaku much the same as the Memory Stimulator for cyber-zombies.

I agree that these tasks should be accomplished through ASIST but honestly feel that you should work on your own original implants to do them.

Along these lines I've been tinkering with a redesign similar to the Smartlink component breakdown for the implants for decking, rigging, and simsense. I'll post my old notes if anyone wants them.

I think many of your concerns should be covered in another SOTA book that isn't just a reprint of old ideas. Shadowrun needs to move forward.
Siege
ARGH! "Princess Bride" quotes where one doesn't expect to find "Princess Bride" quotes!

Ok, so the technology does exist for this "virtual sense" to be accessed through either a datajack or a trode rig.

And this signal can be highly specific -- in the example, the car "virtual dashboard" without including things like the car's emotional context at the time.

So the answer is essentially, yes: you can replace the image link/display link with external hardware routed through either a datajack or a trode rig.

-Siege

This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012