QUOTE (Ka_ge2020 @ Feb 5 2021, 12:30 AM)

Because I'm trying to get my head around the technology from a setting-first principle. The mechanics should reflect the setting and reinforce it, not provide their own reality just because.
But they do, except where you are unwilling to accept VR for rigging. It's not "reasons" it's not "just because", it's based on popular cyberpunk tropes, and - in the case of most 'ware - consultations with actual experts on the topics during the early time of the game.
QUOTE
This is why I'm way more interested in the lore than the mechanics. The lore defines how it works, the mechanics define what dice you roll and, with Shadowrun, there's a lot of momentum in how mechanics represent things in the setting that seem based more on what was before than what might necessarily make more sense.
You are right, Shadowrun is a massive beast, especially in terms of lore. Most of the time (and up to a certain point that I won't rehash, because it's a different one for different people) I feel that it's internally logically consistent in how that is translated into mechanics.
You and your group seem very intent on the details, but I have learned in my time as a roleplayer that this is often when fun takes a backseat and obsession becomes the name of the game. This isn't meant as an accusation or judgement on you, merely as a caveat.
There are many points where game crunch systems break down when confronted with game fluff stories. HP systems like in D&D come to mind, and how Coup de Grace works (or doesn't), the metaphysical nature of Edge or Karma pool dice, and, of course, wireless matrix (which is completely bonkers from an IT point of view), or the nature of wireless bonuses in SR5, which apparently work exactly the opposite of how they were intended and in any case cannot really be quantified with real world physics.
My point that I'm trying to make here is: at some point, marrying fluff and crunch (or rather: scrutinizing that marriage too closely) results in the logic unravelling. Some gamified mechanics just need to be accepted or enjoyment will suffer. If the suspension of disbelief isn't sufficient, any piece of media entertainment will break down.
QUOTE
SR4 seems to indicate that there is a difference between the two as it offers the difference between "something" and hot sim/full immersion. I thought that I had seen a negative modifier on AR control but that might also be one of the issues with juggling numerous systems. (I'm currently working with 4-5 to get this game running.)
It's possible you saw the distraction modifier on other tests
If you're busy with juggling 4 drones on virtual monitors in front of your eyes, you might be distracted from the sharp bend coming up.
Here's how that works: If you're in AR, you can imagine these viewports of the exemplary drones floating in your field of vision, so the real world input is notably less. If you do the same in VR, you are not distracted. You have a direct brain connection with no detour via your eyes, and the same viewports are just another "sense". A rigger doing that can fully drive his car in VR with the corresponding bonuses (different as they may be, depending on edition), and the VR initiative, and command his drones to do things (keyword: Captain's Chair mode) without being distracted by that. That is: apart from the necessity of spending actions on these things, which is why it might still be useful to have more crew, e.g. a gunner, in a vehicle. It's just another system that is part of the body at that point, an additional limb, if you like. It's completely natural, but if you want to flex it, you need to pay a bit of attention to it.
QUOTE
With that said, I had imagined that for the most part that at least "secondary" vehicles/drones would be interfaced through AR to issue commands (etc.) and only when you needed to hot-sim would you truly jump into VR.
You almost got it right, yes. You can control a drone or swarm of drones with AR, with the drawback of not having the corresponding additional initiative passes, and thus less efficiency (unless you also have wired reflexes, synaptic accelerators, drugs, or similar "real world" initiative enhancers). You can give them commands by spending actions, either as a group, or individually (by spending more actions), and you can jump into full VR to ASSUME DIRECT CONTROL (Mass Effect 2). However, going full (cold or hot) sim doesn't remove the possibility to control other drones this way. It is strictly better in many ways, except for the long term addiction issues and the vulnerability of exposing your meat brain to the potentially deadly feedback of damage taken by your vehicle.
QUOTE
From a lore perspective, one imagines that the AR perspective is based on to having to physically interact with perceived icons (AROs?) rather than just "think" your way through it?
Yes and no. To see, hear, touch AR, you usually need corresponding systems. You can use AR gloves, headphones / earbuds with audio link, or goggles / glasses / contacts with image link as peripherals to physically interact with these AROs.
Or you have these systems implanted: audio link in your (cyber-)ears, image link in your (cyber-)eyes, and a data jack as DNI source can be used to interact with them by just thinking about it, just like a transducer was able (before SR4) to change thoughts into communication. A datajack (from SR4 onward) includes that tech, and you can think text messages or touch AROs and phone by thought without speaking.
QUOTE
It's not a bad analogy, though, because it's essentially differentiating between AR view and FPV view (as it were; more direct control of a single object).
Glad you got what I meant exactly. Where it breaks down is that the viewport of a particular drone would, of course, be the FPV of that particular drone, and in hot sim this
becomes your own FPV.
QUOTE
If you hot sim into a vehicle/drone, what happens to your ability to act with the other drones? Are you leaving them to their own "AI" software (or "dogbrain" is how I believe SR4 refers to it)?
Basically, yes. Whenever you give commands to drones via AR (or in VR in the command mode), you only give them basic commands like "attack that" or "follow me", and the dogbrain (i.e. the Pilot autosoft, the Sharpshooter autosoft, etc.) takes care of the details. You can also directly control a drone in AR of course, but it'll usually be faster by itself, if you don't have initiative enhancers.
QUOTE
Basically because of the almost mystical qualities associated with "hot sim" and how it relates to what is happening in the meat world.
But it's not mystical. It's a well understood technology in the fluff, as well as explained with actual examples from our real world. Unlike magic (which is also mostly internally consistent with itself, due to pretty strict lore rules), cyberware and virtual reality interaction is (in most cases) a pretty realistic extrapolation of current technology. The question is usually not how that logically works, it's more how much it costs in terms of money and essence in comparison to the benefits the system in question provides.
QUOTE
Both don't seem to address actually functionality behind the 1980s mythology about how VR will dominate the world. (Which, in the real world, seems to be the complete opposite.)
Well, the real world has made huge strides in VR technology in the last 4-5 years. I personally own an Oculus Rift, and it's a big pair of goggles. However, it is also now pretty comfortable to wear and I can wear it for hours. Naturally, I can't run around with it, because I'm blind to the real world while wearing it, but my own movement impulses aren't overriden, although sensory input is replaced with the screens in front of my eyes, and auditory input is dampened, and I have louder input via the headphones. Noise cancelling tech could potentially completely remove outside auditory input in the future.
Nevertheless, it is still a clunky, wired thing that needs to be plugged in directly to my computer and requires some sensors set up around the computer space to fully function. Furthermore, it is a pain to use a keyboard and mouse with it, so I have to have a good HOTAS joystick or use the hand controllers (which are already pretty intuitive and comparable to AR gloves in SR), as well as voice commands via an additional software tool.
Today, you can buy a smaller one, that's even more comfortable to wear, with better resolution on the screens, better heat management, and completely wireless. I see the trend continuing in the very near future (especially when the entry cost of currently about 300 bucks becomes lower), and with direct brain interfaces being developed
right now that may indeed be the way to access games and the internet in a few decades.
Also, for the AR development, keep in mind that Google Glass didn't fail because it didn't work. It is a hugely useful tool in work and lab conditions, but wearing it and constantly recording with it in normal circumstances, makes you a "glasshole". It wasn't accepted because society shied (rightly, IMO) away from it.
On the other hand, it is now being normalized that instagrammers and other influencers constantly record stuff around them. That is also a trend that may, in the future, lead to more acceptance for things like AR glasses.
Shadowrun had a bit of a period where no new tech was developed, so it isn't as far ahead of our current level as the 60 years of the official timeline might imply.
QUOTE
For Decking this is probably partially related to the notion that live hacking in real time is actually a thing.
Could you elaborate on what you mean by that?