Help - Search - Members - Calendar
Full Version: [RL] More Direct Neural Interface Babysteps
Dumpshock Forums > Discussion > Shadowrun
Kyuhan
I know this has been around for awhile, it says so in the article as well, but updates are always way past cool, especially with the potential for skillsofts that are mentioned:

http://www.physorg.com/news7746.html
hobgoblin
the skillsoft part is just speculation...

reading limb movements isnt realy hard at all compared to writing stuff back to the brain in such a way that it can be stored and used. atleast thats my take on it, i would love to be proven wrong.

another question is, when will we get aritificial limbs that is as easy to control as this is?

i wonder tho, do the user have to actualy focus on the task he is doing for the computer to pick up the traffic? as in think "pointer up, left, click". or does he just try to move his arm and the pointer moves instead?

allso, how big is the computer that does the translation between tought and commands? and can it be turned into a specialized chip that can be buildt into all kinds of devices so that you just have a implant with say a wifi kinda link to the devices? and how will one go about changing from device to device?

still, its nice to see this move from the monkey to the man biggrin.gif
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012