the skillsoft part is just speculation...
reading limb movements isnt realy hard at all compared to writing stuff back to the brain in such a way that it can be stored and used. atleast thats my take on it, i would love to be proven wrong.
another question is, when will we get aritificial limbs that is as easy to control as this is?
i wonder tho, do the user have to actualy focus on the task he is doing for the computer to pick up the traffic? as in think "pointer up, left, click". or does he just try to move his arm and the pointer moves instead?
allso, how big is the computer that does the translation between tought and commands? and can it be turned into a specialized chip that can be buildt into all kinds of devices so that you just have a implant with say a wifi kinda link to the devices? and how will one go about changing from device to device?
still, its nice to see this move from the monkey to the man