Help - Search - Members - Calendar
Full Version: The Deka Arm
Dumpshock Forums > Discussion > Shadowrun
Athenor
So 60 minutes, after football, ran a segment on the Deka arm.

Developed by the guy who did the Segue and his team, it is a prosthetic arm that attaches via airbags and is controlled using pressure pads in the sole of the foot. It gives force feedback for pressure sensitivity using a vibration motor (much like a cell phone) mounted where the prosthetic meets the arm stump.

to say this was impressive would be an understatement. The fact that they also showed powered leg prosthetics and most importantly neurologically-controlled prosthetics was even cooler.


So yeah. It's not ASIST-powered DNI using nanites and golden circuitry... But that doesn't matter. I was actually feeling really excited, both in terms of "Cyberlimbs," and also for those people I know of who are missing limbs.


Does anyone else know more about these? The story said they just entered clinical testing at the VA (the Deka is a DARPA project), so I'd figure something about it would be known.
Kerenshara
I saw the piece, and I had exactly the same response as the reporter (and felt like an idiot for apeing the comment) but I think I can be forgiven: "No way".

I always figured that once we had a sufficiently "dexterous" hand/wrist, they would take the time to map the nerve junctions. As it stands, the existing arms are too primitive to make it worth it. But with independent finger functionalits and force feedback, now we have REASON to spend the time and money on the mapping.

Incidentally, I caught a bit on a legit "cybereye" a couple months back. Currently, it's something like only 16 pixels of resolution, but it's helping a blind woman make out shapes. Once the data is back on the neural mapping, they'll take it up an order. Simple black-and-white VCR level resolution of true bionic eyes could be as little as a decade away, and the hard part's been the theory. It should speed up from here.
klavis
Here is a link that plays the segment from the show if people want to watch it.

http://www.cbsnews.com/video/watch/?id=532...ated;photovideo
Jhaiisiin
I love the one quote from the lead guy of the prosthetics division... "... because your body only has so much tolerance for gadgetry." Man, essence explanations in the works decades before it becomes an issue.
hobgoblin
QUOTE (klavis @ Sep 21 2009, 05:48 AM) *
Here is a link that plays the segment from the show if people want to watch it.

http://www.cbsnews.com/video/watch/?id=532...ated;photovideo

talk about ad density...

btw, its a sad fact of life that most discovery and invention comes out of military needs...
hobgoblin
QUOTE (Kerenshara @ Sep 21 2009, 04:43 AM) *
I saw the piece, and I had exactly the same response as the reporter (and felt like an idiot for apeing the comment) but I think I can be forgiven: "No way".

I always figured that once we had a sufficiently "dexterous" hand/wrist, they would take the time to map the nerve junctions. As it stands, the existing arms are too primitive to make it worth it. But with independent finger functionalits and force feedback, now we have REASON to spend the time and money on the mapping.

Incidentally, I caught a bit on a legit "cybereye" a couple months back. Currently, it's something like only 16 pixels of resolution, but it's helping a blind woman make out shapes. Once the data is back on the neural mapping, they'll take it up an order. Simple black-and-white VCR level resolution of true bionic eyes could be as little as a decade away, and the hard part's been the theory. It should speed up from here.

the sad thing about neural mapping is that it has to be done on a per person basis...

this as while the general areas are the same, the exact signals will vary from person to person.

at best, one can come up with a generic connector, and then tell the user to do some kind of training scenario while the electronics reads the signals...
Khyron
QUOTE (hobgoblin @ Sep 21 2009, 03:42 AM) *
the sad thing about neural mapping is that it has to be done on a per person basis...

this as while the general areas are the same, the exact signals will vary from person to person.

at best, one can come up with a generic connector, and then tell the user to do some kind of training scenario while the electronics reads the signals...


If it's the difference between having a near useless hook arm and having an arm that works, even a multi-year long training program would be more then acceptable.
hobgoblin
yep, but going from that to performance enhancing replacements seems far...
overcannon
Not too far I would think, for certain aspects. Strength is just an issue of motor power, but agility would definitely be much more difficult to handle. Besides, the 6th world won't dawn for another three years smile.gif.
The Overlord
I saw the piece a while back and my reaction remains the same: Awesome! As cheesy as this sounds, it warms my heart to see the true potential of giving limbs back to people who have lost them. It also makes me wonder what such inventions could give to people who still have their limbs but want more. Oh and just on a side note, the guy leading the development of the arms is Dean Kamen and I have actually meet him in person a few times through his FIRST (For Inspiration and Recognition of Science and Technology) competition. The man is Awesome!
Sponge
QUOTE (hobgoblin @ Sep 21 2009, 04:42 AM) *
the sad thing about neural mapping is that it has to be done on a per person basis...

this as while the general areas are the same, the exact signals will vary from person to person.

at best, one can come up with a generic connector, and then tell the user to do some kind of training scenario while the electronics reads the signals...


Sounds like the kind of thing that could be done on the tech side rather than the human side, with some neural network software or similar - training the limb to read the human's signals would probably be significantly faster than training the human to signal the limb correctly.
Demonseed Elite
QUOTE (The Overlord @ Sep 21 2009, 10:06 AM) *
Oh and just on a side note, the guy leading the development of the arms is Dean Kamen and I have actually meet him in person a few times through his FIRST (For Inspiration and Recognition of Science and Technology) competition. The man is Awesome!


He really is. I'm pretty amazed that cyberlimbs are being developed pretty much down the street from where I live (DEKA is located right near me).
Kerenshara
QUOTE (Sponge @ Sep 21 2009, 10:54 AM) *
Sounds like the kind of thing that could be done on the tech side rather than the human side, with some neural network software or similar - training the limb to read the human's signals would probably be significantly faster than training the human to signal the limb correctly.

This is essentially what I expect to see happen. The big hurdle is going to be figuring out HOW the brain fundamentally processes information from the senses and/or how it sends commands to the muscle groups. The second should actually be pretty uniform across the (meta)human condition once you get into the white matter, because all muscles respond about the same, so it stands to reason at some point there's the biological equivalent of a DAC which turns the "commands" into actual control inputs. But the first one, figuring out how the brain handles the ADC conversion is the bear. Once we get the actual "how" worked out, it should be a matter of sending apropriate "nominal" signals and adjusting the software until the person can perceive things correctly. I wouldn't be surprised to discover that it's more a question of "moving the graph to match" the person's idosyncracies rather than having to materially change the underlying equations all that much. But that's just a WAG.
hobgoblin
hmm, i recall reading about them rewiring working arm neurons to small chest muscles, and then have sensors pick that up and control the arm that way...
Kerenshara
QUOTE (hobgoblin @ Sep 26 2009, 03:14 PM) *
hmm, i recall reading about them rewiring working arm neurons to small chest muscles, and then have sensors pick that up and control the arm that way...

That's the current - essentially analog - technology. It's using the meat side to drive the 'ware. I'm talking about using the thing they were doing at the end, where he was "thinking about having his old hand doing something" and reading the resulting impulses. The next step is to read the BRAIN signals rather than the muscular cues.
hobgoblin
and then feed the brain data from sensors along the limb to complete the experience...

still, going directly to the brain should probably be a last resort, as its 3D layout must surely result in a poor signal to noise ratio, going in at the shoulder or similar may be a better option...
Kerenshara
QUOTE (hobgoblin @ Sep 26 2009, 04:18 PM) *
and then feed the brain data from sensors along the limb to complete the experience...

still, going directly to the brain should probably be a last resort, as its 3D layout must surely result in a poor signal to noise ratio, going in at the shoulder or similar may be a better option...

ANY kind of pure machine-nervous interface would be a breakthrough. Right now, everything is muscle related, be it pressure, tension or electrical capacitance.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012