QUOTE (tagz @ Mar 4 2010, 05:01 PM)
And because I like to argue when I'm board...
Ahhh, but just because they are both using photon reflection does not make them the same. They use VERY different frequencies, and that plays a
huge role in vision.
Visual light is in the frequency range of 400THz - 789THz.
"Near Infrared" (which is the heat vision thermographic is based off of I'm 99% sure) has a range of 120THz - 400THz.
Low Light vision is based off of a cat's eye sight, needing only 1/6 of the light we do to see. That is the regular visual light frequency, but the cat eye has what's called a tapetum, a reflective layer behind the retina. That can easily be the cyber solution is to insert a man made tapetum.
Radar operates on a frequency of 3MHz - 10.5GHz (UWB on the 10.5GHz end).
http://upload.wikimedia.org/wikipedia/comm...ic-Spectrum.pngLets compare Near Infrared to Ultra Wide Band (the closest to visual frequencies).
789000000000000 - end of visual spectrum
120000000000000 - low end of infrared
10500000000 - high end of radar
The GHz frequency is extremely far removed from the THz frequency.
Now consider this:
You record a sound clip of yourself saying "Hello there, my name is Tom". Now you slow it to half speed. Still recognizable. But lets say you slowed it to one hundredths the speed. Now it's so slow you can't make recognizable difference from one
letter sound to the next let alone words. You brain can't interpret it.
Oddly enough I'm not asking the brain to interpret it directly. Just as your brain does not interpret light directly. Also the brain runs at ~ 60 hz, or somewhere around there, as that is the shortest period of time on object can exist in the average human visual field and be seen.
The radar image is going to be processed by some 'device' just as a visual image is going to be processed by some device in some kind of eye. In a natural eye this processing occurs in 4 layers of neurons, and in a cyber eye, this occurs on some chip, to allow for graphical overlays, switching vision modes and the like.
QUOTE
It's a similar situation. You can make the leap that the brain can interpret Near Infrared, it's just that they eyes can't normally process it and filter it out before sending the signal to the optic nerve.
The eyes as near as I understand do not actually contain a receptor FOR near infrared. It would not be a matter of filtering it out, as there would be no data to filter. Also the brain processes data spatially, which is to say points on the cortex can be mapped to specific regions of the retina, with the fovia taking up more space on the cortex, and the perifery taking up less proportional to the actual area on the retina. As the brain can color and black and white using the same cortex, all that would happen is that the radar system would tie into the part of the optic nerve responsible for night vision and use the black and white processing of the brain to say "yes radar return" or "no radar return" and create a black and white image of what your sensor is seeing.
QUOTE
But radar is on such a different level.... I would argue that the brain's visual cortex couldn't handle it, and it may even cause brain damage if this signal was sent to it directly.
Then study some neuropsychology, you'd be surprised what the brain can do, I know I am.
QUOTE
The solution? A fictitious representation! Rather then send the very different signal directly, why not create a picture?
You mean like the eye does already?
QUOTE
That's how we currently see in those frequencies to begin with, a picture or video of the image but converted into a spectrum we can see and THEN processed by our eyes. And in our future world they have the ability to make that video appear right into your vision. But it's still not the actual image, your mind isn't receiving the actual signal.
It doesn't with the actual eyes either. There is substantial processing and filtering which occurs in natural eyes. For example an object CAN physically emit both red and green light at the same time, but you will NEVER see a redish green object. The human eye uses the same receptor for red and green, and speeds up a signal for one, and slows it for the other color. If both are detected at the same time, you 'tend' to seen brown if anything at all. The eye also has a contrast increasing feature which highlights the edges of objects, and where colors change. What you SEE, is NOT the signal your eye is receiving. Cyber eyes, with low light, thermo graphic, vision displays, ect. make this problem even worse, and not better.
QUOTE
See the big thing here is that Thermo and low light need no conversion for the brain. They just mod the eye to allow a larger frequency band into the optic nerve. Radar, Ultrasound, and UWB need a conversion step to change that GHz band into the THz band.
Nice attempt at techno babble, but sorry, no. It doesn't work like that. As I said before the brain only works at about 60 hz, so changing the band like that would make no difference at all.