Sunday, May 17, 2009

Battlefield Telepathy - Project Silent Talk


May 14th 2009
Washington DC, USA

The Pentagon's DARPA (Defense Advanced Research Projects Agency) has budgeted 4 Million of next year's budget on a project called 'Silent Talk'. Its goal:“to allow user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals.”

Speech exists as neural signals in the brain, DARPA wants to develop the technology to extract these signals before they are spoken and transmit them between soldiers allowing for more stealth in combat.
Electroencephalography (EEG) is the reading of electrical activity across the scalp produced by neural activity within the Brian. DARPA's first step is to map EEG into of an individual to his or her words.

Whether EEG patterns for speech are general across people or different from person to person is another question.

Even if they aren't, it's may be possible to calibrate the system to a persons personal pattern. Similar to speech recognition software, where the users must read aloud a transcript so the software can recognize his or her voice the first time . By reading (or not reading!) a predefined script that contains all the vocalizations of speech the software may be able map out a personal pattern.

The nest step is to transmit the words over a limited range to the receiving soldier, which is pretty simple with encrypted short range radio frequency like blue tooth.

DARPA has been famous for commissioning weird projects from combat exoskeletons, brainwave binoculars and much more, which can warrant another blog post. So keep you eyes on this blog!

Friday, May 15, 2009

Scientists Read Mind

Kyoto, Japan.
2008 December 11

Scientist
from the ATR Computational Neuroscience Laboratories, Kyoto, Japan have developed a method of extracting images directly from the brain.

Using Functional Magnetic Resonance Imaging (fMRI) of a subject's cerebral visual cortex and a neural network computer application to successfully extract an image of what the subject was seeing.

The researches first used the MRI to map out blood flow and changes to the blood flow of the cerebral visual cortex while the subject viewed an 10 x10 pixel image. This was repeated and the data was then collected for a set of 400 images. The data for 400 images and fMRI scans where fed into the computer neural network model which extracted the pattern, Building a network for that particular pattern.

fMRIs are a specialized form of MRIs that measures blood flow and blood oxygenation (collectively know as hemodynamics). Hemodynamic have been know to be closely linked to neural activity, active neurons need oxygen for energy increasing capillaries in that localized area of the brain.

Neural Networks in computer science are computer models loosely based on biological neural networks, which are used to model complex relationships between inputs and outs.
A neural network after being designed first reads a data set of know inputs against know outputs, after reading each set it adapts its network to incorporate this into its pattern slowly 'learning' the relationship between the input and outputs one data set at a time.

The computer model was then used to read an MRI of subjects viewing an image not in the initial set and reconstruct it from scratch. Although the reconstructed image is rather crude it can be seen that that it match's the image seen by subjects.



The applications for this technology open up many new possibilities like treating mental disorders by allowing doctors to see the thoughts of a patient and better understanding the condition or 'neural marketing' advertising application can read you thoughts and show you relative adverts as you walk pass a billboard.

if the technology improves, such as better imaging and easier fMRI, like any other technology it also brings up many ethical concerns such as privacy.