Back in September, scientists decoded words from brain signals. It’s not a matter of if, but when inter-cortical cognition grids happens. Inter-cortical communication will completely disrupt the arc of human evolution. Odds. Are. You. Are. Not. Ready. Human. You’re every thought will be laid bare to all other minds on the grid. Lusts, fears, paranoia, confusion, all of it. Prior to going on-grid would be a good time to practice judge not, lest ye be judged. Prior to going on-grid would be a good time to practice putting idle synaptic cycles to better use in order to be found useful. Prior to going on-grid would be a good time to think about what substrate independence really means, psychologically.
The Journal of Neural Engineering's September issue is publishing Greger's study showing the feasibility of translating brain signals into computer-spoken words. The University of Utah research team placed grids of tiny microelectrodes over speech centers in the brain of a volunteer with severe epileptic seizures. The man already had a craniotomy – temporary partial skull removal – so doctors could place larger, conventional electrodes to locate the source of his seizures and surgically stop them. Using the experimental microelectrodes, the scientists recorded brain signals as the patient repeatedly read each of 10 words that might be useful to a paralyzed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less. Later, they tried figuring out which brain signals represented each of the 10 words. When they compared any two brain signals – such as those generated when the man said the words "yes" and "no" – they were able to distinguish brain signals for each word 76 percent to 90 percent of the time.