Avenues to Substrate Independence
Ultimately, there will be diverse UX alternatives for substrate independence. The robotic substrate is certainly a fascinating option to consider and we not only can, but must immediately begin preparatory thinking, training, behavioral, and psychological exercises to prepare for increasingly high resolution software and hardware mediated experiences.
Certainly within ten years and likely within five, we will see the convergence of the dexterity of R2 Robonaut, the mobility of AIST and Kawada’s HRP-4, the quotidian autonomy of Anybots, the first-person perspective archival capabilities of Looxcie, the brain machine interface typical of today’s prosthetic arms and legs, in addition to titanium foam bones, sub-kilopascal artificial skin, possibly even hyper-augmented with thin-sheet Displays as I/O Devices and literal mind reading internal Attention Management System HUD’s – vastly improved versions of software like Feedly and My6Sense which are designed to help surface the most salient and actionable information streaming throughout the vastness of the Internet of Things and the ever expanding Global Cognition Grid, all integrated into our 2020 Tesla built MacAvatars, powered by Google, and designed by Apple in California. ;-)
We will not need “mind uploads” for this phase of self-guided, participatory, migratory evolution. Within ten years, we will see vastly improved and multi-functioned brain-machine interfaces to these device and the utterly immersive first person UX will become increasingly difficult to discern from “real life.”
So don’t hold on too tight, Dorothy, or a hole the size of Kansas might get inadvertently ripped through your cute little bioconservative extremist hands. Or, in the words of the sub-legendary 38 Special, “hold on loosely, but don’t let go. If you cling too tightly, you’re gonna’ lose control.”