How should man and machine work together? Should they inform one another from our embodied and digital perspectives? Should machines be pro-active or reactive? Do we query them or can they query us? If so, how?
Check out this Augmented Reality prototype.
The prototype explores prescriptive operating systems that was inspired by my team at Futures Design Lab in San Francisco. We hatched an idea to use artificial intelligence and reality computing to help humanity. This blog builds on that idea to imagine how we might work interactively with machines or distributed personal computers via Augmented Reality.
What if your AR headset machine knew how you approach problems? What if it learned which applications you use at different parts of the day in relation to inputs and outputs on you communication channels and schedule? For example, often times the work I do is distributed across a Word Processor, Spreadsheet, Calendar, Design Tool, Email, Chat Application, Data Repository, and on and on…Wouldn’t it be great if I didn’t have to open an application but that the parts of applications I need work harmoniously and independently of me?
Today let’s consider a hypothetical new way to work with a fun operating system concept we will call YOU. YOU could be a contextual Augmented Reality interface. It might provide the options you want based on your own behavior. Beyond that, it may analyze and prescribe or recommend to you decision options to optimize how you manage your tasks.
COMPUTATION AT YOUR FINGER TIPS
When you put on your Augmented Reality Headset, YOU would be with you the moment you move your hand. Using advanced computer vision algorithms it positions UI elements at your fingertips for actions you take most frequently at that time of day, independent of what device you are on. No need to pick up a phone or log in to your PC. It is all right there in front of you.
WORKFLOWS BASED ON YOUR CALENDAR
You look at your Calendar by touching it with your other hand. You don’t need to remember how to activate it because YOU learns your actions. In other words, your interface is performing multivariate testing on your behavior to see which schema you are biologically and emotionally predisposed to. Once you activate the calendar you see your schedule populate. You see that you have Satellite Analysis to do because you work for NASA.
RAISING PRESCRIPTIVE ABILITIES
This is where a new level of options gets presented to you. YOU knows you look at many charts and graphs. YOU interfaces with a satellite reporting system for real time updates of all of the 1100 satellites in space at any given time. YOU would ideally present you with a few different charts and applications to select from, but let’s assume that step happened already. You chose a treemap depicting the statuses of aging satellites running out of propellant which, if not refueled, could descend back into the Earth’s atmosphere and burn up. Of all the satellites you could examine, YOU has selected one in particular for review by highlighting it in bright green.
THE WORLD IN YOUR HANDS.
You touch this satellite and a new modeling application pops up showing you the relative angular velocity of a healthy satellite with enough propellant. From here you speak to YOU and tell it to model various outcomes.
Prescriptive computation is happening all around us but it is subtle. You can notice it in Google Maps or Waze when it offers you a faster route. Enterprise Analytics companies are integrating prescriptive features into their products.
Computation is now transcending all the media we experience it in. While YOU is one vision of the future, it’s clear the current computational paradigm is rapidly changing. The ways you interact with your current desktop and mobile devices are going the way of the typewriter and pager. We all have a hand in shaping how we interact with data tomorrow. Let’s imagine a world where we might not have to be sitting at a chair for 8 hours a day to do groundbreaking work.