With all the advances we’ve made in technology so far, I imagine that future technology will be based on what we see.
It’s not inconceivable that wearable technology will one day go beyond the wrist and glasses and find its way into people’s eyeballs. We’ve seen this concept in recent media, an example coming from the show Black Mirror.
In our very first Communications@Syracuse course, we were shown a YouTube video called HYPER-REALITY, and it essentially predicts a similar innovation.
Perhaps in this new world, everything we do makes use of our eyes. It’s being done in the world right now with Google Glass and virtual reality. There is a lot of work being done with augmented and mixed reality, too.
Even DJI created a headset for drone pilots to see things from their drones perspective and control the camera with the rotating and tilting of their heads.
I can’t help but notice that so many of these recent technologies have been encouraging us to strap on something to our heads. There’s no doubt that a factor in mass adoption is making it easier for people. I can think of no better way than by making it appear less intrusive, and that’s through our eyes.
This will really change the way people consume media. A new challenge this will bring to the world is that no one will know when someone’s attention is elsewhere. Unless these lenses have an indicator the viewer is tuned into something else, there’s no way of knowing what content they’re accessing.
For instance, imagine you’re on a blind date with someone, and during your initial meeting, you have the ability to research their background while you’re with them. If you’re trying to think of conversation topics, you could easily navigate and find ideas quickly. You can learn what the persons likes and dislikes are, and help steer the conversation, all without them knowing.
A good point that should be mentioned is that this might require more than just your eyes; it could require some kind of neural chip that allows your brain to navigate through things more easily.
This new technology could make it so people don’t need to carry IDs; law enforcement, SAT Test proctors or anyone else who needs to figure out someone’s identity could easily do so by looking at them. However, this opens a whole new can of privacy concerns. Can anyone just scan and analyze anyone for their information? Even if they’re a minor?
In terms of media reporting, think of all the eyewitness accounts that could be provided for many situations. What if people could livestream immediately from their eyeballs if they’re in trouble and they could easily be located. Drones could be controlled with people’s minds and seen through their eyes, ensuring they get all the shots they need without having to use a controller.
Just as Snapchat is being used right now to combine people’s stories to give a complete picture of a story; the same could be done if everyone has these eye lenses.
Firefighters could use this technology for thermal vision to see if the whereabouts of people during a fire. Police can use night vision to apprehend criminals running at night.
There are a lot of possibilities with this technology, but there’s also a lot of risk. Right now, Apple’s facial recognition software is being scrutinized by the government; so I can only imagine how paranoid they’ll be with everyone having a media device in their eyeballs.
It’s hard to say when this might happen, but I’ll give it another decade or so before it finally catches on.
In my career, I imagine myself figuring out the best way to use this technology to tell stories. If I were to imagine myself doing something similar to what I do now, which is creating social media content, then I imagine a lot of the footage I’d be collecting would involve people’s perspectives to create more engaging content.