What science fiction can teach us about the future of user interfaces

Have you ever dreamed of using an awesome device from a science-fiction movie (such as the control board from Star Trek) ? Well, you’re not just dreaming, as science fiction has usually been a great source of inspiration for designers.

According to Shedroff Nathan in Make It So, users have often become familiar with new computer interfaces through filmmakers’ imaginary. By subconsciously influencing the evolution of man-machine interactions, fictional designers have made us gradually adopt next-gen products.

Here are 4 examples of futuristic design that might likely be part of our daily life in a not-so-distant future.

The promising role of sonic interfaces

The Esper Machine in Blade Runner
The Esper Machine in Blade Runner

Ever noticed that in the original Star Wars trilogy, pilots like Han Solo can hear the sounds of spaceships through space? Even if it’s a cinematic trick, it reveals a very promising technology for the future: echolocation.

Our ears have a natural intuition of the space and the distances that surround them, and replace our eyes when their vision is limited. One example are car radar detectors that prevent impact, but sonic interfaces have found many applications in movies.

These sound detection products can especially be powerful thanks to voice control. For example, in Blade Runner, Deckard controls the Esper machine to analyze investigative photographs by using his voice. All he has to do is say “go forward, stop, zoom, go from 34 to 36” for the viewfinder to move and zoom in the right spot. This technology makes it really convenient to navigate a digital space without using our hands.

We’re already seeing this with connected speakers like Google Voice or Amazon Echo. Their voice command control makes it really natural and immediate to move through interfaces. Users give requests to the interface as if they were asking a human being, and don’t need to make the effort of searching and manipulating control buttons.

Gesture and intuitive controls

Minority Report’s gesture-driven interface
Minority Report’s gesture-driven interface

Science fiction has put forward even more intuitive type of interfaces, based on gesture controls.

In Minority Reports, John Anderton can use for example hand gesture control to browse through mental images. He unrolls a video interface with his hand, which helps him find clues to upcoming murders. The advantage of gestural features is evidently to make navigation extremely fast and easy. This is perhaps the most intuitive video editor prototype ever imagined!

Of course, systems like the Xbox Kinect and Nintendo’s Wii Remote have already familiarized users with motion-sensing controls. But they have not yet accustomed them to mastering more abstract interfaces. They also show the limits of gesture commands for more precise and risky requests. For example, in Sleep Dealer, Rudy controls a flying drone by hand. Yet, It seems difficult to imagine the gesture that would define dropping a bomb or activating the video stream.

Gesture detection features have therefore a big future ahead of them to speed up the workflow of often silent and static interfaces. But designers and users might not use them for all kinds of purposes.

Acting by thinking

Matrix’s brain-machine interface
Matrix’s brain-machine interface

In more speculative horizons, science-fiction movies have also shown a lot of creative ideas regarding brain-machine interfaces.

In Matrix, Neo gets implanted on his head a machine that creates a mental simulation. In this virtual environment, he learns very quickly the fighting skills that will help him escape from the Matrix. These interfaces trigger the motor and visual neurons of his cortex to build imaginary obstacles.

From the same perspective, in Star Trek, multiple forms of mind control appear, whether to manipulate the minds or memories of others, or control objects. In Star Trek Voyager, Paris is able to control an entire ship with her mind. In fact, of all the applications of mind control interfaces, spatial displacement features seem to be the most viable today.

Many current brain-machine interfaces use spatial localization neurons to assist in moving objects via electroencephalography, such as the Emotiv EPOC. But what still remains in the realm of theory is an interface that can detect more subtle thoughts and intentions. More generally, these technologies are still little known by both science fiction creators and designers. Nevertheless, they have a great potential to boost human-machine interaction.

Thoughtful Human-like interfaces

Dum-E assisting robots in Iron Man
Dum-E assisting robots in Iron Man

Science-fiction filmmakers have also long explored the impact of human-like technologies on its users. They have especially wondered about the impact of anthropomorphism on our emotions.

For example, in The Matrix, Agent Smith is a robotic simulation whose goal is to destroy Neo and the waking humans. It looks in every way like a human, but with endless agility and cruelty. Because of its very human appearance, the character triggers even more threatening and uneasy sentiments.

We see here the limits of Uncanny Valley, where an all too human appearance creates an unsettling and surreal feeling. On the contrary, the Oracle who predicts and guides Neo on his destiny, disguised as a cookie, seems totally disproportionate and much less impressive.

In Iron Man, the problem is solved by giving the robots a more animal-like appearance, which lends trust and sympathy. Tony Stark’s assistant robot, called Dummy, hardly responds to his comments, but obeys with his fingers and eyes. His appearance and posture reminds of an animal like a dog that would obey to his owner. He therefore directly attracts sympathy, without being judged for his limited abilities.

In the same way, R2D2 in Star Wars attracts much more sympathy than his golden robot counterpart. By not speaking a human language and not having any anthropomorphic gestures, we like him even more. Zoomorphism is therefore an answer to Uncanny Valley so that users accept robots more easily, and do not have excessive expectations about them.

As you can see, far from being surreal inventions, science fiction-based design can give lessons to tomorrow’s designers. They bring new visions on the different human-machine interfaces, and get the audience used to futuristic inventions. They also raise the question of the impact of these new technologies on human psychology.

Have an automation project in mind ?

Don’t Stop Here

More To Explore