social network

Blogging How the Augmented Reality interface will look like by 2025

Hardly I guess something, but still I’ll try.

This post is a fantasy, about which no one asked me, on the topic of how the interface of the operating system will look in the augmented reality of 2025. The material turned out to be long and very subjective. Maybe I’ll even guess something, but most likely not.

Current state of affairs

We will no longer look at the screens of our phones more often than now. The upcoming update of iOS and Android will add a service, which I would call ” Please return me control over the quality of my smartphone life.” These are small OS tweaks that will be asked to be distracted from the screen or offer special modes where no downs will come, and the screen will not be as colorful and informative.

This will obviously affect the fact that the time spent on Instagram or any other service will decrease slightly. For this, operating systems also enter statistics sections so that you can draw some conclusions and limit yourself in something.

It is understandable, the Pushes now send not only important reasons (like ), but literally any: go back to the game, write a post, go on a friend, end the rap-battles, update the OS, see tweets and the like.

At the same time, the AR Kit is the latest version – a platform for working with augmented reality – that in iOS, which in Android does not cause any complaints. Virtual objects are perfectly located in space, you can interact with them (though while through the screen). Moreover, the successes in the field of machine learning allow not only, say, to catch on a surface or a wall, but also to recognize the image and interact with it – all this is sewn into a convenient set for the developer. I do not want to take it and make an application.

An example of a combination of augmented reality and object recognition ( there are many of them on request # arkit2 on Twitter )

Particularly well in AR are some kind of educational materials:

School project with the same technology, source

The only thing that is now bad with AR, is that nobody likes to look at the screen and drive the phone in space. It’s just not convenient, let alone what looks silly.

 

On this infographics, which I'm proud of, it is clear that the current augmented reality is 90% uninteresting, since the phone screen itself is small
On this infographics, which I’m proud of, it is clear that the current augmented reality is 90% uninteresting, since the phone screen itself is small

In general, given the high quality of the augmented reality on the latest smartphones (close to ideal), the market is getting closer and closer to AR-points. Let’s try to imagine what they could be – and most importantly, what interfaces you would use there.

Welcome to the future – AR first 2025

Typical person from the future ☝︎
Typical person from the future ☝︎

In the courtyard of 2025, AR glasses are a common thing for geeks or workers in narrow spheres, and of course they look like ****** (bad). They normally do not go out into the street, they are still big enough, still quite uncomfortable. And the application interfaces to the augmented reality really nobody standardized due to the fact that there is no single marketplace – let’s say that large players react more slowly to new things, as they wait for the most successful solutions in this market, as well as the formation of demand.

So it was with the advent of the mobile devices market about 10 years ago, when instead of a PDA with a stylus there was an iPhone. So, it seems to me, it will be this time too. A large player will come to the AR market with an application store and introduce some standard of guidelines, which the market will develop.

Since this is my fantasy, here you can write any nonsense. So, in 2025, some Apple presents glasses that look like ordinary glasses. They are elegant enough, they can go out into the street. And already there are a number of concepts for managing them.

Actually, I wanted to talk specifically about the concepts of managing AR-information – about the most comfortable AR-mechanics and about possible future guides, and not about those that we all show in the movies.

Like these, ordinary glasses in general
Like these, ordinary glasses in general

Movement with hands

The first thing that makes me particularly annoyed in the cinema when showing augmented reality is when the user is standing and waving his hands, sending windows or buttons to the space – this is super-comforting.

Of course, you can do this, especially if you want to show something to the same people in glasses or in case of some short iteration with the object. For example, “tapknut” in a coffee machine and choose an espresso.

But in ordinary life:

An example of an AR interface from the movie “Dissenting opinion.” No special gloves, of course, are not needed

For the sake of interest I paused and poked my hand against the wall for several minutes, fantasizing about interfaces. The only justification for such an approach for a long iteration is when you are presenting a presentation in augmented reality. Otherwise, your hands get tired pretty quickly.

A concept for managing a smart home through Holo Lens. Easier to click the switch than to reach the button

Still Leap Motion in the end of July showed their version of the interface in augmented reality. While this is a depressing spectacle.

Imagine: here you need to type the text. In this version, the launch of voice input is not at all in the Tap Zone
Imagine: here you need to type the text. In this version, the launch of voice input is not at all in the Tap Zone
My hand is already tired of the duration of the gyphus

Conclusion

The basis for the interfaces should be the natural things that we used to use in everyday life and for thousands of years – a table, a wall, a palm (about it later). Hand movements should be familiar and not worse than those that are on phones. Interaction with objects in space, of course, remains, but should be short-term.

Elements of the interface in space

The second thing, where the UI / UX-designers are not yet fully developed, is, of course, a useful space or the interface of augmented reality.

An example of an entertaining but not the best AR-interface

The interface should be simple, understandable and informative. Especially one that can directly affect your life.

I think that there will not be such an API in the operating systems of glasses, which will allow without the initiative of the user to cover the view from the first person with some kind of pop (an interesting idea for viruses).

Such mechanics can simply kill – in the car, on a bicycle, on a scooter or on foot, and safety of the user in a priority of any IT-company. If we keep this in mind, then we can imagine that a certain “zoning”, for example, is lying well on the interface.

AR space zones
AR space zones

This ugly mockup tries to convey the spirit of how to distribute the user’s work area. For example, it is clear that you need a small area for system information, like clocks, information about the battery, a mini-map and the like.

A small zone, it will probably be allocated for some important information-like information – but there will probably be some restriction on the amount of text in this zone, so as not to distract the person. It is advisable to do without text at all, or throw out these 2.0 push nafigs.

To alert you about emergencies, accidents and much more, this place would be a good idea, because where else is better to write that you, for example, in the radioactive zone. The main thing is not to allow applications-games or media-services into this zone.

The main working area for interacting with applications is at the bottom – on the surfaces in front of the user.

  • First, we almost always look under our feet, so it’s easy enough to look down;
  • Secondly, for thousands of years we use tables or similar items. It would be strange not to use this further.

This intersects with the point about the hands that get tired if you keep them in the air for long. In this case, some files can be suspended in the air, some – scattered on the heap on the table.

The most important thing is that the zone in front of the user should be conditionally prohibited for any mechanic, which the user himself will not call. By default, no one in this zone will be able to intervene, because this will need to launch the application.

Of course, you can consciously place there something or completely close the image of the real world by running the game. Thus, you get a normal VR-helmet. Not all this, I think, is understood, but the ideal AR-helmet is the default VR helmet.

An example of informative navigation in AR. The source code is available here

Another ban on the use of the central zone should not extend to objects with which you would like to interact or, alas, on advertising. Most likely, such objects will have a high priority for displaying something like “your car is here”, “taxi here”, “buying a ticket to transport is available here” and “best burgers in the city without registration and SMS here.”

Separately, I am amused by the thought that in theory you could add a home coffee maker to your favorites and see it always – even through walls when you are at home. Very comfortably.

Conclusion

The interface of the AR-operating system should be simple, informative and do not block the view from the user’s camera (head?) Without permission.

The working prototype from Leap Motion looks now so
The working prototype from Leap Motion looks now so

The most native way to control AR

It can be assumed that the most convenient way to manage augmented reality will be something that is always with you, and something that can be quickly called exactly when you need it. It seems to me that the palm will be a good vector in this direction.

Example of gesture for calling the main menu, prototype

First, if you have all the hands in place, then the palm is always with you. In addition to convenience, this is also a good plus for you. Secondly, in this structure are hundreds of thousands of applications that are already written and present in the application stores. Doing a platform where there are no applications and developing it from scratch is expensive and difficult. Perhaps supporting old applications in this format is a good idea for giants like Apple or Google.

.

That’s what the catalog of handsome men would look like in AR

Another plus to this approach – all the gestures for interacting with the screen, which people are using now, normally fall on such AR-UX. Do not have to retrain, because you just need to add some gestures or tools to remove launched applications from the palm of your hand on the surface around you or just before the user.

The gesture palette can be expanded – brush the photo away from the hand to the friend’s side and he will receive a request for its receipt, squeeze your hand into a fist three times and take a screenshot of the reality around you.

All these gestures are a cool thing to fantasize about. I already used some on the street for tests like crazy. You would have liked – and then how I applied them and how they would actually work.

I sincerely believe that the most successful and painless transition period in the AR universe would happen with the help of projections on the palm. If you know someone who makes these interfaces and helmets, share this idea with him. I think we should stop using the web and mobile interfaces in a completely new paradigm of the virtual world.

You may find this difficult, because you need to distinguish between hands, fingers and track movements. But all this is already solved problems, not to mention the year 2024. Here is an example of the North Star project from Leap Motion, about which I wrote above. North Star is an open-source project, that is, you can build a helmet yourself now.


An example of what you can already have
Such a mechanics could supplant even hours and favorite, already existing wearable interfaces.

A nice example of an AR-clock with Dribbble

Why do we need something on the palms, is there a voice interface?

I am convinced that despite the development of voice interfaces, the need for graphic interfaces will not disappear. There are cases and even categories of people who are not comfortable talking aloud what they want. I myself belong to this category of people. Every time I talk through a headset in a public place, I feel stupid. Naturally, I, and others like me, would not have whispered into the palm of your hand “Start Instagram”: for us there will be good old buttons and quick search by applications.

And there are a number of tasks that can not solve voice or dialog interfaces. On this topic, I already wrote in detail, so I will not repeat myself.

We obviously need some kind of keyboard, something in between the Swype approach and the “finger” in the air, as suggested by Leap Motion. It seems to me that the concept of how the input is now arranged from telephonics again fits in well: you hold something with your hands and this something prints the text.

NEC in 2015 thought it could be convenient. Obviously, no – input must be possible with two hands

An example of what a keyboard might look like in AR, I will not even try. It’s just a constant job with a prototype to understand what’s comfortable there – it’s clear that driving your finger in the air is an idea.

Advertising Opportunities

The joint AR is what Apple showed on its latest WWDC and what will undoubtedly change the principle of information exchange is when one user sees the virtual world of another user.

Video from WWDC broadcast with an application from Lego
A joint AR is so basic that I will not describe it separately. I remembered about her, to fantasize over the personalized advertising opportunities that all of us are waiting for.

A beautiful world of the future, a view from 2010

In 2017, the Wisconsin court ruled that the AR technology is protected by the first amendment. This means that the space of augmented reality is free and accessible to all, if the information in the AR-space is not something false, criminal or slanderous. Otherwise, it will be deleted, and the author will be punished.

If you reformulate the verdict of the district judge: even if you are the owner of the property, does not mean that its virtual surface belongs to you. It’s a bit like GPS: when buying a house, you do not own the signals that the satellites broadcast.

The same author as above, the 2016 movie

There is an interesting book on this subject: Augmented Reality Law, Privacy, and Ethics. The author comes to the fact that any surface – buildings and streets of the real world – will by law be used for commercial purposes by advertising companies.

Before you are indignant and say “Yes, how much you can”, I would like to describe two things that we, as users, will be available.

1. Reducing the cost of transportation

Imagine that now all the trains, planes and taxis are constantly turning you advertising in the windows. You take off your glasses, think how much you can, but it turns out that the advertising there is sewn into the display windows.

Probably there will be a plus from such advertising, because the cost of the trip should fall heavily, since no one agrees to suffer for nothing. Imagine an almost free-of-charge autonomous taxi – it’s unlikely that you would refuse to pay for it with your attention.

This idea was told to me by a venture capitalist who talked with another venture capitalist in London, at a meeting of venture capitalists and so on. Most likely, it will someday reach reality. It will be more insulting if the price for the service does not decrease.

Curious Toyota concept about the windows-displays inside the car and the girl that dirtys the windows, a screenshot from the video

2. AR-adblock

Nobody can ever take away ADBlock from you. The most advanced users will always be isolated from some annoying factors – advertising or some people.

The obvious future of the owner of AR-glasses with adblock

The best concept of ignoring people revealed in the series Black Mirror, and the concepts AdBlock AR exist since 2015.


A good plus to the helmet – advertising from the real world can also be covered
What in the end
2025 is simply awesome, it would be faster to get there. The main conclusion is that anyone who designs AR interfaces should take into account how we used to work with objects right after birth, do not hang huge menus in the air.

In addition to the convenience and beauty of the AR-native applications that await us, it seems to me that everything will also remain relevant to web technologies – as it in its beginning to support mobile interfaces (now a mobile friendly standard of the web development industry), in the same way over time will begin to support and AR-glasses, placing buttons in some convenient form for augmented reality.

 

 

Back to top button
Close
Close