30 UX Tips in Augmented Reality



After “ 50 tips for UX in virtual reality ”, I finally put together a set of tips for UX in augmented reality and tried to concentrate on individual solutions that can make the process of user interaction with the digital layer easier and more convenient. This is primarily about recommendations for building a comfortable experience in mobile AR, as the most common at the current time.

User comfort and safety since the start of the application


1. The user must know in advance whether he needs to stand, sit or move in space while using the application, so it’s important to indicate this on the initial screens immediately after launching or offer an option to choose from - for example, viewing an object in miniature on a table or in full scale with installation on the floor.

2. It is important through testing to verify the length of the session so that the user is not tired of constantly holding the device in his hands while controlling the camera. This is true in cases where the project often requires changing the position of the phone in space.

3.Think about user safety. This seems to be a fairly simple recommendation, which is now even a requirement for access to the game (we are talking about various warnings that should be shown to the user when starting the AR mode). But this is also important to consider in the context of developing a content management system - for example, a refusal at the level of mechanics from sudden movements in real space, etc.

4. As in VR (such a comparison can be seen more than once below), it may be difficult for the user to start moving in space, therefore it is better to invite him to take a couple of steps to the virtual object right at the stage of the training system.

And here it is important to avoid situations in which a person needs to move by stepping back.

5.If possible, it is better to support both portrait and landscape modes of the application to provide the user with the ability to hold the phone in a way that suits him. This point in AR is still more critical than for conventional mobile applications.

UI layout

As in the case of virtual reality in AR, the classification of interfaces that is used in computer games is quite relevant - the separation of all interface elements into 4 types - diegetic, non-diabetic, spatial and meta.

  • Dietetic elements are located inside the game world and are connected with it at the level of storytelling - a map of the area in the hand of the protagonist in the shooter (it is assumed that the protagonist and other characters also perceive these elements as the player does).
  • , HUD — .
  • , — , . .
  • - , . , . - , , , .

, , . , , Dead Space, , .

6. It is usually recommended to place important controls that are required at any time in the plane of the screen so that the user does not need to change the position of the phone in space. For example, the button to exit the AR mode or the settings button.

But here context becomes important. Yes, in many services, menu items can be torn away from the AR layer for faster access to them, but for example, in game activities, menu items can be attached to physical planes - to the floor or wall - and even narratively substantiated, i.e. . make part of the overall virtual environment of the game, i.e. diegetic elements, which can contribute to a greater immersion of the user in the virtual world.

, , . , — Play , .

- VR-.


7.Since all interaction with the augmented layer occurs through the frame of the screen, it is important not to clutter the user’s overview with a large number of interface elements in the plane of the screen when this is not necessary. For example, you can show certain menu items only on call and, if possible, limit the number of buttons that are simultaneously visible on the screen during interaction with AR.

It’s better to place fixed elements on the edges of the screen, trying not to cover the central area. Also, in some applications, for the better readability of these elements, small gradients are added at the bottom and top of the screen. The buttons themselves on the screen can be made translucent or simply not very bright.



AR Object Management Options

Object installation


8. The user from the very start of the application must understand what is required of him to activate the extended layer. It is important to give information step by step for each action - pointing to a marker or searching for a plane, setting an object, and so on. For greater visibility, instructions can be animated.

9. If you need to choose a place to install the object, you can either use the interactive mark where the object will be installed and which can be positioned in space, or use the image of the object itself in the “installation” state, for example, in the form of a translucent silhouette. A great example of implementation is an application from Ikea .


10. It is better to create an animation of the appearance of an object after installation, justifying its occurrence in a real environment.

11. If not an object is installed, but a group of objects in the form of a scene (for example, a whole city consisting of a set of interactive buildings), then add a separate button to reposition the entire scene in space. You can see how this is implemented in Alice in Wonderland AR quest .

12. Sometimes objects of the augmented layer can be placed without any user action immediately after detecting the surface. For example, if we are talking about the appearance of a virtual scene environment - the appearance of a forest with grass where a plane or three-dimensional patterns on the walls are defined, and so on.

Object selection

AR- :

  1. (Touch-based)
  2. (Devices-based)
  3. (Gestures-based / Air-tap) .

— AR .

13. Typically, the choice of an object or interface element for further interaction is done through tap. This is a great option. Difficulties arise when there are a lot of interactive objects on the screen, they begin to overlap each other, which can cause false positives when trying to highlight a specific one. And here you can additionally use the position of the mobile device in space:

  • Splitting the screen into areas where the central one is the “focus” zone, into which active objects can fall to change state - for example, to active (highlighted) for interaction.
  • ( VR , 3DoF 6DoF ).
  • , , .

14. The topic, taking into account the distance of the user to the objects in the augmented layer, should be allocated as a separate item. As for VR-interfaces, the space around the user can be divided into a set of ranges, for example:

- Close - up to 30cm
- Personal - at hand distance, up to 1m
- Social - up to 3m
- Public - from 3 meters



Based on this, it is possible to build as the interface itself around the user, as well as a system for changing the state and options for interacting with interactive elements depending on the user's position to the object.

For example, in a radius of 2 meters, interactive objects display prompts with their names, and when approaching an object at arm's length, it is highlighted - indicating that it can be taken. Thus, it is possible to reduce the amount of excess information in the frame to the most relevant relative to the current position in space.

15. The fact of the selection of an object (for example, after tap on it) can be confirmed not only through the stroke or highlight, but - if the selection involves editing the position of the object - raise it above the floor plane. This is a fairly clear indication that the object can begin to move in space - it already turns out is not connected with the real surface. A similar option is used just in the application from Ikea, which was given in the example above.

sixteen.If the user suddenly lost sight of the object in the selected state (when the object went beyond the border of the device screen), then classic pointers at the edges of the screen work well, which can occur after a short pause after the loss of the object. In this case, a pause is needed to protect against false alarms.



The story with pointers also applies to important objects in the script that the user follows.

Control the position of objects in space


17. Objects that should be placed on the floor can be initially limited in vertical movement, thereby reducing their movement in space by the floor plane - that is, this will be the usual movement through pinching a finger on the screen. Intuitive and simple.

The vertical movement in this embodiment can be implemented through multi-touch or set the installation apart in several steps. This will require a more detailed training system due to greater complexity.

18. If you need more precise and complex manipulations with objects, you can use the gizmo from the editors of three-dimensional graphics and game engines.



19. Although less common, classic buttons for content management tools are also used, which are located on the plane of the screen.

twenty.Closer to game stories include the use of a mobile device to move a virtual object in real space. In this case, the object, attached to the camera, moves with the movement of the device until the user installs (detaches) it from the camera via tap on the screen or a specific button.

21. Separation into a set of areas can be used as restrictions for placing objects in space.

For example, you can limit the placement of a virtual object to a range of 30 centimeters to 3 meters, which will not allow placing the object at the user's feet or so far from it that it will become difficult to interact with it.

Integration into reality


22. For better integration of the digital layer into real space, it is recommended to use the possibilities of accounting for lighting in the frame, as well as environmental reflections on virtual objects.

And if real lighting is not taken into account, then add a fake shadow to objects. Even simple shadows greatly enhance the effect of integrating a virtual object into reality.


23. Object Occlusion is a very powerful tool for more seamless integration of AR into the real world, and it should be used if the platform on which the project is implemented allows it. We are talking about accounting for real objects that can overlap the augmented layer - now this is one of the important and relevant areas of development of augmented reality technology. Further, the possibility of overlapping digital objects will be constantly improved and become more accessible for use.


By the way, when overlapping an object, you can use additional pointers to indicate that the virtual object is behind a real obstacle.

24. Given that the application can be used at different levels of lighting, surrounded by different materials, for better readability, text hints are best placed on the dice. If the dice do not fit into the general concept, then for the text you can use soft shadows.

25. If the virtual object is not on the plane, but hanging in the air, then you can strengthen the designation of its spatial position, both through the shadow, and using various visual indicators coming from the object to the floor plane - this may be relevant for intangible objects, such like holograms.



26.As in the case of VR, the virtual environment in the augmented layer is new for many users, which means that it is better to enhance all actions with interactive objects and interface elements due to all possible tools, including sound effects and the possibility of phone vibration. Use the full range of possible tools to focus the user's attention and enhance the effect of the reliability of the digital layer.

But with vibration more accurately, Google, for example, generally does not particularly recommend using it.

27. Almost a copy-paste from UX tips in VR - the user can move freely in the real space of his room and with a high degree of probability cross (in this case by phone) virtual objects. One of the options to beat this situation is to remove the “materiality” of the intersected geometry so that it does not seem solid and stylize the moment of intersection itself so that the slice of the geometry does not read like a bug:


You can also optionally use stylization by showing the internal structure of the object. Not necessarily realistic, possibly exaggerated and comical, if the setting allows.

28. On the issue of organic integration into the real world, everything is much more interesting with large-scale objects.
If this is consistent with the general concept of the project, it is best to first show a reduced copy of the object, and then allow the user to enlarge it to a real scale. Thus, the user will be able to control the occurrence of a massive object in the frame.
Also, the user can be offered an option to choose - interaction with objects on a reduced scale or 1: 1, as is done in the game AR Sports Basketball .

But all this does not solve the problem of integrating large-sized objects in real space - because of the size, they may simply not fit into the area of ​​the room in which the user is located. In such cases, portals can be used.


It can be either a portal, which you can enter in the most literal sense, as in the example above, or various massive screens that are placed on the walls and are kind of windows to another reality.

29. Let's go back to the topic of interface types and discuss the use of meta elements. In computer games, these elements are directly related to the narrative of the game world, but at the same time they are spatially not related to it, but to the screen plane. Unlike ordinary (non-dihegetic), these elements can enhance the effect of immersion in a virtual environment. The most understandable example is the dimming and reddening of the screen when damage is received by the protagonist of the game - this reflects the internal state of the hero. Similar techniques can be used not only in gaming activities, efforts are the connection between real and virtual.

30. You can go further and recall the fourth wall . Here in AR , unlike VR, we are still looking at the digital layer through the frame of the phone’s screen, but at the same time our spatial position is taken into account and there is integration of the virtual environment into the real environment. And the theme of the destruction of the fourth wall is not only possible, but must be used. Direct calls of virtual characters directly to the user, the reaction of bots to his proximity to them, various simulations of phone glitches, a narrative justification of the digital layer that arose in the real world, real-time accounting, and so on. All this can create a deeper user experience, blurring the border between the real and virtual worlds more (a few examples from the sphere of computer games -1 , 2 ).

All Articles