This workshop will show you how to:

Final Result

To complete this workshop you will need:

Languages used:

Additional resources

In addition to overlay data and models to the physical environment, Augmented Reality can be used to materialise immersive digital places, overlapping realities, that can be explored by walking in them. In order to maintain the illusion of transition between physical and digital places one of the possible solutions is to use portals.

Conceptually, the use of a portal in Augmented Reality allows us to visualise just a small part of the digital environment (e.g. a door frame, a hole in the wall) that is fully presented once the user crosses the threshold of the portal itself. To achieve this illusion we can hide the digital model behind an invisible layer.

Room Environment

Create a new Unity project using the AR Core template. Switch the Platform to Android.

Import the Room environment in Unity. There are two ways to import - click and drag your FBX file into a folder in the Project panel or head to Assets -> import assets
To be able to edit the materials we need to extract them from the FBX file. in the Inspector window, from the tab Materials chose Use External Materials (Legacy) from the Location dropdown menu. Unity will extract the Materials and texture images in the folder of the FBX model.

Room model

DepthMask Shader

A quick and effective way to visually hide a digital model is by using a Shader that, as an invisible cloak, it lets us see behind the model without revealing the model itself

Shader DepthMask

Occlusion panels

Using real-time lighting in Augmented Reality with large models can impact negatively the overall performance of the application, especially on older devices. Using lightmapping technique it is possible to store light information in a lightmap texture. Lightmaps in Unity can be generated just for static GameObjects, however, in Augmented Reality, the GameObjects need to be moved around the scene and Instantiate at Runtime. To overcome these limits it is possible to use the Prefab Lightmapping script that allows us to save lightmap information to the prefab itself

Textel scale

We can now add some light to the scene

Lights location

Lightmap result

The Prefab can be removed from the scene and it will keep the lightmaps once Instantiate in the AR App. Build the app to test it.

Depending on the model used, to have the door facing the user it might be needed to rotate the model. It is usually better to rotate the object within the RoomContainer.

It is possible to mix baked and real time lighting in the same scene.

Build the app to test the scene

Lightmap result

Unity can be use to access multiple sensors from the mobile device. The GPS location it could be useful to trigger AR experiences just if the user is in a specific place. Newer versions of Android and iOS requires specific permissions from the user to access the various sensors. In addition, Unity provides Latitude and Longitude as a float instead of a more accurate double provided by the mobile device. To overcome this problem, it is possible to leverage a native platform plugin (Android or iOS).
The Native GPS Plugin (iOS/Android) provide these plugin out-of-the-box:

The script NativeGPSUI, provided by the asset, can be used to test the GPS location. As we use a TextMeshPro GameObject, we need to change the references in the script from Text to TextMeshPro and add the using TMPro; at the top of the script. You may want to comment out the Speed value of the GPS to keep the text in the panel

Build the app to test the scene

GPS Location

Valve Steam Audio provides a free audio spatialiser that can be use to create immersive experience in VR and AR environments. Compare to other add-on, Valve Steam Audio provides some interesting solutions to create credible sound occlusion and reflection.

Build and Run the application (or use the Editor Play to test the effects of the sound occlusion and transmission)