Languages used:
In addition to overlay data and models to the physical environment, Augmented Reality can be used to materialise immersive digital places, overlapping realities, that can be explored by walking in them. In order to maintain the illusion of transition between physical and digital places one of the possible solutions is to use portals.
Conceptually, the use of a portal in Augmented Reality allows us to visualise just a small part of the digital environment (e.g. a door frame, a hole in the wall) that is fully presented once the user crosses the threshold of the portal itself. To achieve this illusion we can hide the digital model behind an invisible layer.
Create a new Unity project using the AR Core
template, the URP (Universal Render Pipeline) and Vuforia Unity Package 10.10. Switch the Platform to Android
.
Import the Room environment in Unity. There are two ways to import - click and drag your FBX
file into a folder in the Project panel
or head to Assets -> import assets
To be able to edit the materials we need to extract them from the FBX
file. in the Inspector window
, from the tab Materials
chose Use External Materials (Legacy) from the Location
dropdown menu. Unity will extract the Materials and texture images in the folder of the FBX model.
RoomContainer
and add the prefab to it (before adding the prefab of the room to the empty object, be sure that this one is placed at position = 0, 0, 0
)A quick and effective way to visually hide a digital model is by using a Shader that, as an invisible cloak, it lets us see behind the model without revealing the model itself
Standard Surface Shader
named OcclusionInspector Panel
change the shader to Custom/OcclusionOcclusion
inside the RoomContainer
GameObject and add to it 4 planes (GameObject -> 3D Object -> Plane
)Using real-time lighting in Augmented Reality with large models can impact negatively the overall performance of the application, especially on older devices. Using lightmapping technique it is possible to store light information in a lightmap texture. Lightmaps in Unity can be generated just for static GameObjects, however, in Augmented Reality, the GameObjects need to be moved around the scene and Instantiate at Runtime. To overcome these limits it is possible to use the Prefab Lightmapping script that allows us to save lightmap information to the prefab itself
PrefabLightmapData.cs
script in your Assets -> Scripts
folder (create a new folder if it does not exists)RoomContainer GameObject
)Inspector Panel
check the Static checkboxBaked Lightmap
, the model will show a grey and white grid. This is the resolution of the lightmap, the smaller the square (or textel) the higher the quality (and the longer the computation time)Inspector Panel
change the Mesh Renderer -> Lightmapping -> Scale in Lightmap
to 3
The Lightmapper by default is set to Auto Generate. It is possible to change this behaviour from the Lighting panel (Window -> Rendering -> Lightning
)
We can now add some light to the scene
Lights
inside the RoomContainer GameObject
Area light
for the window, one for the door and one on the monitor. Use the Shape
property in the Inspector Panel
to set the size of the light. Also, pay attention that the direction of the light towards the inside of the room. Intensity can be adjusted to have a brighter environmentDirectional Light
or, if present, change the rotation to have the light entering the door (e.g. rotation = 30, -109, 0
). Be sure that the General -> Mode
of the light is set to Baked
(Area light
can be just baked lights)Assets -> Prefab
name Room-Light
and drag the RoomContainer
GameObject to create the prefabWindow -> Rendering -> Lighting Panel
to set the lightmapper parameter. Most of the default values are fineLighting Settings Assets
. By pressing New Lighting Settings
a new asset will be created in a folder with the same name of the SceneAssets -> Bake Prefab Lightmap
The Prefab can be removed from the scene and it will keep the lightmaps once Instantiate in the AR App. Build the app to test it.
Depending on the model used, to have the door facing the user it might be needed to rotate the model. It is usually better to rotate the object within the RoomContainer
.
It is possible to mix baked and real time lighting in the same scene.
RoomContainer
(is better to remove the LeanTouch Move, Rotate and Scale for now)Inspector Panel
and change the Rendering -> Culling Mask
to match only the same Layer. In this way the real time light will affect just the objects belonging to the same layer.Overrides
), add the prefab to the TapToPlace
component, or to the Vuforia GameObjectBuild the app to test the scene
Unity can be use to access multiple sensors from the mobile device. The GPS location it could be useful to trigger AR experiences just if the user is in a specific place. Newer versions of Android and iOS requires specific permissions from the user to access the various sensors. In addition, Unity provides Latitude and Longitude as a float
instead of a more accurate double
provided by the mobile device. To overcome this problem, it is possible to leverage a native platform plugin (Android or iOS).
The Native GPS Plugin (iOS/Android) provide these plugin out-of-the-box:
Native GPS Plugin (iOSAndroid)
is created in the Assets
folder whit a sample sceneRoom Container
prefab in edit modeGPSData
in position = 1.4, 1.7, -0.65
3D Object -> TextMeshPro
(install the essential assets if required)Sample text
to Waiting GPS...
The script NativeGPSUI
can be used to test the GPS location. As we use a TextMeshPro
GameObject, we need to change the references in the script from Text
to TextMeshPro
and add the using TMPro;
at the top of the script. You may want to comment out the Speed
value of the GPS to keep the text in the panel
Build the app to test the scene
Valve Steam Audio
provides a free audio spatialiser that can be use to create immersive experience in VR and AR environments. Compare to other add-on, Valve Steam Audio
provides some interesting solutions to create credible sound occlusion and reflection.
Edit -> Project Settings -> Audio
select Spatializer Plugin
to Steam Audio Spatializer
, and Ambisonic Decoder Plugin
to Steam Audio Ambisonics
Room Container
and add a new Audio -> Audio Source
Inspector Panel
AudioClip
add an audio file (wav or mp3, Freesound.org is a great resource)Steam Audio Source
Room
prefab inside the Room Container
Mesh Collider
, Convex: TrueSteam Audio Geometry
, Change material to Bricks
Steam Audio Dynamic Object
and select Export Dynamic Object
AR Camera
in the Hierarchy Panel
and add a component Audio Listener
Build and Run the application (or use the Editor Play to test the effects of the sound occlusion and transmission)