
Languages used:
In addition to overlay data and models to the physical environment, Augmented Reality can be used to materialise immersive digital places, overlapping realities, that can be explored by walking in them. In order to maintain the illusion of transition between physical and digital places one of the possible solutions is to use portals.
Conceptually, the use of a portal in Augmented Reality allows us to visualise just a small part of the digital environment (e.g. a door frame, a hole in the wall) that is fully presented once the user crosses the threshold of the portal itself. To achieve this illusion we can hide the digital model behind an invisible layer.
Create a new Unity project using the AR Core template. Switch the Platform to Android.
Import the Room environment in Unity. There are two ways to import - click and drag your FBX file into a folder in the Project panel or head to Assets -> import assets
To be able to edit the materials we need to extract them from the FBX file. in the Inspector window, from the tab Materials chose Use External Materials (Legacy) from the Location dropdown menu. Unity will extract the Materials and texture images in the folder of the FBX model.
RoomContainer in the sceneRoomContainer object (be sure that the room is placed at position = 0, 0, 0)RoomContainer to the Prefab folder to create a prefab of it (we will need this later for applying light mapping).
A quick and effective way to visually hide a digital model is by using a Shader that, as an invisible cloak, it lets us see behind the model without revealing the model itself
Standard Surface Shader named OcclusionInspector Panel change the shader to Custom/Occlusion
Occlusion inside the RoomContainer GameObject and add to it 4 planes (GameObject -> 3D Object -> Plane)
Using real-time lighting in Augmented Reality with large models can impact negatively the overall performance of the application, especially on older devices. Using lightmapping technique it is possible to store light information in a lightmap texture. Lightmaps in Unity can be generated just for static GameObjects, however, in Augmented Reality, the GameObjects need to be moved around the scene and Instantiate at Runtime. To overcome these limits it is possible to use the Prefab Lightmapping script that allows us to save lightmap information to the prefab itself
PrefabLightmapData.cs script in your Assets -> Scripts folder (create a new folder if it does not exists)RoomContainer GameObject)RoomContainer) and, from the top right of the Inspector Panel check the Static checkboxBaked Lightmap, the model will show a grey and white grid. This is the resolution of the lightmap, the smaller the square (or textel) the higher the quality (and the longer the computation time)Inspector Panel change the Mesh Renderer -> Lightmapping -> Scale in Lightmap to 3
We can now add some light to the scene
Lights inside the RoomContainer GameObjectArea light for the window, one for the door and one on the monitor. Use the Shape property in the Inspector Panel to set the size of the light. Also, pay attention that the direction of the light towards the inside of the room (blue arrow). Intensity can be adjusted to have a brighter environmentDirectional Light or, if present, change the rotation to have the light entering the door (e.g. rotation = 30, -109, 0). Activate the shadows from the inspector (Soft Shadows) Be sure that the General -> Mode of the light is set to Baked (Area light can be just baked lights)RoomContainer and, in the Inspector panel, Overrides and Apply All
Window -> Rendering -> Lighting Panel to set the lightmapper parameter. Most of the default values are fineLighting Settings Assets. By pressing New Lighting Settings a new asset will be created. As the lightmap is linked to the scene, named the Lighting Settings Asset with the same name of the scene.Assets -> Bake Prefab Lightmap (the button Generate Lightmap is the will not work to create the lightmap for the prefab)
The Prefab can be removed from the scene and it will keep the lightmaps once Instantiate in the AR App. Build the app to test it.
Depending on the model used, to have the door facing the user it might be needed to rotate the model. It is usually better to rotate the object within the RoomContainer.
It is possible to mix baked and real time lighting in the same scene.
RoomContainer (is better to disable or remove the LeanTouch Drag, Twist and Pinch for now)Inspector Panel and change the Rendering -> Culling Mask to match only the same Layer. In this way the real time light will affect just the objects belonging to the same layer.Overrides), add the prefab to the TapToPlace componentBuild the app to test the scene

Unity can be use to access multiple sensors from the mobile device. The GPS location it could be useful to trigger AR experiences just if the user is in a specific place. Newer versions of Android and iOS requires specific permissions from the user to access the various sensors. In addition, Unity provides Latitude and Longitude as a float instead of a more accurate double provided by the mobile device. To overcome this problem, it is possible to leverage a native platform plugin (Android or iOS).
The Native GPS Plugin (iOS/Android) provide these plugin out-of-the-box:
Native GPS Plugin (iOSAndroid) is created in the Assets folder whit a sample sceneRoom Container prefab in edit modeGPSData in position = 1.4, 1.7, -0.653D Object -> TextMeshPro (install the essential assets if required)Sample text to Waiting GPS...The script NativeGPSUI, provided by the asset, can be used to test the GPS location. As we use a TextMeshPro GameObject, we need to change the references in the script from Text to TextMeshPro and add the using TMPro; at the top of the script. You may want to comment out the Speed value of the GPS to keep the text in the panel
Build the app to test the scene

Valve Steam Audio provides a free audio spatialiser that can be use to create immersive experience in VR and AR environments. Compare to other add-on, Valve Steam Audio provides some interesting solutions to create credible sound occlusion and reflection.
Edit -> Project Settings -> Audio select Spatializer Plugin to Steam Audio Spatializer, and Ambisonic Decoder Plugin to Steam Audio AmbisonicsRoom Container and add a new Audio -> Audio SourceInspector PanelAudioClip add an audio file (wav or mp3, Freesound.org is a great resource)Steam Audio SourceRoom prefab inside the Room ContainerMesh Collider, Convex: TrueSteam Audio Geometry, Change material to BricksSteam Audio Dynamic Object and select Export Dynamic ObjectMain Camera in the Hierarchy Panel and add a component Audio ListenerBuild and Run the application (or use the Editor Play to test the effects of the sound occlusion and transmission)