Languages used:
In addition to overlay data and models to the physical environment, Augmented Reality can be used to materialise immersive digital places, overlapping realities in outdoor spaces.
Create a new Unity project using the AR Core
template. Switch the Platform to Android
.
Install the ARDK package following the documentation
Window -> Package Manager
Package Manager tab
, select Add package from git URL...
https://github.com/niantic-lightship/ardk-upm.git
Open the Lightship -> XR Plug-in Management
In the XR Plug-in Management menu
, select Niantic Lightship SDK + Google ARCore
.
In Player Settings
:
Other Settings -> Rendering
- Uncheck Auto Graphics API. If Vulkan appears in the Graphics API list, remove it.Other Settings -> Identification
- Set the Minimum API Level to Android 7.0 ‘Nougat' (API Level 24) or higher.Other Settings -> Configuration
- Set the Scripting Backend to IL2CPP
, then enable both ARMv7
and ARM64
.Lightship -> Settings
and click on Get API Key
under Credentials
Lightship account
or create a new free account.Projects page
, then select an existing project or create a new one by clicking New Project.API Key
by clicking the copy icon next to it.Lightship Settings
window in Unity and paste your API Key
into the API Key field.Control that there aren't any issues in the Project Validation
(it can be found in Edit -> Project Settings
or Lightship -> Project Validation
).
Create a new Empty scene
File -> New Scene -> Empty Scene
and Save as..
Hierarchy panel
the essential ARFoundation
objects:XR -> XR Origin (Mobile AR)
, this GameObject contains the Camera Offset
and Main Camera
XR -> AR Session
AROcclusionManager
component to the Main Camera
GameObjectAdd the Lightship WPS Unity package to a Unity Project following the steps in the Niantic Lightship Documentation
Window -> Package Manager
, click the +
in the top-left corner, then select Add package from tarball...
Hierarchy
, select the XROrigin
, then, in the Inspector
, click Add Component
and add an ARWorldPositioningObjectHelper
to it. This will also create a ARWorldPositioningManager
ComponentHierarchy
, expand theXROrigin
and Camera Offset
to expose the Main Camera
, then select itInspector
, locate Clipping Planes
under Camera
and set the Far value
to 10000
ARWorldPositioningObjectHelper
Component, set the Altitude Mode
to Meters above sea level (WGS84)
Create a new C# script named AddWPSObjects
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Niantic.Experimental.Lightship.AR.WorldPositioning;
public class AddWPSObjects : MonoBehaviour
{
[SerializeField] ARWorldPositioningObjectHelper positioningHelper;
// Start is called before the first frame update
void Start()
{
// replace the coordinates here with your location
double latitude = 51.539185696655736;
double longitude = -0.010363110051920168;
double altitude = 0.0; // We're using camera-relative positioning so make the cube appear at the same height as the camera
// instantiate a cube, scale it up for visibility (make it even bigger if you need), then update its location
GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
cube.transform.localScale *= 2.0f;
positioningHelper.AddOrUpdateObject(cube, latitude, longitude, altitude, Quaternion.identity);
}
}
In the Hierarchy
, create a new Empty GameObject by right-clicking and selecting Create Empty
. Name it WPSObjects and add the script you just created to it. Drag the XR Origin (Mobile AR)
to the empty field of the Positioning Helper
.
Build and Run the app and test it outdoors. A 2-meter cube will float in mid-air at the location defined in the script.
In order to add multiple elements at runtime, we can take advantage of the JSON
format. First, we need to generate a GeoJSON file with the locations of the sensors or other elements that we want to visualize in our app. In this example, we are going to use the BatSensors located in the Queen Elizabeth Olympic Park.
If the GeoJSON file is not readily available, it is possible to create a new one quite easily using geojson.io. Through this online service, it is also possible to add additional properties to the locations that can be used in our application.
Create a new Prefab GameObject to be use as visualisation of the sensor. It could be a primitive such a sphere (scale 0.2 0.2 0.2
), or an actual 3D model. Inside of the prefab add also a TextMeshPro - Text
with width = 0.2; height = 0.3
and Font Size = 0.2
.
In order to provide a feedback to the user on the position of the closest sensor we are going to add a simple UI:
UI -> Canvas
and setScale with Scren Size
X = 1080; Y = 1920
Expand
Canvas
a UI - > Text - TextMeshPro
Left Bottom
Pos X = 0; Pos Y = 50; Pos Z = 0
width = 1080; height = 50
Right
Closest Sensor:......... Distance:............
and the Font Size to 36
We can now update the AddWPSObjects
with a new script named AddWPSObjectsList
that is going to read the JSON file and insatiate a prefab for each location
using UnityEngine;
using System.Collections.Generic;
using Niantic.Experimental.Lightship.AR.WorldPositioning;
using TMPro;
public class AddWPSObjectsList : MonoBehaviour
{
[SerializeField] ARWorldPositioningObjectHelper positioningHelper;
[SerializeField] GameObject prefab; // Assign your prefab in the Inspector
[SerializeField] TextAsset jsonFilePath; // Path to your JSON file
private List<GameObject> sensors = new List<GameObject>();
public Camera mainCamera;
public TextMeshProUGUI distanceText;
private Vector3 previousCameraPosition;
void Start()
{
// Initialize the previous camera position
previousCameraPosition = mainCamera.transform.position;
string jsonText = jsonFilePath.text;
var json = JsonUtility.FromJson<batSensors.Root>(jsonText);
foreach (var feature in json.features)
{
double longitude = feature.geometry.coordinates[0];
double latitude = feature.geometry.coordinates[1];
double altitude = feature.properties.altitude;
// Instantiate the prefab and update its location
GameObject obj = Instantiate(prefab);
obj.name = feature.properties.Name;
positioningHelper.AddOrUpdateObject(obj, latitude, longitude, altitude, Quaternion.identity);
Debug.Log("add " + obj.name);
obj.transform.Find("Info").GetComponent<TextMeshPro>().text = feature.properties.Name + "\n" + feature.properties.Habitat;
sensors.Add(obj);
}
}
void LateUpdate()
{
// Check if the camera has moved
if (mainCamera.transform.position != previousCameraPosition)
{
// Update the previous camera position
previousCameraPosition = mainCamera.transform.position;
// Find the closest object and display the distance
FindAndDisplayClosestObject();
}
}
void FindAndDisplayClosestObject()
{
GameObject closestObject = null;
float closestDistance = Mathf.Infinity;
// Iterate through all objects with the tag "Detectable"
foreach (GameObject obj in sensors)
{
float distance = Vector3.Distance(mainCamera.transform.position, obj.transform.position);
if (distance < closestDistance)
{
closestDistance = distance;
closestObject = obj;
}
}
if (closestObject != null)
{
// Display the distance in meters and two decimals
distanceText.text = $"Closest Sensor: {closestObject.name:F2} | Distance: {closestDistance:F2} m";
}
}
}
// Root myDeserializedClass = JsonConvert.DeserializeObject<Root>(myJsonResponse);
public class batSensors
{
[System.Serializable]
public class Feature
{
public string type;
public Properties properties;
public Geometry geometry;
}
[System.Serializable]
public class Geometry
{
public string type;
public List<double> coordinates;
}
[System.Serializable]
public class Properties
{
public string Name;
public string Habitat;
public double altitude;
}
[System.Serializable]
public class Root
{
public string type;
public List<Feature> features;
}
}
We can add the script to the same WPSObjects
GameObject, paying attention to disable the old one and to fill the public variables
We can test the Application Outdoor
Valve Steam Audio
provides a free audio spatialiser that can be use to create immersive experience in VR and AR environments. Compare to other add-on, Valve Steam Audio
provides some interesting solutions to create credible sound occlusion and reflection.
Edit -> Project Settings -> Audio
select Spatializer Plugin
to Steam Audio Spatializer
, and Ambisonic Decoder Plugin
to Steam Audio Ambisonics
BatPrefab
and add a new Audio -> Audio Source
Inspector Panel
AudioClip
add an audio file (wav or mp3, Freesound.org is a great resource)True
True
1
Steam Audio Source
True
with input Physics Based
Main Camera
in the Hierarchy Panel
and add a component Audio Listener
if not presentBuild and Run the application