Languages used:
Unity can be used to access multiple sensors on a mobile device. The GPS location can be useful for triggering AR experiences if the user is in a specific place. Newer versions of Android and iOS require specific permissions from the user to access various sensors. Additionally, Unity provides latitude and longitude as floats
instead of the more accurate doubles
provided by the mobile device. To overcome this problem, it is possible to use the Native GPS Plugin (iOS/Android):
Create a new Unity project using the Unity HUB and the AR Mobile Core
template.
Once the project is ready, switch the platform to Android
from the File -> Build Settings
.
Save a new scene name GPSNative
Native GPS Plugin (iOSAndroid)
is created in the Assets
folder whit a sample sceneGPSData
UI -> TextMeshPro
Unity will create automatically a Canvas. Select the Canvas and in the Inspector window
:Canvas Scaler
to Expand and set the Reference Resolution to x=1080; y=1920
Horizontal Layout Group
Middle Center
Control Child Size
and Child Force ExpandInspector window
Sample text
to Waiting GPS...
Font Size
to Auto Size
The script NativeGPSUI
(we can search for it int he Project folder
), provided by the imported package, can be used to test the GPS location.
GPSData
GameObjectTextMeshProUGUI
GameObject, we need to change the references in the script from Text
to TextMeshProUGUI
using TMPro;
at the top of the script.Save the Scene.
To test the script we need to build the project as the plugin can access just GPS devices on Android or iOS devices
The NativeGPS Plugin
can be used with ARFoundation
to create location-based Augmented Reality experiences.
However, using specialised packages such as ARDK allows to provide a more straightforward developing experience.
Install the ARDK package following the documentation.
Here are the the main steps to follow:
Window -> Package Manager
Package Manager tab
, select Add package from git URL...
https://github.com/niantic-lightship/ardk-upm.git
Open the Lightship -> XR Plug-in Management
In the XR Plug-in Management menu
, select Niantic Lightship SDK + Google ARCore
.
In Player Settings
:
Other Settings -> Rendering
- Uncheck Auto Graphics API. If Vulkan appears in the Graphics API list, remove it.Other Settings -> Identification
- Set the Minimum API Level to Android 7.0 ‘Nougat' (API Level 24) or higher.Other Settings -> Configuration
- Set the Scripting Backend to IL2CPP
, then enable both ARMv7
and ARM64
.Lightship -> Settings
and click on Get API Key
under Credentials
Lightship account
or create a new free account.Projects page
, then select an existing project or create a new one by clicking New Project.API Key
by clicking the copy icon next to it.Lightship Settings
window in Unity and paste your API Key
into the API Key field.Control that there aren't any issues in the Project Validation
(it can be found in Edit -> Project Settings
or Lightship -> Project Validation
).
Create a new Empty scene
File -> New Scene -> Empty Scene
and Save as..
Hierarchy panel
the essential ARFoundation
objects:XR -> XR Origin (Mobile AR)
, this GameObject contains the Camera Offset
and Main Camera
XR -> AR Session
Hierarchy
, select the XR Origin (Mobile AR)
, then, in the Inspector
, click Add Component
and add an ARWorldPositioningObjectHelper
to it. This will also create a ARWorldPositioningManager
ComponentHierarchy
, expand theXROrigin
and Camera Offset
to expose the Main Camera
, then select itInspector
, locate Clipping Planes
under Camera
and set the Far value
to 10000
ARWorldPositioningObjectHelper
Component, set the Altitude Mode
to Meters above sea level (WGS84)
Create a new C# script named AddWPSObjects
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Niantic.Lightship.AR.WorldPositioning;
public class AddWPSObjects : MonoBehaviour
{
[SerializeField] ARWorldPositioningObjectHelper positioningHelper;
// Start is called before the first frame update
void Start()
{
// replace the coordinates here with your location
double latitude = 51.539185696655736;
double longitude = -0.010363110051920168;
double altitude = 0.0; // We're using camera-relative positioning so make the cube appear at the same height as the camera
// instantiate a cube, scale it up for visibility (make it even bigger if you need), then update its location
GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
cube.transform.localScale *= 2.0f;
positioningHelper.AddOrUpdateObject(cube, latitude, longitude, altitude, Quaternion.identity);
}
}
In the Hierarchy
, create a new Empty GameObject by right-clicking and selecting Create Empty
. Name it WPSObjects and add the script you just created to it. Drag the XR Origin (Mobile AR)
to the empty field of the Positioning Helper
.
Build and Run the app and test it outdoors. A 2-meter cube will float in mid-air at the location defined in the script.
In order to add multiple elements at runtime, we can take advantage of the JSON
format. First, we need to generate a GeoJSON file with the locations of the sensors or other elements that we want to visualize in our app. In this example, we are going to use the BatSensors located in the Queen Elizabeth Olympic Park.
If the GeoJSON file is not readily available, it is possible to create a new one quite easily using geojson.io. Through this online service, it is also possible to add additional properties to the locations that can be used in our application.
Create a new Prefab GameObject to be use as visualisation of the sensor. It could be a primitive such a sphere (scale 2 2 2
), or an actual 3D model. Inside of the prefab add also a TextMeshPro - Text
with width = 0.2; height = 0.3
and Font Size = 0.2
, place it on top of the primitive object and name it Info
(the name is used also in the script below).
In order to provide a feedback to the user on the position of the closest sensor we are going to add a simple UI:
UI -> Canvas
and setScale with Scren Size
X = 1080; Y = 1920
Expand
Horizontal Layout
Canvas
a UI - > Text - TextMeshPro
Left Bottom
Pos X = 0; Pos Y = 50; Pos Z = 0
width = 1080; height = 50
Right
Closest Sensor:......... Distance:............
and the Font Size to 36
We can now update the AddWPSObjects
with a new script named AddWPSObjectsList
that is going to read the JSON file and insatiate a prefab for each location
using UnityEngine;
using System.Collections.Generic;
using Niantic.Lightship.AR.WorldPositioning;
using TMPro;
public class AddWPSObjectsList : MonoBehaviour
{
[SerializeField] ARWorldPositioningObjectHelper positioningHelper;
[SerializeField] GameObject prefab; // Assign your prefab in the Inspector
[SerializeField] TextAsset jsonFilePath; // Path to your JSON file
private List<GameObject> sensors = new List<GameObject>();
public Camera mainCamera;
public TextMeshProUGUI distanceText;
private Vector3 previousCameraPosition;
void Start()
{
// Initialize the previous camera position
previousCameraPosition = mainCamera.transform.position;
string jsonText = jsonFilePath.text;
var json = JsonUtility.FromJson<batSensors.Root>(jsonText);
foreach (var feature in json.features)
{
double longitude = feature.geometry.coordinates[0];
double latitude = feature.geometry.coordinates[1];
double altitude = feature.properties.altitude;
// Instantiate the prefab and update its location
GameObject obj = Instantiate(prefab);
obj.name = feature.properties.Name;
positioningHelper.AddOrUpdateObject(obj, latitude, longitude, altitude, Quaternion.identity);
Debug.Log("add " + obj.name);
obj.transform.Find("Info").GetComponent<TextMeshPro>().text = feature.properties.Name + "\n" + feature.properties.Habitat;
sensors.Add(obj);
}
}
void LateUpdate()
{
// Check if the camera has moved
if (mainCamera.transform.position != previousCameraPosition)
{
// Update the previous camera position
previousCameraPosition = mainCamera.transform.position;
// Find the closest object and display the distance
FindAndDisplayClosestObject();
}
}
void FindAndDisplayClosestObject()
{
GameObject closestObject = null;
float closestDistance = Mathf.Infinity;
// Iterate through all objects with the tag "Detectable"
foreach (GameObject obj in sensors)
{
float distance = Vector3.Distance(mainCamera.transform.position, obj.transform.position);
if (distance < closestDistance)
{
closestDistance = distance;
closestObject = obj;
}
}
if (closestObject != null)
{
// Display the distance in meters and two decimals
distanceText.text = $"Closest Sensor: {closestObject.name:F2} | Distance: {closestDistance:F2} m";
}
}
}
// Root myDeserializedClass = JsonConvert.DeserializeObject<Root>(myJsonResponse);
public class batSensors
{
[System.Serializable]
public class Feature
{
public string type;
public Properties properties;
public Geometry geometry;
}
[System.Serializable]
public class Geometry
{
public string type;
public List<double> coordinates;
}
[System.Serializable]
public class Properties
{
public string Name;
public string Habitat;
public double altitude;
}
[System.Serializable]
public class Root
{
public string type;
public List<Feature> features;
}
}
We can add the script to the same WPSObjects
GameObject, paying attention to disable the old one and to fill the public variables
Build and test the app outdoor (it is possible to use it indoor, but the quality of the GPS signal might not be optimal)
Valve Steam Audio
provides a free audio spatialiser that can be use to create immersive experience in VR and AR environments. Compare to other add-on, Valve Steam Audio
provides some interesting solutions to create credible sound occlusion and reflection.
Edit -> Project Settings -> Audio
select Spatializer Plugin
to Steam Audio Spatializer
, and Ambisonic Decoder Plugin
to Steam Audio Ambisonics
BatPrefab
and add a new Audio -> Audio Source
Inspector Panel
AudioClip
add an audio file (wav or mp3, Freesound.org is a great resource)True
True
1
Steam Audio Source
True
with input Physics Based
Main Camera
in the Hierarchy Panel
and control that a component Audio Listener
is presentBuild and Run
the application