Languages used:
The onboarding experience is an important aspect of any AR app. Its main purpose is to provide users with straightforward information on how to use the app. While each application will have specific instructions, the initial steps (e.g., finding a plane, placing an object) are commonly used in any application. There are no specific rules on how the onboarding experience needs to be; the important part is that the user needs to be informed from the beginning about what they need to do to use the application (e.g., which button to press, which actions to perform).
One effective solution is to add overlaid videos and/or text to alert the user. Both uGUI
and UIToolkit
can be used for displaying and controlling the user interface. In this tutorial, we are going to use the uGUI
system.
Starting point of the workshop is the previous project 6: Unity AR Physical to Digital
.
Before building the UI we need to import the onboarding AR videos. Import the findaplane.webm, findanimage.webm and taptoplace.webm in a subfolder (e.g. Videos) of the Assets.
Inside the same subfolder create a RenderTexture named UXRenderTexture
and change the size to 512 x 512 in the Inspector window
and Anti-aliasing to 2 samples
Create a new UI -> Canvas
and change the name to Onboarding
In the Inspector Window
:
x:1080 y:1920
Create a new Empty Game Object inside the Canvas GameObject and name it UIRoot
this is going to be the container of the UI.
With the UIRoot
selected, in the Inspector Window
:
Stretch Stretch
and set all the properties to 0 (i.e. Top Left Pos Z Right Bottom)Vertical Layout Group
, this will help to organise the three additional UI components we need to create. Set the following parameters:100
Middle Centre
Width
and Height
Height
We can now add the actual components that hold the UX information:
UI -> Text - TextMeshPro
, named Info1;UI -> Text - TextMeshPro
named Info2.The order of these components in the Hierarchy Panel
is important.
With both Info1 and Info2 selected, in the Inspector Window
:
Layout Element
component and set the min width
to 1080
TextMeshPro - Text (UI)
component, check the Auto Size and set the min 70; max 100
and alignment to middle and centre
Select the Info1 GameObject and in the TextMeshPro - Text (UI)
component change the text input to:
Move your device slowly to find a surface
Similarly, for the Info2, change the text to:
Looking for surfaces...
Select the the Video GameObject, and in the Inspector Window
:
Layout Element
component with both Min SizeWidth
and Height450
Vertical Layout Group
:Middle Centre
Width
and Height
Width
and Height
Finally, add to the Video GameObject a new UI -> Raw Image
and named it videoUX
, and in the Inspector Window
:
Raw Image
component set the Texture to UXRenderTexture
Video Player
componentfindaplane
check
check
UXRenderTexture
Aspect Ratio Fitter
componentFit In Parent
1
Rect transform
is set to Stretch Stretch
and Pivot is X 0.5; Y 1
Press Play
, you should be able to see the UI interface and the video running in the centre
The final step is to make the UI respond to the underlying ARFoundation
system. This involves changing the text and video when the surface is found and removing the interface once the user places the object. This functionality is controlled by a new C# script.
Create a new script named uiAR
using UnityEngine;
using UnityEngine.Video;
using UnityEngine.XR.ARFoundation;
using TMPro;
//Script need to be added to XROrigin, as it has ARPlaneManager and tapToPlace
public class uiAR : MonoBehaviour
{
//Video player
public VideoPlayer videoUX;
//Video Clips
public VideoClip v_findSurface;
public VideoClip v_tapPlace;
//Info texts
public TextMeshProUGUI t_info1;
public TextMeshProUGUI t_info2;
// Events for found plane and content created
private ARPlaneManager m_ARPlaneManager; //ARFoundation system
private tapToPlace m_tapToPlace; //used to detect when the user create the content
bool isContentVisible = false;
// Start is called before the first frame update
private void Awake()
{
videoUX.clip = v_findSurface;
m_ARPlaneManager = GetComponent<ARPlaneManager>();
m_ARPlaneManager.planesChanged += planeFound; //Subscribe to the event `plane is detected`
m_tapToPlace = GetComponent<tapToPlace>();
m_tapToPlace._contentVisibleEvent += contentVisible; //Subscribe to the event `content is created` (user Tap)
}
void planeFound(ARPlanesChangedEventArgs args)
{
//Plane found, turn off UI and Video
videoUX.clip = v_tapPlace;
t_info1.text = "Tap to Place";
t_info2.text = "Surface found";
m_ARPlaneManager.planesChanged -= planeFound; //Unsubcribe
if (isContentVisible)
{ //Content created, turn off UI and Video
videoUX.gameObject.SetActive(false);
t_info1.gameObject.SetActive(false);
t_info2.gameObject.SetActive(false);
}
}
void contentVisible()
{
isContentVisible = true; //if the content is visible
m_tapToPlace._contentVisibleEvent -= contentVisible; //Unsubscribe
//Content created, turn off UI and Video
videoUX.gameObject.SetActive(false);
t_info1.gameObject.SetActive(false);
t_info2.gameObject.SetActive(false);
}
}
Add the script to the XR Origin
GameObject and set the public variables
In order to see the script in action we need to build and deploy the App
The Lean Touch Asset provides a quick and easy way to add multiple gestures to your AR project without writing (almost) any code.
Installation is a two-step process, firstly you need to download the Unity Assets (there are two versions of LeanTouch, the Free version is enough for our needs) from the Unity Store, to add it to your asset collection.
Head to The Unity Store
Secondly, install it in Unity by going to Window -> Package Manager
Search under Packages: My Assests
for Lean Touch, download
and import
.
Add the LeanTouch
GameObject by right-clicking on the Hierarchy panelLean -> Touch
We now need to add the touch controls to our object (ARObject
) - there are numerous options and Lean Touch can be used for any application with a touch screen.
Double-click your AR Object Prefab to open it in Edit mode and click on Add Component
. If you type in Lean you will see a long list of options. Our first one is Lean Selectable
and we want to tick the Self Selected
option - this simple makes sure our object is automatically selected and ready to touch.
Lean Pinch Scale
with Required Finger Count3
;Lean Twist Rotate Axis
and we are moving the y axis - so set y to 1
, with Required Finger Count2
;Lean Drag Transalte
with Required Finger Count1
;As the Lean Drag Translate
will be in conflict with the tapToPlace
script, we can change the latter and use Lean touch
for the input tap.
Open the tapToPlace
script in VSCode (tapToPlace
should be attached to XROrigin GameObject), comment out the Update
function and add a new OnFingerTap(LeanFinger finger)
function. The timeThreshold
variable is not used anymore
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.InputSystem;
using Lean.Touch;
[RequireComponent(typeof(ARRaycastManager))]
public class tapToPlaceLean : MonoBehaviour
{
public GameObject gameObjectToInstantiate; //the Prefab GameObject to instantiate in the AR environment. To be added in the inspector window
private GameObject spawnedObject; //the Prefab Instantiate in the scene. Used internally by the script
private ARRaycastManager _arRaycastManager; //part of the XROrigin
static List<ARRaycastHit> hits = new List<ARRaycastHit>();
public float timeThreshold = 0.5f; //User need to tap and hold the finger for at least 0.5 sec to create the content
public bool isTouching = false;
//Event design to fire when content is created
public delegate void ContentVisibleDelegate();
public event ContentVisibleDelegate _contentVisibleEvent;
private void Awake()
{
_arRaycastManager = GetComponent<ARRaycastManager>();
}
private void OnEnable()
{
LeanTouch.OnFingerTap += OnFingerTap;
}
private void OnDisable()
{
LeanTouch.OnFingerTap -= OnFingerTap;
}
public bool TryGetTouchPosition(out Vector2 touchPosition)
{
if (Touchscreen.current.primaryTouch.press.isPressed)
{
isTouching = true;
touchPosition = Touchscreen.current.primaryTouch.position.ReadValue();
return true;
}
touchPosition = default;
isTouching = false;
timeThreshold = 0;
return false;
}
/*
void Update()
{
[...]
}
*/
private void OnFingerTap(LeanFinger finger)
{
if (finger.TapCount == 2) // Check for double tap
{
Vector2 touchPosition = finger.ScreenPosition;
if (_arRaycastManager.Raycast(touchPosition, hits, TrackableType.PlaneWithinPolygon))
{
var hitPose = hits[0].pose;
if (spawnedObject == null)
{
spawnedObject = Instantiate(gameObjectToInstantiate, hitPose.position, hitPose.rotation);
_contentVisibleEvent?.Invoke();
}
else
{
spawnedObject.transform.position = hitPose.position;
}
}
}
}
}
Exit from the Prefab editing mode and select the LeanTouch
GameObject, in the Inspector Panel
:
Lean Select By Finger
and set the Camera to the Main Camera
in the SceneWith the double tap we will create the object, and a single tap used to select the object.
We also need to change also the text in the script uiAR
to Double Tap to Place
You should now be able to:
We are now going to play and control the animation of the gauge. It this example the animation needs to be part of the FBX object imported.
Create -> Animator Controller
, provide a name for the controller (i.e. GaugeAnimatorController) and double click on it to open the Animation Panel
.Animation Panel
, not on the three existing block (Any State, Entry, Exit), and Create State -> Empty
. A New State
block, link with the Entry
block, will be created. If not automatically linked, select the Entry
block, right click on it and Set StateMachine Default State
and create the link with the New State
block.Animation Panel
(not the Prefab, but the FBX imported in Unity). A new grey block with the name of the animation will appear (e.g. Scene).Any State
block and Make Transition with the new grey block Scene
.+
button and add a Trigger named explodeTrigger
and a Float named AnimationSpeed
to be set to 1.1
(any value or name can be used), we will need this later.AnimationSpeed
from the drop-down menuAnimator Panel
click on the arrow that connect Any State
with Scene
Inspector Panel
add a new Conditions and set explodeTrigger
as conditions to start the animation.The Animator controller
is now ready to be connected with a user interface. In this example we are controlling the animation using the Lean Touch
asset. Specifically, we want to use a single tap on the object to start the animation (explodeTrigger
), and another single tap on the object to play the animation backwards (AnimationSpeed
from 1 to -1).
Animator
(from Component/Miscellaneous) to the gauge model inside the Prefab (not the parent Prefab) and add to the Controller the GaugeAnimatorControlleranimationExplode
and add it to the gauge model inside the Prefab (not the parent Prefab). Open the script and add the following code blocks:using UnityEngine;
[RequireComponent(typeof(Animator))]
public class animationExplode : MonoBehaviour
{
Animator animator;
float speed;
bool isExploding = false;
void Start()
{
//Get Animator component
animator = GetComponent<Animator>();
speed = animator.GetFloat("AnimationSpeed");
}
The [RequireComponent(typeof(Animator))]
ensure that the component Animator
is added to the GameObject. The variable speed
is used in this case to control the direction of the animation.
public void Explosion()
{
if (isExploding == false)
{
//Add a Tag value to the animation block, select the animation block from the Animator Controlelr and set its Tag in the Inspector
if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1"))
{
speed = 1;
animator.SetFloat("AnimationSpeed", speed);
isExploding = true;
}
else
{
animator.SetTrigger("explodeTrigger");
isExploding = true;
}
}
else
{
if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1"))
{
speed = -1;
animator.SetFloat("AnimationSpeed", speed);
isExploding = false;
}
else
{
speed = -1;
animator.SetFloat("AnimationSpeed", speed);
animator.SetTrigger("explodeTrigger");
isExploding = false;
}
}
}
The Explosion
function is used to control the animation and it will be trigger when the ARObject is selected using a Lean Finger Tap
.
void Update()
{
if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1") && animator.GetCurrentAnimatorStateInfo(0).normalizedTime > 1)
{
animator.Play("Scene", -1, 1);
}
else if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1") && animator.GetCurrentAnimatorStateInfo(0).normalizedTime < 0)
{
animator.Play("Scene", -1, 0);
}
}
}
The Update
function contains a conditional function to control when the animation is finished (to reset its Time to 0 or 1).
Lean Selectable
componentLean Selectable
component, expand Show Unused Events, add a new one in OnSelected, and add drag the Gauge model inside the prefab on it and select the component animationExplode.Explosion()Lean Pinch Scale
, Lean Twist Rotate Axis
, Lean Drag Translate
have on the Required Selectable the Prefab of the Gauge itself (you should see in the field ARObject (Lean Selectable)
(or the name you gave to the Gauge)Box Collider
to the parent Prefab of the Gauge. Set Size (e.g. 0.15 0.1 0.15
) and Centre to fit the Gauge (e.g. y 0.1
)Exit the Prefab Edit Mode and select the LeanTouch
GameObject in the Hierarchy window
and in the INspector window
Lean Finger Tap
Show Unused Events
and add a new one to the On Finger
LeanTouch
GameObject itself to the field and, on the No Function dropdown menu, select LeanSelectByFinger -> SelectScreenPosition
You can now Build and Run
your project to place the digital gauge in your environment.
In the same way we can control a digital object using MQTT, we can use the same library to publish MQTT messages and therefore control physical devices or other services from the AR mobile app. In this example, we will use the AR app to control a custom website.
Open the Prefab ARObject
in Edit Mode
UI -> Canvas
to the root, select it, and in the Inspector window
Canvas -> Render Mode
to World Space
Pos X = 0; Pos Y= 0; Pos Z = 0;width = 0.2; height = 0.2
Horizontal Layout Group
with Child Alignment set to Lower Center
UI -> Button - TextMeshPro
and rename it ButtonRow, select it, and in the Inspector window
Pos Z = -0.1
width = 0.05 ; height = 0.05
rotation X = 45
Source Image
and change colourVertical Layout Group
with Child Alignment set to Middle Center
and only Control Child Size selected (both Width
and Height
)ButtonRow
and duplicate the Text (TMP)
Text (TMP)
to Row
and in the Inspector window
change the Text to Row
and the FontSize to 0.01
Text (TMP)
to RowNumber
and in the Inspector window
change the Text to 1
and the FontSize to 0.02
columnNumber
and rename the GameObject Column to Grid and change the Text accordinglyThe structure in the Hierarchy
should look like this
We now need to create a new script named mqttPublish
to publish the message to the broker
using UnityEngine;
using TMPro;
public class mqttPublish : MonoBehaviour
{
public string tag_mqttManager = ""; //to be set on the Inspector panel. It must match one of the mqttManager.cs GameObject
[Header(" Case Sensitive!!")]
[Tooltip("the topic to publish !!Case Sensitive!! ")]
public string topicPublish = ""; //the topic to subscribe
public mqttManager _eventSender;
private bool _connected;
private int row = 1;
private int column = 1;
private int[] colour;
public TextMeshProUGUI rowLabel;
public TextMeshProUGUI columnLabel;
public string myName;
private string messagePublish = "";
void Awake()
{
if (GameObject.FindGameObjectsWithTag(tag_mqttManager).Length > 0)
{
_eventSender = GameObject.FindGameObjectsWithTag(tag_mqttManager)[0].gameObject.GetComponent<mqttManager>();
_eventSender.OnConnectionSucceeded += OnConnectionSucceededHandler;
}
else
{
Debug.LogError("At least one GameObject with mqttManager component and Tag == tag_mqttManager needs to be provided");
}
}
private void OnConnectionSucceededHandler(bool connected)
{
_connected = true; //control if we are connected to the MQTT Broker
}
public void rowNumber()
{
row = (++row > 8) ? 1 : row;
rowLabel.text = row.ToString();
}
public void columnNumber()
{
column = (++column > 8) ? 1 : column;
columnLabel.text = column.ToString();
}
public void Grid()
{
colour = new int[3] {
UnityEngine.Random.Range(0, 256), // Range: 0-255
UnityEngine.Random.Range(0, 256), // Range: 0-255
UnityEngine.Random.Range(0, 256) // Range: 0-255
};
messagePublish = "{\"myName\":\"" + myName + "\",\"row\":" + row + ",\"column\":" + column + ",\"colour\":\"" + colour[0] + "," + colour[1] + "," + colour[2] + "\"}";
if (!_connected) //publish if connected
return;
//if the messagePublish is null, use the one of the MQTTReceiver
if (messagePublish.Length > 0)
{
_eventSender.topicPublish = topicPublish;
_eventSender.messagePublish = messagePublish;
}
_eventSender.topicPublish = topicPublish;
_eventSender.Publish();
Debug.Log("Publish" + messagePublish);
}
}
Attach the script to the root of the prefab (e.g. ARObject
)
mqttManger
Topic Publish
My Name
fieldBefore closing the Prefab Edit Mode
set the behaviour of the three buttons
ButtonRow
and in the Inspector
, under Button, add a new event On Click. Drag in the empty field the Prefab root (e.g. ARObject
with the mqttPublish
script attached)mqttPublish -> rowNumber
mqttPublish -> columnNumber
and mqttPublish -> Grid
)Additionally, we need to set the user ID and password in the mqttManager
and change the port accordingly. Build and Run
the App