This workshop will show you how to:

Final Result

To complete this workshop you will need:

Languages used:

Additional resources

The onboarding experience is an important aspect of any AR app. Its main purpose is to provide users with straightforward information on how to use the app. While each application will have specific instructions, the initial steps (e.g., finding a plane, placing an object) are commonly used in any application. There are no specific rules on how the onboarding experience needs to be; the important part is that the user needs to be informed from the beginning about what they need to do to use the application (e.g., which button to press, which actions to perform).

Overlays video

One effective solution is to add overlaid videos and/or text to alert the user. Both uGUI and UIToolkit can be used for displaying and controlling the user interface. In this tutorial, we are going to use the uGUI system.

Canvas Screen space

Starting point of the workshop is the previous project 6: Unity AR Physical to Digital.

Before building the UI we need to import the onboarding AR videos. Import the findaplane.webm, findanimage.webm and taptoplace.webm in a subfolder (e.g. Videos) of the Assets.

Inside the same subfolder create a RenderTexture named UXRenderTextureand change the size to 512 x 512 in the Inspector window and Anti-aliasing to 2 samples

Render Texture

Create a new UI -> Canvas and change the name to Onboarding

Create Canvas

In the Inspector Window:

Create a new Empty Game Object inside the Canvas GameObject and name it UIRoot this is going to be the container of the UI.

With the UIRoot selected, in the Inspector Window:

UIRoot element

We can now add the actual components that hold the UX information:

The order of these components in the Hierarchy Panel is important.

With both Info1 and Info2 selected, in the Inspector Window:

Select the Info1 GameObject and in the TextMeshPro - Text (UI) component change the text input to:

Similarly, for the Info2, change the text to:

Select the the Video GameObject, and in the Inspector Window:

Finally, add to the Video GameObject a new UI -> Raw Image and named it videoUX, and in the Inspector Window:

Video Player UI

Press Play, you should be able to see the UI interface and the video running in the centre

Play video UI

Control the UI

The final step is to make the UI respond to the underlying ARFoundation system. This involves changing the text and video when the surface is found and removing the interface once the user places the object. This functionality is controlled by a new C# script.

Create a new script named uiAR

using UnityEngine;
using UnityEngine.Video;
using UnityEngine.XR.ARFoundation;
using TMPro;

//Script need to be added to XROrigin, as it has ARPlaneManager and tapToPlace
public class uiAR : MonoBehaviour
{
    //Video player
    public VideoPlayer videoUX;
    //Video Clips
    public VideoClip v_findSurface;
    public VideoClip v_tapPlace;
    //Info texts
    public TextMeshProUGUI t_info1;
    public TextMeshProUGUI t_info2;

    // Events for found plane and content created
    private ARPlaneManager m_ARPlaneManager; //ARFoundation system
    private tapToPlace m_tapToPlace; //used to detect when the user create the content

    bool isContentVisible = false;

    // Start is called before the first frame update
    private void Awake()
    {
        videoUX.clip = v_findSurface;
        m_ARPlaneManager = GetComponent<ARPlaneManager>();
        m_ARPlaneManager.planesChanged += planeFound; //Subscribe to the event `plane is detected`

        m_tapToPlace = GetComponent<tapToPlace>();
        m_tapToPlace._contentVisibleEvent += contentVisible; //Subscribe to the event `content is created` (user Tap)
    }

    void planeFound(ARPlanesChangedEventArgs args)
    {
        //Plane found, turn off UI and Video
        videoUX.clip = v_tapPlace;
        t_info1.text = "Tap to Place";
        t_info2.text = "Surface found";
        m_ARPlaneManager.planesChanged -= planeFound; //Unsubcribe

        if (isContentVisible)
        {   //Content created, turn off UI and Video
            videoUX.gameObject.SetActive(false);
            t_info1.gameObject.SetActive(false);
            t_info2.gameObject.SetActive(false);
        }

    }

    void contentVisible()
    {
        isContentVisible = true; //if the content is visible
        m_tapToPlace._contentVisibleEvent -= contentVisible; //Unsubscribe

        //Content created, turn off UI and Video
        videoUX.gameObject.SetActive(false);
        t_info1.gameObject.SetActive(false);
        t_info2.gameObject.SetActive(false);
    }
}

Add the script to the XR Origin GameObject and set the public variables
Play video UI

In order to see the script in action we need to build and deploy the App

Play video UI

The Lean Touch Asset provides a quick and easy way to add multiple gestures to your AR project without writing (almost) any code.

Installation is a two-step process, firstly you need to download the Unity Assets (there are two versions of LeanTouch, the Free version is enough for our needs) from the Unity Store, to add it to your asset collection.

Head to The Unity Store

Lean Touch

Secondly, install it in Unity by going to Window -> Package Manager

Search under Packages: My Assests for Lean Touch, download and import.

Install

Add the LeanTouch GameObject by right-clicking on the Hierarchy panelLean -> Touch

We now need to add the touch controls to our object (ARObject) - there are numerous options and Lean Touch can be used for any application with a touch screen.

Double-click your AR Object Prefab to open it in Edit mode and click on Add Component. If you type in Lean you will see a long list of options. Our first one is Lean Selectable and we want to tick the Self Selected option - this simple makes sure our object is automatically selected and ready to touch.

As the Lean Drag Translate will be in conflict with the tapToPlace script, we can change the latter and use Lean touch for the input tap.

Open the tapToPlace script in VSCode (tapToPlace should be attached to XROrigin GameObject), comment out the Update function and add a new OnFingerTap(LeanFinger finger) function. The timeThreshold variable is not used anymore

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.InputSystem;

using Lean.Touch;


[RequireComponent(typeof(ARRaycastManager))]
public class tapToPlaceLean : MonoBehaviour
{
    public GameObject gameObjectToInstantiate; //the Prefab GameObject to instantiate in the AR environment. To be added in the inspector window
    private GameObject spawnedObject; //the Prefab Instantiate in the scene. Used internally by the script 
    private ARRaycastManager _arRaycastManager; //part of the XROrigin

    static List<ARRaycastHit> hits = new List<ARRaycastHit>();
    public float timeThreshold = 0.5f; //User need to tap and hold the finger for at least 0.5 sec to create the content
    public bool isTouching = false;

    //Event design to fire when content is created
    public delegate void ContentVisibleDelegate();
    public event ContentVisibleDelegate _contentVisibleEvent;

    private void Awake()
    {
        _arRaycastManager = GetComponent<ARRaycastManager>();
    }

    private void OnEnable()
    {
        LeanTouch.OnFingerTap += OnFingerTap;
    }

    private void OnDisable()
    {
        LeanTouch.OnFingerTap -= OnFingerTap;
    }

    public bool TryGetTouchPosition(out Vector2 touchPosition)
    {
        if (Touchscreen.current.primaryTouch.press.isPressed)
        {
            isTouching = true;
            touchPosition = Touchscreen.current.primaryTouch.position.ReadValue();
            return true;
        }
        touchPosition = default;
        isTouching = false;
        timeThreshold = 0;
        return false;
    }

    /* 
    void Update()
     {
      [...]
     }
     */
    private void OnFingerTap(LeanFinger finger)
    {
        if (finger.TapCount == 2) // Check for double tap
        {
            Vector2 touchPosition = finger.ScreenPosition;

            if (_arRaycastManager.Raycast(touchPosition, hits, TrackableType.PlaneWithinPolygon))
            {
                var hitPose = hits[0].pose;

                if (spawnedObject == null)
                {
                    spawnedObject = Instantiate(gameObjectToInstantiate, hitPose.position, hitPose.rotation);
                    _contentVisibleEvent?.Invoke();
                }
                else
                {
                    spawnedObject.transform.position = hitPose.position;
                }
            }
        }
    }
}

Exit from the Prefab editing mode and select the LeanTouch GameObject, in the Inspector Panel:

With the double tap we will create the object, and a single tap used to select the object.

We also need to change also the text in the script uiAR to Double Tap to Place

You should now be able to:

We are now going to play and control the animation of the gauge. It this example the animation needs to be part of the FBX object imported.

FBX animation Inspector

Empty State

New State

New State

Inspector Multiplier

Conditions animation

The Animator controller is now ready to be connected with a user interface. In this example we are controlling the animation using the Lean Touch asset. Specifically, we want to use a single tap on the object to start the animation (explodeTrigger), and another single tap on the object to play the animation backwards (AnimationSpeed from 1 to -1).

using UnityEngine;
[RequireComponent(typeof(Animator))]
public class animationExplode : MonoBehaviour
{
    Animator animator;
    float speed;
    bool isExploding = false;

    void Start()
    {
        //Get Animator component
        animator = GetComponent<Animator>();
        speed = animator.GetFloat("AnimationSpeed");
    }

The [RequireComponent(typeof(Animator))] ensure that the component Animator is added to the GameObject. The variable speed is used in this case to control the direction of the animation.

public void Explosion()
    {
        if (isExploding == false)
        {
            //Add a Tag value to the animation block, select the animation block from the Animator Controlelr and set its Tag in the Inspector
            if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1"))
            {
                speed = 1;
                animator.SetFloat("AnimationSpeed", speed);
                isExploding = true;
            }
            else
            {
                animator.SetTrigger("explodeTrigger");
                isExploding = true;
            }

        }

        else
        {
            if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1"))
            {
                speed = -1;
                animator.SetFloat("AnimationSpeed", speed);
                isExploding = false;
            }
            else
            {
                speed = -1;
                animator.SetFloat("AnimationSpeed", speed);
                animator.SetTrigger("explodeTrigger");
                isExploding = false;
            }

        }
    }

The Explosion function is used to control the animation and it will be trigger when the ARObject is selected using a Lean Finger Tap.

    
    void Update()
    {

        if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1") && animator.GetCurrentAnimatorStateInfo(0).normalizedTime > 1)
        {
            animator.Play("Scene", -1, 1);
        }
        else if (animator.GetCurrentAnimatorStateInfo(0).IsTag("1") && animator.GetCurrentAnimatorStateInfo(0).normalizedTime < 0)
        {
            animator.Play("Scene", -1, 0);
        }
    }
}

The Update function contains a conditional function to control when the animation is finished (to reset its Time to 0 or 1).

Exit the Prefab Edit Mode and select the LeanTouch GameObject in the Hierarchy window and in the INspector window

You can now Build and Run your project to place the digital gauge in your environment.

In the same way we can control a digital object using MQTT, we can use the same library to publish MQTT messages and therefore control physical devices or other services from the AR mobile app. In this example, we will use the AR app to control a custom website.

Open the Prefab ARObject in Edit Mode

The structure in the Hierarchy should look like this

UI for the Grid

We now need to create a new script named mqttPublish to publish the message to the broker

using UnityEngine;
using TMPro;

public class mqttPublish : MonoBehaviour
{
    public string tag_mqttManager = ""; //to be set on the Inspector panel. It must match one of the mqttManager.cs GameObject
    [Header("   Case Sensitive!!")]
    [Tooltip("the topic to publish !!Case Sensitive!! ")]
    public string topicPublish = ""; //the topic to subscribe
    public mqttManager _eventSender;
    private bool _connected;
    private int row = 1;
    private int column = 1;
    private int[] colour;
    public TextMeshProUGUI rowLabel;
    public TextMeshProUGUI columnLabel;
    public string myName;
    private string messagePublish = "";

    void Awake()
    {
        if (GameObject.FindGameObjectsWithTag(tag_mqttManager).Length > 0)
        {
            _eventSender = GameObject.FindGameObjectsWithTag(tag_mqttManager)[0].gameObject.GetComponent<mqttManager>();
            _eventSender.OnConnectionSucceeded += OnConnectionSucceededHandler;
        }
        else
        {
            Debug.LogError("At least one GameObject with mqttManager component and Tag == tag_mqttManager needs to be provided");
        }
    }

    private void OnConnectionSucceededHandler(bool connected)
    {
        _connected = true; //control if we are connected to the MQTT Broker
    }

    public void rowNumber()
    {
        row = (++row > 8) ? 1 : row;
        rowLabel.text = row.ToString();
    }

    public void columnNumber()
    {
        column = (++column > 8) ? 1 : column;
        columnLabel.text = column.ToString();
    }

    public void Grid()
    {
        colour = new int[3] {
            UnityEngine.Random.Range(0, 256), // Range: 0-255
            UnityEngine.Random.Range(0, 256), // Range: 0-255
            UnityEngine.Random.Range(0, 256)  // Range: 0-255
        };

        messagePublish = "{\"myName\":\"" + myName + "\",\"row\":" + row + ",\"column\":" + column + ",\"colour\":\"" + colour[0] + "," + colour[1] + "," + colour[2] + "\"}";

        if (!_connected) //publish if connected
            return;
        //if the messagePublish is null, use the one of the MQTTReceiver
        if (messagePublish.Length > 0)
        {
            _eventSender.topicPublish = topicPublish;
            _eventSender.messagePublish = messagePublish;
        }

        _eventSender.topicPublish = topicPublish;
        _eventSender.Publish();
        Debug.Log("Publish" + messagePublish);
    }
}

Attach the script to the root of the prefab (e.g. ARObject)

UI for the Grid

Before closing the Prefab Edit Mode set the behaviour of the three buttons

UI behaviour

Additionally, we need to set the user ID and password in the mqttManager and change the port accordingly. Build and Run the App

Final Result