This workshop will show you how to

Gif animated AirBallon

To complete this workshop you will need

To upload and use AltspaceVR

To code and deploy MRE

Additional resources

In addition to the traditional modelling techniques, 3D photogrammetry is an exceptional solution to rapid prototyping both small objects and entire rooms and buildings. A well-planned photogrammetry survey can speed up the conceptual phase and save time on the entire modelling process.

There are multiple software solutions, both commercial (Agisoft Metashape; Capturing Reality; Pix4D) and free or Open Source (MeshRoom; Regard3D; COLMAP), that provide step-by-step processes to easily create digital 3D models.

Input data used by these solutions are photos, or videos, taken from high-resolution cameras (that generally provide better results), smartphones and 360 cameras.

A series of online resources are available to start working with 3D photogrammetry that explains which lenses to use, how to take the photos as well as the optimal light conditions (e.g. Sketchfab, Agisoft]). Nevertheless, a good 3D photogrammetry model is 50% technique and 50% experience and trial-and-error. Every object and every room has features, details, materials, accessibility that are unique and therefore the approach to use, the number of photos to take, the settings of the software to use need to be considered case by case.

The model provided for this tutorial, the UCL Flaxman Gallery, has been created using Agisoft Metashape and 126 images (Sony a6400). The Gallery is a good testing environment that presents various features, details and materials not always easy to capture (result of the capture can be found on Sketchfab.

Flaxman GalleryWorld

Feel free to use your own digital space to complete this tutorial. Bear in mind that large environments need to be optimised in terms of the number of vertices and resolution of the textures to be used on AltspaceVR or any other AR or VR system.

Shared experiences using AltspaceVR

AltspaceVR is a free to use social VR platform. Currently owned by Microsoft, it was initially developed using AFrame, and it is now based on Unity3D, which allows higher performances and better use of the system resources especially in VR context. While most of the features can be found also in other VR platforms, AltspaceVR, thanks to the Mixed Reality Extensions (MRE), allow the developers to introduce custom scripts and components. AltspaceVR provides an easy to use Unity package to upload virtual environments (Templates) and groups of digital objects (Kits) directly from the Game Engine.

AltspaceVR organises the users' digital spaces in Universes and Worlds, Templates, Kits and MRE.

The first step is to create a World from the AltspaceVR website. From the top menu select More -> Worlds, select My Worlds and create a Universe

Universe

Inside the Universe, create a World. Together with the name and description, it is possible to add an image and to set the privacy level. Additional settings are available after its creation (e.g. users allowed to enter and edit the world).

World

After this setup, we can now move to Unity to create the scene that will be used as a Template for our brand new world, uploaded using the AltspaceVR Uploader package.

AltspaceVR is currently using the newUnity Universal Render Pipeline (URP). URP generally provides better performances and few limitations if compared to the standard rendering pipeline.

Prepare the Template

In Unity Hubs create a new project using the template Universal Render Pipeline (note that Unity version 2020.3.9f1, at the time of writing, Unity version 2020.3.18 are the only compatible with the AltspaceVR Uploader package)

World

The downloaded AltspaceVR Uploader package.*tgz file needs to be always accessible to Unity even after its installation, as it is used as a reference. A good location is as a subfolder in the main Assets folder.

In Unity, open the Package Manager and from the + dropdown select Add package from tarball and select the downloaded *.tgz

World

A new menu item AltspaceVR will be available. Select AltspaceVR -> Templates to open the uploader interface. To use the plugin we need to log in with an AltspaceVR account

World

Press on Create New Template, the default browser will open the web interface (you may need to log in again). Provide a name, an image (optional) and a short description for the Template (they can be added or changed also later). Select restricted to keep the template private.

Create a new Unity scene to upload as AltspaceVR Template.

Import all the static digital models in the scene (all the digital assets that the user will not interact with). Generally the main environment and the furniture. We are going to use the Flaxman Gallery model, but you can test the template also with the CE Studio model, or another digital environment created using 3D photogrammetry, or any other abstract model.

The 3D model needs to have colliders to avoid the user falling outside the boundaries of the environment. These can be automatically created in Unity:

Colliders Models

for the Flaxman Gallery, in the Inspector Window, in the field LocationUse External Materials (Legacy)

if needed, generate lightmaps. Realtime lights are supported but they may impact the performances of the scene (especially if multiple objects are used).

Add a plinth in the centre of the scene (e.g. it can be a simple Unity GameObject Cube, or a 3D model downloaded from SketchFab)

3D Models

Upload the scene

Using the AltspaceVR Uploader Templates (AltspaceVR -> Templates)

3D Models

The packages will create and upload the Template to AltspaceVR. Back to the AltspaceVR web interface, we can now link the World with the Template:

Template

The web interface is useful also to control the status of the Templates, the versions created and where (in which World) they have been used.

Template.

We are now ready to test our World: in the AltspaceVR Client (2D or VR), from Main Menu -> Worlds -> My Worlds select and enter in the created World. AltspaceVR will automatically place the correct player controller for each platform used (even if in the Unity Scene there are no Cameras).

Loading the World

If the World has been set as public, other users can join the same session. The Client provides also features to take screenshots, enable and disable the microphone and personal bubble, as well as, on the right-hand side, Host tools to manage social events, and World Editor tools to customise the environment directly from the Client (e.g. add objects from the Kits and MRE)

Loading the World

The AltspaceVR Package Uploader can be used to create Kits to place interactable, and not interactable, objects in the AltspaceVR environment. A Kit can be used across the Worlds and shared with other users. The digital objects in the Kit can be cloned, and multiple instances of the same object can be created.

To create a new Kit:

Generate Colliders

Create the Kit

To use the Kit, enter the created World using AltspaceVR Client:

Kit

Kit position

Reload the Kit

A gentle introduction to MRE

Importing custom scripts in Virtual Reality social platforms is generally very limited, mostly for security reasons but not only. AltspaceVR provides a system called MRE (Mixed Reality Experience) that provides support for custom scripts in both private and public worlds.

In order to create and test the MRE we need to install NodeJS and the VSCode extension ESLint.

Open the folder in VS Code and from the top menu bar open Terminal -> New Terminal. The file package.json contains already all the dependencies we need to install (at the bottom of the file). In the terminal type npm install and when finished, npm audit fix to patch possible vulnerabilities.

To test the Hello world MRE, in the terminal type npm run build, this command will compile the source scripts, located in the folder src (server.ts and app.ts) in a new folder built.

In the terminal type npm start, this command will run a local server and the MRE will be accessible, via WebSocket from the address and port ws://127.0.0.1:3901

enter your World from the AltspaceVR Client, from the right-side UI select World Editor -> Editor Panel then SDK Apps -> Local Server. The MRE will load after few seconds: an interactive cube (it changes scale when the cursor is on it and makes a spin if clicked) with a text Hello World! on top of it.

MRE Hello world

The language used to write the MRE is called TypeScript. TypeScript is built on JavaScript and it is a subset of it, therefore all the functions working in JavaScript, will work in TypeScript as well (and they should look familiar). It is a strongly typed language and most of the errors can be found directly in the editor before running the code.

Customise the MRE

The MRE Hello World have multiple files and folders. As it is a sample project for various scenarios, some of them are not going to be used in our example and we can safely remove them. The project can be also created from scratch, but some of the settings are the same for every MRE so it is easier to modify the existing files.

/*!
 * Copyright (c) Microsoft Corporation. All rights reserved.
 * Licensed under the MIT License.
 */

import * as MRE from '@microsoft/mixed-reality-extension-sdk';
/**
 * The main class of this app. All the logic goes here.
 */
export default class MQTTGauge {
 private gaugeBody: MRE.Actor = null;
 private pointer: MRE.Actor = null;
 private assets: MRE.AssetContainer;
 private light: MRE.Actor = null;
}

we can now create the variable context and, when ready, execute the function this.started() that we are going to create next

constructor(private context: MRE.Context) {
		this.context.onStarted(() => this.started());
	}
/*
* Once the context is "started", initialize the app.
*/
	private async started() {
  
  }
// set up somewhere to store loaded assets (meshes, textures, animations, gltfs, etc.)
this.assets = new MRE.AssetContainer(this.context);

// Load the glTF models before we use it, "box" refer to the collider to apply

const gaugeBodysrc = await this.assets.loadGltf('Gauge_Body.glb', "box");
const gaugePointersrc = await this.assets.loadGltf('Pointer.glb', "box");

// spawn a copy of the Gauge Body glTF model
this.gaugeBody = MRE.Actor.CreateFromPrefab(this.context, {
  // using the data we loaded earlier
	firstPrefabFrom: gaugeBodysrc,
	// Also apply the following generic actor properties.
	actor: {
			name: 'Gauge Body',
			transform: {
				local: {
					position: { x: 0, y: -1, z: 0 },
					rotation: { x: 0, y: 0, z: 0 }
					}
				}
			}
		});

// spawn a copy of the Pointer as child of the Gauge Body glTF model
this.pointer = MRE.Actor.CreateFromPrefab(this.context, {
		// using the data we loaded earlier
		firstPrefabFrom: gaugePointersrc,
		// Also apply the following generic actor properties.
		actor: {
			name: 'Gauge Pointer',
			// Parent the glTF model to the text actor, so the transform is relative to the text
			parentId: this.gaugeBody.id,
			transform: {
				local: {
					position: { x: 0, y: 0, z: 0 },
						rotation: { x: 0, y: 0, z: 0 }
					}
				}
			}
		});

The script is using the 3D models of the Gauge and the Pointer in GLB format.

MRE Gauge

To change the rotation of the model, World Editor -> Editor Panel select the small cog-gear to see the settings and change the rotation value of X = 90. In the settings, we can change the position and scale of the object, as well as its name. It is of course possible to rotate the object from the MRE as we will see later on.

If the model looks dark, we can add a light actor just after the this.pointerActor

this.light = MRE.Actor.Create(this.context, {
			actor: {
				parentId: this.gaugeBody.id,
				name: 'Light',
				transform: {
					local: {
						position: { x: 0, y: 3.0, z: 0 },
					}
				},
				light: {
					color: { r: 1, g: 1, b: 1 },
					type: 'point',
					intensity: 1,
					range: 60
				},
			}
		});

This code will create a point light as child of the Gauge.

It is also possible to spawn 3D models that are already part of our AltspaceVR Kits. To do this, we need to remove the lines loadGltf (and the 3D models from the public folder) and remove async from private async started() (as the await is not there anymore) and instead of MRE.Actor.CreateFromPrefab use another method: MRE.Actor.CreateFromLibrary

In this case the method asks for a resourceId this value can be easily find from the AltspaceVR website:

this.gaugeBody = MRE.Actor.CreateFromLibrary(this.context, {
			resourceId: 'artifact:1846572962480653129',
			actor:{
				name:'Gauge',
				transform: {
					local: {
						position: { x: 0, y: -1, z: 0 },
						rotation:{ x: 0, y: 0, z: 0 },
					}
				}
			}
		})

		this.pointer = MRE.Actor.CreateFromLibrary(this.context, {
			resourceId: 'artifact:1846572951206363969',
			actor:{
				name:'Gauge Pointer',
				parentId: this.gaugeBody.id,
				transform: {
					local: {
						position: { x: 0, y: 0, z: 0 },
						rotation:{ x: 0, y: 0, z: 0 },
					}
				}
			}
		})

MRE from Artifact

MQTT data and animation

As the MRE are written in TypeScript and works on the top of NodeJS, it is possible to add other external libraries and framework, such as the MQTT.js.

Before using it we need to install the modules in the _nodemodules folder. There are two ways for doing it:

"dependencies": {
    "@microsoft/mixed-reality-extension-sdk": "^0.20.0",
    "@types/dotenv": "^6.1.0",
    "@types/node": "^10.3.1",
    "mqtt": "^4.2.6",
    "dotenv": "^6.2.0"
  }

Then type in the terminal npm install to add the required modules

Once installed, in src/app.ts we need to import the modules we are going to use:

import * as MRE from '@microsoft/mixed-reality-extension-sdk';
import { IClientOptions, Client, connect, IConnackPacket } from "mqtt";
//MQTT variable
	private BROKER: string;
	private TOPIC: string;
	private client: Client;

	private valueWind: number = null; // value used to move the pointer
private mqttConnect() {
		console.log("ready");

		this.BROKER = "mqtt://URL OF THE MQTT BROKER";
		this.TOPIC = "TOPIC/TO/SUBSCRIBE";
		const opts: IClientOptions = { port: 1883 };
    //username:USER, password: PW can be added in the opts

		this.client = connect(this.BROKER, opts);
		this.client.subscribe({ [this.TOPIC]: { qos: 2 } }, (err, granted) => {
			granted.forEach(({ topic, qos }) => {
				console.log(`subscribed to ${topic} with qos=${qos}`);
			});
		}).on("message", (topic: string, payload: Buffer) => {

      //if response is JSON use
			//const responseJson = JSON.parse(payload.toString());
			
      //if response is value use
      const responseJson = payload.toString();
			console.log(responseJson);

			this.valueWind = parseFloat(responseJson);
      //rotation from 0 to 270dg->[3(Math.PI)/2]->4.712. Value to multiply 4.721/60->0.07853
  this.pointer.transform.local.rotation = MRE.Quaternion.FromEulerAngles(0, this.valueWind*((3*(Math.PI)/2)/60),0);
			// client.end(); //uncomment to stop after 1 msg
		}).on("connect", (packet: IConnackPacket) => {
			console.log("connected!", JSON.stringify(packet));
		});
	}

finally, call the function this.mqttConnect(); inside the private started(), just before its last curly bracket.

in the terminal we need to stop the running MRE using CTRL+C in Windows and Command+C on MacOS. Then type npm run build and npm start

The function now connects to the MQTT broker and, for each message received, it changes the rotation of the pointer. A great result, but we want to do something better, like a smooth animation for the pointer:

MRE.Animation.AnimateTo(this.context, this.pointer, {
destination:
  {
    transform:
    {
      local:
      {
      //rotation from 0 to 270dg->[3(Math.PI)/2]->4.712. Value to multiply 4.721/60->0.07853
      rotation: MRE.Quaternion.FromEulerAngles(0, this.valueWind * ((3 * (Math.PI) / 2) / 60), 0)
      }
    }
  },
	duration: 0.5,
	easing: MRE.AnimationEaseCurves.EaseOutSine
});

We can control the duration of the animation and the type of curve for the animation. Once again, from the terminal type npm run build and then npm start to build the new MRE and start the local server

MRE MQTT final

So far we run the MRE from a local server, but if we want the MRE to be used by other users, we need to render it public.
Any service that provide access to WebSocket and a Node.js platform will fit our needs.
In this example we are going to use the free tiers of Heroku.

Create a Herokuapp

HerokuAPP

HerokuAPP

It is possible to deploy the App using three different systems, Using the Heroku CLI, by linking a GitHub repository, or a Container Registry. We will see here Deploy using Heroku Git

HerokuAPP

link the local repository with the remote App create on Heroku

finally, deploy the App using:

The final command will also start the process to install the _nodemodules remotely.

At the end of the process on the Heroku website we should be able to see our App as https://NAME OF THE APP.herokuapp.com/. By opening the Https link of the App we will see the content of the index.html file we copied in the public folder.

To the MRE in AltspaceVR we just need to change the address of the local server from ws://127.0.0.1:3901 to wss://NAME OF THE APP.herokuapp.com (note the WSS, Secure WebSocket)

HerokuAPP

Final app.ts script

/*!
 * Copyright (c) Microsoft Corporation. All rights reserved.
 * Licensed under the MIT License.
*/

import * as MRE from '@microsoft/mixed-reality-extension-sdk';
import { IClientOptions, Client, connect, IConnackPacket } from "mqtt";

/**
 * The main class of this app. All the logic goes here.
 */
export default class MQTTGauge {
	private gaugeBody: MRE.Actor = null;
	private pointer: MRE.Actor = null;
	private assets: MRE.AssetContainer;
	//private light: MRE.Actor = null;

	//MQTT variable
	private BROKER: string;
	private TOPIC: string;
	private client: Client;

	private valueWind: number = null; // windDir

	constructor(private context: MRE.Context) {
		this.context.onStarted(() => this.started());
	}

	/**
	 * Once the context is "started", initialize the app.
	 */
	//private async started() {
	private started() {

		// set up somewhere to store loaded assets (meshes, textures, animations, gltfs, etc.)
		this.assets = new MRE.AssetContainer(this.context);

		// Load a glTF model before we use it, "box" refer to the collider to apply
		// The started function needs to have the async
		//const gaugeBodysrc = await this.assets.loadGltf('Gauge_Body.glb', "box");
		//const gaugePointersrc = await this.assets.loadGltf('Pointer.glb', "box");

		/*
		// spawn a copy of the Gauge Body glTF model
		this.gaugeBody = MRE.Actor.CreateFromPrefab(this.context, {
		// using the data we loaded earlier
			firstPrefabFrom: gaugeBodysrc,
			// Also apply the following generic actor properties.
			actor: {
				name: 'Gauge Body',
				// Parent the glTF model to the text actor, so the transform is relative to the text
				transform: {
					local: {
						position: { x: 0, y: -1, z: 0 },
						rotation: { x: 0, y: 0, z: 0 }
					}
				}
			}
		});

		// spawn a copy of the Pointer as child of the Gauge Body glTF model
		this.rotPointer = MRE.Quaternion.FromEulerAngles(0, 0, 0);
		this.pointer = MRE.Actor.CreateFromPrefab(this.context, {
		// using the data we loaded earlier
			firstPrefabFrom: gaugePointersrc,
			// Also apply the following generic actor properties.
			actor: {
				name: 'Gauge Pointer',
				// Parent the glTF model to the text actor, so the transform is relative to the text
				parentId: this.gaugeBody.id,
				transform: {
					local: {
						position: { x: 0, y: 0, z: 0 },
						rotation: this.rotPointer,
					}
				}
			}
		});
	*/
		this.gaugeBody = MRE.Actor.CreateFromLibrary(this.context, {
			resourceId: 'artifact:1846572962480653129',
			actor:{
				name:'Gauge',
				transform: {
					local: {
						position: { x: 0, y: -1, z: 0 },
						rotation:{ x: 0, y: 0, z: 0 },
					}
				}
			}
		})

		this.pointer = MRE.Actor.CreateFromLibrary(this.context, {
			resourceId: 'artifact:1846572951206363969',
			actor:{
				name:'Gauge Pointer',
				parentId: this.gaugeBody.id,
				transform: {
					local: {
						position: { x: 0, y: 0, z: 0 },
						rotation:{ x: 0, y: 0, z: 0 },
					}
				}
			}
		})

		this.mqttConnect();
/*
		this.light = MRE.Actor.Create(this.context, {
			actor: {
				parentId: this.gaugeBody.id,
				name: 'Light',
				transform: {
					local: {
						position: { x: 0, y: 3.0, z: 0 },
					}
				},
				light: {
					color: { r: 1, g: 1, b: 1 },
					type: 'point',
					intensity: 1,
					range: 60
				},

			}
		});
*/

	}

	private mqttConnect() {
		console.log("ready");

		this.BROKER = "mqtt://URL OF THE MQTT BROKER";
		this.TOPIC = "TOPIC/TO/SUBSCRIBE";
		const opts: IClientOptions = { port: 1883 }; //to change with the port of the broker
		//username:USER, password: PW can be added in the opts

		this.client = connect(this.BROKER, opts);
		this.client.subscribe({ [this.TOPIC]: { qos: 2 } }, (err, granted) => {
			granted.forEach(({ topic, qos }) => {
				console.log(`subscribed to ${topic} with qos=${qos}`);
			});
		}).on("message", (topic: string, payload: Buffer) => {

			//if response is JSON use
			//const responseJson = JSON.parse(payload.toString());

			//if response is value use
			const responseJson = payload.toString();
			console.log(responseJson);

			this.valueWind = parseFloat(responseJson);
      //rotation from 0 to 270dg->[3(Math.PI)/2]->4.712. Value to multiply 4.721/60->0.07853
  //this.pointer.transform.local.rotation = MRE.Quaternion.FromEulerAngles(0, this.valueWind*((3*(Math.PI)/2)/60),0);

  MRE.Animation.AnimateTo(this.context, this.pointer, {
	destination:
	{
		transform:
		{
			local:
			{
			//rotation from 0 to 270dg->[3(Math.PI)/2]->4.712. Value to multiply 4.721/60->0.07853
			rotation: MRE.Quaternion.FromEulerAngles(0, this.valueWind * ((3 * (Math.PI) / 2) / 60), 0)
			}
		}
	},
				duration: 0.5,
				easing: MRE.AnimationEaseCurves.EaseOutSine
			});
			// client.end(); //uncomment to stop after 1 msg
		}).on("connect", (packet: IConnackPacket) => {
			console.log("connected!", JSON.stringify(packet));
		});
	}
}