Now that we have completed our first AR simple project as per our previous tutorial, lets try to deploy our XZIMG Augmented Vision project to PC
Accessing the Build Settings
Go to File, select Build Settings
You will see something like this
Add Current Scene to our Build
Click Add Open Scene to add our current scene to our build
Our scene has now been added to our build. It will be listed in Scenes In Build
Building the Application
Press the Build button
A window appears asking where you want to store the built application. You can store it wherever you feel is convenient. I usually create a folder called Output inside my project folder, once you selected your own folder, press Select Folder
The Application will start building
The completed Application will be stored at our selected folder
Testing the application
When you run the Application, this window will appear. I recommend to check the Windowed option, so that it is easier to move the AR app around when its running, otherwise it will run fullscreen. Its just a preference though
There it is, our application is playing happily from our PC
Detection of Marker Detected and Lost events is important. We can use it to play sound when marker is found, play and pause video, trigger animations. It is the backbone of many AR interaction experience. This tutorial will show you how to do it. In the next tutorial, we will start playing with audio, video and other things. so this is really important
To follow this tutorial you would need a few things:
Event Handling scripts. Use mine for now, we will learn how to build one in the advanced class. download it from here.
Download and Extract the Event Handling scripts folder
Download the scripts i provided above, and unzip it. You will get a folder like so:
Drag and drop the folder into Unity
Drag and Drop the EventHandling Script folder into Assets in Project window
Replacing the default Image Detector with our new one that has Event Sender
At the project window, go to Assets > EventHandling Script , you will see xmgAIplusEventSender and Event Receiver
Firstly, in the Hierarchy, select the imageDetector
With imageDetector selected, look at the Inspector window, right click Xmg Augmented Image script and click Remove Component
Drag and drop the xmgAIplusEventSender script into the Inspector window
Your inspector window will now look like this. Good job!
Configuring the new script
Under Object Pivot Links, change the Size to 1 and press Enter on your keyboard.
Once you press Enter, you will get something like this
Drag and drop Pivot 1 to Scene Pivot
Go to Assets > Resources, drag and drop the marker that we created in the previous tutorial to the Classifier slot
Finally, under Object Pivot Links, change the Object Real Width to 1.
That’s it, we are done with the sender part, now a few more steps for the receiver part.
Attaching the Event Receiver
In the Hierarchy, select Pivot1
With Pivot1 selected, drag and drop the Event Receiver into the Inspector
It should look like this now
Lets test it
Press Play, and show our marker to the camera. Our AR should work normally and the Marker Status will show marker is detected
Okay…we are all Done. So what can we do with this?
With this ready, we can use it to play sound when marker is found, play and pause video, trigger animations, do multiple marker interactions and many more. We will look at some of them in the next post
Notice, this codes are no longer needed as Vuforia has provided their own Unity Events Handler. So If you are using the latest Vuforia, it comes built in. It looks like the picture below and it works exactly the same as the one I wrote here.
This post below is still kept as there are people still struggling to update their old projects, so this is for you guys/gals in University, hope you can update soon!
Do you ever wish that Vuforia event handling is as simple as using the OnClick() property in Unity UI’s Button? Wish no more bros and sis. Here the code, and explanation on how I did it.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Events;
using Vuforia;
public class EasyEventHandler : MonoBehaviour, ITrackableEventHandler
{
public UnityEvent onMarkerFound;
public UnityEvent onMarkerLost;
protected TrackableBehaviour mTrackableBehaviour;
protected TrackableBehaviour.Status m_PreviousStatus;
protected TrackableBehaviour.Status m_NewStatus;
protected virtual void Start()
{
mTrackableBehaviour = GetComponent<TrackableBehaviour>();
if (mTrackableBehaviour)
mTrackableBehaviour.RegisterTrackableEventHandler(this);
}
protected virtual void OnDestroy()
{
if (mTrackableBehaviour)
mTrackableBehaviour.UnregisterTrackableEventHandler(this);
}
public void OnTrackableStateChanged(
TrackableBehaviour.Status previousStatus,
TrackableBehaviour.Status newStatus)
{
m_PreviousStatus = previousStatus;
m_NewStatus = newStatus;
if (newStatus == TrackableBehaviour.Status.DETECTED ||
newStatus == TrackableBehaviour.Status.TRACKED ||
newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
{
onMarkerFound.Invoke();
}
else if (previousStatus == TrackableBehaviour.Status.TRACKED &&
newStatus == TrackableBehaviour.Status.NO_POSE)
{
onMarkerLost.Invoke();
}
else
{
// For combo of previousStatus=UNKNOWN + newStatus=UNKNOWN|NOT_FOUND
// Vuforia is starting, but tracking has not been lost or found yet
// Call OnTrackingLost() to hide the augmentations
onMarkerLost.Invoke();
}
}
}
How it looks like
Explanation
Creating the Unity Events and Exposing it in Inspector
This code uses the UnityEngine.Events . Firstly, I create two UnityEvents variable, and make the variables public, doing this I get to expose the events, giving you that familiar On Click () thingy in Unity UI Button.
public UnityEvent onMarkerFound;
public UnityEvent onMarkerLost;
Invoking the Events when marker is detected and lost
Then, OnTrackableStateChanged, if the Image Target is detected, I use the UnityEvent’s Invoke() function to Invoke the event, this will then work just like the On Click ()
A note for the pros and intermediately skilled. Since this is a basic tutorial, I will be very detailed to help those in need of a step by step nature of explaination. It may be boring a bit for the ones in the know, so skim through the whole tutorial and pick out what you need yeah.
Prerequisite
In order for you to follow this tutorial, you would need:
Unity 2019.1.5 (and above), with Android deployment module
XZIMG Augmented Vision for Unity. Download from here