Deploying our AR to PC – XZIMG Augmented Vision

Now that we have completed our first AR simple project as per our previous tutorial, lets try to deploy our XZIMG Augmented Vision project to PC

Accessing the Build Settings

Go to File, select Build Settings

You will see something like this

Add Current Scene to our Build

Click Add Open Scene to add our current scene to our build

Our scene has now been added to our build. It will be listed in Scenes In Build

Building the Application

Press the Build button

A window appears asking where you want to store the built application. You can store it wherever you feel is convenient. I usually create a folder called Output inside my project folder, once you selected your own folder, press Select Folder

The Application will start building

The completed Application will be stored at our selected folder

Testing the application

When you run the Application, this window will appear. I recommend to check the Windowed option, so that it is easier to move the AR app around when its running, otherwise it will run fullscreen. Its just a preference though

There it is, our application is playing happily from our PC

Playing Sound on Marker Detected in XZIMG Augmented Vision

Prerequisite

I order to start this tutorial, you would need:

  1. You need to have finished the previous tutorial on Detecting Marker Detected and Lost
  2. You need a sound. You can find lots of it to download here

What type of sound format does Unity support?

It is best to go for mp3 sounds as sometimes in my experience the .wav have a tendency to not play properly.

Import Sound Into Unity

So you have downloaded the sound now lets import it into Unity. Drag and drop the sound into the Assets folder, in the Project window.

Using the downloaded sound

In the Hierarchy window, right click the Main Camera, under Audio, click Audio Source

An Audio Source will be created under the Main Camera. It will look something like this

Select the Audio Source, with the Audio Source selected, drag and drop the sound to the Audio Clip slot.

Next, uncheck Play On Awake.

Triggering the sound to play when marker is detected

Select Pivot1, look at the Inspector window, under Event Receiver, press the “+” at On Marker Found()

You will get something like this

Drag and Drop Audio Source to the Object Slot

Click the Function Dropdown, Go to Audio Source, and select Play()

Now lets Play it and test it out

The marker is detected and sound is now playing

Detecting when Marker Is Detected and Lost in XZIMG Augmented Vision

Detection of Marker Detected and Lost events is important. We can use it to play sound when marker is found, play and pause video, trigger animations. It is the backbone of many AR interaction experience. This tutorial will show you how to do it. In the next tutorial, we will start playing with audio, video and other things. so this is really important

To follow this tutorial you would need a few things:

  1. You finished the previous tutorial
  2. Event Handling scripts. Use mine for now, we will learn how to build one in the advanced class. download it from here.

Download and Extract the Event Handling scripts folder

Download the scripts i provided above, and unzip it. You will get a folder like so:

Drag and drop the folder into Unity

Drag and Drop the EventHandling Script folder into Assets in Project window

Replacing the default Image Detector with our new one that has Event Sender

At the project window, go to Assets > EventHandling Script , you will see xmgAIplusEventSender and Event Receiver

Firstly, in the Hierarchy, select the imageDetector

With imageDetector selected, look at the Inspector window, right click Xmg Augmented Image script and click Remove Component

Drag and drop the xmgAIplusEventSender script into the Inspector window

Your inspector window will now look like this. Good job!

Configuring the new script

Under Object Pivot Links, change the Size to 1 and press Enter on your keyboard.

Once you press Enter, you will get something like this

Drag and drop Pivot 1 to Scene Pivot

Go to Assets > Resources, drag and drop the marker that we created in the previous tutorial to the Classifier slot

Finally, under Object Pivot Links, change the Object Real Width to 1.

That’s it, we are done with the sender part, now a few more steps for the receiver part.

Attaching the Event Receiver

In the Hierarchy, select Pivot1

With Pivot1 selected, drag and drop the Event Receiver into the Inspector

It should look like this now

Lets test it

Press Play, and show our marker to the camera. Our AR should work normally and the Marker Status will show marker is detected

Okay…we are all Done. So what can we do with this?

With this ready, we can use it to play sound when marker is found, play and pause video, trigger animations, do multiple marker interactions and many more. We will look at some of them in the next post

Simpler Vuforia Event Handling using Unity Events

Edit 2021 Update:

Notice, this codes are no longer needed as Vuforia has provided their own Unity Events Handler. So If you are using the latest Vuforia, it comes built in. It looks like the picture below and it works exactly the same as the one I wrote here.

This post below is still kept as there are people still struggling to update their old projects, so this is for you guys/gals in University, hope you can update soon!


Do you ever wish that Vuforia event handling is as simple as using the OnClick() property in Unity UI’s Button? Wish no more bros and sis. Here the code, and explanation on how I did it.

The Code

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Events;
using Vuforia;

    public class EasyEventHandler : MonoBehaviour, ITrackableEventHandler
    {

    public UnityEvent onMarkerFound;
    public UnityEvent onMarkerLost;

        protected TrackableBehaviour mTrackableBehaviour;
        protected TrackableBehaviour.Status m_PreviousStatus;
        protected TrackableBehaviour.Status m_NewStatus;



        protected virtual void Start()
        {
            mTrackableBehaviour = GetComponent<TrackableBehaviour>();
            if (mTrackableBehaviour)
                mTrackableBehaviour.RegisterTrackableEventHandler(this);
        }

        protected virtual void OnDestroy()
        {
            if (mTrackableBehaviour)
                mTrackableBehaviour.UnregisterTrackableEventHandler(this);
        }


        public void OnTrackableStateChanged(
            TrackableBehaviour.Status previousStatus,
            TrackableBehaviour.Status newStatus)
        {
            m_PreviousStatus = previousStatus;
            m_NewStatus = newStatus;

            if (newStatus == TrackableBehaviour.Status.DETECTED ||
                newStatus == TrackableBehaviour.Status.TRACKED ||
                newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
            {
                onMarkerFound.Invoke();
            }
            else if (previousStatus == TrackableBehaviour.Status.TRACKED &&
                     newStatus == TrackableBehaviour.Status.NO_POSE)
            {
 
                onMarkerLost.Invoke();
            }
            else
            {
            // For combo of previousStatus=UNKNOWN + newStatus=UNKNOWN|NOT_FOUND
            // Vuforia is starting, but tracking has not been lost or found yet
            // Call OnTrackingLost() to hide the augmentations
            onMarkerLost.Invoke();
            }
        }
    }

How it looks like

Explanation

Creating the Unity Events and Exposing it in Inspector

This code uses the UnityEngine.Events . Firstly, I create two UnityEvents variable, and make the variables public, doing this I get to expose the events, giving you that familiar On Click () thingy in Unity UI Button.

    public UnityEvent onMarkerFound;
    public UnityEvent onMarkerLost;

Invoking the Events when marker is detected and lost

Then, OnTrackableStateChanged, if the Image Target is detected, I use the UnityEvent’s Invoke() function to Invoke the event, this will then work just like the On Click ()

That’s all folks…… enjoy!

Creating Basic AR using XZIMG Augmented Vision

A note for the pros and intermediately skilled. Since this is a basic tutorial, I will be very detailed to help those in need of a step by step nature of explaination. It may be boring a bit for the ones in the know, so skim through the whole tutorial and pick out what you need yeah.

Prerequisite

In order for you to follow  this tutorial, you would need:

  1. Unity 2019.1.5 (and above), with Android deployment module
  2. XZIMG Augmented Vision for Unity. Download from here
  3. A marker image. Use this one here
  4. A 3d model to use. You can use your own or use this
  5. A webcam or laptop with camera

Download and Unzip the XZIMG Augmented Vision SDK

Go to the download link provided in the prerequisite, login and download Augmented Vision 2.0.2 Free Version

Screen Shot 2019-08-06 at 10.59.18
Download Augmented Vision 2.0.2 Free Version

The next thing you need to do is to extract or unzip the XZIMG Augmented Vision SDK. At the end of this step you should have a folder like so:

Screen Shot 2019-08-07 at 12.06.26.png

Open the sample scene

Inside the folder, go to Assets>Scenes> and double click on sample-AugmentedVision unity scene.

Screen Shot 2019-08-07 at 12.11.36.png

Unity will start and load the project. Your screen should look something like below.

Creating your own marker

In the Project window, Go to Assets > Resources. Then, Drag and Drop the marker image into the project window.

Right click the marker image, go to Create, and click XZIMG Natural Image Classifier

It will start generating the marker. You can see the progress from the notification bar.

Once done.. the marker will appear in your project window.

Marker creation process done.

Using your own marker

Select the imageDetector prefab

In the inspector window, drag and drop the newly generated marker to the Classifier slot of the Xmg Augmented Image.

Test Your Marker

Press the play button to test

Your marker should now works!

Changing the model

Drag and drop the model folder into Assets

Go to the Assets > Character_Cactus > Mesh

Drag and drop the mesh under Pivot 1

Press play to preview the model

It is upside down. So the next thing we need to do is to rotate the model and retest.

Delete the cube, and we are done.

Final Result