Augmented Reality Wave Reflection

Recently, as part of our consultation project, the requirements are to vizualise sound wave reflection off a wall. It is required that the AR does not use markers so we used ground plane tracking. This is the outcome. By using the mic on mobile devices, students can record their own voice, the voice will be transmitted to the wall and reflected back. The students then can hear the echos of their own voices.

Playing Sound on Marker Detected in XZIMG Augmented Vision

Prerequisite

I order to start this tutorial, you would need:

  1. You need to have finished the previous tutorial on Detecting Marker Detected and Lost
  2. You need a sound. You can find lots of it to download here

What type of sound format does Unity support?

It is best to go for mp3 sounds as sometimes in my experience the .wav have a tendency to not play properly.

Import Sound Into Unity

So you have downloaded the sound now lets import it into Unity. Drag and drop the sound into the Assets folder, in the Project window.

Using the downloaded sound

In the Hierarchy window, right click the Main Camera, under Audio, click Audio Source

An Audio Source will be created under the Main Camera. It will look something like this

Select the Audio Source, with the Audio Source selected, drag and drop the sound to the Audio Clip slot.

Next, uncheck Play On Awake.

Triggering the sound to play when marker is detected

Select Pivot1, look at the Inspector window, under Event Receiver, press the “+” at On Marker Found()

You will get something like this

Drag and Drop Audio Source to the Object Slot

Click the Function Dropdown, Go to Audio Source, and select Play()

Now lets Play it and test it out

The marker is detected and sound is now playing