Tutorial: Creating an Effect with Scripting

In this tutorial you'll learn how to use scripting in Spark AR Studio to create an interactive world effect.

We'll be mirroring the Boombox with Audio sample effect, using scripting to add interactivity instead of the Patch Editor.

If this is your first time using scripting in Spark AR Studio, take a look at the Scripting Basics and Reactive Programming guides.

In this guide you'll learn about:

  1. Setting up a scene - Preparing your scene for scripting.
  2. Animating objects - Animating objects from a script.
  3. Moving objects - Allowing the person using the effect to move an object using touch gestures.
  4. Scaling objects - Allowing the person using the effect to scale an object using touch gestures.
  5. Rotating objects - Allowing the person using the effect to rotate an object using touch gestures.

To follow this tutorial, download the sample content and open the unfinished effect. We've already imported the assets you need to help you get started.


Setting up the scene

Before we can focus on scripting, we need to setup our scene. Objects can't be created within a script so we need to add them to the scene before we can access them.

Understanding the assets

Within the Assets Panel you'll find two folders:

  • boombox_animated - This folder contains the 3D model of the boombox (boombox_animated), along with an animation (Beats), a couple of materials (dropShadow_mat and boombox_mat) and textures (Texture 0 and Texture 1) applied to the model.
  • Audio - This folder contains the boombox_loop.m4a audio file.

Adding a plane tracker

To start building the effect, add a plane tracker to your scene by:

  1. Clicking Add Object at the bottom of the Scene Panel.
  2. Selecting Plane Tracker in the menu.
  3. Clicking Insert.


After adding the plane tracker a planeTracker0 object will appear within your Scene Panel. The Simulator will prompt you to look around the new grid view that has replaced the face video.

This is a good time to re-size the Simulator window by clicking and dragging from the bottom of the window to make it bigger.



Objects that are children of a plane tracker will appear when a flat surface is detected by the camera, like a table or floor. We'll be using the plane tracker to position the boombox in the world.

Adding a null object

Add a null object as a child of the plane tracker by right-clicking on the planeTracker0 object and selecting Add > Null Object.

Null objects act as empty groups and are a great way of grouping objects together, something we'll be doing in the following steps.

We're going to rename the null object to placer by right-clicking on the nullobject0 object and clicking Rename. We're calling it placer because we will use the object to place the boombox in the world.



Your Scene Panel will now look like this:

Adding the boombox

Click and drag the boombox_animated folder into the placer to add it as a child.



It will appear within the Viewport and Simulator, it isn't animated yet but we'll cover that in scripting later.

We're also going to update the position of the boombox so that it's centered on the plane tracker. To do this select the boombox_animated object in the Scene Panel and change the z-axis Position to 0.3 in the Inspector Panel.

Adding the audio

There are a few steps to adding audio to an effect in Spark AR Studio.

First, create an audio playback controller:

  1. Click Add Assets at the bottom of the Assets Panel.
  2. Select Audio Playback Controller.


Next add the audio file to the playback controller:

  1. Select audioPlaybackController0 in the Assets Panel.
  2. In the Inspector, select boombox_loop.m4a from the Audio dropdown and check the Loop box.


The audio playback controller controls how the audio will be played. We set it to loop because we want the audio clip to play continuously.

Then, create and configure the speaker:

  1. Create a speaker by right-clicking on the placer object in the Scene Panel and selecting Add > Speaker.
  2. Select the speaker0 object in the Scene Panel and in the Inspector set the Audio dropdown to audioPlaybackController0.


The speaker emits the audio in the scene, we add it to the placer to group it with the object that is emitting the sound, the boombox.

Your Scene Panel and Assets Panel will now look like this:

Adding a script

To create a script click Add Asset at the bottom of the Assets Panel and select Script.



A script0 object will then appear in the Assets Panel which you can double-click to open.

The script file will open in the default editor associated with JavaScript files on your computer.

New scripts are pre-populated with code to help you understand some of the fundamentals of scripting. For this tutorial we'll be writing the entire script step-by-step, so we can delete the existing code, leaving the script empty.

For more information on the default code, please see the Scripting Basics guide.

After deleting the code you can hit save within the editor you're using to save the script. You need to save your script after making changes before returning to Spark AR Studio to see the changes take effect.

Now the scene is setup and we've created our script, we're ready to start scripting.


Animating the boombox base

The first animation we're going to apply is to the base of the boombox. We want to give the effect of the boombox pulsing with the audio that's playing.

Loading the Scene Module

To be able to access the base of the boombox we need to be able to access objects within the scene, this is done in scripting by using the Scene Module.

We load in the scene module by using this line of code:

const Scene = require('Scene');

The require() method tells the script we're looking for a module and we pass in the name as the argument to specify which one.

We store a reference to the scene module in a Scene variable which we will use to access the methods and properties of the module later.

It's best practice to capitalize module variables to distinguish them from normal variables.

Accessing the base object

You can expand the boombox_animated object to reveal the full object tree within the Scene Panel by clicking on the arrows.



The first 4 objects are meshes. The mesh is where materials are applied, to determine how the boombox looks. There's also a skeleton with 3 joints. We'll manipulate these joints to control the animation.

The base of the boombox is represented by the base_jnt object.

Now that we know the name of the object we want to access in our script we can write the following line of code, beneath loading in the scene module, to reference it:

const base = Scene.root.find('base_jnt');

The root property of the scene module gives us access the the root of the Scene Panel tree and the find() method of the SceneObjectBase (returned by the root property) allows us to search for an object within the tree.

Again we store the resulting object in a variable for use later.

Your script will now look like this:

const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

Using the Animation Module

To be able to animate the base we need also need to load the Animation Module, this module gives us access to methods and properties to animate objects:

const Animation = require('Animation');
const Scene = require('Scene');

We load the module next to where we loaded the scene module.

Animations in Spark AR Studio are made up of two components, drivers and samplers. Drivers are used to control the animation and samplers are used to specify the interpolation as well as the beginning and end values.

We want to create a time driver that uses the same values as the sample project (a duration of 0.4 seconds, looping and mirrored).

To do this we create a set of parameters, apply them to a time driver and then start the driver:

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

With the driver created we now need to create a sampler. We're going to use the same interpolation and values as the sample project so we'll use the easeInQuint() method of the SamplerFactory class and animate the base's scale from 0.9 to 1:

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

The animation can then be created by combining the driver and sampler:

const baseAnimation = Animation.animate(baseDriver,baseSampler);

Applying the animation to the base

The baseAnimation variable now holds a reference to a ScalarSignal thats value is changing based on the animation. We can bind that signal to the x, y and z scale values of the base to animate it:

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

We start by creating a reference to the transform of the base, which returns a TransformSignal, and then using the scale properties to bind the baseAnimation signal.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

If you save your script and return to Spark AR Studio you will now see the boombox animated.




Animating the boombox speakers

We also want to apply an animation to the speakers to give them the appearance of playing the audio.

As with the base we need to access the speakers within the script, this time we want the speaker_left_jnt and speaker_right_jnt objects from the Scene Panel:

const base = Scene.root.find('base_jnt');
const speakerLeft = Scene.root.find('speaker_left_jnt');
const speakerRight = Scene.root.find('speaker_right_jnt');

This is an opportunity to improve the performance of our script. In accessing the three objects in our scene we are calling Scene.root three times. If we store Scene.root in a variable we only have to call it once, and can use the variable as many times as we need it:

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

Now we can create a driver, sampler and animation for the speakers like we did for the base but changing a few of the values to match the sample project:

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

Then we'll apply the speakerAnimation signal to the x,y and z scale values of both speaker objects:

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

If you save your script and return to Spark AR Studio you will now see the speakers animated.



With the boombox animated we can now add interactivity that allows the person using the effect to move, scale and rotate the boombox.


Moving the boombox with pan gestures

To move the boombox we're going to update the location of the plane tracker using the trackPoint() method of the PlaneTracker class whenever a pan gesture is detected. A pan gesture is triggered when the person using the effect drags their finger across the screen.

These gestures are handled by the TouchGestures Module so our first step is to load the module:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

We then need to access the planeTracker0 object in the scene:

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');

Subscribing to pan gestures

Now we can subscribe to pan gestures at the bottom of the script:

TouchGestures.onPan().subscribe(function(gesture) {
    
});

The onPan() method returns an EventSource that we then subscribe to with a callback function that will run everytime a pan gesture is detected.

We also pass the gesture as an argument to the callback function which we will use when moving the plane tracker. The gesture passed when subscribing to the onPan() EventSource is a PanGesture.

The trackpoint() method can now be called within the callback function:

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});

We use the location and state of the pan gesture to update the position of the plane tracker.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});

Handling capability error messages

If you hit save and return to Spark AR Studio you will see an error message appear in the console.



This is because Spark AR Studio knows we want to use the TouchGestures capability, but we need to specify which ones.

To do so we click Project > Edit Properties from the menu bar, select Capabilities, expand the Touch Gestures capability by clicking the arrow next to it and check the box next to Pan, Pinch and Rotate and click Done. Pinch and rotate will be used in the next steps so we are saving time by adding them now.



We can now clear the console and restart the experience to verify we no longer have the error.



If we still had an error then it would appear in the console after we restarted.

Previewing on device

Now we're ready to preview the effect with the new pan functionality. The Simulator, however, has limited functionality when it comes to moving the plane tracker and using advanced touch effects so from now on we'll be previewing the experience on a mobile device.

The easiest way to preview on a device is to use the Send to App option.

You will now be able to move the location of the boombox in the world by using your finger.




Scaling the boombox with pinch gestures

For scaling and rotating the boombox we're going to use the placer object. The reason we need to do this is because we're already using the base object in the animation, if we used it again here then it would override the existing animation.

We access the placer object in the script:

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');
const placer = sceneRoot.find('placer');

Subscribing with snapshots

Store a reference to the transform at the bottom of the script and subscribe to pinch gestures:

const placerTransform = placer.transform;

TouchGestures.onPinch().subscribeWithSnapshot( {

}, function (gesture, snapshot) {

});

This time we are using the subscribeWithSnapshot() method as there are some values we want to pass through to the callback function when a pinch gesture is detected.

The values we want to pass through are the current x, y and z scale values of the placerTransform:

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 
}, function (gesture, snapshot) {

});

We use these values combined with the scale of the PinchGesture being passed through to set the new scale of the placer:

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 
}, function (gesture, snapshot) {
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
});

Saving, your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');
const placer = sceneRoot.find('placer');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});

const placerTransform = placer.transform;

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 
}, function (gesture, snapshot) {
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
});

You will now be able to scale the boombox by using two fingers and pinching on your device.


Rotating the boombox with rotation gestures

Finally we'll allow the boombox to be rotated via the placer object, subscribing to rotation gestures at the bottom of the script:

TouchGestures.onRotate().subscribeWithSnapshot( {
    'lastRotationY' : placerTransform.rotationY,
}, function (gesture, snapshot) {
    const correctRotation = gesture.rotation.mul(-1);
    placerTransform.rotationY = correctRotation.add(snapshot.lastRotationY);
});

We use subscribeWithSnapshot() to pass the y rotation value of the placerTransform (the axis we'll be rotating around) to the callback function.

In the callback function we multiply the RotateGesture rotation by -1 before applying it to the placerTransform. We do this to make sure the rotation happens in the expected direction.

Saving, your finished script will look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');
const placer = sceneRoot.find('placer');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const placerTransform = placer.transform;

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});
}, function (gesture, snapshot) {
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
});

TouchGestures.onRotate().subscribeWithSnapshot( {
    'lastRotationY' : placerTransform.rotationY,
}, function (gesture, snapshot) {
    const correctRotation = gesture.rotation.mul(-1);
    placerTransform.rotationY = correctRotation.add(snapshot.lastRotationY);
});

You will now be able to rotate the boombox by using two fingers and rotating them on your device.


Next Steps

Now that you've completed an effect using scripting, you can start thinking about customizing and publishing it. Make it your own by swapping out the 3D model, changing the sound or even using more modules from the scripting reference for further interactivity.