Tutorial: Creating an Effect with Scripting

In this tutorial you'll learn how to use scripting in Spark AR Studio to create an interactive world effect.

We'll be mirroring the Boombox with Audio sample effect, using scripting to add interactivity instead of the Patch Editor.

This effect shows how you can animate and place a 3D model in the real world that can be manipulated with touch gestures. It also includes audio that plays on a loop.



If this is your first time using scripting in Spark AR Studio, take a look at the Scripting Basics and Reactive Programming guides.

In this guide you'll learn about:

  1. Setting up a scene - Preparing your scene for scripting.
  2. Animating objects - Animating objects from a script.
  3. Moving objects - Allowing the person using the effect to move an object using touch gestures.
  4. Scaling objects - Allowing the person using the effect to scale an object using touch gestures.
  5. Rotating objects - Allowing the person using the effect to rotate an object using touch gestures.

Downloading the project

To follow this tutorial:

  1. Download the sample content.
  2. Open the unfinished effect in Spark AR Studio v67 and above.

We've already imported the assets you need to help you get started.


Setting up the scene

Before we can focus on scripting, we need to setup our scene. Objects can't be created within a script so we need to add them to the scene before we can access them.

Understanding the assets

Within the Assets Panel you'll find two folders:

  • boombox_animated - This folder contains the 3D model of the boombox (boombox_animated), along with an animation (Beats), a couple of materials (dropShadow_mat and boombox_mat) and textures (Texture 0 and Texture 1) applied to the model.
  • Audio - This folder contains the boombox_loop.m4a audio file.

Adding a plane tracker

We'll start building the effect by adding a plane tracker.

Objects that are children of a plane tracker will appear when a flat surface is detected by the camera, like a table or floor. We'll be using the plane tracker to position the boombox in the world.

To add a plane tracker to your scene:

  1. Click Add Object at the bottom of the Scene Panel.
  2. Select Plane Tracker in the menu.
  3. Click Insert.


After adding the plane tracker a planeTracker0 object will appear within the Scene Panel and the Simulator will prompt you to look around the new grid view that has replaced the face video.

Your scene will now look like this:



You can re-size the Simulator window by clicking and dragging from the bottom of the window to make it bigger.

Adding a null object

Null objects act as empty groups and are a great way of grouping objects together, something we'll be doing in the following steps.

To add a null object as a child of the plane tracker:

  1. Right-click on the planeTracker0 object.
  2. Select Add > Null Object.

After adding the null object a nullObject0 object will appear within the Scene Panel.

We're going to rename the null object to something more descriptive as we'll be referencing it in our script later. We'll call it placer because we're going to use it to place the boombox within the world.

To rename the object:

  1. Right-click on the nullObject0 object.
  2. Click Rename.
  3. Change the name to placer.

Your Scene Panel will now look like this:

Adding the boombox

Click and drag the boombox_animated folder into the placer to add it as a child.



It will appear within the Viewport and Simulator, it isn't animated yet but we'll cover that in scripting later.

We're also going to update the position of the boombox so that it's centered on the plane tracker.

To do this:

  1. Select the boombox_animated object in the Scene Panel.
  2. Change the z-axis Position to 0.3 in the Inspector Panel.

Adding the audio

There are a few steps to adding audio to an effect in Spark AR Studio. An audio playback controller needs to be created in order to control how they audio should be played and a speaker needs to be added to the scene to emit the audio.

  1. Create an audio playback controller:
    1. Click Add Assets at the bottom of the Assets Panel.
    2. Select Audio Playback Controller.
  2. Add the audio file to the playback controller:
    1. Select the audioPlaybackController0 in the Assets Panel.
    2. Select boombox_loop.m4a from the Audio dropdown and check the Loop box in the Inspector.
  3. Create and configure the speaker:
    1. Right-click on the placer object in the Scene Panel and select Add > Speaker.
    2. Select the speaker0 object in the Scene Panel.
    3. Select audioPlaybackController0 from the Audio dropdown in the Inspector.

Your Scene Panel and Assets Panel will now look like this:

Adding a script

To create a script:

  1. Click Add Asset at the bottom of the Assets Panel.
  2. Select Script.


A script0 object will then appear in the Assets Panel.

Opening the script

To open the script double-click the script0 file in the Assets Panel.

The script file will open in the default editor associated with JavaScript files on your computer.

Deleting the existing code

New scripts are pre-populated with code to help you understand some of the fundamentals of scripting (more information on which can be found in the Scripting Basics guide).

As this tutorial will cover all the aspects we need to know about scripting for this project:

  1. Delete all the existing code so the script is empty.
  2. Save your script.

When making changes, you need to save your script before returning to Spark AR Studio to see the changes take effect.

Now the scene is setup and we've created our script, we're ready to start scripting.


Animating the boombox base

The first animation we're going to apply is to the base of the boombox. We want to give the effect of the boombox pulsing with the audio that's playing.

Loading the Scene Module

We need to load in the scene module so that we can access the base of the boombox within the scene.

The following code loads the scene module and stores it in a variable so it can be used later to access objects within the scene.

Add this code to your script:

const Scene = require('Scene');

The require() method tells the script we're looking for a module, we pass in the name of the module as the argument to specify the one we want to load.

The Scene variable now contains a reference to the Scene Module that can be used to access the module's properties, methods, classes and enums.

It's best practice to capitalize module variables to distinguish them from normal variables.

Accessing the base object

Before we can access the base object within our script we need to know what it's called in the Scene Panel.

Expand the boombox_animated object by clicking on the arrows to reveal the full object tree within the Scene Panel.



The base of the boombox is represented by the base_jnt object. This base object, along with the speakers beneath it, are simply transforms within the 3D model.

With the name of the scene object known, we can now use it in our script to access it.

The following code finds the base_jnt object in the scene and stores it in a variable which we will use later to animate.

Add this code to your script:

const base = Scene.root.find('base_jnt');

The root property of the scene module gives us access to the root of the Scene Panel tree and the find() method of the SceneObjectBase (returned by the root property) allows us to search for an object within the tree.

Your script will now look like this:

const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

Loading the Animation Module

We need to load in the animation module so that we can animate the base and speakers of the boombox within the scene.

The module is loaded in the same way as the scene module, replacing the word 'Scene' with 'Animation'.

Add the code for loading the animation module on the line above the scene module:

const Animation = require('Animation');
const Scene = require('Scene'); // Existing code

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

Creating the base animation

Creating an animation in Spark AR Studio consists of 3 steps:

  1. Creating a driver.
  2. Creating a sampler.
  3. Combining the driver and sampler to create an animation.

Creating a driver


There are two types of drivers in Spark AR Studio:

  • Time driver - Allows you to specify a duration in milliseconds for the animation along with optional parameters for looping and mirroring.
  • Value driver - Allows you to specify a ScalarSignal, the values of which will control the animation, clamped to a given minimum and mamximum value.

For the animations in this project we'll be using time drivers.

The following code creates a time driver using a set of parameters we create beforehand:

Add this code to your script:

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);

The baseDriverParameters object specifies:

  • durationMilliseconds - The animation will last 0.4 seconds.
  • loopCount - The animation will loop indefinitely.
  • mirror - The animation will have returned to it's starting value by the time it has finished a loop.

The parameters are then used to create a time driver using the timeDriver() method of the Animation Module.

The time driver also needs to be instructed to start.

Add this code for starting the time driver the line below creating it:

const baseDriver = Animation.timeDriver(baseDriverParameters); // Existing code
baseDriver.start();

Starting the driver at this point in the script won't have an effect until we create the animation later, we're just keeping the driver code together.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

Creating a sampler


The sampler allows you to specify the beginning and end values of the animation as well as the easing function applied to it, altering the rate of change.

The following code creates a sampler.

Add this code to your script:

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

The samplers property of the animation module gives us access to the SamplerFactory class, which we use to set the easing function via the easeInQuint() method. We pass in 0.9 and 1 to the method to specify the start and end values of the animation.

An ease in function means the animation will start slow but end faster.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

Creating the animation


The animation is created by combining the driver and sampler.

Add this code to your script:

const baseAnimation = Animation.animate(baseDriver,baseSampler);

The animate() method of the animation module returns a ScalarSignal that is changing between 0.9 and 1 every 0.4 seconds according to the easing function used.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

Applying the animation to the base

With the animation created we can now apply it to the base of the boombox, we'll be using it to change the scale.

The base of the boombox simply represents the 3D model as a whole. Applying the animation to the base will mean we are animating the entire object.

Getting the base transform


To access the scale of the base we need to first access the transform. The transform represents the object's transformation (position, scale and rotation) in it's local coordinate system.

Add this code to your script:

const baseTransform = base.transform;

The transform property of the SceneObjectBase class (which base is an instance of) returns a TransformSignal.

Binding the animation


With access to the transform of the base we can now bind the animation to it's scale properties, animating the model.

Add this code to your script:

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

The scaleX/Y/Z() methods of the TransformSignal allows us to bind the baseAnimation ScalarSignal being output by the animation to the scale of the base object.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const base = Scene.root.find('base_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

Save your script and return to Spark AR Studio.

The boombox model will now be animated in the scene.




Animating the boombox speakers

The second animation we're creating will be applied to the speakers of the boombox to give them the appearance of playing the audio.

Accessing the speakers

As with the base we need to access the speakers within the script, this time we want the speaker_left_jnt and speaker_right_jnt objects from the Scene Panel.

Add the code for accessing the speaker objects the line beneath the base object:

const base = Scene.root.find('base_jnt'); // Existing code
const speakerLeft = Scene.root.find('speaker_left_jnt');
const speakerRight = Scene.root.find('speaker_right_jnt');

This is an opportunity to improve the performance of the script. In accessing the three objects in our scene we are calling Scene.root three times. If we store Scene.root in a variable we only have to call it once, and can use the variable as many times as we need it.

Add the following code to store the root in a variable and update the lines accessing objects:

const sceneRoot = Scene.root;

// Existing code updated
const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

The sceneRoot variable has now replaced the individual Scene.root property calls.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

Creating the speaker animation

Now we can create a driver, sampler and animation for the speakers like we did for the base.

For the driver we'll be using similar parameters to that of the base driver, but setting a duration of 0.2 seconds.

Add this code to your script:

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

For the sampler we'll be using a different easing function and values.

Add this code to your script:

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

The values are smaller than the ones we used for the base because the speakers themselves are smaller objects. An ease out function means the animation will start fast but end slower.

With the driver and sampler created we can now create the animation.

Add this code to your script:

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

Applying the animation to the speakers

We can now apply the new animation to both of the speaker objects, the animation will be used to change their scale.

Add this code to your script:

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

The transforms of the speakers are accessed before using the scale methods to bind the speakerAnimation signal to them. This means both speakers will animate in the same way.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

Save your script and return to Spark AR Studio.

The speakers will now be animated in the scene.




Moving the boombox with pan gestures

With the boombox animated we can now add interactivity that allows the person using the effect to position, scale and rotate it.

Loading the TouchGestures Module

We need to load in the touchgestures module to be able to detect touch events.

Add the code for loading the touchgestures module on the line below the scene module:

const Animation = require('Animation'); // Existing code
const Scene = require('Scene'); // Existing code
const TouchGestures = require('TouchGestures');

Accessing the plane tracker

To move the boombox we're going to update the location of the plane tracker on which the boombox sits so we first need to access it in our script.

Add the code for finding the plane tracker beneath the lines accessing the speakers:

const base = sceneRoot.find('base_jnt'); // Existing code
const speakerLeft = sceneRoot.find('speaker_left_jnt'); // Existing code
const speakerRight = sceneRoot.find('speaker_right_jnt'); // Existing code
const planeTracker = sceneRoot.find('planeTracker0');

Subscribing to pan gestures

We want the plane tracker to move when the person using the effect drags their finger across the screen, this is referred to as a pan gesture within Spark AR Studio.

The following code subscribes to pan gestures, meaning any detected will call this function and execute code inside.

Add this code to your script:

TouchGestures.onPan().subscribe(function(gesture) {
    
});

The onPan() method returns an EventSource that we then subscribe to with a callback function that will run everytime a pan gesture is detected.

We also pass the gesture as an argument to the callback function which we will use when moving the plane tracker. The gesture passed when subscribing to the onPan() EventSource is a PanGesture.

Moving the plane tracker

Inside the callback function we can now add the code to move the plane tracker according to the location and state of the detected pan gesture.

Add the code for moving the plane tracker within the callback function:

TouchGestures.onPan().subscribe(function(gesture) { // Existing code
    planeTracker.trackPoint(gesture.location, gesture.state);
}); // Existing code

The trackPoint() method of the PlaneTracker class triggers new plane detection, updating the position of the planetracker.

We use the location and state of the pan gesture as the position that the plane tracker should be updated to, meaning that the planetracker moves to where the gesture is detected.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});

Save your script and return to Spark AR Studio.

Handling capability error messages

Returning to Spark AR Studio you will see an error message appear in the console:



This is because we need to specify which touch gestures we want to detect.

To specify the touch gestures we want to use:

  1. Click Project > Edit Properties from the menu bar.
  2. Select Capabilities.
  3. Expand the Touch Gestures capability by clicking the arrow next to it.
  4. Check the box next to Pan, Pinch and Rotate and click Done.


Pinch and rotate will be used in the next steps so we are saving time by adding them now.

Clear the console and restart the experience to verify there is no longer an error.



If we still had an error then it would appear in the console after we restarted.

Previewing on device

Now we're ready to preview the effect with the new gesture inputs. The Simulator has limited functionality when it comes to moving the plane tracker and using advanced touch effects, so from now on we'll be previewing the experience on a mobile device.

The easiest way to preview on a device is to use the Send to App option.

To test the project on a device:

  1. Click the Send To Device button.
  2. Click the arrow next to Send to App to expand the window if it isn't already.
  3. Click Send next to either Instagram or Facebook Camera to preview within the app.


You will now be able to move the location of the boombox in the world by using your finger.


Scaling the boombox with pinch gestures

Accessing the placer

For scaling and rotating the boombox we're going to use the placer object. The reason we need to do this is because we're already using the base object for the scaling animation, and if we used it again here then it would override that animation.

Add the code for finding the placer beneath the line accessing the plane tracker:

const base = sceneRoot.find('base_jnt'); // Existing code
const speakerLeft = sceneRoot.find('speaker_left_jnt'); // Existing code
const speakerRight = sceneRoot.find('speaker_right_jnt'); // Existing code
const planeTracker = sceneRoot.find('planeTracker0'); // Existing code
const placer = sceneRoot.find('placer');

Now we have access to the placer we can store a reference to it's transform (like we did with the base) to use for updating the scale and rotation.

Add this code to your script:

const placerTransform = placer.transform;

Subscribing to pinch gestures

To scale the boombox we're going to subscribe to pinch gestures, the gesture you make with two fingers getting closer or further apart.

The following code uses a slightly different subscription method that also allows us to capture signal values at the time the gesture is detected.

Add this code to your script:

TouchGestures.onPinch().subscribeWithSnapshot( {

}, function (gesture, snapshot) {

});

The subscribeWithSnapshot() method allows us to capture additional 'snapshot' values which we will need to pass through to the callback function for updating the scale.

Passing snapshot values

The values we want to pass to the callback function at the time of the pinch gesture are the current x, y and z scale values of the placerTransform. The reason we need to do this is because we want to change the scale (increase or decrease depending on the gesture) in relation to it's current scale.

Add the snapshot code to the subscription function:

TouchGestures.onPinch().subscribeWithSnapshot( { // Existing code
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 
}, function (gesture, snapshot) { // Existing code

}); // Existing code

A snapshot is a dictionary of signal values that can be accessed in the callback function by name.

Scaling the placer

The current scale values passed from the snapshot can now be used in the callback function along with the scale from the pinch gesture to scale the placer.

Add the scale code within the callback function:

TouchGestures.onPinch().subscribeWithSnapshot( { // Existing code
    'lastScaleX' : placerTransform.scaleX, // Existing code
    'lastScaleY' : placerTransform.scaleY, // Existing code
    'lastScaleZ' : placerTransform.scaleZ // Existing code
}, function (gesture, snapshot) { // Existing code
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
}); // Existing code

Multiplying the snapshot values with the scale property of the PinchGesture gives the desired scaling effect.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');
const placer = sceneRoot.find('placer');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});

const placerTransform = placer.transform;

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 
}, function (gesture, snapshot) {
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
});

Save your script and return to Spark AR Studio.

Send the project to your app of choice and you will now be able to scale the boombox by using two fingers and pinching on your device.


Rotating the boombox with rotation gestures

Subscribing to rotation gestures

To rotate the boombox we're going to subscribe to rotation gestures, a gesture made with two fingers rotating around each other.

Add this code to your script:

TouchGestures.onRotate().subscribeWithSnapshot( {
    'lastRotationY' : placerTransform.rotationY,
}, function (gesture, snapshot) {
    const correctRotation = gesture.rotation.mul(-1);
    placerTransform.rotationY = correctRotation.add(snapshot.lastRotationY);
});

We use subscribeWithSnapshot() to pass the y rotation value of the placerTransform (the axis we'll be rotating around) to the callback function.

In the callback function we multiply the RotateGesture rotation by -1 before adding it to the placerTransform to rotate the boombox. We multiply the gesture rotation by -1 to make sure the rotation happens in the correct direction.

Your script will now look like this:

const Animation = require('Animation');
const Scene = require('Scene');
const TouchGestures = require('TouchGestures');

const sceneRoot = Scene.root;

const base = sceneRoot.find('base_jnt');
const speakerLeft = sceneRoot.find('speaker_left_jnt');
const speakerRight = sceneRoot.find('speaker_right_jnt');
const planeTracker = sceneRoot.find('planeTracker0');
const placer = sceneRoot.find('placer');

const baseDriverParameters = {
    durationMilliseconds: 400,
    loopCount: Infinity,
    mirror: true
};

const baseDriver = Animation.timeDriver(baseDriverParameters);
baseDriver.start();

const baseSampler = Animation.samplers.easeInQuint(0.9,1);

const baseAnimation = Animation.animate(baseDriver,baseSampler);

const baseTransform = base.transform;

baseTransform.scaleX = baseAnimation;
baseTransform.scaleY = baseAnimation;
baseTransform.scaleZ = baseAnimation;

const speakerDriverParameters = {
    durationMilliseconds: 200,
    loopCount: Infinity,
    mirror: true
};

const placerTransform = placer.transform;

TouchGestures.onPinch().subscribeWithSnapshot( {
    'lastScaleX' : placerTransform.scaleX,
    'lastScaleY' : placerTransform.scaleY,
    'lastScaleZ' : placerTransform.scaleZ 

const speakerDriver = Animation.timeDriver(speakerDriverParameters);
speakerDriver.start();

const speakerSampler = Animation.samplers.easeOutElastic(0.7,0.85);

const speakerAnimation = Animation.animate(speakerDriver,speakerSampler);

const speakerLeftTransform = speakerLeft.transform;

speakerLeftTransform.scaleX = speakerAnimation;
speakerLeftTransform.scaleY = speakerAnimation;
speakerLeftTransform.scaleZ = speakerAnimation;

const speakerRightTransform = speakerRight.transform;

speakerRightTransform.scaleX = speakerAnimation;
speakerRightTransform.scaleY = speakerAnimation;
speakerRightTransform.scaleZ = speakerAnimation;

TouchGestures.onPan().subscribe(function(gesture) {
    planeTracker.trackPoint(gesture.location, gesture.state);
});
}, function (gesture, snapshot) {
    placerTransform.scaleX = gesture.scale.mul(snapshot.lastScaleX);
    placerTransform.scaleY = gesture.scale.mul(snapshot.lastScaleY);
    placerTransform.scaleZ = gesture.scale.mul(snapshot.lastScaleZ);
});

TouchGestures.onRotate().subscribeWithSnapshot( {
    'lastRotationY' : placerTransform.rotationY,
}, function (gesture, snapshot) {
    const correctRotation = gesture.rotation.mul(-1);
    placerTransform.rotationY = correctRotation.add(snapshot.lastRotationY);
});

Save your script and return to Spark AR Studio.

Send the project to your app of choice and you will now be able to rotate the boombox by using two fingers and rotating on your device.


Tutorial Completed

Next Steps

Now that you've completed an effect using scripting, you can start thinking about customizing and publishing it. Make it your own by swapping out the 3D model, changing the sound or even using more modules from the scripting reference for further interactivity.