-
Notifications
You must be signed in to change notification settings - Fork 1
gemasowa
User interaction | Sound | Interactive? | OpenSource? | Multiplayer? | Technology | Plattform |
---|---|---|---|---|---|---|
Modifying sound waves through gestures. | synthesizer | YES | YES | NO | Leap Motion, ThreeJS, Web Audio API | Web |
As ThreeJS has been used very little in any of the work, I encountered no problems nor was I amazed by anything - so I can not really come to any conclusion regarding this JS library. The Leap Motion, in my opinion, lacks reliability and consistency. The infrared cameras face bottom-up and fail to recognize fingers, once the hand is rotated (the palm of the hand is looking upwards). Any of the promised gestures (such as the swipe gesture), which are said to be detectable, seem to be overwhelmed by false-positive and false-negative detections. The Web Audio API costed me quite some time to get into it and I had trouble understanding a few concepts it implements. As I tried to refilling the buffer everytime with fresh values, the API seems to have some threading issues on real-time synthesis (s. http://webaudio.github.io/web-audio-api/#JavaScriptPerformance) which leads to constant crackling sounds.
Update: As it turns out, the crackling is a result of my event handler not being called anymore at some point in time. The variable which is storing the function is getting garbage collected after a few seconds, which leads to the buffer not being updated anymore. As the Firefox browser handles this situation as intended, I call this a bug in the Chrome browser (http://stackoverflow.com/q/24338144).
Regarding my second field of investigation - location tracking - I encountered either technology which was not appropriate (in terms of accuracy) for our needs (s. WiFi-based or iBeacon-based tracking) or technology which was way over our budget (s. Pozyx and STEM).
Outputting a sound as a sine wave to the users screen and change its appearance and sound through gestures, recognized by the Leap Motion device. Recognized gestures are pinching gestures between the index finger and thumb of each hand, to modify several parameters. Such as amplitude, frequency and the beta parameter of a frequency modulation synthesis (briefly described below). Any alteration on the waves appearance is also reflected in the audio output, generated by the Web Audio API.
The initial frequency of the sine wave is 220 Hz. The following parameters can be changed through gestures, which are recognized by the Leap Motion device.
- Frequency of the carrier wave (Trägerfrequenz)
- Frequency of the modulator wave (Modulatorfrequenz)
- Modulation index (Modulationsindex)
All of the listed parameters can be altered using pinch gestures. The carrier wave and modulator wave frequency can be accessed pinching either with the index finger and thumb or with the middle finger and thumb, both with the right hand. The modulation index is changed by pinching using the index finger and thumb of the left hand. The following formula (taken from wikipedia) describes the frequency modulation synthesis.
- fixing audio crack bug
- making second wave parameters of fm-synthesis changable
- introduce different wave types
- sawtooth
- square
- triangle
- limit parameter to useful ranges
Minor cracks in the audio output, caused by the different wave angles the audio buffer gets filled with each iteration.- The Chrome browser happens to garbage collect the event handler (onaudioprocess) which is responsible for recalculating wave values and refilling the buffer. This leads to unresponsiveness and audio crackles. Firefox works though.
These parameters are globally accessible and are constantly used by several functions and express the current waves state.
// global wave parameters
var waveTypes = [getSinWaveVertices, getSquareWaveVertices, getTriangleWaveVertices, getSawtoothWaveVertices];
var currentWaveType = 1;
var getWaveVertices = waveTypes[currentWaveType];
var _freqCarrier = 220;
var _freqModulator = 220;
var _amplitude = 1;
var _modulIndex = 0;
var _carrierAngle = 0.0;
var _modulatorAngle = 0.0;
At first we are using ThreeJS to manage our visual environment to draw the initial wave.
// set up ThreeJS scene
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 );
var renderer = new THREE.WebGLRenderer({ antialias: true });
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );
// camera position
camera.position.z = 4;
camera.position.x = 5.3;
// line object, containing the waves vertices and the waves color
var line = new THREE.Line( new THREE.Geometry(), new THREE.LineBasicMaterial({color:0x87CEFA, linewidth:3}));
// draw wave initially
line.geometry.vertices = getWaveVertices(_freqCarrier, _amplitude, _freqModulator, _modulIndex);
scene.add(line);
Now we are initializing the audio environment by using an audio buffer with the help of the onaudioprocess event handler which constantly recalculates the waves values and refills the buffer.
For further information see Sound synthesis (Web Audio API)
// set up audio output
var audioCtx = new AudioContext();
var sampleRate = audioCtx.sampleRate;
var frameCount = 1024;
var node = audioCtx.createScriptProcessor(frameCount, 1, 1);
node.onaudioprocess = function(e) {
var data = e.outputBuffer.getChannelData(0);
line.geometry.vertices = getWaveVertices(_freqCarrier, _amplitude, _freqModulator, _modulIndex);
fillBufferFromWaveVertices(data, line.geometry.vertices);
line.geometry.verticesNeedUpdate = true;
};
// connect audio
node.connect(audioCtx.destination);
The logic consists mainly out of two loops, one to handle the drawing aspects and one to process gesture recognition.
This loop draws the wave with 24 FPS and prints the global parameters to the users screen.
// ThreeJS animation loop where drawing happens
function loop() {
var fps = 24;
setTimeout(function() {
document.getElementById("text").innerHTML = "Trägerfrequenz = " + _freqCarrier + ";\nMoldulatorfrequenz = " + _freqModulator + ";\nAmplitude = " + _amplitude + ";\nModulationsindex = " + _modulIndex + ";";
requestAnimationFrame(loop);
renderer.render(scene, camera);
}, 1000 / fps);
}
loop();
var currentRightPinch = null, currentLeftPinch = null, pinchingFinger = null, pinchingStartPos = null;
// Main Leap Loop to recognize pinch gestures
Leap.loop({ enableGestures: true }, function(frame) {
var hand = null;
// detect pinching hand
frame.hands.forEach(function(currentHand, index) {
if(currentHand.pinchStrength > 0.99) hand = currentHand;
});
// when pinching hand is found
if(hand) {
// and pinching start position hasn't been set yet
if(pinchingStartPos == null) {
pinchingFinger = getPinchingFinger(hand);
// check whether its the index or middle finger pinching
if(pinchingFinger.type == 1 || pinchingFinger.type == 2) {
// set location where pinching started
pinchingStartPos = pinchingFinger.dipPosition;
}
// we are still pinching, but the start position of the pinch is already saved
} else {
// calculate pinching vector for the according hand
if(hand.type == "right") {
var deltaX = (hand.indexFinger.dipPosition[0] - pinchingStartPos[0]) * 0.010;
var deltaY = (hand.indexFinger.dipPosition[1] - pinchingStartPos[1]) * 0.025;
currentRightPinch = new THREE.Vector3(deltaX, deltaY, 0);
} else {
var deltaX = (hand.indexFinger.dipPosition[0] - pinchingStartPos[0]) * 0.001;
var deltaY = (hand.indexFinger.dipPosition[1] - pinchingStartPos[1]) * 1.00;
currentLeftPinch = new THREE.Vector3(deltaX, deltaY, 0);
}
}
// no hand pinching
} else {
// if theres still a pinch recognized
if(currentRightPinch || currentLeftPinch) {
// cancel pinch
currentRightPinch = null;
currentLeftPinch = null;
pinchingStartPos = null;
pinchingFinger = null;
}
}
// redraw wave when theres a pinch currently happening
if(pinchingStartPos) {
// prevent null pointer exceptions
currentLeftPinch = currentLeftPinch ? currentLeftPinch : { x: 0, y: 0 };
currentRightPinch = currentRightPinch ? currentRightPinch : { x: 0, y: 0 };
// apply pinching vectors to wave parameters
_freqCarrier += pinchingFinger.type == 1 ? currentRightPinch.x : 0;
_freqModulator += pinchingFinger.type == 2 ? currentRightPinch.x : 0;
_modulIndex += currentLeftPinch.x;
//_amplitude += currentRightPinch.y;
line.geometry.verticesNeedUpdate = true;
}
});
The code above uses the following helper functions.
var getSinWaveVertices = function(freqCarrier, amplitude, freqModulator, modulIndex) {
var fm = new Array(frameCount);
var vertices = [];
var TWOPI = 2 * Math.PI;
var circleCarrier = (TWOPI * freqCarrier / sampleRate);
var circleModulator = (TWOPI * freqModulator / sampleRate);
var curphaseCarrier = circleCarrier + _carrierAngle;
var curphaseModulator = circleModulator + _modulatorAngle;
for(var t = 0; t < frameCount; t++) {
fm[t] = amplitude * Math.cos(_carrierAngle + circleCarrier * t + modulIndex * Math.cos(_modulatorAngle + circleModulator * t));
curphaseCarrier += circleCarrier;
curphaseModulator += circleModulator;
if (curphaseCarrier >= TWOPI) curphaseCarrier -= TWOPI;
if (curphaseModulator >= TWOPI) curphaseModulator -= TWOPI;
// turn value in vertices
vertices.push(new THREE.Vector3(t/100, fm[t], 0));
}
_carrierAngle = curphaseCarrier;
_modulatorAngle = curphaseModulator;
return vertices;
}
var getSquareWaveVertices = function(freqCarrier, amplitude) {
var output = new Array(frameCount);
var vertices = [];
var TWOPI = 2 * Math.PI;
var curphase = _carrierAngle;
var curFreq = 0;
var incr = (freqCarrier * Math.PI / sampleRate) * 2;
var twoPiOvSr = 2 * Math.PI / sampleRate;
var value = 0;
for (var i = 0; i < frameCount; i++) {
if (curFreq != freqCarrier) {
curFreq = freqCarrier;
incr = twoPiOvSr * freqCarrier;
}
if (curphase <= Math.PI) {
value = amplitude;
} else {
value = amplitude * -1;
}
curphase = curphase + incr;
if (curphase >= TWOPI) {
curphase = curphase - TWOPI;
}
if (curphase < 0.0) {
curphase = curphase + TWOPI;
}
output[i] = value;
vertices.push(new THREE.Vector3(i/100, value, 0));
}
_carrierAngle = curphase;
return vertices;
}
var getTriangleWaveVertices = function(freqCarrier, amplitude) {
var output = new Array(frameCount);
var vertices = [];
var TWOPI = 2 * Math.PI;
var curphase = _carrierAngle;
var curFreq = 0;
var incr = (freqCarrier * Math.PI / sampleRate) * 2;
var twoPiOvSr = 2 * Math.PI / sampleRate;
var value = 0;
for (var i = 0; i < frameCount; i++) {
if (curFreq != freqCarrier) {
curFreq = freqCarrier;
incr = twoPiOvSr * freqCarrier;
}
value = 2.0 * (curphase * (1 / TWOPI)) - 1;
if (value < 0.0) {
value = -1 * value;
}
value = 2 * (value - 0.5);
curphase = curphase + incr;
if (curphase >= TWOPI) {
curphase = curphase - TWOPI;
}
if (curphase < 0.0) {
curphase = curphase + TWOPI;
}
value = value * amplitude;
output[i] = value;
vertices.push(new THREE.Vector3(i/100, value, 0));
}
_carrierAngle = curphase;
return vertices;
}
var getSawtoothWaveVertices = function(freqCarrier, amplitude) {
var output = new Array(frameCount);
var vertices = [];
var TWOPI = 2 * Math.PI;
var curphase = _carrierAngle;
var curFreq = 0;
var incr = (freqCarrier * Math.PI / sampleRate) * 2;
var twoPiOvSr = 2 * Math.PI / sampleRate;
var value = 0;
for (var i = 0; i < frameCount; i++) {
if (curFreq != freqCarrier) {
curFreq = freqCarrier;
incr = twoPiOvSr * freqCarrier;
}
value = (2.0 * (curphase * (1.0 / TWOPI))) - 1.0;
curphase = curphase + incr;
if (curphase >= TWOPI) {
curphase = curphase - TWOPI;
}
if (curphase < 0.0) {
curphase = curphase + TWOPI;
}
_angle = curphase;
output[i] = (value * amplitude);
vertices.push(new THREE.Vector3(i/100, value, 0));
}
_carrierAngle = curphase;
return vertices;
}
var getPinchingFinger = function(hand) {
var pincher;
var closest = 500;
for(var f = 1; f < 5; f++) {
current = hand.fingers[f];
distance = Leap.vec3.distance(hand.thumb.tipPosition, current.tipPosition);
if(current != hand.thumb && distance < closest) {
closest = distance;
pincher = current;
}
}
return pincher;
};
var fillBufferFromWaveVertices = function(buffer, vertices) {
var waveValues = [];
// normalize
var maxValue = 0;
for(var j = 0; j < vertices.length; j++) {
var currV = Math.abs(vertices[j].y);
maxValue = currV > maxValue ? currV : maxValue;
}
for(var i = 0; i < vertices.length; i++) {
waveValues[i] = ( vertices[i].y / maxValue ) * .9;
}
var pos = 0;
for (var i = 0; i < frameCount; i++) {
pos = pos >= waveValues.length ? 0 : pos;
buffer[i] = waveValues[pos];
pos++;
}
};