This is the ninth article of an ongoing series about game development for the Web.
In the previous article, we saw a practical demonstration of event-driven development in JavaScript. We used the available MouseEvents
in the DOM Level 2 Event Model to handle user input with a pointing device. Moreover, we demonstrated how to create a simple application that made use of such events.
This time around, we will discuss a paradigm brought by smartphones and tablets over the last decade: touch as a form of digital input.
We will talk briefly about the state of touch in the Web and more importantly how to support it.
Testing Touch Input
Testing the quality of mobile experiences takes longer and makes debugging more complex. Touch input is no different from this, unless, of course, you have a touch capable screen.
In order to circumvent the differences between our development boxes and the devices we develop for (when doing mobile development) browser vendors provide an option within their development tools called "Device mode".
These tools allow mobile testing through emulation. Device mode emulates things like screen size, touch, etc.
Select your favorite Web Browser and follow the given steps in order to enable these features.
Touch Input
In JavaScript, touch events are handled similar to the way mouse events are handled.
Touch events are somewhat more complex than mouse events. Like with a mouse, you can listen for touch down, move, end, etc. However, users, usually, only have one mouse pointer, whereas users may touch the screen with multiple fingers at the same time.
As of the writing of this article, W3C's touch events specification defines four touch events:
- touchstart: A user places a touch point on the touch surface.
- touchend: A user removes a touch point from the touch surface.
- touchmove: A user moves a touch point along the touch surface.
- touchcancel: Indicates when a touch point has been disrupted in an implementation-specific manner.
The registered event listeners receive an object, which is an implementation of the TouchEvent
interface, containing all the relevant information about the touch action performed.
The TouchEvent.changedTouches
property provides a list of every point of contact which contributed to the event. Each of these points of contact are represented with a concrete implementation of the Touch
interface.
Similar to the MouseEvent
, Touch
objects have properties that define their coordinates within the viewport (canvas) or (browser) screen (in pixels).
Example:
// touchstart event binding
//
canvas.addEventListener("touchstart", function(event) {
// Handle touchstart...
}, false);
// touchmove event binding
canvas.addEventListener("touchmove", function(event) {
// Handle touchmove...
}, false);
// touchend event binding
canvas.addEventListener("touchend", function(event) {
// Handle touchend...
}, false);
Result:
Adding Constraints
Touch input opens up new possibilities for user interaction. It can be used to move objects, define gestures with one, two, three or more fingers.
In the following example we will use touch input to horizontally drag an element on the screen within a given length in pixels.
Example:
// Get a reference to the slider div
//
var slider = document.getElementById("slider");
// Get a reference to the slider knob
//
var knob = document.getElementById("knob");
// Get a reference to the image
//
var image = document.getElementById("image");
// Knob mid
//
var knobMid = knob.offsetWidth / 2;
// Binding events
//
slider.addEventListener("touchstart",
touchXY(event), false);
slider.addEventListener("touchmove",
touchXY(event), false);
function touchXY(event){
event.preventDefault();
var touchX =
event.touches[0].pageX - slider.offsetLeft;
if(touchX >= 0 && touchX <= slider.offsetWidth){
setKnob(touchX);
}
}
function setKnob(x) {
var knobX = x - knobMid;
knobX = Math.max(knobX, 0);
knobX =
Math.min(knobX,
slider.offsetWidth - knob.offsetWidth);
if(knob){
knob.style.left = knobX + 'px';
}
}
Giving it purpose
Let's just go ahead and use touch input in our primitive fruit ninja clone, bubble ninja.
function animate() {
// Clear the canvas
//
ctx.clearRect(0, 0, canvas.width, canvas.height);
// create a path for each bubble
for (var i = 0; i < 4; i++) {
// set the bubble color
//
ctx.strokeStyle = colors[i];
// Random speed per bubble just because
//
bubble[i] += speed[i];
// If bubble is not visible reset its position
//
if (bubble[i] >= canvas.height + 10) {
bubble[i] = -10;
}
var y = bubble[i];
var x = (i + 1) * 50;
var radius = 20;
// Draw the bubble
//
ctx.beginPath();
ctx.arc(x, y, radius, 0, 2 * Math.PI);
ctx.closePath();
// Collision test
//
for (var j = 0; j < touches.length; j++) {
if (ctx.isPointInPath(x[j], y[j]) &&
mouseIsDown) {
// Reset the bubble position
//
bubble[i] = -30;
// Increase the score
//
score++;
}
}
ctx.stroke();
// Set the font
ctx.font =
"italic 200 36px/2 Unknown Font, sans-serif";
ctx.fillStyle = "gray";
ctx.fillText(score, 24, canvas.height - 24);
}
window.requestAnimationFrame(animate);
}
Code Examples
Working examples of this article and the series can be downloaded through the following link. The source code for these and future examples are hosted in a public repository at Github.
Feel free to fork and submit pull requests.
Conclusion
In this article, we discussed how JavaScript events help us handle user input through touch capable surfaces. We talked about how to create TouchEvent
listeners for the HTML elements and evaluated the different scenarios involving each event type.
Have you ever implemented a touch based input system in JavaScript? If so, what considerations have you made and what advice would you give?
References
Image source: Pexels