Illustration by Angela Torchio
P5.js on mobile provides unique opportunities and challenges. The main P5 framework does an excellent job of making it easy to read data from various phone inputs and sensors, however it doesn't deal with the realities of contemporary browser's built in gestures and security protocols. That's where this library comes in:
- Simplifies accessing phone hardware from the browser (accelerometers, gyroscopes, microphone, vibration motor)
- Simplifies disabling default phone gestures (Zoom, refresh, back, etc)
- Simplifies enabling audio output
- Simplifies using an on-screen console to display errors and debug info
This library simplifies access to the following p5.js mobile sensor and audio commands:
Touch Events:
touchStarted()- Called when a touch beginstouchEnded()- Called when a touch ends
Device Motion & Orientation:
rotationX- Device tilt forward/backwardrotationY- Device tilt left/rightrotationZ- Device rotation around screenaccelerationX- Acceleration left/rightaccelerationY- Acceleration up/downaccelerationZ- Acceleration forward/backdeviceShaken()- Shake detection eventdeviceMoved()- Movement detection eventsetShakeThreshold()- Set shake detection sensitivitysetMoveThreshold()- Set movement detection sensitivity
Audio Input (requires p5.sound):
p5.AudioIn()- Audio input objectgetLevel()- Current audio input level
- iOS 13+ (Safari)
- Android 7+ (Chrome)
- Chrome 80+
- Safari 13+
- Firefox 75+
<!-- Minified version (recommended) -->
<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/p5-phone.min.js"></script>
<!-- Development version (larger, with comments) -->
<!-- <script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/p5-phone.js"></script> --><!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Mobile p5.js App</title>
<!-- Basic CSS to remove browser defaults and align canvas -->
<style>
body {
margin: 0;
padding: 0;
overflow: hidden;
}
</style>
<!-- Load p5.js library -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.10/p5.min.js"></script>
<!-- Load p5-phone library -->
<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/p5-phone.min.js"></script>
</head>
<body>
<!-- Load the p5.js sketch -->
<script src="sketch.js"></script>
</body>
</html>let mic;
let mySound;
function preload() {
// Load sound file if needed
// mySound = loadSound('assets/sound.mp3');
}
function setup() {
// Show debug panel FIRST to catch setup errors
showDebug();
createCanvas(windowWidth, windowHeight);
// Lock mobile gestures to prevent browser interference
lockGestures();
// Enable motion sensors with tap-to-start
enableGyroTap('Tap to enable motion sensors');
// Enable microphone with tap-to-start (also enables sound output)
mic = new p5.AudioIn();
enableMicTap('Tap to enable microphone');
// OR enable sound output only (no microphone input)
// enableSoundTap('Tap to enable sound');
}
function draw() {
background(220);
// Always check status before using hardware features
if (window.sensorsEnabled) {
// Use device rotation and acceleration
fill(255, 0, 0);
circle(width/2 + rotationY * 5, height/2 + rotationX * 5, 50);
}
if (window.micEnabled) {
// Use microphone input
let level = mic.getLevel();
fill(0, 255, 0);
rect(10, 10, level * 200, 20);
}
if (window.soundEnabled) {
// Safe to play sounds
// mySound.play();
}
}
// Prevent default touch behavior (optional but recommended)
function touchStarted() {
return false;
}
function touchEnded() {
return false;
}// Essential mobile setup
lockGestures() // Prevent browser gestures (call in setup())
// Motion sensor activation
enableGyroTap(message) // Tap anywhere to enable sensors
enableGyroButton(text) // Button-based sensor activation
// Microphone activation
enableMicTap(message) // Tap anywhere to enable microphone
enableMicButton(text) // Button-based microphone activation
// Sound output activation (no microphone input)
enableSoundTap(message) // Tap anywhere to enable sound playback
enableSoundButton(text) // Button-based sound activation
// Vibration motor (Android only)
enableVibrationTap(message) // Tap anywhere to enable vibration
enableVibrationButton(text) // Button-based vibration activation
vibrate(pattern) // Trigger vibration (duration or pattern array)
stopVibration() // Stop any ongoing vibration
// Status variables (check these in your code)
window.sensorsEnabled // Boolean: true when motion sensors are active
window.micEnabled // Boolean: true when microphone is active
window.soundEnabled // Boolean: true when sound output is active
window.vibrationEnabled // Boolean: true when vibration is available (Android only)
// Debug system (enhanced in v1.4.0)
showDebug() // Show on-screen debug panel with automatic error catching
hideDebug() // Hide debug panel
toggleDebug() // Toggle panel visibility
debug(...args) // Console.log with on-screen display and timestamps
debugError(...args) // Display errors with red styling
debugWarn(...args) // Display warnings with yellow styling
debug.clear() // Clear debug messagesp5.js Namespace Support: All functions are also available as p5.prototype methods:
// You can use either syntax:
lockGestures(); // Global function (recommended)
this.lockGestures(); // p5.js instance method
// Both approaches work identically
enableGyroTap('Tap to start');
this.enableGyroTap('Tap to start');Purpose: Check whether permissions have been granted and sensors are active.
Variables:
window.sensorsEnabled- Boolean indicating if motion sensors are activewindow.micEnabled- Boolean indicating if microphone is activewindow.soundEnabled- Boolean indicating if sound output is activewindow.vibrationEnabled- Boolean indicating if vibration is available (Android only)
Usage:
function draw() {
// Always check before using sensor data
if (window.sensorsEnabled) {
// Safe to use rotationX, rotationY, accelerationX, etc.
let tilt = rotationX;
}
if (window.micEnabled) {
// Safe to use microphone
let audioLevel = mic.getLevel();
}
if (window.soundEnabled) {
// Safe to play sounds
mySound.play();
}
if (window.vibrationEnabled) {
// Safe to use vibration (Android only)
vibrate(50);
}
}
// You can also use them for conditional UI
function setup() {
enableGyroTap('Tap to enable motion');
// Show different instructions based on status
if (!window.sensorsEnabled) {
debug("Motion sensors not yet enabled");
}
}Purpose: Prevents unwanted mobile browser gestures that can interfere with your p5.js app.
When to use: Call once in your setup() function after creating the canvas.
What it blocks:
- Pinch-to-zoom - Prevents users from accidentally zooming the page
- Pull-to-refresh - Stops the browser refresh gesture when pulling down
- Swipe navigation - Disables back/forward swipe gestures
- Long-press context menus - Prevents copy/paste menus from appearing
- Text selection - Stops accidental text highlighting on touch and hold
- Double-tap zoom - Eliminates double-tap to zoom behavior
function setup() {
createCanvas(windowWidth, windowHeight);
lockGestures(); // Essential for smooth mobile interaction
}Purpose: Enable device motion and orientation sensors with user permission handling.
Commands:
enableGyroTap(message)- Tap anywhere on screen to enable sensorsenableGyroButton(text)- Creates a button with custom text to enable sensors
Usage:
// Tap-to-enable (recommended)
enableGyroTap('Tap to enable motion sensors');
// Button-based activation
enableGyroButton('Enable Motion');Available p5.js Variables (when window.sensorsEnabled is true):
| Variable | Description | Range/Units |
|---|---|---|
rotationX |
Device tilt forward/backward | -180° to 180° |
rotationY |
Device tilt left/right | -180° to 180° |
rotationZ |
Device rotation around screen | -180° to 180° |
accelerationX |
Acceleration left/right | m/s² |
accelerationY |
Acceleration up/down | m/s² |
accelerationZ |
Acceleration forward/back | m/s² |
deviceShaken |
Shake detection event | true when shaken |
deviceMoved |
Movement detection event | true when moved |
Important: All motion sensor variables, including deviceShaken and deviceMoved, are only available when window.sensorsEnabled is true. Always check this status before using any motion data.
Example:
function draw() {
// CRITICAL: Always check window.sensorsEnabled first
if (window.sensorsEnabled) {
// Tilt-controlled circle
let x = width/2 + rotationY * 3;
let y = height/2 + rotationX * 3;
circle(x, y, 50);
// Shake detection - only works when sensors are enabled
if (deviceShaken) {
background(random(255), random(255), random(255));
}
// Movement detection - also requires sensors to be enabled
if (deviceMoved) {
fill(255, 0, 0);
}
} else {
// Show fallback when sensors not enabled
text('Tap to enable motion sensors', 20, 20);
}
}Purpose: Enable device microphone with user permission handling for audio-reactive applications.
Important: Microphone examples require the p5.sound library. Add this script tag to your HTML:
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.0/addons/p5.sound.min.js"></script>Commands:
enableMicTap(message)- Tap anywhere on screen to enable microphoneenableMicButton(text)- Creates a button with custom text to enable microphone
Usage:
// Tap-to-enable (recommended)
enableMicTap('Tap to enable microphone');
// Button-based activation
enableMicButton('Enable Audio');Available p5.js Variables (when window.micEnabled is true):
| Variable | Description | Range |
|---|---|---|
p5.AudioIn() |
Audio input object (stored in mic) |
Object |
mic.getLevel() |
Current audio input level | 0.0 to 1.0 |
Example:
let mic;
function setup() {
createCanvas(windowWidth, windowHeight);
// Create a new p5.AudioIn() instance
mic = new p5.AudioIn();
// Enable microphone with tap
enableMicTap();
}
function draw() {
if (window.micEnabled) {
// The mic object is a p5.AudioIn() instance
// Audio-reactive visualization
let level = mic.getLevel();
let size = map(level, 0, 1, 10, 200);
background(level * 255);
circle(width/2, height/2, size);
}
}Purpose: Enable audio playback without requiring microphone input. Perfect for playing sounds, music, synthesizers, and audio effects in mobile browsers.
Important: Sound examples require the p5.sound library. Add this script tag to your HTML:
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.0/addons/p5.sound.min.js"></script>Commands:
enableSoundTap(message)- Tap anywhere on screen to enable sound playbackenableSoundButton(text)- Creates a button with custom text to enable sound
Usage:
// Tap-to-enable (recommended)
enableSoundTap('Tap to enable sound');
// Button-based activation
enableSoundButton('Enable Sound');When to use Sound vs. Microphone:
- Use
enableSoundfor: Playing audio files, synthesizers, oscillators, sound effects - Use
enableMicfor: Recording audio, audio-reactive visualizations, voice input - Note:
enableMicalso enables sound output, so you don't need both
Example:
let mySound;
function preload() {
// Load audio file
mySound = loadSound('assets/sound.mp3');
}
function setup() {
createCanvas(windowWidth, windowHeight);
// Enable sound playback with tap
enableSoundTap('Tap to enable sound');
}
function draw() {
background(220);
if (window.soundEnabled) {
text('Tap anywhere to play sound', 20, 20);
} else {
text('Waiting for sound activation...', 20, 20);
}
}
function mousePressed() {
// Check if sound is enabled before playing
if (window.soundEnabled && !mySound.isPlaying()) {
mySound.play();
}
}Purpose: Access the device's vibration motor for haptic feedback and tactile interactions.
- ✅ Android - Full support in Chrome and most Android browsers
- ❌ iOS - Not supported (Vibration API not available on iOS devices)
Important: The vibration feature will automatically detect if the device supports vibration. On iOS or unsupported devices, window.vibrationEnabled will be false and vibration calls will be safely ignored with console warnings.
Commands:
enableVibrationTap(message)- Tap anywhere on screen to enable vibrationenableVibrationButton(text)- Creates a button with custom text to enable vibrationvibrate(pattern)- Trigger vibration with a duration (ms) or pattern arraystopVibration()- Stop any ongoing vibration
Usage:
function setup() {
createCanvas(windowWidth, windowHeight);
// Enable vibration with tap (Android only)
enableVibrationTap('Tap to enable vibration');
// Or use a button
// enableVibrationButton('Enable Haptics');
}
function draw() {
background(220);
if (window.vibrationEnabled) {
text('Vibration ready! Tap anywhere', 20, 20);
} else {
text('Vibration not available', 20, 20);
}
}
function mousePressed() {
if (window.vibrationEnabled) {
// Simple vibration - 50ms pulse
vibrate(50);
}
}Vibration Patterns:
// Single vibration (duration in milliseconds)
vibrate(100); // Vibrate for 100ms
// Pattern: [vibrate, pause, vibrate, pause, ...]
vibrate([100, 50, 100]); // Short-short pattern
vibrate([200, 100, 200, 100, 200]); // Triple pulse
vibrate([50, 50, 50, 50, 500]); // Quick taps then long
// Stop any ongoing vibration
stopVibration();Common Use Cases:
// Haptic feedback for button presses
function mousePressed() {
if (window.vibrationEnabled) {
vibrate(20); // Quick tap feedback
}
}
// Touch zones with different haptic patterns
function touchStarted() {
if (window.vibrationEnabled) {
if (mouseX < width/2) {
vibrate(50); // Left side - short pulse
} else {
vibrate([50, 30, 50]); // Right side - double pulse
}
}
return false;
}
// Collision detection
function checkCollision() {
if (collision && window.vibrationEnabled) {
vibrate([100, 50, 100, 50, 200]); // Alert pattern
}
}
// Game events
function gameOver() {
if (window.vibrationEnabled) {
vibrate(500); // Long vibration for game over
}
}Best Practices:
- Use short vibrations (20-100ms) for subtle feedback
- Use patterns for more complex haptic responses
- Always check
window.vibrationEnabledbefore callingvibrate() - Don't overuse - vibration can quickly drain battery
- Test on Android devices as iOS doesn't support vibration
Purpose: Simplified camera access optimized for ML5.js machine learning models (FaceMesh, HandPose, BodyPose, etc.). Handles camera initialization, coordinate mapping, mirroring, and display modes automatically.
Key Features:
- Automatic Coordinate Mapping - ML5 keypoints automatically mapped to canvas coordinates
- Mirror Support - Handles front camera mirroring for natural interaction
- Display Modes - Multiple video sizing options (fitHeight, cover, contain, fixed)
- ML5 Optimized - Direct integration with ML5 v1.x models
- Auto-initialization - Camera starts automatically when permissions are granted
Commands:
| Function | Purpose | Parameters |
|---|---|---|
createPhoneCamera(active, mirror, mode) |
Create new camera instance | active: 'user' or 'environment' mirror: true/false mode: 'fitHeight', 'cover', 'contain', 'fixed' |
enableCameraTap(message) |
Tap to enable camera | Optional message string |
cam.onReady(callback) |
Execute code when camera ready | Callback function |
cam.mapKeypoint(keypoint) |
Map single ML5 keypoint to screen | ML5 keypoint object |
cam.mapKeypoints(keypoints) |
Map array of ML5 keypoints | Array of ML5 keypoints |
Properties:
| Property | Description | Type |
|---|---|---|
cam.ready |
Camera initialization status | Boolean |
cam.video |
p5.js video element | p5.Element |
cam.active |
Current camera ('user'/'environment') | String |
cam.mirror |
Mirror state | Boolean |
cam.mode |
Display mode | String |
cam.width |
Video width | Number |
cam.height |
Video height | Number |
Basic Setup:
let cam;
let facemesh;
let faces = [];
function setup() {
createCanvas(windowWidth, windowHeight);
// Create camera: front camera, mirrored, fit to canvas height
cam = createPhoneCamera('user', true, 'fitHeight');
// Enable camera (auto-starts if permission granted)
enableCameraTap();
// Start ML5 when camera is ready
cam.onReady(() => {
let options = {
maxFaces: 1,
refineLandmarks: false,
flipHorizontal: false // cam.mapKeypoint() handles mirroring
};
facemesh = ml5.faceMesh(options, modelLoaded);
});
}
function modelLoaded() {
// Start detection - use cam.videoElement for ML5
facemesh.detectStart(cam.videoElement, (results) => {
faces = results;
});
}
function draw() {
background(220);
// Draw camera feed
if (cam.ready) {
image(cam, 0, 0); // PhoneCamera handles positioning automatically
}
// Draw tracked face keypoints
if (faces.length > 0) {
let face = faces[0];
// Map nose tip keypoint (index 1) to screen coordinates
let nose = cam.mapKeypoint(face.keypoints[1]);
// Use coordinates for interaction
fill(255, 0, 0);
circle(nose.x, nose.y, 30);
// Map all keypoints at once
let allPoints = cam.mapKeypoints(face.keypoints);
for (let point of allPoints) {
circle(point.x, point.y, 3);
}
}
}Display Modes:
| Mode | Behavior |
|---|---|
'fitHeight' |
Scale video to canvas height (default, recommended) |
'cover' |
Fill entire canvas (may crop video) |
'contain' |
Fit entire video in canvas (may show letterboxing) |
'fixed' |
Fixed size (set with cam.fixedWidth, cam.fixedHeight) |
Coordinate Mapping:
The mapKeypoint() and mapKeypoints() functions automatically handle:
- Video-to-canvas scaling
- Mirror transformation (for front camera)
- Offset positioning (for different display modes)
- 3D coordinates (preserves z-depth from BlazePose)
// Single keypoint
let nose = cam.mapKeypoint(face.keypoints[1]);
console.log(nose.x, nose.y, nose.z); // Screen coordinates + depth
// Multiple keypoints
let hands = cam.mapKeypoints(hand.keypoints);
hands.forEach(point => {
circle(point.x, point.y, 5);
});ML5 Model Examples:
// FaceMesh (468 keypoints)
let options = { maxFaces: 1, refineLandmarks: false, flipHorizontal: false };
facemesh = ml5.faceMesh(options, modelLoaded);
// HandPose (21 keypoints per hand)
let options = { maxHands: 2, runtime: 'mediapipe', flipHorizontal: false };
handpose = ml5.handPose(options, modelLoaded);
// BodyPose (33 keypoints with 3D)
let options = { modelType: 'MULTIPOSE_LIGHTNING', flipped: false };
bodypose = ml5.bodyPose('BlazePose', options, modelLoaded);Important Notes:
- Always set
flipHorizontal: falsein ML5 options (PhoneCamera handles mirroring) - Use
cam.videoElement(native HTML video element) when passing to ML5'sdetectStart() - Check
cam.readybefore using video or drawing keypoints - Call
enableCameraTap()to handle camera permissions automatically
Purpose: Essential on-screen debugging system for mobile development where traditional browser dev tools aren't accessible. Provides automatic error catching, timestamped logging, and color-coded messages.
Why use it: Mobile browsers often hide JavaScript errors, making debugging difficult. This system displays all errors, warnings, and custom messages directly on your mobile screen with timestamps and color coding.
Commands:
| Function | Purpose | Example |
|---|---|---|
showDebug() |
Show debug panel and enable error catching | showDebug() |
hideDebug() |
Hide debug panel | hideDebug() |
toggleDebug() |
Toggle panel visibility | toggleDebug() |
debug(...args) |
Log messages (white text) | debug("App started", frameRate()) |
debugError(...args) |
Display errors (red text) | debugError("Connection failed") |
debugWarn(...args) |
Display warnings (yellow text) | debugWarn("Low battery") |
debug.clear() |
Clear all messages | debug.clear() |
Key Features:
- Automatic Error Catching - JavaScript errors automatically displayed with red styling
- Error Location - Shows filename and line number for easy debugging
- Timestamps - All messages include precise timestamps
- Color Coding - Errors (red), warnings (yellow), normal messages (white)
- Mobile Optimized - Touch-friendly interface that works on small screens
- Keyboard Shortcuts - Press 'D' to toggle, 'C' to clear (when debug is enabled)
Critical Setup:
function setup() {
// IMPORTANT: Call showDebug() FIRST to catch setup errors
showDebug();
createCanvas(windowWidth, windowHeight);
// Any errors after this point will be automatically caught and displayed
}Usage Examples:
// Basic logging
debug("Touch at:", mouseX, mouseY);
debug("Sensors enabled:", window.sensorsEnabled);
// Error handling
debugError("Failed to load image");
debugWarn("Frame rate dropping:", frameRate());
// Objects and arrays
debug("Touch points:", touches);
debug({rotation: rotationX, acceleration: accelerationX});