Skip to content

VisionProTeleop v2.50 : External Network Connection, Isaac Lab Support, etc.

Latest

Choose a tag to compare

@younghyopark younghyopark released this 22 Dec 05:00
· 4 commits to main since this release

🎉 VisionProTeleop v2.50 Release Notes

🌟 Highlights

This release introduces external network connectivity, native simulation streaming for MuJoCo and Isaac Lab, public dataset sharing, and a companion iOS app, making Apple Vision Pro even more useful for robotics research and development.


🆕 Major New Features

🌐 External Network Mode (Remote Connectivity)

No more same-WiFi requirement! Connect Vision Pro and your Python client from anywhere in the world.

  • Room Code System: Vision Pro generates a room code (e.g., "ABC-1234") that you use instead of an IP address
  • WebRTC with TURN Relay: Automatic NAT traversal using signaling and TURN servers
  • Seamless API: Just change ip="192.168.1.100" to ip="ABC-1234" — everything else works the same
# External network - use room code instead of IP
s = VisionProStreamer(ip="ABC-1234")  
s.configure_video(device="/dev/video0", format="v4l2")
s.start_webrtc()

🎮 Native Simulation Streaming

MuJoCo AR Streaming

Stream MuJoCo simulations directly to Vision Pro as native AR objects rendered by RealityKit:

  • Automatic MJCF → USD conversion with USDZ caching
  • Real-time pose updates via WebRTC (more efficient than video streaming)
  • Configurable world-to-AR placement with relative_to parameter
s.configure_mujoco("robot.xml", model, data, relative_to=[0, 0, 0.8, 90])
s.start_webrtc()
while True:
    mujoco.mj_step(model, data)
    s.update_sim()  # Stream poses, not rendered frames

Isaac Lab AR Streaming 🆕

Full support for NVIDIA Isaac Lab simulations:

  • Stream Isaac Lab scenes with native RealityKit rendering
  • Multi-environment support with env_indices selection
  • Automatic USD export and caching
s.configure_isaac(scene=env.scene, relative_to=[0, 0, 0.8, 90], env_indices=[0])
s.start_webrtc()

📱 Companion iOS App: Tracking Manager

A new iOS app for managing your VisionProTeleop workflow:

  • Recording Management: Browse, playback, and manage cloud recordings
  • 3D Visualization: View synchronized video + skeleton overlays
  • Camera Calibration: Perform intrinsic/extrinsic calibration with visual guidance
  • Remote Settings: Configure Vision Pro app settings from your iPhone
  • Public Sharing: Share recordings with the research community

📊 Public Dataset Sharing

Create and access community datasets of egocentric manipulation videos:

  • Share recordings via CloudKit (data stays in your cloud, just made shareable)
  • Browse and download others' public recordings
  • Full Python API for dataset access
from avp_stream.datasets import list_public_recordings, download_recording

recordings = list_public_recordings()
download_recording(recordings[0], dest_dir="./data")

🔧 New Tracking Capabilities

Enhanced Hand Tracking API

New attribute-style access for cleaner code (fully backward compatible):

data = s.get_latest()

# New API (recommended)
data.right.wrist           # (4, 4) transform
data.right.indexTip        # (4, 4) fingertip
data.right.pinch_distance  # float: thumb-index distance
data.right.wrist_roll      # float: axial rotation

# Legacy API (still works)
data["right_wrist"]        # Same data

Predictive Hand Tracking 🆕

Compensate for system/network latency with ARKit's predictive tracking:

  • Configurable prediction offset (0-100ms)
  • Reduces perceived latency for responsive teleoperation

Marker & Image Tracking 🆕

Track ArUco markers and custom reference images:

markers = s.get_markers()
for marker_id, info in markers.items():
    pose = info["pose"]  # (4, 4) transform matrix

Stylus Tracking 🆕

Track Logitech Muse stylus (visionOS 26.0+):

stylus = s.get_stylus()
if stylus["tip_pressed"]:
    pressure = stylus["tip_pressure"]  # 0.0-1.0

📹 Egocentric Recording Improvements

UVC Camera Support

  • Direct USB camera connection via Developer Strap
  • CAD models for 3D-printable mounting brackets in assets/adapters/
  • Full frame access without Apple Enterprise approval

Camera Calibration

  • Intrinsic Calibration: Camera parameters via Tracking Manager iOS app
  • Extrinsic Calibration: Camera-to-Vision Pro transform
  • Visual guidance and validation tools

☁️ Cloud Storage Expansion

New Storage Providers

  • Google Drive 🆕
  • Dropbox 🆕
  • iCloud Drive (existing)

Recording Contents

Recordings now include:

  • Video (H.264/H.265)
  • All tracking data (hands, head, markers, stylus)
  • Simulation data (if using MuJoCo/Isaac streaming)
  • Metadata and calibration info

📚 New Examples

Example Description
00_hand_streaming.py Basic hand tracking
09_mujoco_streaming.py MuJoCo AR simulation
10_teleop_osc_franka.py OSC-based Franka teleoperation
11_diffik_aloha.py Differential IK with ALOHA robot
12_diffik_shadow_hand.py Shadow Hand teleoperation
13_g1_freefall.py Unitree G1 simulation
14_kuka_allegro_streaming.py KUKA + Allegro Hand
15_franka_visionpro_teleop.py Complete Franka teleop example
16_spot_demo.py Boston Dynamics Spot
17_maker_detection_streaming.py ArUco marker tracking
18_public_datasets.py Public dataset browser
19_logitech_muse_stylus.py Stylus tracking demo

🙏 Acknowledgements

Thanks to all contributors and the robotics research community for feedback and feature requests that shaped this release.


📥 Upgrade Instructions

# Python package
pip install --upgrade avp_stream

# VisionOS app
Update from App Store

# iOS companion app (NEW)
Install "Tracking Manager" from App Store