Skip to content

feat: context-aware eye gaze system (initial/local)#711

Open
kidkwazine wants to merge 8 commits intoBasisVR:developerfrom
kidkwazine:eye-gaze-overhaul
Open

feat: context-aware eye gaze system (initial/local)#711
kidkwazine wants to merge 8 commits intoBasisVR:developerfrom
kidkwazine:eye-gaze-overhaul

Conversation

@kidkwazine
Copy link
Copy Markdown
Contributor

Updates the local eye saccades to context-aware gaze so eyes naturally focus on nearby players, mirrors, and cameras while falling back to improved idle movement when nothing's around.

Social gaze targeting: Eyes pick up on nearby players (scored by mutual attention, proximity, and facing direction) and cycle focus through a social pattern.

BasisGazeTarget: New component you can drop on any game object and the system scores it alongside players automatically. Defaults to transform tracking but you can manually drive the focus point for dynamic cases (e.g. mirrors computing a reflected position so you look at your own reflection). Camera prefabs (local + pip) already have it wired up in this PR.

Personality sliders: Avatar creators get two new sliders on BasisAvatar under Eye and Mouth Data: Liveliness (frequency + amplitude) and Attentiveness (comfort with eye contact). These tune the eye gaze system behavior per-avatar.

Debug window is also updated to reflect changes in the new system
basis-eye-editor


Important

Everything here runs on the local player only right now.
Remotes still get their eye movement from the idle targets in BasisRemoteFaceManagement

That'd be the big next step since a lot of the impact here (avatars in convo, looking into cameras, etc.) naturally depends on remotes being able to receive it, but figure best to limit to local first bcs you'd have a better idea on the best way to get it across instead of what I was doing.

Happy to rework stuff or talk through it + ofc feel free to tune shit. No worries if it's a post-load-test thing too
(Starting new gig in the coming week so responses might be a bit slower, but wanted to send this before then!)

…le saccade improvements (local)

- eyes naturally cycle between nearby player's eyes and mouth with physiological timing, micro-saccade jitter, near-field adaptation, and smooth fallback to idle saccades when no one is nearby
- targets are ranked by who's facing us, who we're facing, and proximity with hysteresis, reaction delay, and personality-driven tuning (will expose via BasisAvatar)
- `BasisGazeTarget.cs` which attaches to any GameObject to attract eye gaze (mirrors, cameras, signage, or anything). Supports transform tracking or manually driven focus points. It's scored alongside players automatically
- Mirrors compute a reflected eye position so the player naturally looks at their own reflection.
- Camera prefabs (handheld and remote pip) get gaze targets so the player glances toward the lens.
…isAvatar

Allows avatar creators to config eye gaze personality per-avatar via two
new sliders. Values are piped to `BasisLocalEyeDriver` on avatar load / swap, with live preview support during play mode for real-time tuning in the editor.
Pass head rotation delta into the eye job so movements compensate for head turns  (TIL way too much about the vestibulo-ocular reflex).

Also:
- General tuning pass to make things feel better.
- Debug window now uses a final value from the system itself.
- Various commenting clean-up for clarity.
- Widen all attentiveness ranges
- Add gaze disengagement timer: low attentiveness periodically averts eyes even when the target is valid
- Make reaction delay personality-driven instead of hardcoded instead of applying full personality immediately)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant