feat: context-aware eye gaze system (initial/local)#711
Open
kidkwazine wants to merge 8 commits intoBasisVR:developerfrom
Open
feat: context-aware eye gaze system (initial/local)#711kidkwazine wants to merge 8 commits intoBasisVR:developerfrom
kidkwazine wants to merge 8 commits intoBasisVR:developerfrom
Conversation
…le saccade improvements (local) - eyes naturally cycle between nearby player's eyes and mouth with physiological timing, micro-saccade jitter, near-field adaptation, and smooth fallback to idle saccades when no one is nearby - targets are ranked by who's facing us, who we're facing, and proximity with hysteresis, reaction delay, and personality-driven tuning (will expose via BasisAvatar) - `BasisGazeTarget.cs` which attaches to any GameObject to attract eye gaze (mirrors, cameras, signage, or anything). Supports transform tracking or manually driven focus points. It's scored alongside players automatically
- Mirrors compute a reflected eye position so the player naturally looks at their own reflection. - Camera prefabs (handheld and remote pip) get gaze targets so the player glances toward the lens.
…isAvatar Allows avatar creators to config eye gaze personality per-avatar via two new sliders. Values are piped to `BasisLocalEyeDriver` on avatar load / swap, with live preview support during play mode for real-time tuning in the editor.
Pass head rotation delta into the eye job so movements compensate for head turns (TIL way too much about the vestibulo-ocular reflex). Also: - General tuning pass to make things feel better. - Debug window now uses a final value from the system itself. - Various commenting clean-up for clarity.
- Widen all attentiveness ranges - Add gaze disengagement timer: low attentiveness periodically averts eyes even when the target is valid - Make reaction delay personality-driven instead of hardcoded instead of applying full personality immediately)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Updates the local eye saccades to context-aware gaze so eyes naturally focus on nearby players, mirrors, and cameras while falling back to improved idle movement when nothing's around.
Social gaze targeting: Eyes pick up on nearby players (scored by mutual attention, proximity, and facing direction) and cycle focus through a social pattern.
BasisGazeTarget: New component you can drop on any game object and the system scores it alongside players automatically. Defaults to transform tracking but you can manually drive the focus point for dynamic cases (e.g. mirrors computing a reflected position so you look at your own reflection). Camera prefabs (local + pip) already have it wired up in this PR.Personality sliders: Avatar creators get two new sliders on
BasisAvatarunder Eye and Mouth Data:Liveliness(frequency + amplitude) andAttentiveness(comfort with eye contact). These tune the eye gaze system behavior per-avatar.Debug window is also updated to reflect changes in the new system

Important
Everything here runs on the local player only right now.
Remotes still get their eye movement from the idle targets in
BasisRemoteFaceManagementThat'd be the big next step since a lot of the impact here (avatars in convo, looking into cameras, etc.) naturally depends on remotes being able to receive it, but figure best to limit to local first bcs you'd have a better idea on the best way to get it across instead of what I was doing.
Happy to rework stuff or talk through it + ofc feel free to tune shit. No worries if it's a post-load-test thing too
(Starting new gig in the coming week so responses might be a bit slower, but wanted to send this before then!)