How I cheated my way to VR-friendly UI in Godot by raycasting into a mesh, projecting into a viewport -- and then faking mouse events.
For the Godot VR Toolkit I wanted proper GUI interaction in VR without rewriting or reinventing the wheel. So, I kept using normal Godot Control nodes and made them think they were being used with a mouse:
Viewport hosts a regular 2D UI.ViewportToMesh.gd renders that viewport onto a 3D mesh in the world.GuiInteraction.gd casts rays from VR controllers into that mesh.GuiFinger.gd tracks a finger tip collider for “poking” the UI.InputEventMouseMotion / InputEventMouseButton so the UI believes a mouse is moving and clicking.Result: 3D VR hands, laser pointer, hovering, clicking, dragging – but under the hood it’s still just mouse events and a viewport.
The implementation can be found on Github.
Godot is fundamentally stable and user-friendly at 2D GUIs: Control hierarchies, themes, focus handling, signals – all the usual desktop UI goodies.
VR, on the other hand, lives entirely in 3D: controllers/hand meshes, raycasts, spatial interactions and depth perception. What you don’t have is a mouse pointer.
Traditionally, user interface interactions in VR are handled by casting a ray from the controller (like a “laser pointer”) or by directly pinching, hence touching objects with a virtual hand/finger. Naively, you could:
Controls to understand “rays” and “fingers”.Both approaches quickly turn into a rabbit hole of “oh right, I also need hover, drag, scroll, focus, keyboard…“.
So instead I went for a “cheaper” trick that reuses Godot’s existing 2D GUI system as much as possible;:
What if we keep the entire 2D GUI stack as-is and just pretend we’re a mouse?
That leads to a very simple mental model:
(x, y) and clicked.” Everything in between is just coordinate transformations and synthetic input events.
The three main scripts in play are:
ViewportToMesh.gd
Renders a 2D Viewport (with Controls) onto a 3D mesh, including UV coordinate handling.
GuiInteraction.gd
Handles ray-based interaction: a “laser pointer” from the controller that hits the GUI mesh.
GuiFinger.gd
Handles fingertip interaction: a collider on the tip of a VR hand that can “press” UI buttons directly.
In a slightly simplified picture:
flowchart LR
H[VR Hand / Controller]
H --> R[Raycast]
H --> G[Finger Collider]
R -->|collide| M[GUI Mesh]
G -->|collide| M[GUI Mesh]
M -->|interpret collision point| M
M -->|pass faked mousevent| C[Viewport]
C --> E[Godot GUI system]
The resulting faked mouse events are then processed by Godot’s normal GUI system, triggering hover states, button presses, etc..
The starting point is trivial – rendering a UI to a texture and show it on a mesh:
ViewportToMesh: SpatialMaterial that takes a viewport texture (Albedo -> Texture -> New ViewportTexture)PackedScene) to assign the Viewport node(0,0) in UV corresponds to (0,0) in the viewport, and (1,1) to (width, height) Area and CollisionShape on the mesh to match the viewport size (e.g. a plane of size width x height) for querying collisionsAs a node-hierarchy this might look like:
Spatial (World)
├── ViewportToMesh (MeshInstance)
│ ├── Area
│ │ └── CollisionShape
│ └── Viewport (instanced PackedScene)
...
Conceptually, when a ray hits the mesh, we get:
The ray-based interaction lives in GuiInteraction.gd; a ray originates from the VR controller (or camera), which – every frame – it casts into the world. This gives a classic laser-pointer style UI:
The interesting bit is not the raycast itself (that’s standard Godot), but what happens in _send_mouse_motion and _send_mouse_button – that’s where we fake mouse events.
GuiFinger.gd is the second interaction mode: direct hand interaction.
Instead of a laser pointer, we have:
The flow is essentially:
Where the ray pointer feels a bit like a laser pointer/remote, finger interaction feels like actually poking the UI – but internally, both boil down to the same InputEventMouse-machinery.
Now the fun part: lying to Godot’s GUI.
The goal: from the GUI’s perspective, nothing special is happening. It just sees mouse movement and clicks.
func ray_interaction_input(position3D: Vector3, event_type, device_id, pressed=null):
This function turns a 3D hit position on the GUI mesh into a 2D mouse-like input event for a viewport.
position3D: where the ray (or fingertip) hit the GUI in world space.event_type: which kind of mouse event to create (InputEventMouseMotion or InputEventMouseButton).device_id: which input device (controller) this event belongs to.pressed: only relevant for button events, indicates down/up.The rest of the function:
Control nodes react to it. position3D = area.global_transform.affine_inverse() * position3D
var position2D = Vector2(position3D.x, position3D.z)
area.global_transform.affine_inverse() converts the world-space hit position into the local space of area (the GUI panel / collision area).(x, z) and throw away the third dimension, ending up with a 2D point on the panel: position2D.At this point, position2D still uses the panel’s own local units, centered on its origin.
position2D.x += quad_mesh_size.x / 2
position2D.y += quad_mesh_size.y / 2
(0, 0) with extents ±quad_mesh_size/2.quad_mesh_size in both directions.(0, 0) is one corner of the quad and (quad_mesh_size.x, quad_mesh_size.y) is the opposite corner. position2D.x = position2D.x / quad_mesh_size.x
position2D.y = position2D.y / quad_mesh_size.y
0.0 -> start of the axis,1.0 -> end of the axis.So position2D is now in the range (0..1, 0..1) and basically matches the UVs of the quad.
position2D.x = position2D.x * viewport.size.x
position2D.y = position2D.y * viewport.size.y
(0, 0) is the top-left corner of the viewport.(viewport.size.x, viewport.size.y) is the bottom-right.This is exactly the coordinate system Godot’s GUI expects for mouse events.
var event = event_type.new()
Here we instantiate whichever mouse event type was passed in (InputEventMouseMotion or InputEventMouseButton).
if event is InputEventMouseMotion:
if last_pos2D == null:
event.relative = Vector2(0, 0)
else:
event.relative = position2D - last_pos2D
relative: (0, 0).last_pos2D from the new position for a movement delta. elif event is InputEventMouseButton:
event.button_index = 1
event.pressed = pressed
button_index = 1 -> left mouse button.pressed -> uses the argument passed into the function to differentiate between button-down and button-up. last_pos2D = position2D
event.position = position2D
event.global_position = position2D
event.device = device_id
viewport.input(event)
last_pos2D is updated so the next motion event can compute a correct relative delta.event.position and event.global_position are both set to the computed viewport coordinates (for GUI, these are typically the same).event.device identifies which controller generated this event.viewport.input(event) injects the event into the viewport’s input pipeline.From this point on, Godot treats it like a normal mouse event:
Control nodes react to hover and clicks.The nice part is that dragging a slider via a VR ray is literally the same code path as dragging it with a real mouse.
This approach works surprisingly well, but it’s not magic. A few gotchas and design decisions:
If you bend or distort your mesh (curved UI panels), you have to ensure:
ViewportToMesh acts as the contract: its job is to make sure “UV -> viewport pixel” stays consistent.
Mouse is only half the story:
InputEventKey from a VR keyboard into the viewport.The nice thing: you don’t have to change how LineEdit or TextEdit work, you just feed them events.
In VR you might have:
The mouse model assumes a single pointer. In practice, I make a conscious decision:
For more complex setups you could emulate multiple mice by tagging events, but that quickly diverges from Godot’s standard assumptions.
Not every VR interaction needs to pretend to be a mouse:
Start from the existing machinery, then adapt the edges.
By bending the inputs to look like mouse events, I can:
The triad of ViewportToMesh.gd, GuiInteraction.gd, and GuiFinger.gd is essentially just:
From the user’s perspective it feels like “of course I can point at that button and press it”.
From Godot’s perspective, they just moved a mouse and clicked.
And that’s exactly the kind of cheating I enjoy.
Here are some more articles you might like to read next: