Unity - how to get raw x/y position of gaze as a function of the screen?



  • Avatar

    Our SDK (C/C++/C#/Python as of v0.16.1) expose 2D gaze coordinates via fove_Headset_getGazeVectors2D

    The Unity plugin is using an older SDK and doesn't expose this currently, but it's fairly easy to do manually. In general, you just need to project the gaze direction vector by the projection matrix of the camera.

    In Unity there's some helpers for this, namely Camera.WorldToScreenPoint(). All you need to do is fetch the camera (you may need to do this at runtime since I think FoveInterface2 generates the cameras on the fly), then generate a world point by offsetting the camera location by the gaze direction.

    gazeDirection2D = cam.WorldToScreenPoint(cam.Position + cam.Direction * gazeDirection3D);

    Then you have a screen point and you can scale as needed (eg. if you need it from 0 to 1 you can divide by the resolution of the camera).

    Hopefully that helps. If you get it working, post some code for anyone else happening along this post. If you have trouble I'll hop into Unity later and see if I can write a bit of sample code as well.

  • Avatar

    As far as filtering, there is some filtering in our eye tracking system. There is no way to disable it, but we keep it very minimal as we strive for low latency.

  • Avatar
    stack wolll

    Just as you said, you're able to use the worldToScreenPoint, to get a vector2. However, how would you scale this to be a % of the screen size IN theHMD? Screen.width is the actual game width, not the dimensions within the HMD - so this conversion to a % from the vector2 of its screenposition does not seem to function correctly...

  • Avatar

    You should use pixelWidth/pixelHeight from the eye camera, the same camera used for worldToScreenPoint, to do the scaling to a ratio. Screen.width will return the unity window size on the monitor (I think), so it's not the right thing.

    This will get you the right ratio within the field of view of the eye camera. This angle comes from the service (but currently set to 95 degrees for FOVE0). Note, that each user has a different field of view depending on how close the lens is to their eye, which depends on how the headset mounts on their face, but the angle of the eye camera is what the rest of the FOVE system uses and is the correct thing to use if you're doing picking in 2D or other such 2D math..

    That said, we do recommend using 3D math where possible if your application is 3D. More or less any VR application should be 3D. This avoids any extra conversions since you can directly raycast into the 3D world with the 3D gaze vectors, or draw overlays at given points in 3D, etc.

Please sign in to leave a comment.