Displaying FOVE Eyes in Unity
I'm trying to set something up to display the FOVE Eye cameras directly in a unity scene. I want my users to be able to see the eye camera view without needing to use the debug tool.
I can get them to display in a scene using a standard webcam setup (WebCamTexture.Play() on a RawImage component, see below for exact code), but this interferes with the eye tracking. Either the eye cameras can play in the scene, or they will show in the debug tool, but not both. When the cameras play in the scene, eye tracking gets disabled ([0,0,1] for both eye vectors), and the eye camera view shows a black screen in the debug tool.
Is there any way to display the eye cameras directly in the scene while still keeping eye-tracking enabled?
Edit: Code I'm currently using:
using UnityEngine;
using UnityEngine.UI;
public class DisplayFoveEyes : MonoBehaviour
{
public RawImage webCamImage;
void Start()
{
WebCamTexture webCamTexture = new WebCamTexture("FOVE Eyes");
webCamImage.texture = webCamTexture;
webCamImage.material.mainTexture = webCamTexture;
webCamTexture.Play();
}
}
-
Official comment
Hi Sam,
Only one thing can grab the camera at once. If you grab it, then we can't and thus no eye tracking can happen. So essentially this is not supported yet.
We have a C++ research API, which is not fully complete, but it allows getting the camera images from the service for any purpose, including display, while eye tracking is running.
However, you will have to bind this to Unity which is a bit of work. We've implemented this for the next major update to the Unity plugin, so once that comes out (no specific date currently), this will be supported.
One thing to note is that image output is considered a research feature (instead of a game feature). We don't maintain backwards compatibility for research features. If you use the eye images from that API, your application will not work against future versions of the SDK. This is fine as normally researchers can control the environment they run in (as opposed to games where your users may be running various versions).
Comment actions -
Hi Jeff,
Thanks for the info, although it's a bit disappointing. Hopefully, this feature becomes fully-supported in the future. Having the ability to display and manipulate (replay, slow down, etc) the eye video presents huge added value for us, but dropping future support/features is too risky of a trade-off.
-
We're hoping to get the updates out within the next week or two once we squash a couple final bugs and get a clear QA pass, and a major update to the Unity plugin is coming as well, alongside an add-in we've written specifically for displaying eye images inside Unity.
I was just looking at the code from 0.14.1 and the Unity plugin 2.1.2, and it looks like the groundwork classes are already there that you could use on your own.
Try calling `FoveInterfaceBase.GetFVRHeadset().GetResearchInterface()` to get a reference to the low-level FOVE (reserach) interface bindings object. With that, you need to enable eye images with:
`researchInterface.RegisterResearchCapabilities(EFVR_ResearchCapabilities.EyeImage)`After that, you should be able to call `researchInterface.GetEyeImage(out eyeImageContainer)` which takes a `FVRHeadset.Research.BitmapImage` reference and writes it out.
These last two methods return error codes that you should probably check as well, or else you might end up with missing or wrong data.
Finally, to actually get the image bytes properly, you'll need to call `eyeImageContainer.ImageData`, which returns an `SFVR_Buffer` object that contains an IntPtr and length (in number of bytes). That should be what you need to marshal an array of bytes into C#.
Cheers!
-
On which point are you struggling?
Due to my project policy I am not allowed to share the source code. But it's basically what Scott wrote put together. At the end you just need convert the byte[] into a jpg/png or your desired format.
Converting byte[] to jpg/png is pretty easy: https://stackoverflow.com/questions/8946846/converting-a-byte-array-to-png-jpg
Hannes -
We have some internal Unity projects that display the eye images, so I poked around and found some code and pasted it here:
https://gist.github.com/jbfove/029a2bb44a8037cffc45e622b7aee75e
I don't know if that will work out of the box because it may be depending on some other internal stuff, but it gets the general idea across, namely how to pull data from the research API and load it as a texture into Unity.
The C# API was updated in v0.16.0, though the general concepts are unchanged. The next version of the Unity plugin, which will be a large update, will use the new API. Once that's out, I'll make a point to post an official example of displaying the eye feed in Unity.
However, it will be a little while before we can finish the next version, so hopefully the code I pasted above, and help from Hannes and others can get you up and running!
Please sign in to leave a comment.
Comments
10 comments