In my view security and data privacy concerns have been mostly ignored in VR so far.
However in the long run these topics will get much more public attention
up to a level were missing security and data privacy features become a show stopper.
Don't get me wrong, I want xR technologies to be a success.
That's why I would like to raise these concerns today
hopping that they can be more easily implemented earlier than later.
Even with the current limited API possibilities there are already some concerns:
a) Protection of the positional tracking camera image.
DocOk has shown that by brighten an image of the positional tracking sensor for Rift CV1
you can easily identify everything that the sensor sees in the room.
b) Protection of the eye camera images:
The eye images might be good enough for medical purposes.
That might enable remote medical diagnosis and training software.
However maybe you want your personal health data only be accessible by people you trust.
c) Protection of HMD movement data and eye data:
I Don't know what insights can be derived from HMD head movement data.
But I'm sure more and more body parts will get tracked in the future,
allowing all kinds of analyses.
When you look at some higher level API functions the concerns increase.
a) User profiles
b) Mental workload measurement
c) Heatmapping integrated with the game engine analytics
d) Intent detection (how long and how often looking at things)
e) Mood detection (facial muscle movements, eyeline shape, heartbeat, breath)
f) Capturing of biometric data (IPD, iris, pupil, ...)
g) Authentication, authorization, and accounting (AAA)
I'm fully aware that as a startup, FOVE has to keep a laser sharp focus.
However I just wanted to start a discussion, so things hopefully start moving
in a direction where the above concerns are taken into account.
Please sign in to leave a comment.