Tracking user in web VR

web VR user tracking

When we wanted to get user feedback on our application we prepared the demo version and semi structured interview with participants. However we also wanted to log the user movement and what they are generally looking at in the VR in real-time. You could use wired VR headset for this purpose, such as Oculus Link, Rift or HTC Vive but since we have build our app as web experience we wanted to stay in the online realm. Main advantage is that this way you can even do remote user research while participant can simply stay at home. This tool is also ideal supplement for product feasibility study or GUI evaluation.

In the time of Covid-19 it can prove like practical approach. It can also be used to retrieve positional data from brand specific ecosystems like Oculus or Vive without dealing with hardware specific API of Facebook etc.

web VR user tracking
Logging user movement and rotation in web VR online

Our solution was to write little component for A-frame that will communicate via websockets to the remote server and send the user position and rotation as string with delimeters.

You can download the component from Trick the Ear Github repository for free. You can use it like so:

<html>
<head>
<script src="https://cdn.jsdelivr.net/gh/trackme518/tricktheear/logVR/a-frame/logCamera.js"></script>
</head>
<body>
<a-scene>
<a-entity camera look-controls wasd-controls rotation-reader="ip: ws://10.0.0.18:8025/track; position: true, rotation: true; interval: 100;"></a-entity>
<a-scene>
</body>
</html>

By writing position: false; you would track rotation only and vice versa. Interval is time in miliseconds between each message and ip is a string with your server adrresse.

User tracking in web VR with A-frame

Additional advantage is that the communication is based on vanilla web sockets and therefore you can run it directly in the browser. We implemented the reciever part in Java Processing 3.5.4 with help of websockets library but you can implement it in any software that can handle websockets. Sample output of incoming data:

-0.31928554801974135;1.6;-0.11819450292053287;0.024000000000000195;0.28200000000000036;0;12681
-0.31928554801974135;1.6;-0.11819450292053287;0.002000000000000196;0.5780000000000003;0;12798
-0.31928554801974135;1.6;-0.11819450292053287;-0.017999999999999808;1.0280000000000002;0;12914

Each line is separate entry. All values are split by delimeter “;”. First three values are x,y,z component of position and another three are x,y,z components of rotation. The last parameter is a timestamp in miliseconds since the program started.

Our reciever visualize the movement in 3d space with mouse controlled camera and record the incoming data to text file saved in local folder for additional processing in excel and alike.

If you are on windows64bit you can simply just download the latest build of our reciever and run logVR.exe – no installation is needed. If you are on other platform such as Linux or MacOS you need to build the app from the source provided – you will need to download Processing 3.5.4 and used libraries – websockets and PeasyCam.

You can start recording the incoming data by pressing “r” on keyboard. You will end recording by pressing “r” again. Data will be saved in data folder with current date in the file name. We have also added heatmap image as an output – it is generated automatically when recording.