Interview with Robert Merki, Director of Product at Cognitive VR
Vancouver-based CognitiveVR helps content creators and developers figure out how people interact with their products by showing user behaviour on a visual mesh-map using the SceneExplorer plugin, and tracking their movements. This tells you which parts users pay more attention to, where users get stuck, or helps identify exactly where a crash happens.
Furthermore, it can tell you exactly how a user actually feels about it by logging their reactions in the exit polls and getting their thoughts on tape. Robert Merki, Director of Product will share his insights for Measuring Behavior in VR at VRAR With The Best Dec 3–4, but first was kind enough to answer some of our questions.
Q Your super SceneExplorer tool allows developers to analyse their experience setup, UX and bugs by using a simple engine plugin. What’s a typical build problem solved by this?
The typical problems we see fall into three main categories: performance, user confusion, and active feedback.
Performance: we’re able to show developers their users’ in-app performance both at scale and per user session. If a particular user has feedback about performance, a developer can quickly check on the exact user session and figure out what caused the performance issue. The developer can then go back to the web dashboard and cross reference performance metrics with certain hardware configurations.
User confusion: the beauty of VR is its ability to mimic real life, which unfortunately removes a significant amount of direction from developers. Users often get confused or lost in virtual worlds despite the best intentions of developers. Our platform allows developers to track and flag different confusion points for their users, as well as track the ways people organically explore their virtual worlds.
User Feedback: qualitative feedback is incredible important for VR experiences — and we’ve built a way for developers to receive in-game feedback directly from their users. We allow developers to place a prefab object in their scenes which, when prompted, gives users the ability to provide an answer to a simple survey question, as well as leave voice feedback (recorded via headset microphone). A developer can see all of these feedback items on their dashboard and visually replay user sessions based on this vocal and qualitative feedback.
Q More importantly, your sophisticated tech tracks the user and tells you vital information — if they get lost, what they don’t like, what they look at during the experience — How do you track the users gaze?
Right now, we track the center point of the headset forward position. This is generally a very useful way to track gaze, as the field of view (FOV) of most headsets is relatively small, so people generally don’t move their eyes away from center.
Retinal tracking is something we’re experimenting with and finding a lot of success with. Retinal tracking allows us to start tracking things like emotion and reactions to specific virtual objects. As soon as eye tracking goes mainstream, we will support it at scale.
We also track player positions and events, which become serialized and timecoded. This allows us to build a full replay of a user’s session inside of SceneExplorer.
Q What are the core analytics you gather on the CognitiveVR platform and how does this help developers?
Our core analytics tracking includes metadata like hardware information, app performance, and geolocation. This allows our customers to correlate different hardware configurations to performance. Developers immediately gain the ability to test — at scale — how their app performs across different graphics cards and processors.
Geolocation and custom events allow developers to optimize localizations of their VR apps. Developers can quickly see which countries or cities perform better given certain variables and custom events. We’ve seen app popularity spike when our customers pay special attention to localization improvements.
Q Vital to the feedback loop, how much of the intelligence is taken on board and what are creators’ reaction to the data?
Developers — especially VR developers — have very little time to learn new dashboards. For this reason, a lot of our process is automated. After a few lines of code to ping our servers upon user events, the workload is on our backend to provide insights for developers.
Furthermore, our dashboard provides query building tools to make this data painless, scalable, and efficient to access. So far, creators are extremely pleased with how deep they can look with ease.
Q Personally we love how ExitPoll is allowing people to leave comments at the end of their experience — what’s been the most surprising reaction?
The interesting thing during our ExitPoll beta testing is how obvious the insights become once you have direct vocal feedback from a user. Once you scroll back in time from the point of vocal feedback, you get this funny feeling of “duh!” every single time you look through a user’s session!
Some of the comments have been interesting. People really seem to love how easy it is to give direct developer feedback right away. The overwhelming majority of users choose to engage with these feedback questions — which is not something you see in most internet feedback tools.
Q You recently tweeted that VR needs more developers — what do you think is holding people back?
There are two specific issues holding back an increase in VR developers.
The first is hardware. A complete Vive or Oculus Rift setup requires both an expensive headset as well as a pricey gaming PC. This is inaccessible to many people — especially younger people who may have the necessary free time to experiment outside of their school or college time. For this reason — mobile VR seems to be a better route to develop for, considering most have smartphones. Unfortunately mobile VR is also somewhat fragmented between GearVR, Google Cardboard, and Google Daydream. This is not insurmountable, but certainly a factor.
The second item holding back VR is the disproportionate ratio of excitement to development. Virtual reality is a medium that requires a lot of experimentation, and the more developers that exist in the ecosystem the better. The excitement around virtual reality unfortunately surpasses the willingness to sit down and dig into the toolsets. Both Unreal Engine and Unity have excellent developer communities and are very accessible to get started with. I would like to see more people ambitiously creating new VR experiences instead of simply dreaming about future possibilities.
Sitting down and building 3D worlds that you can actually explore is one of the most transformative experiences you can have with virtual reality, and I think many who are timid about joining the ranks of the VR development community should make the leap of faith.
Q How is Cognitive VR supporting Indie Developers?
We have been extremely lucky to meet enthusiastic indie developers who are eager to beta test our software. These relationships have allowed us to continuously test and refactor our product with instant feedback — while providing said developers with free version of our products to help them improve as well. It has been a win-win situation for us all.
Moreover, we sponsor and attend VR hackathons all along the western coast of North America. This has allowed us to keep our ear to the ground with regards to VR development practices and process. We love the indie VR dev community!
Q What advice would you give to developers wanting to create some amazing experiences?
Virtual reality is full of unknowns, you must complete the feedback loop as fast as you can. Building amazing content is about listening to what customers want, and applying those findings to you vision. If you’re not listening to user feedback, it’s time to start!
Q Are you looking forward to speaking at VRAR With The Best in December?
Absolutely! The line-up of speakers and attendees is extremely high quality. These types of events are what we need for the next generation of VR developers to take shape!
Create your free account to unlock your custom reading experience.