Interface 360

Contact Information

Jeremy Rook: rookjere@msu.edu

Jinghan Ni: nijingha@msu.edu


Challenge

360 Content is becoming more prevalent. As VR headsets become more popular, there is more incentive for content creators to create immersive experiences for this new platform. However, there is still a considerable amount of people who will not interact with VR content. How can users who may be unable to access 360 content via VR headsets interact in an immersive and enjoyable way. Currently, users are restricted to using a mouse and click-dragging to look around a 360-degree space. We believe that this is time consuming, imprecise, and not nearly as enjoyable as the VR experience.

Goal

Our goal is to create an interface using Arduino and Grove that approaches web-based 360-degree content in a novel way. Provide an interaction with content normally accessed via VR headsets through more conventional hardware by emulating the mouse click-drag command that is used in web browsers to explore 360-degree content. Provide users a method of easily navigating to major 360-degree content sites.


Design

We started our project with a research related to the field that we are interested. Through the research we noticed that as VR headsets become more popular, there is more incentive for content creators to create immersive experiences for this new platform. However, there is still a considerable amount of people who will not interact with VR content. These insights give us a How-Can-We statement in mind “How can we make users who may be unable to access 360 content via VR headsets interact in an immersive and enjoyable way?”

We then use a high concept paper to finalize our problem statement, design objectives, key features, audience and a specific user persona. In order to implement our design objects into our actual interface, we talk about what key feature should our interface have, what kind of feedback we want to create for the user and how this interface might change their 360 content viewing experience.

Then we move on coding with Arduino. During the coding process we realized that it is very important to test the range for x and y axis for the accelerometer,and give appropriate vibration feedback for specified situation. After trying different ways, taking different routes, tested and failed many times, our interface’s performance definitely improved a lot, it can now provide vibration feedback when user tilting our interface up and down, left and right.

We also used Tinkercad and Makerbot-mini to design and print our 3D model for the interface, we designed two different models to test for best usability, one model is inspired by the Wii controller, and the other one is inspired by HTC Hive, both controller is wide used in gaming industry. After adjusting some details and features, we attached our Arduino board, buttons and vibration feedback device on to the model. We then designed a survey, defined several user tasks and a set of interview questions to ask for feedback. At the final stage, we evaluate our testing result summarized our data and user feedback, The data we collected helped us for our revise stage and also gives us a lot of inspiration for future work.


User Experience Analysis

We invited five subjects to participate in a guided task-based session with the new interface device. All five participants completed a survey where they were asked to rate their experiences with a 360-degree video using three different control methods: a mouse, a keyboard, and Interface360. After the participants completed the survey, four of them were interviewed to understand more of their sentiment and experience with the device.

x-axis: clockwise completion time. y-axis: counter-clockwise completion time

Participants were asked to navigate from a set point in a 360-degree Video back to that point by rotating completely around the video. Each participant was asked to do this twice for each interface: once in a clockwise direction, and once in a counter-clockwise direction. The graph to the left shows the results of the task.

The mouse and keyboard performance times were regularly completed in less time. This indicates that participants had a harder time controlling Interface360 than conventional control methods. Interviews discovered just that, along with other compelling results. 

While participants were enthusiastic about how fun Interface360 was, the fact is that more familiar devices were preferred, regardless of previous experience controlling VR and 360-degree content. It was generally agreed that the Interface360 device felt good in the hand, but there were too many issues pertaining to the hardware to suggest that it is good enough. That being said, users enjoyed the more immersive nature of the Interface360 experience.

These findings will help shape future iterations of this prototype.


Future Work

According to our testing data and feedback we got from our testers, one thing that need to be improved is definitely the tilting accuracy for the “ups” and “downs”. We want to make sure that the user is in full control of this interface when viewing the 360 content in the future. We hope this interface can provide same or even better performance comparing with mouse and keyboard.

Another aspect for future work is to maybe redesign our user test tasks to implement full exploration of the interface, for example other than complete certain movements, we can allow user to find specific objects in the video using our interface. Also, in future testing, we will try to select different testers with different experience, different gender, age. These data will give us a clearer picture of our user group.

For the device itself, because we want to make this interface fun and meaningful at the same time, we are also thinking to add exploration buttons to allow users to switch different 360 contents and be able to not only view the content but also be able to perform multi-task while exploring the website, such as viewing comments, adjust video volumes etc.


Project Files