Preparing for Prototyping on the PufferSphere

After successfully assembling the Pufferfish spherical display (PufferSphere) and getting the software up and running, we started to work towards our goal of transitioning our existing tabletop prototype onto the PufferSphere. Our main goal is to build a touch and gesture-interactive prototype to promote science learning through data visualization. We will be exploring how users interact with the prototype on the PufferSphere to understand what gestures feel more natural to them in order to design better interactive exhibits.

To start the development process, the TIDESS development team had a meeting with our collaborators at Pufferfish Ltd . During the meeting we shared a demonstration of our current (tabletop) prototype. Following the demonstration, Pufferfish gave an overview of the PufferSphere system including what gestures it currently supports and how to create interactive applications for the PufferSphere, using PufferPrime software development kit. To begin prototyping, we built our first PufferSphere application that displayed just a video capture of our existing tabletop prototype visuals on the PufferSphere in the right aspect ratio (2:1) (Figure 1). This application helped us to understand better how well our existing prototype (which was designed originally for a flat-screen tabletop display) fits on the PufferSphere and what modifications are required in our existing OpenExhibits prototype to get it to work on the PufferSphere. We also discussed tweaking the interface aesthetics such as color, font style, and font size, to suit better the spherical form factor.

tabletop-prototype-onSphere
Figure 1: Tabletop prototype video capture on the PufferSphere

The over-arching goal of our project is to investigate new types of touch interactions (gestures) that support exploration of the content displayed on the sphere. To facilitate our investigation, we need the sphere software to recognize and log users’ gestures. Currently, the default behavior of the PufferSphere is only to interpret the tap gesture and it does not log every single touch that is detected. Also, any kind of dragging or swiping gestures on the sphere are interpreted only as rotation of the whole sphere, and for our purposes, we need the sphere to recognize and respond to more complex gestures (such as long-tap, drag of a finger, drag of two fingers and so on). To support these gestures, Pufferfish built a Touch Forwarding application programming interface (API) for us. This API level access gives us access to touch events such as touch position in terms of latitude and longitude (location on the sphere) and touch velocity and will eventually help us define our own gesture library and try different gestures in the prototype to better understand what feels natural to users. We are in the process of trying out this new API feature.

I am a 2nd year Ph.D. student in the Human-Centered Computing Ph.D. program at the University of Florida. I am thoroughly enjoying working on the PufferSphere and developing new applications for the interface. I am excited to continue learning about the nuances of designing applications for spherical displays and looking forward to using the sphere for my dissertation research.


by Nikita Soni


 

0


Posted: February 1, 2018


Category:
Tags: TIDESS Project


Subscribe For More Great Content

IFAS Blogs Categories