top of page

FLUX: Touchless Touchscreen

Team

Ziyang Fang,

Olivia Brandel,

Christine Lambert,

Dr. Andrea Goncher

Time Line

Aug 2020 - May 2021

Role

Product Designer

Software Engineer

UX Researcher

Skills

​Competitive Analysis

Survey design and analysis  Prototyping

​Usability Testing

Agile development

Problem Statement

Museums often use touchscreens to improve gallery accessibility, but the consequences of the COVID-19 pandemic have made these experiences unattainable. The Harn seeks to revive this experience as a touch-free interactive digital system for the Korean Gallery Experience.

Goals

The objective of the Touchless Touchscreen project is to enable the touchless navigation of a touch-based museum exhibit. Some goals set in the scope of work include:

  • Any user should be able to navigate the experience regardless of physical appearance or disability.

  • A focus on the user experience and ensuring the system is easy to program and utilize.

  • Since the Harn will display this prototype in a gallery, it must appear aesthetically finished and not be hazardous to passersby.

Design Process

1. Research

The Intuiface system currently used by the Harn is code-free by design. As such, there is no way to directly alter the program with code. This factor limits the touchless touchscreen prototype’s ability to adapt to the current system.  

  • A way around this is to have the prototype interact with Intuiface by selecting a coordinate on the touch screen rather than an element on the Intuiface experience design

  • The Harn is open to selecting a new system to replace Intuiface.   This is a secondary option if needed. 

  • A comparable product that offers gesture detection is a GesTrack 3D, which allows users to interact with onscreen media using simple hand motions. While this product is ideal for what is necessary for the exhibit, it is not available at an economical price.

IntuiFace Player the Harn used 

Our alternative solutions based on the research

After taking into account factors such as the Harn's environment, hardware limitations, software development challenges, and other relevant considerations, we decided to pursue a design direction based on gesture recognition.

2. Prototype Inspection

Hardware Architecture

The major components of the Azure Kinect DK all-in-one hardware for use in this project are a 12 MP RGB camera and a 1 MP depth camera. These critical pieces of hardware allow for thorough motion detection and gesture recognition.

Software Architecture

The software architecture uses a standard architecture (layered architecture). A software implementation option is Google's open-source gesture recognition package with TensorFlow Lite and MediaPipe. The package recognizes a variety of one-handed gestures.

3. System Design

UX Prototype

We had two stages for prototyping: User Experience (UX) Prototyping and Functional Prototyping. The former refers to the minimally-functional, highly-visual prototype built to highlight the UX element of the system rather than technical performance. We presented this prototype on 17 November 2020. 

Specifications

We also designed the specifications for the project. These specifications are testable, quantifiable, and directly related to the primary user of the device, the museumgoer.

4. Build

System Architecture Summary

The primary input source is an Azure Kinect DK’s RGB camera. A Python program utilizes the camera data to track the user’s hand shape and location relative to the camera. The system interprets the hand data into gestures and uses these gestures to control the Windows 10 cursor. 

Given this approach, the system can navigate more than just Intuiface. It can operate any software that runs on Windows 10 PC.

Functional Prototype

After we built up the system, we went to the museum for on-site functional prototype testing to ensure that the prototype was usable and ready for further user testing.

5. Test

User Testing

We had two sets of user testing: informal walk-up testers and formal user testing. For the formal user tests, we developed written test plans. 

The team implemented both qualitative and behavioral tests to understand perceived intuitiveness. Additionally, these tests provided the team with ideas for developing the user experience in Intuiface. 

During the informal tests, FLUX set up the prototype in an open area of the Harn to allow passing-by museumgoers and staff to try the system. The main goals that were accounted for by the usability testing were accessibility, clear instructions, and minimal hand movement. The purpose of both was to:

  • Determine if the system was easy and logical to use

  • Determine if the system requires too much effort to move the cursor with the gestures

Testing Finding 1: Gesture

In user testing, we presented users with two different hand gesture sets. The final prototype uses the “one-finger” and “two-finger” gestures as click and drag, respectively. This change reduced misdetections by 33% compared to the gesture set that included the “fist” gesture for click.

Testing Finding 2: Frame Size

During user testing, we noticed some misclicks caused by a user’s other hand. we solved this using the backend debugger window. In the final iteration, the user sits on a bench with their other hand on their lap. By cutting their other hand out of the frame, the system prevents misdetection.

Result

After five rounds of user testing and iterations, we delivered our final prototype to Harn, along with a detailed manual and report. The final functional prototype meets all the specifications (based on the testing results) set in the Scope of Work provided by Harn.

Tutorial

To help users learn how to use gesture controls, we made a tutorial and placed it at the beginning of the exhibition experience.

Video

Poster

Next Steps

For future software developments, the team could:

  1. Improve the front end of the program with a graphical user interface (GUI),

  2. Continue to train the gesture recognition algorithm,

  3. & Look for more efficient mouse/touchpoint (multi-input) control methods.

Credits To

bottom of page