Files
Abstract
Architects are increasingly adopting gesture-based interaction to create responsive, engaging and inspiring spaces. However, there is no established theories providing guidelines to link gestures and spatial movement. This thesis critiques the prevailing approach in HBI in which users learn "cookie-cutter" gestures from gesture elicitation studies for interacting with the built environment. Instead, I argue that these gestures should emerge naturally as mutual convergence between users and intelligent architecture in order to be customized to different user group. In this thesis, I demonstrate how gestures can emerge through the interaction between user and architecture by developing an interactive art installation with machine learning algorithm and confirm its potential of increasing curiosity and engagement by conducting a user study.This thesis also compares these emerged gestures with the "cookie-cutter" gestures and raise the discussion on the implications for the future of intelligent responsive architecture.