A multiscale framework for mixed reality walking tours
MetadataShow full item record
Mixed Reality experiences, that blend physical and virtual objects, have become commonplace on handheld computing devices. One common application of these technologies is their use in cultural heritage "walking tours." These tours provide information about the surrounding environment in a variety of contexts, to suit the needs and interests of different groups of participants. Using the familiar "campus tour" as a canonical example, this dissertation investigates the technical and cognitive processes involved in transferring this tour from its physical and analog form into Mixed Reality. Using the concept of spatial scale borrowed from cognitive geography, this work identifies the need to create and maintain continuity across different scales of spatial experience as being of paramount importance to successful Mixed Reality walking tours. The concepts of scale transitions, coordination of representations across scales, and scale-matching are shown to be essential to maintaining the continuity of experience. Specific techniques that embody these concepts are also discussed and demonstrated in a number of Mixed Reality examples, including in the context of a successful deployment of a Mixed Reality Tour of the Georgia Tech campus. The potential for a "Language of Mixed Reality" based on the concepts outlined in this work is also discussed, and a general framework, called the Mixed Reality Scale Framework is shown to meet all the necessary criteria for being a cognitive theory of Human-Centered Computing in the context of Mixed Reality.