Smart Park Data Table

 

Do interactive urban data visualisation tools help our shared understanding of how cities are used?

 

Partners

University College London – Centre for Advanced Spatial Analysis

 


Deployment of urban data visualisation tool on multiple (7) platforms with varying degrees of immersion and interactivity, allowing for comparison between the different tools and their applicability on visualising urban data.

 


Augmented Reality technologies proved to be more easily deployable, requiring a single mobile device, and non-specialist members of the public found them to be more immersive, easier to use, and to offer more intuitive navigation of 3D space.

 


Virtual Reality approaches required special equipment to deploy (headset and/or controllers), thus reducing the user’s familiarity with the controls, but were found by non-specialist members of the public to be better overall for understanding urban datasets.

 

The Smart Park project aims to help people understand how cities are used, through a combination of state-of-the-art interactive visualisation methods and urban data sets. This project examines the potential of Augmented and Virtual Reality tools as platforms for urban visualisations that communicate Real-Time Data generated via Internet of Things devices, aimed at opening urban data to the wider public. The project focuses primarily on Queen Elizabeth Olympic Park (QEOP) as a case study, collecting and visualising real-time data sets relating to the park.

Multiple data sets are brought together to create a virtual view of the park as captured through urban data. Sources include open and publicly available datasets, such as transport data (using Transport for London’s web API), weather conditions (Wunderground web API), and social media, data generated by sensors deployed in the park (smart bat monitors), as well as simulated data. These datasets are overlaid over a highly detailed 3D map of QEOP using the Virtual London (Vilo) platform, an interactive 3D urban data visualisation platform developed at CASA. Using this setup, information is visualised within its environment, providing meaningful context for observers, thus helping them better understand urban data.

Multiple visualisations were developed exploring a wide range of approaches, with varying degrees of interactivity and immersion, from passive data monitoring

to interactive data manipulation. The QEOP Data Table, developed at CASA, uses an overhead projector to project data onto a table cut in the shape of the park, cycling through layers highlighting current conditions at the park, providing ample context with minimal interaction. Augmented Reality (AR) tools were used to add interactivity and immersion to the visualisation, by allowing users to overlay digital data on surfaces using different AR technologies (Apple’s ARKit, Google Tango, Microsoft’s HoloLens). The AR versions were implemented using handheld or head-mounted devices, and allowed users to view 3D geometry overlaid over the device’s camera feed, and toggle data layers on and off. Finally, two fully immersive Virtual Reality (VR) versions were developed using HTC Vive and the Google DayDream, which allowed users to navigate within virtual 3D space and explore QEOP from multiple perspectives. Current work-in-progress focuses on combining benefits from both AR and VR approaches into a single application (VR Binoculars), allowing for the virtual representation of data overlaid over large areas of the park viewed from a distance, thus giving the viewer an overview of information on the park while they are looking at the actual park.