Ji, Haohao, Qing, Linbo, Han, Longmei, Wang, Zhengyong, Cheng, Yongqiang and Peng, Yonghong ORCID: https://orcid.org/0000-0002-5508-1819 (2021) A new data-enabled intelligence framework for evaluating urban space perception. ISPRS International Journal of Geo-Information, 10 (6). 400.
|
Published Version
Available under License Creative Commons Attribution. Download (11MB) | Preview |
Abstract
The urban environment has a great impact on the wellbeing of citizens and it is of great significance to understand how citizens perceive and evaluate places in a large scale urban region and to provide scientific evidence to support human-centered urban planning with a better urban environment. Existing studies for assessing urban perception have primarily relied on low efficiency methods, which also result in low evaluation accuracy. Furthermore, there lacks a sophisticated understanding on how to correlate the urban perception with the built environment and other socio-economic data, which limits their applications in supporting urban planning. In this study, a new data-enabled intelligence framework for evaluating human perceptions of urban space is proposed. Specifically, a novel classification-then-regression strategy based on a deep convolutional neural network and a random-forest algorithm is proposed. The proposed approach has been applied to evaluate the perceptions of Beijing and Chengdu against six perceptual criteria. Meanwhile, multi-source data were employed to investigate the associations between human perceptions and the indicators for the built environment and socio-economic data including visual elements, facility attributes and socio-economic indicators. Experimental results show that the proposed framework can effectively evaluate urban perceptions. The associations between urban perceptions and the visual elements, facility attributes and a socio-economic dimension have also been identified, which can provide substantial inputs to guide the urban planning for a better urban space.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.