Application of machine learning for integrated flood risk assessment: Case study of Hurricane Harvey in Houston, Texas
Abstract. Flood risk, encompassing hazard, exposure, and vulnerability is defined concerning potential losses. Machine learning techniques have gained traction among researchers to address the complexities of multi-variable flood risk assessment models and overcome issues associated with non-linear relationships. However, the focus has primarily been on flood hazard prediction rather than comprehensive risk assessment and damage estimations. Therefore, there is a need for experiments that combine risk elements using such methods. To address this need, this study utilized the Random Forest algorithm to analyze the correlations between the physical flood damage caused by Hurricane Harvey in 2017 in Houston, Texas and certain hazard, exposure, and vulnerability-related variables. The study identified poorly drained soils as the primary contributor to the losses, followed by population density and the ratio of developed lands with medium intensity. The study's findings also explored the reasons for the unexpectedly low importance of social vulnerability factors compared to the environmental justice concept. These findings and conclusions can provide insights to planners and stakeholders enhancing their understanding of the underlying causes contributing to flood risk. Future research can expand upon this study's methodology and findings by incorporating additional factors related to climate change.
Status: final response (author comments only)
Viewed (geographical distribution)