Disaster Tweets Classification in Disaster Response using Bidirectional Encoder Representations from Transformer (BERT)
A. K. Ningsih,A. Hadiana
TLDR
BERT (Bidirectional Encoder Representations by Transformers) is a profound learning model developed by Google used in this paper to some dataset disaster tweets to help rescue and emergency responders establish effective knowledge management techniques for a rapidly evolving disaster environment.
Abstract
The omnipresence of smartphones helps people to declare an emergency in realtime. In times of crisis, Twitter has become a big communication platform. As a result, more organizations are interested in tracking Twitter programmatically. Although governments and emergency management agencies work together through their respective national system for response to disasters, the sentiments of the people affected during and after the catastrophe decide the effectiveness of the disaster response and the recovery process. In recent years, sentiment analysis via Twitter-based machine learning has become a common subject. However, the detection of such tweets was often difficult due to the tweets’ language structure’s uncertainty. Thus, it is not always apparent if the words of a person announce a catastrophe. BERT (Bidirectional Encoder Representations by Transformers) is a profound learning model developed by Google. Since Google opened it, several scientists and companies have embraced it and have applied it to many text classification tasks. Therefore, we use BERT in this paper to some dataset disaster tweets. This research will help rescue and emergency responders establish effective knowledge management techniques for a rapidly evolving disaster environment.
