Neoteric Advancements in Neural Automatic Text Summarization: A Comprehensive Survey
Sukriti Bohra,Manisha Kumari,Sandeep Mandia,Kuldeep Singh
TLDR
The model architectures, including pre-processing and feature extraction, are discussed in detail and the competitive efficiency of the widely recognized ATS architectures on different datasets are discussed.
Abstract
Natural Language Processing (NLP) aids computers in understanding human language well enough to converse naturally. Text summarization is a very useful and important part of NLP. Text summarising is the process of creating an organised, logical, and concise summary of a large text document. This paper offers an extensive analysis of current developments in neural automated text summarization (ATS). The model architectures, including pre-processing and feature extraction, are discussed in detail. The paper also discusses the competitive efficiency of the widely recognized ATS architectures on different datasets. We have discussed below the various aspects of each technique that researchers could further refer to help develop newer and more efficient algorithms to solve the text summarization problem. Finally, we look ahead to the future and bring this study to a close.
