UPDF AI

Scope and Challenges in Conversational AI using Transformer Models

Arighna Chakraborty,Asoke Nath

2021 · DOI: 10.32628/cseit217696
International Journal of Scientific Research in Computer Science Engineering and Information Technology · 0 Citations

TLDR

The authors have given a comparison between the various models discussed in terms of efficiency/accuracy and also discussed the scope and challenges in Transformer models.

Abstract

Conversational AI is an interesting problem in the field of Natural Language Processing and combines natural language processing with machine learning. There has been quite a lot of advancements in this field with each new model architecture capable of processing more data, better optimisation and execution, handling more parameters and having higher accuracy and efficiency. This paper discusses various trends and advancements in the field of natural language processing and conversational AI like RNNs and RNN based architectures such as LSTMs, Sequence to Sequence models, and finally, the Transformer networks, the latest in NLP and conversational AI. The authors have given a comparison between the various models discussed in terms of efficiency/accuracy and also discussed the scope and challenges in Transformer models.

Cited Papers
Citing Papers