A Context-Centric Chatbot for Cryptocurrency Using the Bidirectional Encoder Representations from Transformers Neural Networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
A Context-Centric Chatbot for Cryptocurrency Using the Bidirectional Encoder Representations from Transformers Neural Networks

Authors: Qitao Xie, Qingquan Zhang, Xiaofei Zhang, Di Tian, Ruixuan Wen, Ting Zhu, Ping Yi, Xin Li

Abstract:

Inspired by the recent movement of digital currency, we are building a question answering system concerning the subject of cryptocurrency using Bidirectional Encoder Representations from Transformers (BERT). The motivation behind this work is to properly assist digital currency investors by directing them to the corresponding knowledge bases that can offer them help and increase the querying speed. BERT, one of newest language models in natural language processing, was investigated to improve the quality of generated responses. We studied different combinations of hyperparameters of the BERT model to obtain the best fit responses. Further, we created an intelligent chatbot for cryptocurrency using BERT. A chatbot using BERT shows great potential for the further advancement of a cryptocurrency market tool. We show that the BERT neural networks generalize well to other tasks by applying it successfully to cryptocurrency.

Keywords: BERT, chatbot, cryptocurrency, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988

References:


[1] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding, ArXiv preprint arXiv, 1810.04805, 2018.
[2] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. “Distributed representations of words and phrases and their compositionality”. Proceedings of the 26th International Conference on Neural Information Processing Systems Vol. 2, pages 31113119, 2013.
[3] Jeffrey Pennington, Richard Socher, and Christopher Manning. “Glove: Global vectors for word representation.” Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 15321543, 2014.
[4] Jeremy Howard and Sebastian Ruder. “Universal language model fine-tuning for text classification”, arXiv:1801.06146
[cs.CL]2018.
[5] Matthew E Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, and Luke Zettlemoyer. “Deep contextualized word representations.” arXiv preprint arXiv:1802.05365, 2018.
[6] Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. “Improving language understanding by generative pre-training.” https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/ language-unsupervised/language understanding paper.pdf, 2018.
[7] Asa Cooper Stickland and Iain Murray. “Bert and pals: Projected attention layers for efficient adaptation in multi-task learning”, arXiv:1902.02671
[cs.LG], 2019.
[8] Sina J. Semnani, Kaushik Ram Sadagopan, and Fatma Tlili. “BERT-A: Fine-tuning BERT with Adapters and Data Augmentation.” https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/reports/ default/15848417.pdf, 2019.
[9] Mauldin, Michael L. “Chatterbots, tinymuds, and the turing test: Entering the loebner prize competition.” AAAI. Vol. 94. 1994.
[10] Manish Dudharejia. “Chatbots are the next big platform.” Entrepreneur. www.entrepreneur.com/article/298600. Oct. 4, 2017.
[11] Weizenbaum, Joseph. “ELIZAa computer program for the study of natural language communication between man and machine.” Communications of the ACM 9.1, 36-45, 1966.
[12] Xu, Anbang, et al. “A new chatbot for customer service on social media.” Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2017.
[13] Christopher D. Manning, Prabhakar Raghavan and Hinrich Schtze. “Introduction to Information Retrieval.” Cambridge University Press. 2008.
[14] Ameer Rosic. “What is Cryptocurrency: Everything you need to know.” Blockgeeks. blockgeeks.com/guides/what-is-cryptocurrency. Accessed Sept. 13, 2018.
[15] Market cap, 2020 https://www.ccn.com/cryptocurrencies-are-lookingbullish- as-market-cap-shatters-6-month-high/
[16] Qitao Xie, Dayuan Tan, Ting Zhu, Qingquan Zhang, Sheng Xiao, Junyu Wang, Beibei Li, Lei Sun, and Ping Yi. “Chatbot Application on Cryptocurrency” IEEE Conference on Computational Intelligence for Financial Engineering & Economics, 2019.
[17] Devlin, J. und Chang, M.-W. “Google AI Blog: Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing, Google AI Blog.” Available here: https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html (Accessed: 2. February 2020). 2018.
[18] “The Transformer for language translation.” Available here: https://www. youtube.com/watch?v=KzfyftiH7R8&t=1022s (Accessed: 7. July 2020).
[19] Sharan, S. “Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT.” Available here: https://medium.com/ huggingface/distilbert-8cf3380435b5 (Accessed: 7. February 2020). 2019.
[20] Jianfeng Gao, Michel Galley, and Lihong Li. “Neural Approaches to Conversational AI: Question Answering, Task-oriented Dialogues and Social Chatbots” Information Retrieval Vol. 13, No. 2-3, pp 127298. 2019.
[21] Dumitru Erhan, Yoshua Bengio, Aaron Courville, Pierre-Antoine Manzagol, Pascal Vincent, and Samy Bengio. “Why Does Unsupervised Pre-training Help Deep Learning”. Journal of Machine Learning Research 11 625-660, 2010.
[22] Jonas Mueller and Aditya Thyagarajan. “Siamese recurrent architectures for learning sentence similarity.” Thirtieth AAAI Conference on Artificial Intelli- gence, 2016.
[23] Brownlee, Jason. “Difference Between a Batch and an Epoch in a Neural Network”. https://machinelearningmastery.com/ difference-between-a-batch-and-an-epoch/, 2018.
[24] VI Levenshtein Tlili. “Binary Codes Capable of Correcting Deletions, Insertions and Reversals.” https://nymity.ch/sybilhunting/pdf/ Levenshtein1966a.pdf, 1966.