{"title":"A Context-Centric Chatbot for Cryptocurrency Using the Bidirectional Encoder Representations from Transformers Neural Networks","authors":"Qitao Xie, Qingquan Zhang, Xiaofei Zhang, Di Tian, Ruixuan Wen, Ting Zhu, Ping Yi, Xin Li","volume":170,"journal":"International Journal of Economics and Management Engineering","pagesStart":150,"pagesEnd":157,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/10011865","abstract":"Inspired by the recent movement of digital currency,
\r\nwe are building a question answering system concerning the subject
\r\nof cryptocurrency using Bidirectional Encoder Representations from
\r\nTransformers (BERT). The motivation behind this work is to
\r\nproperly assist digital currency investors by directing them to
\r\nthe corresponding knowledge bases that can offer them help and
\r\nincrease the querying speed. BERT, one of newest language models
\r\nin natural language processing, was investigated to improve the
\r\nquality of generated responses. We studied different combinations of
\r\nhyperparameters of the BERT model to obtain the best fit responses.
\r\nFurther, we created an intelligent chatbot for cryptocurrency using
\r\nBERT. A chatbot using BERT shows great potential for the further
\r\nadvancement of a cryptocurrency market tool. We show that the
\r\nBERT neural networks generalize well to other tasks by applying
\r\nit successfully to cryptocurrency.","references":"[1] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, \u201cBERT: Pre-training\r\nof deep bidirectional transformers for language understanding, ArXiv\r\npreprint arXiv, 1810.04805, 2018.\r\n[2] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey\r\nDean. \u201cDistributed representations of words and phrases and their\r\ncompositionality\u201d. Proceedings of the 26th International Conference on\r\nNeural Information Processing Systems Vol. 2, pages 31113119, 2013.\r\n[3] Jeffrey Pennington, Richard Socher, and Christopher Manning. \u201cGlove:\r\nGlobal vectors for word representation.\u201d Proceedings of the 2014\r\nconference on empirical methods in natural language processing\r\n(EMNLP), pp 15321543, 2014.\r\n[4] Jeremy Howard and Sebastian Ruder. \u201cUniversal language model\r\nfine-tuning for text classification\u201d, arXiv:1801.06146 [cs.CL]2018.\r\n[5] Matthew E Peters, Mark Neumann, Mohit Iyyer, Matt Gardner,\r\nChristopher Clark, Kenton Lee, and Luke Zettlemoyer. \u201cDeep\r\ncontextualized word representations.\u201d arXiv preprint arXiv:1802.05365,\r\n2018.\r\n[6] Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever.\r\n\u201cImproving language understanding by generative pre-training.\u201d\r\nhttps:\/\/s3-us-west-2.amazonaws.com\/openai-assets\/research-covers\/\r\nlanguage-unsupervised\/language understanding paper.pdf, 2018.\r\n[7] Asa Cooper Stickland and Iain Murray. \u201cBert and pals: Projected\r\nattention layers for efficient adaptation in multi-task learning\u201d,\r\narXiv:1902.02671 [cs.LG], 2019.\r\n[8] Sina J. Semnani, Kaushik Ram Sadagopan, and Fatma Tlili.\r\n\u201cBERT-A: Fine-tuning BERT with Adapters and Data Augmentation.\u201d\r\nhttps:\/\/web.stanford.edu\/class\/archive\/cs\/cs224n\/cs224n.1194\/reports\/\r\ndefault\/15848417.pdf, 2019.\r\n[9] Mauldin, Michael L. \u201cChatterbots, tinymuds, and the turing test:\r\nEntering the loebner prize competition.\u201d AAAI. Vol. 94. 1994.\r\n[10] Manish Dudharejia. \u201cChatbots are the next big platform.\u201d Entrepreneur.\r\nwww.entrepreneur.com\/article\/298600. Oct. 4, 2017.\r\n[11] Weizenbaum, Joseph. \u201cELIZAa computer program for the study\r\nof natural language communication between man and machine.\u201d\r\nCommunications of the ACM 9.1, 36-45, 1966.\r\n[12] Xu, Anbang, et al. \u201cA new chatbot for customer service on social\r\nmedia.\u201d Proceedings of the 2017 CHI Conference on Human Factors\r\nin Computing Systems. ACM, 2017.\r\n[13] Christopher D. Manning, Prabhakar Raghavan and Hinrich Schtze.\r\n\u201cIntroduction to Information Retrieval.\u201d Cambridge University Press.\r\n2008.\r\n[14] Ameer Rosic. \u201cWhat is Cryptocurrency: Everything you need to know.\u201d\r\nBlockgeeks. blockgeeks.com\/guides\/what-is-cryptocurrency. Accessed\r\nSept. 13, 2018.\r\n[15] Market cap, 2020 https:\/\/www.ccn.com\/cryptocurrencies-are-lookingbullish-\r\nas-market-cap-shatters-6-month-high\/\r\n[16] Qitao Xie, Dayuan Tan, Ting Zhu, Qingquan Zhang, Sheng Xiao,\r\nJunyu Wang, Beibei Li, Lei Sun, and Ping Yi. \u201cChatbot Application\r\non Cryptocurrency\u201d IEEE Conference on Computational Intelligence for\r\nFinancial Engineering & Economics, 2019.\r\n[17] Devlin, J. und Chang, M.-W. \u201cGoogle AI Blog: Open\r\nSourcing BERT: State-of-the-Art Pre-training for Natural\r\nLanguage Processing, Google AI Blog.\u201d Available here:\r\nhttps:\/\/ai.googleblog.com\/2018\/11\/open-sourcing-bert-state-of-art-pre.html\r\n(Accessed: 2. February 2020). 2018.\r\n[18] \u201cThe Transformer for language translation.\u201d Available here: https:\/\/www.\r\nyoutube.com\/watch?v=KzfyftiH7R8&t=1022s (Accessed: 7. July 2020).\r\n[19] Sharan, S. \u201cSmaller, faster, cheaper, lighter: Introducing DistilBERT,\r\na distilled version of BERT.\u201d Available here: https:\/\/medium.com\/\r\nhuggingface\/distilbert-8cf3380435b5 (Accessed: 7. February 2020).\r\n2019.\r\n[20] Jianfeng Gao, Michel Galley, and Lihong Li. \u201cNeural Approaches to\r\nConversational AI: Question Answering, Task-oriented Dialogues and\r\nSocial Chatbots\u201d Information Retrieval Vol. 13, No. 2-3, pp 127298.\r\n2019.\r\n[21] Dumitru Erhan, Yoshua Bengio, Aaron Courville, Pierre-Antoine\r\nManzagol, Pascal Vincent, and Samy Bengio. \u201cWhy Does Unsupervised\r\nPre-training Help Deep Learning\u201d. Journal of Machine Learning\r\nResearch 11 625-660, 2010.\r\n[22] Jonas Mueller and Aditya Thyagarajan. \u201cSiamese recurrent architectures\r\nfor learning sentence similarity.\u201d Thirtieth AAAI Conference on Artificial\r\nIntelli- gence, 2016.\r\n[23] Brownlee, Jason. \u201cDifference Between a Batch and an Epoch\r\nin a Neural Network\u201d. https:\/\/machinelearningmastery.com\/\r\ndifference-between-a-batch-and-an-epoch\/, 2018.\r\n[24] VI Levenshtein Tlili. \u201cBinary Codes Capable of Correcting\r\nDeletions, Insertions and Reversals.\u201d https:\/\/nymity.ch\/sybilhunting\/pdf\/\r\nLevenshtein1966a.pdf, 1966.","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 170, 2021"}