| Русский Русский | English English |
   
Главная Архив номеров
18 | 06 | 2019
10.14489/vkit.2019.03.pp.026-034

DOI: 10.14489/vkit.2019.03.pp.026-034

Карпов Н. В., Демидовский А. В.
СРАВНЕНИЕ МЕТОДОВ МАШИННОГО ОБУЧЕНИЯ В ЗАДАЧЕ ПРЕДСКАЗАНИЯ ИНТЕНЦИИ УЧАСТНИКА ОНЛАЙН-ДИСКУССИИ
(c. 26-34)

Аннотация. Исследованы закономерности изменения интенций реплик пользователей в социальной сети. При этом использован оригинальный набор данных, где собраны диалоги пользователей из социальной сети, и каждое сообщение соотнесено с одним из 25 типов интенций. Рассмотрены современные методы машинного обучения, которые позволяют анализировать элементы последовательности и предсказывать следующие. Проведен вычислительный эксперимент на выбранном наборе данных по предсказанию следующей интенции пользователя в ходе дискурса. Дана оценка точности работы каждого алгоритма.

Ключевые слова:  предсказание последовательности; анализ социальных сетей; машинное обучение; искусственные нейронные сети.

 

Karpov N. V., Demidovskij A. V.
COMPARING MACHINE LEARNING METHODS APPLIED TO INTENT PREDICTION IN ONLINE DISCUSSIONS
(pp. 26-34)

Abstract. Popularity of social networks makes them an attractive field for analysis of users’ behavior, for example based on the intention analysis of their posts and comments. In the linguistic theory only 25 types of intentions exist and can be joined in 5 supergroups. We use the dataset that contains directed oriented graphs which nodes store information about the author intention, text of the post in the social network “Vkontakte” etc. Each graph is split in a linked list of nodes (a sequence, 13 156 sequences in our dataset) from root to each leaf so that the intention prediction becomes the sequence prediction. We have analyzed 14 traditional and ANN (Artificial Neural Network) approaches that address this task and propose to solve it with the original modifications of CNN (Convolutional Neural Network) and LSTM (Long Short Term Memory) architectures. It was decided to translate all posts to the embeddings which are then used as inputs for NN (Neural Network). In order to benchmark the proposed solution, we used he existing implementation of traditional algorithms in the SPMF (Sequential Pattern Mining Framework) library and implemented our solution (open source repository) in the Keras framework. According to the experiments we have identified that CPT+ (Compact Prediction Tree) and our proposed RNN (Recurrent Neural Network) outperform other alternatives. Also, predicting supergroups is more accurate. Finally, the context in the dialogs is lost quickly that allows to decrease the context size while keeping accuracy at the appropriate level. Although, the RNN is on par with the CPT+ it is more flexible in terms of adding new features to be used for prediction as well as task-specific tuning of hyper-parameters is expected to provide additional gain. We assume the latter to be the direction of further research.

Keywords: Sequence prediction; Social network analysis; Machine learning; Artificial neural networks.

Рус

Н. В. Карпов, А. В. Демидовский (Нижегородский филиал «Национального исследовательского университета “Высшая школа экономики”», Нижний Новгород, Россия) E-mail: Этот e-mail адрес защищен от спам-ботов, для его просмотра у Вас должен быть включен Javascript  

Eng

N. V. Karpov, A. V. Demidovskij (Nizhny Novgorod Branch National Research University “Higher School of Economics”, Nizhny Novgorod, Russia) E-mail: Этот e-mail адрес защищен от спам-ботов, для его просмотра у Вас должен быть включен Javascript  

Рус

1. Karpov N., Demidovskij A., Malafeev A. Development of a Model to Predict Intention Using Deep Learning // Supplementary Proc. of the Sixth Intern. Conf. on Analysis of Images, Social Networks and Texts (AIST 2017). Moscow, Russia, Jule 27 – 29, 2017. P. 69 – 78.
2. Gueniche T., Fournier-Viger P., Tseng V. S. Compact Prediction Tree: A Lossless Model for Accurate Sequence Prediction // Advanced Data Mining and Applications (ADMA 2013) / H. Motoda et al. (Eds) // Lecture Notes in Computer Science. 2013. V. 8347. P. 177 – 188. doi.org/10.1007/978-3-642-53917-6_16
3. CPT+: Decreasing the Time/Space Complexity of the Compact Prediction Tree / T. Gueniche et al. // Advances in Knowledge Discovery and Data Mining (PAKDD 2015) / T. Cao et al. (Eds) // Lecture Notes in Computer Science. 2015. V. 9078. P. 625 – 636. doi.org/10.1007/978-3-319-18032-8_49
4. Cleary J. G., Witten I. H. Data Compression Using Adaptive Coding and Partial String Matching // IEEE Transactions on Communications. 1984. V. COM-32, No. 4. P. 396 – 402.
5. Padmanabhan V. N., Mogul J. C. Using Predictive Prefetching to Improve World Wide Web Latency // ACM SIGCOMM Computer Communication Review. 1996. V. 26, No. 3. P. 22 – 36. doi:10.1145/ 235160.235164
6. Pitkow J., Pirolli P. Mining Longest Repeated Subsequences to Predict World Wide Web Surfing // Proc. of the 2nd USENIX Symposium on Internet Technologies and Systems (USITS’99). Boulder, Colorado. 1999. P. 139 – 150.
7. Laird P., Saul R. Discrete Sequence Prediction and Its Applications [Электронный ресурс] // Machine Learning. 1994. V. 15, No. 1. P. 43 – 68. URL: http:// jmvidal.cse.sc.edu/library/laird94a.pdf (дата обращения: 21.01.2019).
8. Willems F. M., Shtarkov Y. M., Tjalkens T. J. The Context-Tree Weighting Method: Basic Properties // IEEE Transactions on Information Theory. 1995. V. 41, No. 3. P. 653 – 664. doi: 10.1109/18.382012
9. Ron D., Singer Y., Tishby N. The Power of Amnesia: Learning Probabilistic Automata with Variable Memory Length // Machine Learning. 1996. V. 25, No. 2–3. P. 117 – 149.
10. Sequence Prediction Using Neural Network Classifiers [Электронный ресурс] / Y. Zhao et al. // Proc. of the Intern. Conf. on Grammatical Inference, PMLR. 2017. No. 57. P. 164 – 169. URL: http:// proceedings.mlr.press/v57/zhao16.pdf (дата обращения: 21.01.2019).
11. Sun R., Giles C. L. Sequence Learning: from Recognition and Prediction to Sequential Decision Making // IEEE Intelligent Systems. 2001. V. 16, No. 4. P. 67 – 70.
12. Pérez-Ortiz J. A., Calera-Rubio J., Forcada M. L. Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks // Intern. Conf. on Artificial Neural Networks. 2001. V. 2130. P. 719 – 724.
13. Tax N. Human Activity Prediction in Smart Home Environments with LSTM Neural Networks // The 14th Intern. Conf. on Intelligent Environments (IE'18), Rome, Italy. 2018. doi: 10.1109/IE.2018.00014
14. Hochreiter S., Schmidhuber J. Long Short-Term Memory [Электронный ресурс] // Neural Computation. 1997. V. 9, No. 8. P. 1735 – 1780. URL: http:// www.bioinf.jku.at/publications/older/2604.pdf (дата обращения: 21.01.2019).
15. Радина Н. К. Интент-анализ онлайн-дискуссий (на примере комментирования материалов интернет-портала ИноСМИ.ru) [Электронный ресурс] // Медиаскоп: электр. науч. журнал. 2016. № 4. URL: http://www.mediascope.ru/2238 (дата обращения: 21.01.2019).
16. SPMF: Sequential Pattern Mining Framework: An Open-Source Data Mining Library [Электронный ресурс]. URL: http://www.philippe-fournier-viger.com/ spmf/ (дата обращения: 21.01.2019).
17. Keras: The Python Deep Learning Library [Электронный ресурс]. URL: https://keras.io (дата обращения: 21.01.2019).
18. DL Intent Analysis: demid5111/dialog-intent-dl [Электронный ресурс] // GitHub, Inc. (US): офиц. сайт. URL: https://github.com/demid5111/dialog-intent-dl (дата обращения: 21.01.2019).

Eng

1. Karpov N., Demidovskij A., Malafeev A. (2017). Development of a Model to Predict Intention Using Deep Learning. Supplementary Proceedings of the Sixth Intern. Conf. on Analysis of Images, Social Networks and Texts (AIST 2017), pp. 69-78. Moscow.
2. Motoda H. (Ed.), Gueniche T., FournierViger P., Tseng V. S. et al. (2013). Compact Prediction Tree: A Lossless Model for Accurate Sequence Prediction. Advanced Data Mining and Applications (ADMA 2013). Lecture Notes in Computer Science, Vol. 8347, pp. 177-188. doi.org/10.1007/978-3-642-53917-6_16
3. Cao T. (Ed.), Gueniche T. et al. (2015). CPT+: Decreasing the Time/Space Complexity of the Compact Prediction Tree. Advances in Knowledge Discovery and Data Mining (PAKDD 2015). Lecture Notes in Computer Science, Vol. 9078, pp. 625-636. doi.org/10.1007/978-3-319-18032-8_49
4. Cleary J. G., Witten I. H. (1984). Data Compression Using Adaptive Coding and Partial String Matching. IEEE Transactions on Communications, Vol. COM-32, (4), pp. 396-402.
5. Padmanabhan V. N., Mogul J. C. (1996). Using Predictive Prefetching to Improve World Wide Web Latency. ACM SIGCOMM Computer Communication Review, Vol. 26(3), pp. 22-36. doi:10.1145/ 235160.235164
6. Pitkow J., Pirolli P. (1999). Mining Longest Repeated Subsequences to Predict World Wide Web Surfing. Proceedings of the 2nd USENIX Symposium on Internet Technologies and Systems (USITS’99), pp. 139-150. Boulder, Colorado.
7. Laird P., Saul R. (1994). Discrete Sequence Prediction and Its Applications. Machine Learning, 15(1), pp. 43-68. Available at: http://jmvidal.cse.sc.edu/ library/laird94a.pdf (Accessed: 21.01.2019).
8. Willems F. M., Shtarkov Y. M., Tjalkens T. J. (1995). The Context-Tree Weighting Method: Basic Properties. IEEE Transactions on Information Theory, Vol. 41, (3), pp. 653-664. doi: 10.1109/18.382012
9. Ron D., Singer Y., Tishby N. (1996). The Power of Amnesia: Learning Probabilistic Automata with Variable Memory Length. Machine Learning, 25(2–3), pp. 117-149.
10. Zhao Y. et al. (2017). Sequence Prediction Using Neural Network Classifiers. Proceedings of the International Conference on Grammatical Inference, PMLR, 57, pp. 164-169. Available at: http:// proceed-ings.mlr.press/v57/zhao16.pdf (Accessed: 21.01.2019).
11. Sun R., Giles C. L. (2001). Sequence Learning: from Recognition and Prediction to Sequential Decision Making. IEEE Intelligent Systems, Vol. 16, (4), pp. 67-70.
12. Pérez-Ortiz J. A., Calera-Rubio J., Forcada M. L. (2001). Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks. International Conference on Artificial Neural Networks, Vol. 2130, pp. 719-724.
13. Tax N. (2018). Human Activity Prediction in Smart Home Environments with LSTM Neural Networks. The 14th International Conference on Intelligent Environments (IE'18). Rome, Italy. doi: 10.1109/IE.2018. 00014
14. Hochreiter S., Schmidhuber J. (1997). Long Short-Term Memory. Neural Computation, Vol. 9, (8), pp. 1735-1780. Available at: http://www.bioinf.jku.at/ publications/older/ 2604.pdf (Accessed: 21.01.2019).
15. Radina N. K. (2016). Intent analysis of online discussions (on the example of commenting on materials from the Internet portal InoSMI.ru). Mediaskop: elektronnyy nauchnyy zhurnal, (4). Available at: http://www.mediascope.ru/2238 (Accessed: 21.01.2019) [in Russian language]
16. SPMF: Sequential Pattern Mining Frame-work: An Open-Source Data Mining Librar. Available at: http://www.philippe-fournier-viger.com/spmf/ (Accessed: 21.01.2019)
17. Keras: The Python Deep Learning Library. Available at: https://keras.io (Accessed: 21.01.2019)
18. DL Intent Analysis: demid5111/dialog-intent-dl GitHub, Inc. (US): official site. Available at: https://github.com/demid5111/dialog-intent-dl (Accessed: 21.01.2019)

Рус

Статью можно приобрести в электронном виде (PDF формат).

Стоимость статьи 350 руб. (в том числе НДС 18%). После оформления заказа, в течение нескольких дней, на указанный вами e-mail придут счет и квитанция для оплаты в банке.

После поступления денег на счет издательства, вам будет выслан электронный вариант статьи.

Для заказа скопируйте doi статьи:

10.14489/vkit.2019.03.pp.026-034

и заполните  форму 

Отправляя форму вы даете согласие на обработку персональных данных.

.

 

Eng

This article  is available in electronic format (PDF).

The cost of a single article is 350 rubles. (including VAT 18%). After you place an order within a few days, you will receive following documents to your specified e-mail: account on payment and receipt to pay in the bank.

After depositing your payment on our bank account we send you file of the article by e-mail.

To order articles please copy the article doi:

10.14489/vkit.2019.03.pp.026-034

and fill out the  form  

 

.

 

 

 
Поиск
Баннер
Баннер
Журнал КОНТРОЛЬ. ДИАГНОСТИКА
Баннер
Rambler's Top100 Яндекс цитирования