MULTI-ARMED BANDIT-BASED ADAPTIVE CONTROL OF ADVERTISING IN SOCIAL NETWORKS

Authors

DOI:

https://doi.org/10.31649/ins.2024.1.83.92

Keywords:

advertising, advertising control, adaptation, social network, multi-armed bandit

Abstract

In a rapidly changing marketing environment, to ensure the success of an advertising campaign, it is necessary to minimize the time from the idea of advertising to the implementation of communication. There is a need for simultaneous advertising, study and adjustment of communication processes. Accordingly, it is necessary to make a certain sequence of decisions, the sum of which ensures the right balance between learning and earning. One of the most convenient adaptation mechanisms for practical implementation is the two-armed bandit scheme. A two-armed bandit is a model of multiple choices of one of two alternatives with an a priori unknown distribution of payoffs between them. Each alternative is associated with one arm of the bandit. As a result of the choices made, the probabilistic distribution of payoffs is identified by step to step and used to exploit the best alternative. In practice, there may be more alternatives, in which case the multiple decision problem is maped to the multi-armed bandit scheme. A new multi-armed bandit-based model for advertising control in social networks is proposed. The idea is that multiple advertisements are displayed simultaneously, and the frequency of display of each depends on its performance - from the level of viewer retention to the contact frame. Based on the retention rates, a certain share of resources is redistributed among the commercials on a daily basis. Simple and effective formulas for the reassignment of resources are proposed, taking into account both the current unevenness of retention rates and the statistics of video demonstrations. A simulation of the control model for a 35-day campaign for three commercials is performed. The performance of the control model is demonstrated both under steady-state conditions and in the case of a viral message insertion. The phase trajectories show how the response of the system to the insertion of a viral message changes with different control parameters.

Author Biography

Olena SHTOVBA, Vinnytsia National Technical University

PhD of Economics, Associate Professor, Associate Professor of Management, Marketing, and Economics Department

References

Feit, E. M., & Berman, R. (2019). Test & roll: Profit-maximizing A/B tests. Marketing Science, 38(6), 1038–1058. https://doi.org/10.1287/mksc.2019.1194Holland, J.H. (1992). Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. MIT Press, 211 p.Rothschild, M. (1974). A two-armed bandit theory of market pricing. Journal of Economic Theory, 9(2), 185-202. https://doi.org/10.1016/0022-0531(74)90066-0Scott, S. L. (2015). Multi-armed bandit experiments in the online service economy. Applied Stochastic Models in Business and Industry, 31(1), 37–45. https://doi.org/10.1002/asmb.2104Schwartz, E. M., Bradlow, E. T., & Fader, P. S. (2017). Customer acquisition via display advertising using multi-armed bandit experiments. Marketing Science, 36(4), 500–522. https://doi.org/10.1287/mksc.2016.1023Russo, D. J., Van Roy, B., Kazerouni, A., Osband, I., & Wen, Z. (2018). A tutorial on Thompson sampling. Foundations and Trends in Machine Learning. Now Publishers Inc. https://doi.org/10.1561/2200000070Lykouris, T., Mirrokni, V., & Leme, R. P. (2020). Bandits with adversarial scaling. In 37th International Conference on Machine Learning, ICML 2020 (Vol. PartF168147-9, pp. 6467–6477). International Machine Learning Society (IMLS).Iacob, A., Cautis, B., & Maniu, S. (2022). Contextual Bandits for Advertising Campaigns: A Diffusion-Model Independent Approach. In Proceedings of the 2022 SIAM International Conference on Data Mining, SDM 2022 (pp. 513–521). Society for Industrial and Applied Mathematics Publications. https://doi.org/10.1137/1.9781611977172.58Aramayo, N., Schiappacasse, M., & Goic, M. (2023). A Multiarmed Bandit Approach for House Ads Recommendations. Marketing Science, 42(2), 271–292. https://doi.org/10.1287/mksc.2022.1378

Published

2024-03-27

How to Cite

SHTOVBA О. . (2024). MULTI-ARMED BANDIT-BASED ADAPTIVE CONTROL OF ADVERTISING IN SOCIAL NETWORKS. Innovation and Sustainability, (1), 83–92. https://doi.org/10.31649/ins.2024.1.83.92

Issue

Section

Articles

Downloads

Download data is not yet available.