Advertisements(ads) often include strongly emotional content toleave a lasting impression on the viewer. This work (i) compiles anaffective ad dataset capable of evoking coherent emotions acrossusers, as determined from the affective opinions of five expertsand 14 annotators; (ii) explores the efficacy of convolutional neural network (CNN) features for encoding emotions, and observes that CNN features outperform low-level audio-visual emotion descriptors upon extensive experimentation; and (iii) demonstrates how enhanced affect prediction facilitates computational advertising,and leads to better viewing experience while watching an online video stream embedded with ads based on a study involving 17 users. We model ad emotions based on subjective human opinions as well as objective multimodal features, and show how effectively modeling ad emotions can positively impact a real-life application.
The annotations can be downloaded here
Shukla, A., Gullapuram, S. S., Katti, H., Yadati, K., Kankanhalli, M., & Subramanian, R. (2017, October). Affect recognition in ads with application to computational advertising. In Proceedings of the 25th ACM Int’l Conference on Multimedia (ACM MM) (pp. 1148-1156).
Shukla, A., Gullapuram, S. S., Katti, H., Yadati, K., Kankanhalli, M., & Subramanian, R. (2017, November). Evaluating content-centric vs. user-centric ad affect recognition. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI) (pp. 402-410).