讲座：The Effect of Slow Motion Video on Consumer Inference
题 目：The Effect of Slow Motion Video on Consumer Inference
嘉 宾：殷云露 青年副研究员 复旦大学
主持人：才凤艳 副教授 上海交通大学安泰经济与管理学院
地 点：上海交通大学 徐汇校区安泰楼A303室
Video advertisements often show actors and influence agents consuming and enjoying products in slow motion. By prolonging depictions of influence agents’ consumption utility, slow motion cinematographic effects ostensibly enhance social proof and signal product qualities that are otherwise difficult to infer visually (e.g., pleasant tastes, smells, haptic sensations, etc.). Seven studies including an eye-tracking study, a Facebook Ads field experiment, and lab and online experiments—all using real ads across diverse contexts—demonstrate that slow motion (vs. natural speed) can backfire and undercut product appeal by making the influence agent’s behavior seem more intentional and extrinsically motivated. The authors rule out several alternative explanations by showing that the effect attenuates for individuals with lower intentionality bias, is mitigated under cognitive load, and reverses when ads use non-human influence agents. The authors conclude by highlighting the potential for cross-pollination between visual information processing and social cognition research, particularly in contexts such as persuasion and trust, and discuss managerial implications for visual marketing, especially on digital and social platforms.
Yunlu Yin is assistant professor of marketing at the School of Management, Fudan University. He received his PhD in marketing from The University of Hong Kong. He is also a visiting scholar at Yale Institute for Network Science, Yale University. By employing interdisciplinary methodologies, including lab and field experiments, empirical analysis of large-scale digital trace data, and neuro-cognitive tools, his research focuses on 1) cognitive underpinnings of sensory and media marketing, and 2) biological drivers of consumption. His research appears in a diverse range of journals, such as Journal of Marketing Research, Neuroimage, eLife, and Journal of Neuroscience.