From Aristotle to AI: Exploring the convergence of deepfakes and persuasion and their societal consequences
Main Article Content
Abstract
From the foundational tenets of Aristotle’s rhetoric to the digital complexities of today’s AI-driven technologies, the path of persuasive communication involves a variety of tools and tactics. At the center of this technological evolution are ‘deepfakes,’ which are advanced AI-generated videos that are almost indistinguishable from real content. This study uses critical discourse analysis, to show how deepfakes not only exemplify Aristotle’s principles but amplify them in an indistinguishable way, raising concerns about the spread of misinformation, and distrust in the media and political discourse. By juxtaposing rhetoric and contemporary AI technology, this paper sheds light on this shade of persuasion, prompting reflections on its implications, and putting forward suggestions for this increasingly AI-infused communication space.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
References
- Agarwal, S., Farid, H., Gu, Y., He, M., Nagano, K., & Li, H. (2019). Protecting World Leaders Against Deep Fakes. In CVPR Workshops, 1, 38–45.
- Aristotle. (1984). The Nicomachean Ethics. (W. D. Ross, Trans.). In J. Barnes (Ed.), The Complete Works of Aristotle: The Revised Oxford Translation (Vol. 2). Princeton, NJ: Princeton University Press.
- Baumlin, J. S., & Meyer, C. A. (2018). Positioning Ethos in/for the Twenty-First Century: An Introduction to Histories of Ethos. Humanities, 7(3), 78. https://doi.org/10.3390/h7030078
- Beattie, A., Edwards, A. P., & Edwards, C. (2020). A Bot and a Smile: Interpersonal Impressions of Chatbots and Humans Using Emoji in Computer-mediated Communication. Communication Studies, 71(3), 409–427. https://doi.org/10.1080/10510974.2020.1725082
- Brader, T. (2005). Striking a Responsive Chord: How Political Ads Motivate and Persuade Voters by Appealing to Emotions. American Journal of Political Science, 49(2), 388–405. Portico. https://doi.org/10.1111/j.0092-5853.2005.00130.x
- Braet, A. C. (1992). Ethos, pathos and logos in Aristotle’s Rhetoric: A re-examination. Argumentation, 6(3), 307–320. https://doi.org/10.1007/bf00154696
- Brummett, B. S. (2022). Rhetoric in Popular Culture. Sage Publications, Incorporated.
- Burke, K. (1964). Shakespearean Persuasion. The Antioch Review, 24(1), 19. https://doi.org/10.2307/4610566
- Chesney, R., & Citron, D. K. (2018). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3213954
- Clayton, E. W. (2004). The Audience for Aristotle’s Rhetoric. Rhetorica, 22(2), 183–203. https://doi.org/10.1525/rh.2004.22.2.183
- Cobb, M. D., & Kuklinski, J. H. (1997). Changing Minds: Political Arguments and Political Persuasion. American Journal of Political Science, 41(1), 88. https://doi.org/10.2307/2111710
- Dehnert, M., & Mongeau, P. A. (2022). Persuasion in the Age of Artificial Intelligence (AI): Theories and Complications of AI-Based Persuasion. Human Communication Research, 48(3), 386–403. https://doi.org/10.1093/hcr/hqac006
- Fairclough, N. (2013). Critical Discourse Analysis. Routledge. https://doi.org/10.4324/9781315834368
- Figueira, Á., & Oliveira, L. (2017). The current state of fake news: challenges and opportunities. Procedia Computer Science, 121, 817–825. https://doi.org/10.1016/j.procs.2017.11.106
- Fletcher, J. (2018). Deepfakes, Artificial Intelligence, and Some Kind of Dystopia: The New Faces of Online Post-Fact Performance. Theatre Journal, 70(4), 455–471. https://doi.org/10.1353/tj.2018.0097
- Fortenbaugh, W. W. (1992). Aristotle on Persuasion Through Character. Rhetorica, 10(3), 207–244. https://doi.org/10.1525/rh.1992.10.3.207
- Garver, E. (1996). The Political Irrelevance of Aristotle’s “Rhetoric.” Philosophy & Rhetoric, 29(2), 179–199.
- Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., & Bengio, Y. (2020). Generative adversarial networks. Communications of the ACM, 63(11), 139–144. https://doi.org/10.1145/3422622
- Halloran, S. M. (1975). On the End of Rhetoric, Classical and Modern. College English, 36(6), 621. https://doi.org/10.2307/374944
- Ingram, J. (2009). Political Emotions: Aristotle and the Symphony of Reason and Emotion. Philosophy & Rhetoric, 42(1), 92–95. https://doi.org/10.2307/25655340
- Kennedy, G. A. (2007). Aristotle : On rhetoric : A theory of civic discourse. Oxford University Press.
- Kietzmann, J., Lee, L. W., McCarthy, I. P., & Kietzmann, T. C. (2020). Deepfakes: Trick or treat? Business Horizons, 63(2), 135–146. https://doi.org/10.1016/j.bushor.2019.11.006
- Ling, H., & Björling, E. (2020). Sharing Stress With a Robot: What Would a Robot Say? Human-Machine Communication, 1, 133–158. Portico. https://doi.org/10.30658/hmc.1.8
- Machin, D., & Mayr, A. (2012). How to do critical discourse analysis : a multimodal introduction. Sage.
- Marwick, A. E., & Lewis, R. (2017). Media Manipulation and Disinformation Online.
- Mayer, R. E., & Moreno, R. (2003). Nine Ways to Reduce Cognitive Load in Multimedia Learning. Educational Psychologist, 38(1), 43–52. https://doi.org/10.1207/s15326985ep3801_6
- Mirhady, D. C., & Garver, E. (1996). Aristotle’s “Rhetoric”: An Art of Character. Phoenix, 50(2), 179. https://doi.org/10.2307/1192712
- Nichols, M. P. (1987). Aristotle’s Defense of Rhetoric. The Journal of Politics, 49(3), 657–677. https://doi.org/10.2307/2131273
- O’Sullivan, D. (2023, May 2). https://twitter.com/donie/status/1653556187487502336?s=20
- Qayyum, A., Qadir, J., Janjua, M. U., & Sher, F. (2019). Using Blockchain to Rein in the New Post-Truth World and Check the Spread of Fake News. IT Professional, 21(4), 16–24. https://doi.org/10.1109/mitp.2019.2910503
- Rhodes, S. C. (2021). Filter Bubbles, Echo Chambers, and Fake News: How Social Media Conditions Individuals to Be Less Critical of Political Misinformation. Political Communication, 39(1), 1–22. https://doi.org/10.1080/10584609.2021.1910887
- Rossler, A., Cozzolino, D., Verdoliva, L., Riess, C., Thies, J., & Niessner, M. (2019). FaceForensics++: Learning to Detect Manipulated Facial Images. 2019 IEEE/CVF International Conference on Computer Vision (ICCV), 1–11. https://doi.org/10.1109/iccv.2019.00009
- Sundar, S. S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility. Digital Media, Youth, and Credibility/MIT Press, 73-100. https://doi.org/10.1162/dmal.9780262562324.073
- Stenberg, G. (2006). Conceptual and perceptual factors in the picture superiority effect. European Journal of Cognitive Psychology, 18(6), 813–847. https://doi.org/10.1080/09541440500412361
- Sunstein, C. R. (2017). Republic. https://doi.org/10.1515/9781400884711
- Ternovski, J., Kalla, J., & Aronow, P. M. (2021). Deepfake Warnings for Political Videos Increase Disbelief but Do Not Improve Discernment: Evidence from Two Experiments. https://doi.org/10.31219/osf.io/dta97
- Thompson, A. (2023, May 25). https://twitter.com/AlexThomp/status/1650817703584645122?s=20
- Vaccari, C., & Chadwick, A. (2020). Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News. Social Media + Society, 6(1). https://doi.org/10.1177/2056305120903408
- van Dijk, T. A. (1993). Principles of Critical Discourse Analysis. Discourse & Society, 4(2), 249–283. https://doi.org/10.1177/0957926593004002006
- Vincent, J. , & Fortunati, L. (2009). Electronic emotion: the mediation of emotion via information and communication technologies.
- Weber, C. (2012). Emotions, Campaigns, and Political Participation. Political Research Quarterly, 66(2), 414–428. https://doi.org/10.1177/1065912912449697
- Westerlund, M. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39–52. https://doi.org/10.22215/timreview/1282
- Williams, J. D. (2013). An introduction to classical rhetoric : essential readings. Wiley-Blackwell.
- Wodak, R., & Meyer, M. (2015). Methods of Critical Discourse Studies. SAGE.
- Yumak, Z., Ren, J., Thalmann, N. M., & Yuan, J. (2014). Modelling Multi-Party Interactions among Virtual Characters, Robots, and Humans. Presence: Teleoperators and Virtual Environments, 23(2), 172–190. https://doi.org/10.1162/pres_a_00179