The Bulletin


.

When it comes to political advertising, is AI ever OK?

  • Written by Susan Grantham, Lecturer in communication, Griffith University
When it comes to political advertising, is AI ever OK?

The Liberal National Party Queensland (LNP) has recently taken a bold step in its political strategy by employing artificial intelligence (AI) to shape public perception of the current premier, Steven Miles. This move has not only highlighted the innovative potential of AI in political campaigning but also sparked significant debate about its ethical implications.

Globally, the use of AI in political campaigns is on the rise. In recent elections worldwide, AI has been harnessed to analyse voter behaviour, craft targeted messages, and even generate persuasive content.

We saw the use of AI in the UK general elections through the development of an AI-generated politician[1]. In February 2024, there was another powerful use of AI in Pakistan when Imran Khan and his Pakistan Tehreek-e-Insaf party generated an AI video of Khan to deliver a victory speech[2] written while in prison.

However, the LNP’s approach in Queensland marks a notable escalation in the Australian context, albeit with a much more light-hearted approach. This video depicts Miles in a realistic portrayal of dancing to a popular early 2000s song with the caption:

POV: my rent is up $60 a week, my power bill is up 20%, but the premier made a sandwich on TikTok.

It aims to sway voter opinion by casting doubt on Miles’ leadership.

It’s clever, but is it ethical?

While technologically impressive, the role of AI in political campaigning is in question. Negative campaigning is a common strategy used worldwide. Society has come to expect negative posts or commentary from opposing parties. For instance, during the 2022 Australian federal election, the Labor Party used technology and video editing tools to manipulate images of then prime minister Scott Morrison.

What makes the Queensland LNP example unique is the use of AI to manipulate the individual’s actual form.

The Labor Party also came under scrutiny recently for an AI-generated TikTok video featuring opposition leader Peter Dutton.

This video leverages AI to manipulate Dutton’s appearance and behaviour. It also exemplifies how AI technology can be used to create realistic and persuasive content.

AI’s ability to be convincing yet misleading at the same time challenges the boundaries of acceptable political debate. It also underscores the need for robust regulatory frameworks.

The Electoral Commission of Queensland has said[3] that while the state’s electoral act does not explicitly mention AI, it does cover the publication of false statements about a candidate’s character or conduct. However, political freedom of expression does allow for negative campaigning.

When politics and pop culture collide

From an election campaigning perspective, there has been a significant shift towards a more lighthearted and culturally relevant approach. Short-form video platforms serve as an excellent method to engage a generation of people who may not yet be politically aligned.

These platforms are exceedingly powerful tools. But platforms like TikTok are driven by algorithms, requiring content to be crafted to capture the algorithm’s interest. One effective strategy to achieve this is incorporating elements of popular culture and current trends. This can transform [4]a serious topic into more entertainment-driven content.

Consequently, for politicians, governments, and large organisations to use these platforms effectively, they must adopt these popular culture methods, regardless of the seriousness of the topics being addressed. This has resulted in a rising trend of “politainment[5]” by political figures.

However, politicians are also increasingly engaging with these platforms to develop a sense of authenticity. In Queensland, the two party leaders are using personal accounts to portray themselves as an “ordinary” Australian. The techniques they use to do this centre around domestic tasks such as cooking. A connection to food has been seen internationally, particularly in Italy[6], but is a relatively new approach in Australia.

Scott Morrison used to delight in showcasing[7] his cooking skills. However, this was not always to positive effect.

Ultimately, political parties are not new to using digital manipulation for strategic purposes. However, the question remains whether there should be rules governing the use of AI in election campaigns.

AI is mostly fine – but it should be clearly labelled as such

While freedom of speech in political campaigning is crucial, clear identification of AI use is essential to maintain transparency and trust. Restricting official accounts might push AI-generated content to more unofficial, harder-to-regulate sources, complicating the issue further.

The case in Queensland highlights the opportunities and challenges of integrating advanced technologies into political campaigns. As AI continues to evolve, its role in shaping political landscapes will grow.

Political parties, regulators, and the public must navigate this terrain carefully, ensuring that the integrity of democratic processes is upheld while embracing the innovative potential of AI.

References

  1. ^ AI-generated politician (theconversation.com)
  2. ^ AI video of Khan to deliver a victory speech (www.politico.eu)
  3. ^ has said (www.theguardian.com)
  4. ^ This can transform (www.tandfonline.com)
  5. ^ politainment (www.cogitatiopress.com)
  6. ^ particularly in Italy (doi.org)
  7. ^ used to delight in showcasing (www.theguardian.com)

Read more https://theconversation.com/when-it-comes-to-political-advertising-is-ai-ever-ok-235323

The Conversation