Register now for better personalized quote!

Survey reveals concerns over potential AI abuse in US presidential election

Jun, 03, 2024 Hi-network.com

A recent survey conducted by the Elon University Poll and the Imagining the Digital Future Center at Elon University has revealed widespread concerns among American adults regarding the impact of AI on the upcoming presidential election. According to the survey, more than three-fourths of respondents believe that abuses involving AI systems will influence the election outcome. Specifically, 73% of respondents fear AI will be used to manipulate social media, while 70% anticipate the spread of fake information through AI-generated content like deepfakes.

Moreover, the survey highlights concerns about targeted AI manipulation to dissuade certain voters from participating in the election, with 62% of respondents expressing apprehension about this possibility. Overall, 78% of Americans anticipate at least one form of AI abuse affecting the election, while over half believe all three identified forms are likely to occur. Lee Rainie, director of Elon University's Imagining the Digital Future Center, notes that voters in the USA anticipate facing significant challenges in navigating misinformation and voter manipulation tactics facilitated by AI during the campaign period.

The survey underscores a strong consensus among Americans regarding the accountability of political candidates who maliciously alter or fake photos, videos, or audio files. A resounding 93% of respondents believe such candidates should face punishment, with opinions split between removal from office (46%) and criminal prosecution (36%). Additionally, the survey reveals concerns about the public's ability to discern faked media, as 69% of respondents lack confidence in most voters' ability to detect altered content.

tag-icon Hot Tags : Artificial Intelligence Content policy Sociocultural US elections and digital policy publish

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.