Frederica Freyberg:
Political groups are expected to spend $423 million on campaign ads in Wisconsin this year, with $60 million in ads for the presidential race alone reserved for airtime between now and Election Day. We may be used to attack ads stretching the truth or twisting facts, but a new concern comes with the growing use of artificial intelligence, or AI, to create an entirely new false reality. “Here & Now” senior political reporter Zac Schultz has the story.
Zac Schultz:
Watch just about any local newscast in Wisconsin and when the station cuts to break, many of the commercials will be campaign ads.
TV announcer:
Thanks a lot, Kamala.
Steve Lavin:
Political will occupy a large percentage of our available inventory between now and the election and have for many months.
Zac Schultz:
Steve Lavin is the general manager of WBAY-TV in Green Bay. In a purple state, campaign ads are the best way to reach the last few voters who haven’t already made up their minds.
Steve Lavin:
Well, let’s be honest, the majority of the public is already decided the way that they’re going to vote, and these ads are just over that little fringe, 5 to 7% of undecideds, really undecideds in the middle.
Zac Schultz:
Most of the controversies over campaign ads come when one side demands their opponent retract an ad due to inaccuracies or technicalities. Lavin says TV stations have lawyers to deal with those issues.
Steve Lavin:
I’ll never pull an ad myself because number one, I would rather the other side — if there’s falsehoods in an ad, the other side has every bit of right to actually answer those by buying more ads, right?
Zac Schultz:
A new concern over ads has to do with the growing use of artificial intelligence, or AI, to generate images that look real. Earlier this year, the legislature passed a bill that requires any campaign ad using AI to include that information.
Steve Lavin:
Any time they’re using any type of AI, they’re supposed to disclose it.
Zac Schultz:
Not disclosing comes with a $1,000 penalty, but that penalty would be paid by the group running the ad. An amendment to the bill made sure broadcasters are not liable if AI is used and not disclosed.
Michael Wagner:
There is a bit of responsibility on the side of the broadcasters, but it’s also extremely hard to police.
Zac Schultz:
Mike Wagner is a professor of journalism at the University of Wisconsin who has studied the use of AI in political speech.
Michael Wagner:
How can they know for sure the video was AI generated? How can they know for sure the script was AI generated?
Zac Schultz:
Lavin says so far, there’s no evidence of any AI in campaign ads.
Steve Lavin:
Since that law went into place, I have not seen or heard an ad that has used that disclaimer yet. So either it’s — if it’s being used, nobody’s disclosing it or they’ve determined that the penalties are so high that they’re just not going to use AI to determine it.
Michael Wagner:
I think the real danger with AI in this election is not in campaign advertisements. It’s in social media posts that go viral.
Zac Schultz:
We’ve already seen examples of this.
Michael Wagner:
Using AI as a boogeyman, so something happens and the other side says, “Oh, that must be AI. Can’t possibly be real.”
Zac Schultz:
In August, Kamala Harris held a rally at an airport hangar in Detroit, Michigan. Donald Trump falsely claimed photos of the event used AI to make the crowd look bigger.
Garlin Gilchrist:
I spoke at that rally. I spoke to all 15,000 of those people. They are real.
Zac Schultz:
Garlin Gilchrist is Michigan’s lieutenant governor.
Garlin Gilchrist:
Donald Trump was so insecure about that crowd size that he had to find a way to try to delegitimize it. And so by playing on the fears of people and saying it was artificial intelligence and fake, that’s all he knows how to do is play on people’s fears.
Zac Schultz:
Wagner says beyond making AI a boogeyman, there’s another way AI can be abused.
Michael Wagner:
The other is that a candidate picks up on a post that uses AI and treats it as true, which has also happened where former President Trump shared information that Taylor Swift had endorsed him, which she had not.
Zac Schultz:
Neither AI incident seems to have affected the race for president. Taylor Swift later endorsed Kamala Harris.
Kamala Harris:
When we fight, we win.
Zac Schultz:
Who has proven her large crowds are real.
Mark Pocan:
What a crowd. You know, Donald Trump says Democrats can only have large crowds because of AI.
Zac Schultz:
AI wasn’t even involved in the biggest lie of the campaign so far.
Donald Trump:
They’re eating the dogs.
Zac Schultz:
When Donald Trump falsely claimed Haitian immigrants were stealing pets and eating them in Springfield, Ohio, the source of the misinformation was a Facebook post. No AI involved at all.
Michael Wagner:
So when these kinds of things happen too, especially when the candidates themselves pick it up and share it, those things are going to take on a life of their own in really remarkable and fast ways that are hard to regulate.
Zac Schultz:
The targets for misinformation are the same as the audience for campaign ads.
Michael Wagner:
Low information voters who are paying attention at the last minute are often susceptible to the messages because they’re new to them. They haven’t been paying attention to the race, and these things are new.
Zac Schultz:
And the solution to misinformation, whether AI generated or not, is the same as it’s always been.
Steve Lavin:
There is so much disinformation out there, whether it’s in social media, it could be in these campaigns, could be spreading, spreading disinformation. I think it’s up to the individual voter to determine what’s actually the truth.
Zac Schultz:
Reporting from Green Bay, I’m Zac Schultz for “Here & Now.”
Follow Us