Anti-Gunners at it Again With Legally Dubious AI-Generated Voice Mails to Congress
Artificial intelligence is pretty cool, all things considered, or would be without some of the problems we’ve seen. As a science fiction fan, I can’t call it true AI by any stretch of the imagination, but it’s kind of interesting to have a conversation with a computer program sometimes.
But with the emergence of AI, as we currently have it, there’s a problem. As with any technology, it can be misused. People have used it to create pornography of well-known people, for example; people who never consented to this kind of thing.
Yet it seems many people love this technology, particularly in an effort to try and sway legislative opinions toward gun control.
We’ve seen it before.
It seems we’re seeing it all over again.
Activist groups are using a typical advocacy tool — voicemails to members of Congress — with a new, uncomfortable twist: They’re from the deceased victims of gun violence, generated by artificial intelligence.
TheShotline.org, a gun reform campaign by March for Our Lives and Change the REF, is asking constituents nationwide to send representatives in their zip code the AI-generated phone calls.
The voice memos feature six victims of gun violence, including those killed in mass shootings, suicide, and people like 15-year-old Ethan Song, whose accidental death was the result of an unsecured gun in 2018 in Connecticut. The digital rendition of Ethan’s voice briefly explains his passion for helping animals and people and the inability to continue to “help anyone in need anymore.” It quickly segues into a plea to lawmakers to “finally do something to protect kids from guns” and a warning to members of Congress that if gun reform bills aren’t passed, they face the risk of being voted out.
The advocacy organizations hope the chilling recordings will lead to the passage of federal gun reform, specifically an assault weapons ban.
“I want these politicians to sit there and listen,” said Brett Cross, a father of one of the victims featured on Shotline.org, “I want them to imagine that that’s their children’s voices, because they didn’t do anything to prevent countless children being slaughtered.”
Except, there’s one massive problem beyond the fact that it’s just disgusting.
It’s likely also illegal.
The FCC has ruled that using AI to create the voice of someone in order to make a robocall–and that’s what’s happening here–is an illegal act. This can and should be prosecuted.
I’m not a big fan of regulations, even of AI, but the truth of the matter is that this is wrong The law is pretty clear, and when someone is trying to push for new laws, they should probably follow existing ones.
Then there’s the fact that there’s no evidence this works.
These aren’t voters, after all. They’re AI-generated voices of people who are no longer with us. The dead only vote in places like Chicago.
In all seriousness, though, I’m with Tennessee’s Rep. Tim Burchett on this. He called these calls “fraudulent,” which they are, and noted that they’re useless for creating systemic change, adding:
“My office has received these types of calls,” Burchett said in a statement to POLITICO. “If people want to make their voices heard, they should contact their elected representatives and express their concerns directly.”
So illegal and ineffective.
I’m sorry, but that kind of sums it up perfectly, doesn’t it?
Read the full article here