When we receive a communication from someone we don’t know, we should be skeptical about it. Especially if they’re asking for money to pay for medical expenses or a trip to America to gain freedom.
It’s most likely a scam. But what if that communication seems to come from someone we know? Or someone we know of? Or even someone we trust?
Lately, some folks have been receiving calls that seem to be coming from people they’re familiar with. Some of these calls even mimic the voice of President Joe Biden.
The calls – using AI-generated voices – ask for money. And they’re working. AI voice clones cost Americans about $2.6 billion per year. The elderly are most at risk.
AI Makes It Seem Real
Today I want to talk about how prevalent these calls have become. And how careful we need to be to avoid being deceived.
One of the biggest problems with artificial intelligence (AI) is how close to reality it seems.
Sometimes it’s difficult to tell the difference. Doctored photographs and AI-generated voices are the tip of the iceberg.
Some U.S. senators are aware of the problem. They’re asking the Biden Administration to do something about it.
Bipartisan Committee Fighting Back
Mike Braun is a U.S. senator from Indiana. He is the top Republican on the Senate Special Committee on Aging.
He’s leading an effort to petition the Federal Trade Commission (FTC) on the subject. He and other senators from both parties sent the agency a letter.
They’re trying to determine what the FTC knows about AI scams targeting the elderly. And what the agency is doing to protect them.
They’re especially interested in AI technology used to replicate voices of well-known individuals.
Fake Grandson Fools Elderly Couple
The bipartisan letter addressed FTC chairwoman Lina Khan. It mentioned voice clones and chat boxes. Including those trying to convince the elderly they’re communicating with a real person. Such as a relative, close friend, or celebrity.
One elderly couple almost lost $9,400 in a scam. But their bank alerted them to the potential fraud in time.
The caller had tried to convince the couple he was their grandson. And that he needed money to make bail.
In another case, someone posing as a kidnapper used voice-cloning technology. He duplicated the sounds of a crying child and demanded ransom.
Voice-Cloning Technology Is the Culprit
Older folks were already more vulnerable to scams than others. Partly because they grew up in an era when things like this rarely happened.
And also because many elderly people don’t have the same cognitive skills they used to have. Their hearing is not what it once was. One scam used a voice that sounded like movie star Jennifer Lopez.
Braun, who cited the $2.6 billion figure, said AI makes scams easier “because it’s like talking to your grandkid.”
He added that voice-cloning technology is making it much easier to fool people. Especially older individuals.
Connecticut Senator Gets in the Act
Richard Blumenthal is a Democratic senator from Connecticut. He used a creative way to let his fellow politicians understand how serious the problem is.
He opened a hearing on AI by using an AI-generated voice that sounded like his. The voice was reading a script generated by artificial intelligence.
Braun said this. “When you can replicate a voice to the extent I couldn’t tell if that was Senator Blumenthal or a replication… just imagine.
“That is a tool the scammers never had” at their disposal in the past. But now scammers are using it regularly.
Advice for the FTC
What do Braun and other politicians want the FTC to do about this? For one, they want the agency to update its “educational and awareness” materials.
That will help seniors understand scammers use artificial voices to steal their money.
They also want the FTC to air public service messages to bring more awareness to the problem. And also to maintain a hotline people can use. To make complaints about voice-cloning calls.
Braun added it’s a red flag when inventors of AI warn against its usage. He said, “I’ve never seen any new technology… where the people that created it have been more worried about how you use it.”
Biden’s ‘Voice’ Targets Voters
Recently, the Federal Communications Commission (FCC) made phone calls using AI voices illegal.
Not that this decision will stop these calls. But it could slow them down. The commission was partly influenced by political calls made in January.
The callers used a voice like Biden’s. It encouraged New Hampshire voters to skip the primary election.
FCC Chairwoman Jessica Rosenworcel made this statement in a press release. “Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters.”
Protect Yourself From Scams
What should you do if someone communicates with you, asking for money? Or for personal information?
If it’s someone you know (or think you know), tell them you will call them back. By doing that, you may learn it was a fraud.
If it’s someone you know of and trust, ask them to mail their request to you. Punishments for false solicitation through the mail are harsher than through phone calls.
If they ask for your address, don’t provide it. Not knowing your address is a sign they’re trying to scam you.
We’d all rather be trusting than suspicious. But we also need to protect ourselves. Be on guard against scammers.