We all know people who have forked over thousands or perhaps tens of thousands of dollars to scammers. Perhaps you’ve even been defrauded by these criminals.
Avoiding scams is about to get a whole lot tougher due to artificial intelligence (AI).
In the past, it was apparent to some that it wasn’t actually the IRS emailing them, demanding immediate payment. While IRS agents aren’t necessarily known for their eloquent prose, communications from the agency usually aren’t littered with grammatical mistakes and broken English.
And many seniors who’ve been woken up at 3 a.m. by their “grandsons” pleading for cash because they’ve been arrested in a foreign country were astute to recognize that it wasn’t the voice of their grandchild.
AI could change all that.
With AI, scammers can use information about you to produce official-sounding emails from what appear to be banks or government agencies.
Software that can replicate voices and videos of loved ones can make a convincing argument that they are in fact in trouble and need cash. It may even capture your voice to fool your bank.
And one of the biggest scams out there involves love, where a scammer pretends to be romantically interested in the victim, persuading them to send money. AI will be able to take available information on the victim and create a Casanova or femme fatale who knows everything about a person and can potentially sweep them off their feet.
AI scams are expected to cost victims $100 billion in the next 18 months.
So here are a few ways to avoid them.
No gift cards or crypto: There is not a single legitimate business or investment in the world that will ask you to send gift cards. Choosing to purchase a gift card for a loved one or friend is fine. But if you are ever asked to purchase gift cards and send them… run – don’t walk – away. It is a scam.
The same can mostly be said about crypto. For the most part, being asked to send crypto is a giant red flag. There may be a few exceptions, such as businesses in the crypto industry. But cryptocurrencies are often used by criminals and hackers because they are mostly untraceable. Never convert cash to crypto to send to someone.
Have some family secrets: AI voice scams are going to be pretty sophisticated. With just a few snippets of a loved one’s recorded voice, fraudsters may soon be able to create an entire conversation with you, pretending to be someone you know.
Have an in-person discussion with your family about a code word or specific family event that can be referenced if one of you receives a phone call like this.
For example…
Caller: Grandma, I’ve been arrested. Please don’t tell Dad, but I need $1,000.
You: Bobby, what’s the code word?
Or…
You: Bobby, where did we celebrate your 4th birthday?
If you use some kind of family event or story, it should be something that cannot be found online or on social media. It should include some detail from family lore that a criminal would not have access to.
Avoid saying “yes” or “I agree”: If you don’t know who you’re speaking with, do not say the words “yes,” “I agree,” “I confirm” or anything similar. Scammers can use your voice to create requests from your financial institutions. And if they have your voice confirming the request to withdraw funds, that’s going to make their lives that much easier.
Don’t answer the phone if you don’t know the number: I never answer the phone if I don’t recognize the number. If it’s important, they’ll leave a message. It not only saves me from potential fraud but also saves me time and spares me the aggravation of dealing with cold callers, survey takers and anyone else I don’t want to talk to.
Government agencies will NOT email you if you’re in trouble: If you have a tax or Social Security issue, the government will let you know by mail. It will not email you, asking you to click a link and send money. Remember, AI will know a lot about you, so these emails may look real. If you are ever in doubt, go to the government’s websites or call. Look up the number yourself. Don’t call a number that’s included in the email.
Never give money to a romantic partner you haven’t met: With online profiles, you may think you’re communicating with a beautiful, sophisticated woman from Minneapolis, but in reality, you’re talking to a 23-year-old dude in Indonesia. AI will allow these types of scammers to know a lot about you and figure out what moves you emotionally.
Never, ever, ever send money to someone you haven’t met in person. Even phone calls may be AI-generated conversations. Don’t send money for them to come see you.
If your heart is bursting and you want to see them, go visit them. If they turn out not to be real, at least you’ll have a weekend to yourself in Minneapolis.
These scammers are getting more sophisticated every day. Keep your guard up.
If you have any tips on how to avoid being the victim of scams, leave them in the comments section.