As AI technology continues to advance, so do the tactics scammers use. In a recent phone scam, several people (I've seen 3 videos) were made to believe that their family members had been killed in a car crash using an AI-generated voice that cloned her brothers voice. The scammer then requested money.
???? While the above cases were impersonating a family member, this tactic will very likely be used against businesses as well very soon. Scammers will clone the voice of a CEO, manager or director and use it to call clients, employees or partners and ask them to make cash transfers.
Very likely that Crypto community will see those scammers developing ways to scam us.
Here's what I advise you to do about it:
✅ Make all your family members aware of voice-cloning AI scams. Explain to the more vulnerable family members that you will never call them and ask for money or if they ever receive a call, make sure they have a process in place to double-check.
✅ If you're ever unsure about a request that's been made to you, the easiest thing to do is to hang up and video call the person making the request or ask them a “safe question” to check that they are who they say they are.
As we move into an AI powered world, we all need to be more vigilant because what we see and hear might not be real.
[link] [comments]
You can get bonuses upto $100 FREE BONUS when you:
💰 Install these recommended apps:
💲 SocialGood - 100% Crypto Back on Everyday Shopping
💲 xPortal - The DeFi For The Next Billion
💲 CryptoTab Browser - Lightweight, fast, and ready to mine!
💰 Register on these recommended exchanges:
🟡 Binance🟡 Bitfinex🟡 Bitmart🟡 Bittrex🟡 Bitget
🟡 CoinEx🟡 Crypto.com🟡 Gate.io🟡 Huobi🟡 Kucoin.
Comments