
AI-generated scams are expected to rise significantly in 2025, making it increasingly challenging to identify fraudulent content. In the past, scams could often be detected through obvious clues like spelling errors, poor grammar, and awkward punctuation. However, advancements in AI technology have eliminated these telltale signs. AI can now create highly realistic and personalized content, including texts, images, and voice simulations, blurring the line between legitimate and fraudulent communications.
One of the oldest scams still circulating is the “grandparent scam.” In this scheme, a scammer pretends to be a distressed relative, claiming to need money urgently—often for bail or an emergency. Traditionally, the caller’s voice might not have sounded quite like the real relative, prompting questions like, “Johnny, is that you?” However, with technological advances, scammers can now use AI to clone a relative’s voice using a short audio sample, often obtained from social media. When “Johnny” calls, it sounds convincingly like him, making the scam even more believable.
Ultimately, the scammer manipulates the grandparent into sending money.
Always use caution if you are being pressured for information or to send money quickly. Scammers often try to bully victims into transferring money through a mobile payment app, wiring money, purchasing gift cards or money orders. Some may even request a meeting in person to receive the money. If you get a call like this, hang up and report it immediately to local law enforcement.
Want to know more about how to spot an AI scam? Click here.
—>Your feedback matters. Is there information about scams or fraud you would like the Scam Spotter to cover? Please send your requests and feedback to amc@denverda.org
|