AI Voice Impersonation Scams on the Rise

According to the Federal Trade Commission (FTC), imposter scams, in which a criminal impersonates someone to steal money or personal information from a target, continue to be one of the most common forms of fraud in the U.S. A reported $2.6 billion were lost to imposter scammers in 2022, and the degree of impersonation has grown in sophistication along with advancing technology. While some scammers continue to attempt impersonation via email or text, there is a growing trend of employing Artificial Intelligence (AI) to clone and impersonate a person’s voice to call targets and trick them into revealing sensitive information. Read on to learn how title and real estate industry professionals and consumers can avoid falling victim to this rising crime.

WHAT IS AI VOICE IMPERSONATION?

With AI becoming more mainstream and affordable, more people are using it in the workplace and at home. However, its accessibility also makes it a dangerous tool for criminals looking to use the technology maliciously. AI voice impersonation (also known as voice cloning) is the creation of an artificial rendering of a person’s voice using AI tools or software. The software creates a synthetic, digital copy of a unique human voice, considering the person’s gender, accent, speech patterns, inflection and breathing. According to a recent report from computer security software company McAfee, these copies can match 85 percent to the real voice. The same report included other alarming statistics, including that, in some cases, just three seconds of audio is all that is needed to clone a person's voice. Scroll through any social media platform or video messaging app, and it's easy to find seconds of someone's voice recorded. In fact, McAfee found that 53 percent of all adults surveyed share their voice online at least once a week.

Once a voice is copied, criminals can then use the cloned version to activate voice-controlled digital devices, contact clients or business associates to extract money or personal information, or even attempt virtual kidnapping. Voice-cloning has become so ubiquitous that the Federal Trade Commission (FTC) recently released a Consumer Alert warning about the rise of “family emergency” schemes targeting parents and grandparents to wire money to a supposed child or grandchild in distress.

HOW TO AVOID BEING TRICKED

So, how does one combat such a chillingly persuasive method of fraud, particularly as a title or real estate professional interacting with clients and customers? Following these best practices can help you avoid falling victim to an AI voice impersonation scam:

  • Stay up-to-date on cybercrime schemes. The Federal Bureau of Investigation (FBI) provides resources on common scams and crimes, and how to prevent and avoid them.
  • Consider allowing unknown numbers to go to voicemail. If you receive a call from a number you don’t recognize, allowing it to go to your voicemail gives you the opportunity to research the credibility of the caller and their request.
  • Be mindful of what you share on social media. Voice cloning scammers might go through social media and video messaging apps to find their material. Think twice before sharing anything too personal, and consider adjusting your privacy settings to allow only friends and family access to your profile.
  • Recognize red flags. Telltale signs of scams include the call coming from an unknown number, the caller making a suspicious or unusual request, a sense of extreme urgency and/or requests for money through hard-to-trace routes like wire transferring, gift cards or cryptocurrency.
  • Establish a codeword or phrase with clients or colleagues. Security experts recommend this to throw off synthetic speech algorithms and distinguish whether the situation is a real emergency.
  • Always stop to verify the call. Even if the call is coming from a trusted phone number, it could still be fake. ALWAYS call the person back at a trusted number to verify the caller’s story or request. If you can’t reach them, have others try to contact them via phone as well. Don’t worry about looking silly; it’s better to be safe than sorry.
  • Stay vigilant. Scammers bank on human error and emotions like panic or confusion. Always be cautious and stick to these best practices to ensure your protection.

Though AI voice impersonation may seem like something out of a dystopian nightmare, knowing the signs, staying calm and always verifying the call yourself can help thwart a potential scam.

To learn more about imposter scams in the title and real estate industry, click here and here. Information on other common cybersecurity threats can be found here, and helpful resources for preventing wire fraud are available from the American Land Title Association.