AI tools like ChatGPT, Microsoft Copilot, and Google Bard are very useful, but there is a dark side to this new technology that you should be concerned about
Companies and the general public alike are starting to use AI to make things easier, but so are scammers In addition to writing more convincing phishing emails, scammers are now using AI to create deep fakes of their loved ones' voices to use in their attacks
In addition to regular scam calls, be on the lookout for calls that appear to be from friends and family It may seem rude to question the legitimacy of a caller in an emergency, but doing so may save you from falling victim to the new "I've been in an accident" scam that is growing in popularity due to advances in AI
In a new blog post, Malwarebyte further elaborates on a San Francisco Chronicle article about a family that was nearly snatched up by one of these AI-powered scams
The family received a call, supposedly from their son, who said he had been in a car accident and had hurt a pregnant woman; as Malwarebytes points out, this type of scam is becoming more common, and in addition to car accidents, the familiar voice on the other end of the phone may have been the one that unexpectedly put them in the hospital to be hospitalized or suffer some other kind of tragedy
Like online scams, this strategy is used to create a sense of urgency so that potential victims act quickly before they have a chance to think deeply about what is actually happening A few years ago, this type of scam was easy to spot, but with the rapid evolution of AI, this is no longer the case
Now it is very easy for scammers to take clips from videos on social media and convincingly fake the voice of a loved one Worse, according to FBI Special Agent Robert Tripp, who spoke to the San Francisco Chronicle, the AI tools used to fake voices are "available in the public domain for free or very cheap"
After the first call regarding the accident, a second call came in from someone posing as the legal representative of the son in question, asking for bail money Fortunately, however, the family became suspicious when this purported attorney said he would send a courier to pick up the bail money
An AI-enabled scam call is not likely to go anywhere anytime soon So to prevent your friends and family from falling for these scams, you need to learn how to recognize them while making it harder for the scammers to impersonate you
For starters, don't answer calls from unfamiliar or private numbers If you answer a call from such a number because you think you are expecting an important call, take what they say with a grain of salt Likewise, do not give personal or financial information to strangers over the phone
If you receive such a call, it is a good idea to contact the family member or loved one to see if it is really an emergency However, if you cannot contact them directly, try calling someone who might know where they are
You should also notify the police immediately to prevent others from becoming victims of this type of scam In the United States, you can report suspicious activity by contacting the FBI's Internet Crime Complaint Center (IC3) directly Here, you can file a complaint or view the FBI's FAQs for more information on the different types of scams that are currently out there in the world
Like any tool, AI can be used for good or bad But hopefully companies and governments will find ways to use their own AI tools to combat the cybercriminals behind these scams
Comments