Skip to main content

Has any of this happened to you?

A video call that didn’t feel quite right. A voice message that sounded almost—but not quite—like someone you know. A request for money or sensitive info that came out of nowhere.

If yes, you’re not imagining things. Phishing scams have evolved. With AI, attackers can now generate fake voice calls, fake video chats, and deepfakes so eerily realistic they could fool even your own mother!

Scammers are getting bold these days. They might show up pretending to be your CEO approving an “urgent” payment, an IT person who suddenly needs access “right now,” or even a family member in a full-blown panic asking for help.

If you get a voice or video call from someone you think you know, but the number looks unfamiliar and they launch straight into high-pressure demands for information or money, pause. Take a breath. Channel your inner detective.

Listen for the weird stuff. Maybe their timing is off, maybe the sentences sound like they were ironed flat, or maybe the whole emotional vibe feels… well, a little robotic, with unnatural pauses, way-too-crisp wording that doesn’t match how that person normally talks, you know – robotic. On video, keep an eye out for lighting that clearly forgot to follow the laws of physics, shadows refusing to cooperate, or the hairline and facial features are… negotiating with reality. Basically, anywhere AI still struggles to pass as a functioning human being.

To verify, ask a question only the real person would know. If anything feels even slightly off, hang up and reach out to them through a totally separate, trusted contact method. A few seconds of healthy skepticism can save you a lot of money and even more stress.