How Someone Can Ruin Your Life With Your Stolen Personal Information

Deep Fakes: How Someone Can Ruin Your Life With Your Stolen Personal Information?

Deep fakes — ever heard of this term?

What if I told you that you’ve used it yourself at least once.

Yes, that’s right. Those face swap pictures you took with Snapchat, are linked to machine-learning and deep fakes on some level.

News about deep fakes is all over the internet. You may have seen countless fake videos on YouTube of politicians and celebrities that look too real.

Just a few days back, numerous fake videos of Tom Cruise in awkward situations made headlines over the internet. With the help of machine-learning, a TikToker by the username ‘deeptomcruise’ gained millions of views.

Deep faking involves taking someone’s face and mapping it onto someone else’s face. While you may perceive it as harmless, it can be quite sinister if you think about it.

For instance, what if someone stole your personal information and created a deep fake of you. Someone can legitimately ruin your life.

Other than being used for harmless entertainment, deep fakes are also being used to seek revenge. According to TheNewYorkTimes, deep fakes pose a great risk to women and children.

These days, It’s not uncommon to see fake face swapped adult videos of random victims online. According to an article published by CSO, many celebrities have expressed horror upon seeing their face mapped onto a adultstar’s body.

For someone to create a realistic adult video using deep fake technology, all a person needs are a couple of photos of the victim from social media.

Yes, that is scary.

As Nina Schick, the author behind “Deepfakes: The Coming Infocalypse” said, with advancements in deep fake technology, very little content will be needed to put women and children at risk.

Speaking of putting women at risk, according to BuzzFeed News, photos of over 680,000 women have been uploaded to a bot on Telegram (messaging app) to generate realistic nude images. Despite using a less complex deep fake algorithm, this bot only needs a single photo to create a super realistic computer-generated image of the victim.

These are the scary and twisted times we live in.

How Easy It Is For Someone To Steal Your Personal Information?

So, it is pretty easy to get a hold of someone’s photo online, but how easy it is for someone to steal your personal information.

Let’s find out…

Identity theft is a real issue. It’s when someone uses your name, Social Security number, or date of birth for their own financial gain. With your personal information and a deep fake video of you, who knows what havoc a sadistic individual can wreak.

Despite what we like to believe, it is not all that hard for someone to steal our personal information. For instance, according to one cybersecurity guide, more than 2,200 cyberattacks take place every single day.

Just to give you an idea, here’s how someone can potentially steal your information.

Through Your Mail

Whether it’s junk mail or important confidential mail, thieves love to steal them right from your mailbox. Going through important stuff like your bank statements, identity thieves can know a lot about you. To get more information, sometimes, these crooks can even submit a change of address request at your local post office to get your mail redirected to their address.

Through Phishing

Phishing involves stealing personal information through email. By sending an authentic-looking email from a legitimate organization, scammers can lure victims into giving up personal information such as bank/credit card numbers or passwords.

Through Your Wallet Or Handbag

You keep your driver’s license, your ID, and other important documents in your wallet or purse that can be a jackpot for any crook. If your wallet or handbag gets stolen, your identity can also be easily stolen with it.

How To Identify Deep Fakes?

Deep fake videos and images are hard to detect. You’ll need to really focus on minor details to figure out if the video that you’re watching is actually fake or real. However, poorly edited videos can be easily spotted.

When analyzing a deep fake video, look for these signs:

  1. Awkwardly positioned shadows.
  2. Eyelids not blinking.
  3. Real face appearing in different frames.

Software that generates deep fakes is getting better and better. It is becoming extremely hard to even detect such videos. Things have gone so out of hand that ‘The Defense Advanced Research Projects Agency (DARPA)’ is funding researchers to develop technology that can efficiently detect deep fake videos.

There is good news though. Researchers from Binghamton University and the State University of New York have teamed up with Intel to develop a deep fake detection tool called FakeCatcher, which supposedly has an accuracy rate of 99.39%.

Without an efficient deep fake detection tool, we may soon have to distrust everything we see or hear online.

Information Is Key

Deep fake videos with celebrities get millions of views online. For example, a video from the Jimmy Kimmel show gained 10 million views just in a couple of months. The purpose behind these videos is entertainment and that is exactly what makes these videos seem harmless.

Despite being something that should raise concerns for being potentially dangerous, because of the comedic nature of the content, deep fake technology is mostly perceived as harmless — which shouldn’t be the case. Because let’s face it, if the person in the funny YouTube video or some obscene Adult video gets replaced by someone you know, things can get quite scary.

There is still a silver lining though. With deep fakes constantly making news, awareness regarding such videos is spreading rapidly. If not the majority, at least the minority is aware of the fact that videos can be forged to spread false information.

Author/Bio: Sebastian Riley is an independent cybersecurity consultant and writer at thevpnexperts working to fight online censorship. Sebastian is also a passionate writer and speaker who enjoys spending his time educating people about emerging cybersecurity threats.


Discover more from TechResider Submit AI Tool

Subscribe to get the latest posts sent to your email.