Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    King and Queen Spend a Day in New York

    Violence against women is at ‘breaking point’, says writer of John Worboys drama | Rape and sexual assault

    Sub-two-hour marathon, spooky houses explained and why is UK health in decline? – podcast | Science

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) YouTube LinkedIn
    Naija Global News |
    Thursday, April 30
    • Business
    • Health
    • Politics
    • Science
    • Sports
    • Education
    • Social Issues
    • Technology
    • More
      • Crime & Justice
      • Environment
      • Entertainment
    Naija Global News |
    You are at:Home»Science»How digital forensics could prove what’s real in the age of deepfakes
    Science

    How digital forensics could prove what’s real in the age of deepfakes

    onlyplanz_80y6mtBy onlyplanz_80y6mtJanuary 24, 2026008 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    How digital forensics could prove what’s real in the age of deepfakes

    SmileStudioAP/Getty Images

    Share
    Facebook Twitter LinkedIn Pinterest Email

    Imagine this scenario. The year is 2030; deepfakes and artificial-intelligence-generated content are everywhere, and you are a member of a new profession—a reality notary. From your office, clients ask you to verify the authenticity of photos, videos, e-mails, contracts, screenshots, audio recordings, text message threads, social media posts and biometric records. People arrive desperate to protect their money, reputation and sanity—and also their freedom.

    All four are at stake on a rainy Monday when an elderly woman tells you her son has been accused of murder. She carries the evidence against him: a USB flash drive containing surveillance footage of the shooting. It is sealed in a plastic bag stapled to an affidavit, which explains that the drive contains evidence the prosecution intends to use. At the bottom is a string of numbers and letters: a cryptographic hash.

    The Sterile Lab

    On supporting science journalism

    If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

    Your first step isn’t to look at the video—that would be like traipsing through a crime scene. Instead you connect the drive to an offline computer with a write blocker, a hardware device that prevents any data from being written back to the drive. This is like bringing evidence into a sterile lab. The computer is where you hash the file. Cryptographic hashing, an integrity check in digital forensics, has an “avalanche effect” so that any tiny change—a deleted pixel or audio adjustment—results in an entirely different code. If you open the drive without protecting it, your computer could quietly modify metadata—information about the file—and you won’t know whether the file you received was the same one that the prosecution intends to present. When you hash the video, you get the same string of numbers and letters printed on the affidavit.

    Next you create a copy and hash it, checking that the codes match. Then you lock the original in a secure archive. You move the copy to a forensic workstation, where you watch the video—what appears to be security camera footage showing the woman’s adult son approaching a man in an alley, lifting a pistol and firing a shot. The video is convincing because it’s boring—no cinematic angles, no dramatic lighting. You’ve actually seen it before—it recently began circulating online, weeks after the murder. The affidavit notes the exact time the police downloaded it from a social platform.

    Watching the grainy footage, you remember why you do this. You were still at university in the mid-2020s when deepfakes went from novelty to big business. Verification firms reported a 10-fold jump in deepfakes between 2022 and 2023, and face-swap attacks surged by more than 700 percent in just six months. By 2024 a deepfake fraud attempt occurred every five minutes. You had friends whose bank accounts were emptied, and your grandparents wired thousands to a virtual-kidnapping scammer after receiving altered photos of your cousin while she traveled through Europe. You entered this profession because you saw how a single fabrication could ruin a life.

    Digital Fingerprints

    The next step in analyzing the video is to run a provenance check. In 2021 the Coalition for Content Provenance and Authenticity (C2PA) was founded to develop a standard for tracking a file’s history. C2PA Content Credentials work like a passport, collecting stamps as the file moves through the world. If the video has any, you could track its creation and modifications. But most have been slow to adopt, and Content Credentials are often stripped as files circulate online. In a 2025 Washington Post test, journalists attached Content Credentials to an AI-generated video, but every major platform where they uploaded it stripped the data.

    Next you open the file’s metadata, though it rarely survives online transfers. The time stamps don’t match the time of the murder. They were reset at some point—all are now listed as midnight—and the device field is blank. The software tag tells you the file was last saved by the kind of common video encoder used by social platforms. Nothing indicates the clip came directly from a surveillance system.

    When you look up the public court filings in the homicide case, you learn that the owner of the property with the security camera was slow to respond to the police request. The surveillance system was set to overwrite data every 72 hours, and by the time the police accessed it, the footage was gone. This is what made the video’s anonymous online appearance—with the murder shown from the exact angle of that security camera—a sensation.

    The Physics of Deception

    You begin the Internet sleuthing that investigators call open-source intelligence, or OSINT. You instruct an AI agent to search for an earlier copy of the video. After eight minutes, it delivers the results. A video posted two hours before the police download shows a partial record that says the recording was made with a phone.

    The reason you are finding the C2PA data is that companies such as Truepic and Qualcomm developed ways for phones and cameras to cryptographically sign content at the point of capture. What’s clear now is that the video didn’t come from a security camera.

    You watch it again for physics that don’t make sense. The slowed frames pass like a flip-book. You stare at shadows, at the lines of an alley door. Then, at the edge of a wall, light that shouldn’t be there pulses. It’s not a light bulb’s flicker but a rhythmic shimmer. Someone filmed a screen.

    The flicker is the sign of two clocks out of sync. A phone camera scans the world line by line, top to bottom, many times each second, whereas a screen refreshes in cycles—60, 90 or 120 times per second. When a phone records a screen, it can capture the shimmer of the screen updating. But this still doesn’t tell you if the recorded screen showed the truth. Someone might have simply recorded the original surveillance monitor to save the footage before it was overwritten. To prove a deepfake, you have to look deeper.

    Artifacts of the Fake

    You check for watermarks now—invisible statistical patterns inside the image. For instance, SynthID is Google DeepMind’s watermark for Google-made AI content. Your software finds hints of what might be a watermark but nothing certain. Cropping, compression or filming a screen can damage watermarks, leaving only traces, like those of erased words on paper. This doesn’t mean that AI generated the whole scene; it suggests an AI system may have altered the footage before the screen was recorded.

    Next you run it through a deepfake detector like Reality Defender. The analysis flags anomalies around the shooter’s face. You break the video apart into stills. You use the InVID-WeVerify plug-in to pull clear frames and do reverse-image searches on the accused son’s face to see if it appeared in another context. Nothing comes up.

    On the drive is other evidence, including more recent footage from the same camera. The brickwork lines up with the video. This isn’t a fabricated scene.

    You return to the shooter’s face. The alley’s lighting is harsh, casting a distinct grain. His jacket and hands and the wall behind him have its coarse digital noise, but his face doesn’t. It’s slightly smoother, from a cleaner source.

    Security cameras give moving objects a distinct blur, and their footage is compressed. The shooter has that blur and blocky quality except for his face. You watch the video again, zoomed in on only the face. The outline of the jaw jitters faintly—two layers are ever so slightly misaligned.

    The Final Calculation

    You move back to when the shooter appears. He raises the weapon in his left hand. You call the woman. She tells you her son is right-handed and sends you videos of him playing sports as a teenager.

    Lastly you go to the alley. The building’s maintenance records list the camera at 12 feet high. You measure its height and downward angle, using basic trigonometry to calculate the shooter’s height—three inches taller than the woman’s son.

    The video makes sense now—it was made by cloning the son’s face, using an AI generator to superimpose it on the shooter and recording the screen with a phone to remove the generator’s watermark. Cleverly, whoever did this chose a phone that would generate Content Credentials, so viewers would see a cryptographically signed claim that the clip was recorded on that phone and that no edits were declared after capture. By doing this, the video’s maker essentially forged a certificate of authenticity for a lie.

    The notarized document you will send to the public defender won’t read like a thriller but like a lab report. In 2030 a “reality notary” is no longer science fiction; it is the person whose services we use to ensure that people and institutions are what they appear to be.

    age deepfakes Digital forensics Prove real Whats
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleIs the supreme court ready to stand up to Trump over Federal Reserve attack? | Federal Reserve
    Next Article What’s the biggest explosion in the universe?
    onlyplanz_80y6mt
    • Website

    Related Posts

    Facing AI and a tough job market, gen Z turns to entrepreneurship: ‘I have to prove myself’ | US work & careers

    April 25, 2026

    Trump says he will ‘probably put a big tariff on the UK’ if it doesn’t drop digital services tax | Donald Trump

    April 25, 2026

    How Well Will You Age? Take Our Quiz to Find Out.

    April 22, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    At Chile’s Vera Rubin Observatory, Earth’s Largest Camera Surveys the Sky

    By onlyplanz_80y6mtJune 19, 2025

    SpaceX Starship Explodes Before Test Fire

    By onlyplanz_80y6mtJune 19, 2025

    How the L.A. Port got hit by Trump’s Tariffs

    By onlyplanz_80y6mtJune 19, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Watch Lady Gaga’s Perform ‘Vanish Into You’ on ‘Colbert’

    September 9, 20251 Views

    Advertisers flock to Fox seeking an ‘audience of one’ — Donald Trump

    July 13, 20251 Views

    A Setback for Maine’s Free Community College Program

    June 19, 20251 Views
    Our Picks

    King and Queen Spend a Day in New York

    Violence against women is at ‘breaking point’, says writer of John Worboys drama | Rape and sexual assault

    Sub-two-hour marathon, spooky houses explained and why is UK health in decline? – podcast | Science

    Recent Posts
    • King and Queen Spend a Day in New York
    • Violence against women is at ‘breaking point’, says writer of John Worboys drama | Rape and sexual assault
    • Sub-two-hour marathon, spooky houses explained and why is UK health in decline? – podcast | Science
    • Private equity’s tempest in a teapot
    • Supreme Court Considers Trump Administration’s Plan to End TPS
    © 2026 naijaglobalnews. Designed by Pro.
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions

    Type above and press Enter to search. Press Esc to cancel.