About Us
Why the PBBM video going viral should raise questions, not conclusions
- Art Samaniego
- PHT
- #AI, Deepfake, President Ferdinand Marcos Jr.
DECODED: TECH, TRUTH, AND THREATS
I am speaking up about this video not because it is viral or dramatic, but because it exposes a problem that should not be ignored. It raises serious questions about authenticity, manipulation, and how easily false content can pass as real when people are not looking closely.
I watched the video once, then watched it again. And then I replayed it a few more times. At first glance, everything looks normal.
That is what makes it dangerous. It is only after repeated viewing that a small but critical detail becomes visible. What looks like a harmless glitch. A sudden “jump” that most people would easily miss.
In that moment, the face suddenly disappears. Not because the person turned away. Not because something blocked the camera. It simply vanishes in an instant, like a scene in The Flash.
Anyone who has ever taken a real video knows this does not happen. Faces do not switch off. Cameras do not erase people.
This is not a harmless glitch. It is a sign of an AI generated video.
As the co founder of Scam Watch Pilipinas, I have a responsibility to speak up when content like this appears online. Scams today are no longer limited to fake messages, phishing links, or bogus investments.
Deepfakes have become part of the scam ecosystem. They are used to mislead, to manipulate emotions, and to make people doubt what is real.
What makes this part peven more concerning is that the video simply stop showing the face while the background and other elements stayed.
As the face becomes clearer and viewers are in a better position to check its authenticity and likeness, the system creating the video appears to lose confidence. Instead of continuing to show a face that could be closely examined, it simply stops showing one at all. Real cameras do not behave this way.
Even in poor lighting, a real camera still records something, even if the image looks grainy, messy, or unpleasant. It does not selectively remove a person’s face.
In artificial intelligence, when the system can no longer confidently generate a face that looks like the person it is trying to imitate, it chooses to remove the face entirely.
This happens because the model’s top priority is likeness. A distorted or obviously wrong face would reveal the fake. Removing the face becomes the safer option.
A deepfake does not need to steal money immediately to be dangerous. It steals something just as important. Trust.
When people can no longer tell if a video is real or fake, they begin to question everything. Real victims are doubted. Real evidence is dismissed. Real people are accused of things they never did. That loss of trust affects families, schools, communities, and even institutions.
Some may argue that this is just technology or just content. That is exactly the problem. When fake videos are normalized, scammers gain cover. They can hide behind uncertainty and confusion. “That was not me” becomes easier to claim, even when the truth matters.
The sudden jump in this video is not a small detail. It is a warning sign. It shows how AI generated content can fail in subtle ways, often only visible when someone takes the time to look closely.
We must call these things out early, calmly, and clearly. Not to shame, not to panic, but to educate. The fight against scams now includes teaching people how to question what they see, not just what they read.
If we allow deepfakes to spread without challenge, we are not just tolerating fake videos. We are allowing the erosion of trust itself.
And once trust is gone, scams do not just succeed. They thrive.
(I shared my observation with friends in law enforcement, as well as with some of the people who believed the video and helped circulate it. I did this deliberately. I wanted to hear how trained investigators would explain the anomaly, and how ordinary viewers who trusted the clip would defend it. I am now waiting for their responses. By comparing these explanations side by side, I hope to arrive at the most credible and grounded understanding of what the video really is, and to expose where assumptions collapse when confronted with facts.)
