54. The Receipts Are Fake, The Photos Are Lies: A Small Business Owner's Survival Guide with Mounir Ibrahim
Justin Shelley (00:00)
Welcome everybody to episode 54 of Unhacked, the show for business owners who know the best way to deal with a cybercrime is what? To get unhacked? No, to prevent it. I'm Justin Shelley, joined by Brian and Mario. Each week we break down the practical moves you need to protect your business. Clear, simple, proven work. Listen in, take action, and keep your businesses unhacked. Guys, ⁓ let's do some quick intros. Brian, tell me who you are, what you do, and who you do it for.
Bryan Lachapelle (00:29)
Brian Lashpot with B4 Networks based out of beautiful Niagara region, Ontario, Canada. We help business owners remove the frustrations and ⁓ headaches that come with dealing with cybersecurity and technology in their business.
Justin Shelley (00:41)
Excellent. I love it. Mario, you're up.
Mario Zaki (00:44)
Mario Zaki, CEO of Mastek IT, located in New Jersey, about 15 miles away from Manhattan. We try to stay out of there as much as we can. We work with small to medium sized businesses, keep them ⁓ safe from not just Justin's Russian hackers, but all the other hackers. And we specialize with keeping business owners, giving them the ability to sleep better at night.
Bryan Lachapelle (00:54)
You
Justin Shelley (01:12)
Listen, I I've only dealt with Russians. They're there. It's my only area of expertise. I know they, I know that I know they're everywhere. In fact, I think we probably have two or three of them on this show right now for being honest. So, ⁓
Mario Zaki (01:17)
There's ones in China, there's-
If you ask my
Bryan Lachapelle (01:26)
You
Mario Zaki (01:28)
nine-year-old son, can't they're usually not from Mexico. He doesn't think Mexicans could ever be hackers by the
Justin Shelley (01:34)
I don't know where to go with that, but moving on, moving on. Like I said, Justin Shelley, CEO of Phoenix IT advisors. And yes, I do. I protect businesses from the Russian hackers, from government fines and penalties and from them greedy bastard attorneys that are coming, Hoover everything else up if you do get hacked. So we want to keep you unhacked. Now, Mario, you've got us a special guest today. Tell us what's going on.
Bryan Lachapelle (01:37)
I'm gonna leave that.
Mario Zaki (01:59)
So Munir Ibrahim, I've known this guy, he's more than just a friend of mine. I've known him for about 30 years now. He's about, he's one of my good brothers that I've known forever. ⁓ We end up busting each other's chops on like a weekly basis. ⁓ And when he found out that I had a podcast,
He was actually mad at me that I haven't asked him to join. I'm like... I honestly, I don't know. It's like one of those things that just never clicked for me. Like that I thought he would be able to do this, but he was, you know, excited to do it. I'm excited to have him on here. He, this guy's been all over, been doing so many different things from...
Justin Shelley (02:30)
Hey, that makes two of us. Me too, by the way. Why have you been holding out? What the hell?
Mario Zaki (02:55)
⁓ he was a diplomat, you know, working in different countries for a while, but, you know, we'll get, we'll get to all that stuff now.
Justin Shelley (03:05)
I want to know more about that. I'll let you guys run this, I do want to know more about the diplomat stuff.
Bryan Lachapelle (03:10)
Yeah.
Mounir (03:12)
Thanks for having me on. Mario, ⁓ I've known him, like he said, for many, years and never get into a fantasy football league with him. ⁓ He will collude his way to the top and make blockbuster trades that are impossible for anybody else and don't sit down at a poker table with him. Other than that, he's great.
Bryan Lachapelle (03:26)
Hm.
Justin Shelley (03:33)
Other than that, he's a good guy.
Mario Zaki (03:34)
Other
than that.
Bryan Lachapelle (03:35)
Other than that.
Mario Zaki (03:38)
No, but honestly, ⁓ it's a pleasure to have you ⁓ on Mounir. When you first told me that you would love to join the podcast, was like, my God, where am I going to start with him? Right now, you work with True Pick, which ⁓ works deeply in preventing deep fakes and
stuff like that which I'll have you explain in a little bit. ⁓ You know, prior to that you were working for the US government as a diplomat. You did a TED talk ⁓ on, you know, how long ago was that? You did?
Mounir (04:24)
Man,
was 2019, God, that was six years ago. Six years ago, yeah.
Mario Zaki (04:27)
⁓
Yeah, know, he's done it all. ⁓ this, know, like a week or two ago, we went out for a couple of drinks and we were just talking about this stuff for about, I don't know, easily two, three hours. And it was great. So ⁓ why don't you introduce yourself?
Mounir (04:53)
Yeah, thanks. Thanks guys for having me on. Munir Ibrahim. ⁓ I'm currently the chief communications officer and head of public affairs for Trupec. We're a technology company and we focus on image and data authentication. We see that as very much the antidote and the opposite side of the coin to synthetic media and deepfakes, which we can get into. As Mario noted ⁓ prior to this, I've been with Trupec now for seven years.
Prior to that, I was a foreign service officer with the US Department of State, that is a US diplomat, ⁓ for the better part of a decade, ⁓ served around the world in a variety of countries, conflict zones like Syria, ⁓ Bogota, Colombia, shorter stints in Istanbul, Turkey, and my last stint was at the US mission to the United Nations here in New York, which is where I'm based.
And it was really my time as a diplomat that led me into this world. And that might be a little counterintuitive. I do not have the technical chops that you guys do. I don't code, I've never built a computer. I still have trouble hooking up my kids like HDMI cables to the TV. ⁓ But what I saw ⁓ in my prior life as a diplomat is
the relevance and the importance that digital content images and videos played in quite frankly, some of the highest levels of governmental and multilateral decision-making where an image or a video from a conflict zone would lead to dramatic debate and serious life and death decisions. What I also saw in that time was the
inability to authenticate, improve that that image or video is in fact authentic, led to dramatic issues. Most specifically, when I was on the ground in Damascus, Syria in 2010, 2011, which is now ages ago, seems like yesterday to me. You may have heard of the term Arab Spring, which took over in 2010.
which was a wave of pro-democracy movements across the Middle East. And it happened in Syria. And I was the guy at the U.S. Embassy that was tracking all of this. I would go to the protests, observe it, come back and write the reports back to Washington. And I saw a lot of real stuff. ⁓ I saw people being beaten, arrested, shot at. I've run from gunshots and AK-47s. And I saw this all with my own eyes.
Justin Shelley (07:39)
you
Mounir (07:41)
⁓ We suspended operations, we shut down our embassy and I continued to work that issue, the Syrian conflict from Washington, variety of other locations. But what I saw is those images and videos that people risk their lives to take with their smartphones would enter into public debate like in the UN Security Council, only for ⁓ bad actors, those who wanted to deny the reality on the ground, just say, well, those images are fake.
Those videos are fake. Now, this is 2015, 2016 when I was at the UN 2017. This is before AI. This is when we operated in a world that is known. Have you ever heard the term cheap fakes?
Justin Shelley (08:26)
I have not, I saw that in your notes though, so I'm curious.
Mounir (08:29)
Yeah,
so a cheap fake is largely considered a rudimentary change to an image or a video. ⁓ Photoshop, right? Changing the metadata, the time or the date, cropping a face out and putting a face in. These are, this is like old school ⁓ visual deception, which can still be highly, highly effective, by the way. ⁓
Mario Zaki (08:55)
Like me with
a six pack or something.
Mounir (08:57)
Yeah,
Justin Shelley (08:57)
No!
Mounir (08:58)
that's what I do on my wedding photos. I just do that. So this was at the time where it wasn't just as simple as entering in a prompt and boom, now you have a realistic, lifelike image. So what we saw was even back then, those arguments were highly, highly effective, where nation states would stop and stall entire investigations
Bryan Lachapelle (09:02)
you
Mounir (09:26)
everything from attacks on schools and hospitals, chemical weapons use, to all these things that I was dealing with by simply calling into question the authenticity of that digital content. ⁓ It drove me and pretty much any other diplomat that worked on the issue insane, but particularly me because I was on the ground and I saw it happening. So don't tell me it's fake, I saw it, I ran from it. ⁓ And I knew people there, I knew people that were candidly killed and jailed and tortured and...
Justin Shelley (09:45)
Yeah.
Move.
Mounir (09:56)
⁓ so it really drove me nuts and I started thinking like there's gotta be a way to prove an image is real, right? Spoiler alert. There is no way to prove an image is real after it is captured. And we'll get, we'll get into that. So, ⁓ I started looking around, ⁓ for some way to authenticate digital content and I stumbled onto this company called TruePic and I see this tech tech crunch article, I believe it's tech crunch.
where it said San Diego startup raises $1 million seed funding to fight fake photos on Tinder. And here I am at the United Nations and I find our CEO, who is now my boss, ⁓ and I hit him up on LinkedIn. It's a two person, three person company at the time. And I said, hey, can I get your Tinder fighting technology to the US government and to the Syrians on the ground? And he writes me back immediately and he's like, whoa, whoa, whoa.
Let's get on a phone call. And you know, what I found out is the founder and our CEO, they came from the private sector, right? They came from Goldman Sachs and the insurance world. And they saw the same problems I was seeing in like life and death conflict zones. They saw it in insurance claims. They saw it in audits. They saw it in peer to peer commerce where people were being defrauded daily simply because of fake digital content.
Mario Zaki (10:56)
Ha ha!
Bryan Lachapelle (10:56)
Hahaha!
Justin Shelley (11:15)
you
Mounir (11:24)
And that's how I kind of made that bridge that jump from diplomacy into image authentication. And the kind of one takeaway I would would I would ⁓ posture here is we rely on digital content in every aspect of life, whether it is war crime, an online dating photo or an insurance claim and everything in between. We are literally making decisions on financial, personal and even governmental decisions based
off the pixels and the MP4s we are watching. And if we cannot know the provenance, that's a key word, that means the history and the origins and the creation of that content, we are in trouble, whether personally, professionally, ⁓ or from a governmental standpoint. ⁓ So that's a very long, windy kind of intro there.
Mario Zaki (12:12)
Yeah, so.
So it's not just, you know, when we're talking about like businesses with like videos that we have to, you know, protect out there, but it's, it's other like businesses, like you mentioned, insurance agent. I know we talked about this, two weeks ago, ⁓ you know, before we started getting, you know, too many drinks, ⁓ in the beginning of the conversation, we were talking about like.
Mounir (12:31)
Yeah.
from the conversation got good.
Mario Zaki (12:41)
you know, some customers for TruePick are like insurance companies that no longer need to use people to go on site to assess like the damage and stuff like that. They have it built into the platform where, you know, they can authenticate through there. Could you talk a little bit about that?
Mounir (13:00)
Yeah, sure. know, well, TruePig, we build and deploy workflow systems around our image authentication technology, right? So that allows any organization or company to basically request and receive digital content, images, videos, audio from any smartphone in the world. And that content coming back to them goes through our 30 plus
fraud prevention and authentication tests to ensure that what is served back to our client, they can have high integrity and high trust that this was the car accident at this date, at this time, at this locations, that those pixels have not changed and they can make an assessment in real time based on that content. So this has a variety of
⁓ utility based on different use cases, right? So one of the most basic ⁓ areas of benefit is almost every business on earth, and I'm sure the three of you know this better than me, is digitally transforming, right? Figuring out a way to do what was once arcane and almost very ⁓ human-centric in some digital form. The pandemic just accelerated what was already happening, right?
Everything from food delivery to insurance claims to whatever it is, shopping. We used to go to a mall. Now we are doing it from our smartphone. in that Sega.
Mario Zaki (14:37)
Yes, tell that to the... Like
my wife. My wife.
Mounir (14:42)
The Amazon
boxes, that's another conversation we have. Amazon boxes that are on our doorstep every day. ⁓ So ⁓ in digital transformation, ⁓ there is a very real piece that is fraud prevention, right? You want to make sure that if you're digitally transforming, that you can actually trust the content you're working on. So there are a variety of companies of ours.
⁓ like companies that are merchant cash advance companies, right? That provide cash advance to businesses around the country, but they need to make sure that business actually exists, right? And normally they would kind of ⁓ request some documentation and kind of hope and pray because they work in really high velocity, ⁓ time compressed environments.
So they didn't even have the luxury of sending out an inspector to go ensure that exists. We can help them ensure it exists. So if Mario's Pizza Place ⁓ said, need a cash advance, they'll say, all right, Mario, we just sent you a link on your phone. That's through TruePix technology. Open it up. A camera will pop open. Take a selfie. Take a picture of your signage. Go to your point of sale system. Do a 30 second video walking around your ⁓ pizza place. ⁓
Bryan Lachapelle (15:42)
Thanks
Okay.
Mounir (16:05)
At the millisecond, we are authenticating
all of the data and the imagery, and it's going right back to the Merchant Cash Advance Company where they can see it and have high faith that this actually is Mario. The guy I'm on the phone with just took those pictures. I see them. They're matching up to the application. Boom, you can advance them. So that's one kind of area, fraud prevention.
Bryan Lachapelle (16:25)
Okay, that was going
to be my question. The technology that you're using to take this video, you can't use existing video and authenticate it. You have to use a special app or a special process during the recording phase to authenticate it.
Mounir (16:41)
That's exactly right. It's called a controlled capture environment, right? You can't just upload pre-existing content. It is all on a forward moving basis. You're going to snap that shutter button. We are going to ensure that the device capturing that ⁓ image does not have malware on it, does not have geo-spoofing technology, does not have anything on the operating system that would allow the injection of faulty data or information. We're going to ensure that the
pixels are captured, that photons are hitting the camera lens, and that you're not, you know, trying to pass off some synthetic data or image, that you are not holding your phone up to a camera screen or another screen and taking a picture of a picture. That's called a rebroadcast. One of the easiest ways to commit photo fraud. Just take a picture of a picture and then pass off your metadata and a variety of other checks to ensure and give that lender faith that like, okay,
Bryan Lachapelle (17:25)
Right. Yeah.
Mounir (17:38)
This really is coming from the pizza place. Now another, so that's fraud prevention. We have a variety of areas that we work on in the fraud prevention area. Now another major benefit there is efficiency. So we have a variety of ⁓ partners that used to use a third party to go out to, let's say it's a different use case, an auto warranty use case.
we're heavily using auto warranty. I don't know if you've ever had a car under warranty, take it to the shop and they say, all right, an inspector is gonna come. It'll be about three days and we'll turn your car around. So the auto warranty company is gonna send an inspector to make sure that, I don't know, your tire or whatever it is, your carburetor really is busted. And that is one, took about three days and two for every one of those inspections.
I'm sure many of your listeners actually do this or use some sort of service. About one hundred and fifty dollars for the person to go out there, come back. What we do is instead of doing that, they can say, OK, auto dealership, we just sent you a link, take a picture and they can give them a list of based on whatever, whether it's tire and wheel or whatever it is. ⁓ And ⁓ in a matter of some people turn around one hour, snap, snap, snap, give the answers.
Justin Shelley (18:55)
you
Mounir (19:04)
to whatever the questions are in the app, and then they can approve that claim. So there's an efficiency there. They're lowering it from 150 to whatever the cost is, which is dramatically lower, and they're shortening the wait time from three days to, in some case, ⁓ So now you're increasing customer satisfaction and you're accelerating and driving efficiency in your own business. So I'd say image authentication as a whole has...
to major areas, fraud prevention and efficiency. Now, if you were to wrap those two benefits up, it really means something else in my mind. I see that as foundational to how any business will operate in the AI era. Because we've established, and please push back if you disagree, almost every business in the world is digitally transforming, and they need to trust that digital content.
Justin Shelley (19:59)
you
Mounir (20:02)
And without authentication, it's going to be very hard to compete if you're the slower one and the more expensive version of whatever your business is.
Mario Zaki (20:13)
As much as I love disagreeing with you on, you know, just for the sake of disagreeing with you usually, but, ⁓ you know, how do we, know, because not everybody is able, not everybody kind of is able to use this technology on like a daily basis, like with us and IT and stuff like that. We don't always need to do that, but you know, we, we had a guest on a few weeks ago that was saying like there was, you know,
using certain technologies that they've worked on, know, doing like pigments and stuff like that off their face on the camera and stuff like that, they could authenticate that. What about like, and you mentioned like without capturing it or filtered it essentially, you can't do it, you can't authenticate it.
Bryan Lachapelle (20:43)
.
Mario Zaki (21:07)
Is there ways to do anything after the image is taken? is it, that's it, it's taken, like I know you showed me, I didn't even know this, that you could actually change the metadata right off an iPhone and a picture that was taken in New Jersey, you can make it look like it came from California in a second. Is there anything after the fact?
Mounir (21:33)
So no. ⁓ So this is a good question, right? So there's when you're taught, it's important to think of like deep fakes and cheap fakes. So let's start with the deep fakes, excuse me, the AI. There is an entire industry ⁓ of synthetic media detection. You had a brilliant lady on, ⁓ I think it was last week or two weeks ago from Intel.
She really is top notch and she ⁓ thinks about this ⁓ quite a bit. And I know Intel's worked hard. I think it's fake catcher or something like that. There's a variety of other companies, GetReal Labs, Reality Defender that are all working on this. I would posit that, ⁓ and I think most of them would agree, that synthetic media detection is...
impossible in real time and at scale because if there was some detector that anybody could take anything off of line and drop it into a website and it's going to tell you, yep, it's real. No, it's fake. You would need an accuracy level of 99.9999 % and I believe the best detectors are barely touching 90%.
And that's usually on a trained data set, not an open data set. Part of the... So either a specific set of images, let's say, it's the same, I don't know, 1 million images that you're training off constantly and constantly and constantly. So your detector is going to get smarter. An open data set is just any image that I could possibly serve it from the internet. So it's going to always see different permutations and not have trained on it. So then it's going to lower your averages.
Mario Zaki (23:06)
What's the difference? Can you tell us the difference between us?
Mm.
Mounir (23:30)
⁓ So that and part of the reason for this is in the name artificial intelligence or machine learning, right? One of the main ways that you're going to create synthetic media are GANs, right? Generative Adversarial Networks, those two neural networks. And one is synthesizing, creating the fake image, and one is detecting it. And the detection side is telling the synthesis side how it detected it so that the synthesis side gets better and then it spits out another image.
And then the detection side will say, ha, found this. And it's going to tell the synthesis side. And it goes 1,000 times an hour, constantly. That is why synthesis is always getting better. There are other models known as diffusion models that operate the same way. They learn from how they created the last image. So synthesis will always be ahead of detection when it comes to AI. That is why I don't believe
you're going to see some detector that anybody can use and deliver high accuracy. ⁓ Probably the smartest person I know on this is a UC Berkeley professor named Hany Fareed. He is really amazing and I'd encourage you to check out his stuff for anyone who's interested in detection. I do believe detection will be useful in closed environments, like in government, for example, when they have, you know, all of their intelligence and all their information.
And a detector is just another data point where they could say, we have a 76 % confidence level that this is accurate or that this is AI, but we have all our other information. But if you're buying something online and you're going to date someone and the website says 76 % chance, this is a real person or a fake person, what does that mean? So that's why I don't believe detection would work on the AI side.
Bryan Lachapelle (25:17)
you
Mounir (25:26)
Now, let's even assume there is this amazing detector that is 100 % accurate. That's for synthetic media. We still didn't solve the cheap fake problem. What you and I did in the bar, Mario, literally take a picture, change the time and the date and the location, and then pass it off. If you put that in a detector that said, yeah, it's a real image, but I've totally deceived you because the time, and location is off. Or if I Photoshopped it,
None of that is AI. So you have two threat vectors in terms of digital deception, really more. could do using it AI and you could do it using non AI and both are highly effective. So at the end of the day, unless you are testing at the moment of capture, the time, the date, the location, the pixels and the device capturing it all, you can't have any confidence in whatever detection.
platform you're using in my opinion.
Mario Zaki (26:26)
See, Justin, when we talked two weeks ago, at end of the podcast, I told you, like, oh, I guess we're not doomed after all. We thought that there was some silver lining. Talking to Munir, I think we're back to doomed again. Being doomed again.
Justin Shelley (26:36)
I know we're we're we're back to doomed
we're back to doomed.
Bryan Lachapelle (26:42)
In Doom and Gloom.
Mounir (26:44)
I would love to, I get that question a lot and man, the AI is moving so fast.
Justin Shelley (26:52)
Here's my theory on that though, is that we live in a world and have for quite some time, by the way, this is not new. This scenario where we can't trust information that we're given is not new. In fact, it goes back to like the wiring of our brains. We are wired to trust in a way that connects us to other people. We want to trust. Look at religion, politics, nationalism, race, gender, all of these things were wired to
to look at information that confirms our individual beliefs, biases, and whatever. ⁓ That's a survival tactic. now the more we, well, if we back up the internet, right? ⁓ Social media, ⁓ Google search, Yahoo, all of these things have been designed for years, decades to feed us whatever the hell it is we want to believe. So the real danger here in my mind isn't even the technology.
It's the dome, you know, it's our brain that wants to believe this stuff. So the battle I think we're fighting is more of a psychological battle than it is a technical battle. We have to retrain our brains to stop believing every goddamn thing we see and read. Discuss.
Mario Zaki (27:51)
Yeah.
The problem is some of that fake stuff is pretty entertaining. mean, know, when he yells at me all the time because I enjoy watching my TikTok videos, you know, and some of it, you know, even though I in the back of my mind, I know it's probably fake. It's pretty entertaining. You know, like, you know, just to see, you know, like ⁓ a TikTok of like Jesus talking to like a 10 year old kid and stuff like that. I'm like, I know it's not real, but it's pretty entertaining.
Justin Shelley (28:18)
yeah, it is.
Mario Zaki (28:38)
You know, that's the problem is that, you know, people are so addicted and so they want to be entertained and there's just more and more people doing it.
Justin Shelley (28:47)
And I would argue it's harmless
and entertainment for the most part, right? Because when we know we're just being entertained. But Monir, where you're talking about business trying to verify for funding, for loans, for warranty, for insurance, ⁓ and those are just the industries that you work in. But like we talk about voter fraud, like I'll get a little political here, right? And if you're on one side, you believe this. If you're on the other side, you believe that. And they're absolute facts.
depending on which side you're on. Like there's no room. But what we don't talk about is the way we're manipulated through social media, through Google search, through ⁓ fake videos, whatever. That I believe is where the real fraud takes place. And I don't know, this is a problem. I don't know what the answer is. I'm hearing you say, Monir, that we absolutely cannot detect truth from fiction where images are concerned unless they use your app, right?
Mario Zaki (29:18)
Mm-hmm.
Mounir (29:44)
Well, all right. A couple of things. One, I was saying in real time and at scale. Given time, is a whole ⁓ there is a whole industry known as OSIN, Open Source Investigations and Intelligence. They like there's companies like Bellingcat and The New York Times has one. The BBC has one. CBS News. They do amazing work. You've probably seen this.
Justin Shelley (29:51)
Okay?
Mounir (30:10)
but it takes weeks and months where they'll look at a video and they'll say the missile came in from here and they'll track. So there are ways you can identify if digital content is in fact authentic or manipulated. But I would argue that OSINT is one of the best ways, that's not real time. That's not scale. can't base a business off of that or anything. Yes, that's open source. So there's a whole...
Justin Shelley (30:31)
Is it available to the little guy? Okay.
Mounir (30:38)
Most of these tools and one of them there's a whole training behind it People leverage reverse image search and then they can cut down videos into frames and the reverse image search. So it's like a it's a skill that the little guy could use but that that to me is is more beneficial for a lot of these more kind of like Geopolitical or social like my god did that? I don't know protest actually happen and you have weeks to figure that out
Justin Shelley (30:53)
Yeah.
Mounir (31:08)
Less so on a transaction, a financial transaction or a time compressed environment. To your point, Justin, in terms of probably the real challenge is our own perceptions and our own kind of social view. Yeah, I think I would agree with you there. right? There's certain regions of the world that are less affected by
Mario Zaki (31:13)
Insurance claim or something like that.
Bryan Lachapelle (31:18)
you
Mounir (31:36)
misinformation like Scandinavia has some of the lowest rates of that stuff really affecting those populations. They have more media literacy than other parts of the world and and we could get into like curriculum and everything we're taught and how to be media literate and all of that. But I don't think the four of us will figure that all out right now. I would. No, no, please go ahead, Brian.
Bryan Lachapelle (31:55)
Are the, ⁓ go ahead.
I was gonna ask, are the hardware manufacturers of cameras and phones and different technologies to record, are they developing hardware-based where they can embed either code or something into the video file that could authenticate it after the fact? Is that being developed?
Mounir (32:14)
After the fact, after.
Mario Zaki (32:14)
Sure pick dead.
Bryan Lachapelle (32:16)
No, like
as it's being recorded, the hardware itself is embedding something into the video file that could be authenticated after the fact.
Mounir (32:25)
So I think you're okay. Yes. When you said after the fact, I thought you meant like detection, ⁓ but you're saying is, are there hardware manufacturers that are leaving a mark like a watermark or a fingerprint or something like that? Yes. And actually we worked with Qualcomm technologies to put the first type of ⁓ cryptographic hash from a chip set onto digital content. And we've proved that out.
Bryan Lachapelle (32:40)
something like that. Yeah.
Right. Okay.
Mounir (32:55)
⁓ so that it's, it's not, ⁓ I do believe this idea of digital content provenance, which is it's not just, you know, like true pick, have our own system, but we helped create a standards body known as the C2PA. And I believe your last guest spoke about that. The coalition for content provenance and authenticity where we built an interoperable standard where
Bryan Lachapelle (32:56)
So the future isn't so bleak.
Justin Shelley (33:16)
Yes.
Mounir (33:22)
all of these technologies could write and cryptographically hash and secure the metadata that is associated with digital content at the point of creation so that it can be read from compliance system to compliance system. ⁓ That's what TruPic really helped pioneer and build with Microsoft, Adobe, Google, and a variety of others, Qualcomm. So that exists, but there's challenges to that in that.
You may I'll ask all of you, have you ever been on LinkedIn in the recent year and seen an image with a little icon on the upper left hand corner that has like a pin and it says CR? you have seen it.
Justin Shelley (34:05)
I've
not noticed it.
Bryan Lachapelle (34:06)
I think
so, but I know they embed, they change the video on as it's being transmitted. Like they modify the video.
Mounir (34:13)
Well, if you see an image with the upper left hand corner with a CR, that is called a content credential. That is the mark of the C2PA specification. So today, if you were to go on, and you can try this right after this, go on a chat GPT, create any image, Cow jumping over the moon, then put it onto LinkedIn.
The minute you post it, LinkedIn is going to have that mark say this is AI generated and it comes from chat GPT. That is provenance. ⁓ Now, we still need to grow to have a fully interoperable transparent internet is you need adoption of that spec. Much like SSL encryption ⁓ is an open standard where every website is going to use to be the lock icon. Now it's like those two things that
All three of you know better than I do. ⁓ That means it's SSL encrypted. You can still go on ⁓ a website that's not SSL encrypted. And if you want, you can put your credit card information, but most people probably won't do that today. And we envision a world where eventually images and digital content will flow from Internet site to Internet site because they all have adopted the specification reading that. ⁓ this came from Chachi. This came from Chachi.
but it's going to take time for platform after platform to adopt this technology.
Justin Shelley (35:42)
So let
me see if I could kind of summarize, we're gonna move to wrap up. ⁓ What I'm hearing is A, there's hopelessness, fear, dread and doom, ⁓ which I would just say we're living in the wild wild west in technology. And it happens all the time, we just iterate through new versions of it, right? ⁓ The answer to the solution potentially, or at least part of it is going to be in ⁓ standards, universal adoption of standards.
Mario Zaki (35:52)
We're doomed.
Justin Shelley (36:11)
and some version of government enforcement, correct?
Mounir (36:16)
⁓ I don't know if government enforcement is necessary on everything. ⁓ We are seeing a lot of it on like AI labeling, but I would argue AI labeling by itself isn't going to do the trick. If we're talking to businesses, they need to ensure that their workflows are bringing in authentic content. They cannot wait for an open standard to become the status. It might take years for that to happen.
Justin Shelley (36:21)
Okay.
Mounir (36:46)
But if we're talking about general society, I think we will eventually at some point in the future, I cannot predict when we'll have more transparency and provenance showing the history of digital content. It's not going to tell you it's real or it's fake. It's going to say this image came from a Sony camera. This image came from Adobe Firefly or Photoshop. You make your own decisions. I do believe we're going to go into that world.
Justin Shelley (37:08)
Right. Right.
And it's, we're always behind that. That's just the way this has played out. look at like the internet right now, largely every website on the internet is secure HTTPS. You can see it right in the URL. ⁓ that's relatively recent is like for as long as the internet's been around. It took a while for that to be adopted to, start protecting the information that we were just blasting all over the place. Passwords, know, usernames and passwords, social security numbers, all this stuff was just being thrown out there unprotected, unencrypted, ⁓ until relatively
So I think if I were to pick up my crystal ball and look at, we are in the wild, wild west where it comes to images, video, like this content that we're creating. But I think that the dust will settle, you know, I give it three to five years. I think it settles to a large extent, not perfect, but to a large extent. What are your thoughts on that?
Mario Zaki (38:05)
But the government did do something, right? Like they just came out a couple days ago with the Take It Down Act, you know, right?
Justin Shelley (38:08)
they did. Mm-hmm.
Mm-hmm.
Mounir (38:14)
So the Take It Down Act is actually a really good piece of legislation. ⁓ I actually testified at the US Senate in November, and the woman who really helped lead that was right next to me. She's from New Jersey, actually. ⁓ Really smart lady. Her daughter was a victim of this, actually. But the Take It Down Act, though, what that actually stipulates is that if there is a piece of ⁓ NCII non-consensual intimate imagery or non-consensual pornography that is placed
on a platform and the platform is notified. I think they have 24 or 48 hours to take it down. Great. We all support that. That's a good thing. But that's not going to stop the deception on like businesses, you know, on fake insurance claims and fake receipts. And what I said at the beginning, we've digitized everything in our lives. And there's going to be this gap in period of time. I don't know how long for many years where ⁓
Justin Shelley (38:59)
Right.
Mm-hmm. Mm-hmm.
Mounir (39:14)
Businesses are going to be operating and flying blind because there is so much pressure to digitally transform. You don't have the luxury to operate an insurance company or a lending company today the way you did 15 years ago and just have adjusters drive everywhere. You're going to be the slowest one the most expensive one and you're going to lose market share. So you need to digitally transform. How do you do that on digital content that you can't trust?
Justin Shelley (39:39)
Yeah. Well, right now I think you're the answer to that problem in those specific industries. ⁓ Yeah. And, and if you take all the precautions upfront, which I mean, I, I love that, that it's out there, ⁓ you know,
Bryan Lachapelle (39:43)
Hahaha
Mounir (39:44)
And those are the...
Mario Zaki (39:54)
But
outside of those businesses, like a company like us, small business, you know, with a little podcast like this, any recommendations? I know there's certain things like that we were talking about, like my face, you know, my body, my ass, something like that, right? What was the name of the...
Justin Shelley (40:10)
Hahaha
Mounir (40:14)
Not familiar with those, but
I would encourage you any small kind of audit your own procedures, right? And if there are any areas where you are relying on your clients or your partners to submit any form of digital content for your documentation, you really need to think about authentication. Or if you are relying on in-person, ⁓ now it goes back to the efficiency. You're probably spending too much time or money.
The same time, there's certainly, I'm sure you have AI experts. There's a lot of ways AI can help your small business. And I'm sure you guys have all covered that too. So there is a flip side of that coin. But wherever you're operating and making decisions off of digital content, you really need to start thinking about authentic. It's just too easy, man. Like even things like receipts. If you have your employees submitting receipts for like reimbursement for gas, for total.
Mario Zaki (40:52)
Mm.
Justin Shelley (41:11)
yeah.
Mounir (41:12)
You can go on a chat GBT today and greet, give me a fake, I don't know, George Washington bridge toll receipt. Give me 20 of them and change the...
Justin Shelley (41:20)
I saw, yeah, I saw somebody do that,
a Walmart receipt and it even checked out. don't remember how they, yeah, it passed all the tests.
Mounir (41:29)
It's just and and like even if it's a little bit off and you are the reviewer and you have like a hundred people you got to review their their expense reports Are you really gonna like with the magnifying glass look at the $89 receipt at Walmart? We're just gonna pass it through. I mean, it's there's a lot of breakdowns in the workflow
Justin Shelley (41:37)
Yeah.
It's like I said, Wild Wild West guys. Let's go ahead and ⁓ wrap up. We're going to go with key takeaways ⁓ and let's try to stay as far away from doom and gloom as we can, even though there is a version of it that's ⁓ We'll just fake it. ⁓ right, Brian, if you had to summarize everything, this is it. This is all your clients and prospects. We're going to see what do they need to know?
Bryan Lachapelle (42:14)
For me, would be just that if, basically what Munir said, if you're going to be using digital content to authenticate any kind of ⁓ process in your business, just look at making sure you authenticate the information and have a system in place to be able to do that, whatever that looks like. ⁓ That's all I have, sorry.
Mario Zaki (42:36)
You
Justin Shelley (42:37)
You know,
that's kind of a head, Mario, what are your thoughts?
Mario Zaki (42:41)
⁓ key takeaway for me is that if you are doing a lot of manual, like in-person stuff that, you know, it is time to, you know, add some digital, ⁓ flavor into your, into the workflow, but, know, doing it, you know, with caution, doing it with security in place, just like we say every, every week, you know, every, do everything with security in mind.
Justin Shelley (43:08)
Yeah. Okay. Monique Monier. Why I slaughtered your name. That's how everybody's going to remember that by what what's your, what's your summary, your key takeaway for business owners, you know, 15 seconds or less. What do they need to know about this whole episode?
Mounir (43:13)
Yeah.
Mario Zaki (43:13)
Yeah.
Mounir (43:23)
I would leave with one piece of optimism. We spoke about authentication in terms of ⁓ efficiency and safeguarding and security. I also think you can expand your business dramatically using this. This allows you to be in more cities in more countries than you would ever have to. We have lenders that used to only lend in their geographic area because that's where they could get to. Now they can lend across the country because now there are systems.
that allows them to test if you're lending against assets or businesses or whatever and know that pizza place in California is legit, that car dealership in California is legit. So now you can actually expand your business using these services.
Justin Shelley (44:06)
Absolutely. you know, and that is the beauty of technology. It's this, by the way, guys, this game's never going to change. It's been going on as long as we go back and look at the history of technology. Every time we invent something new, we, rush out there and we use it and we get better and we make more money. And then the bad guys come in and they screw it all up for us. And then somebody like Moneer comes in and. You know, protects us that thing. And then we move on just to the next iteration.
We've been doing it forever. going to keep doing it. It is the wild wild west. It always will be. ⁓ so believe nothing, protect yourself from everything and use companies like Moniers. ⁓ that blank on the name. True pick. There we go. Jesus Christ. How's that for a sign off? I, I slaughter your name pronunciation and then I forget the name of your company. True pick, but that's okay because we have our website, unhacked.live go there guys. And, you can get the real story on who Monier is, what company he works for and what he does.
Mario Zaki (44:45)
Chupac!
Mounir (44:51)
Now it's closed.
Justin Shelley (45:01)
You'll have a profile up on our website that you ⁓ will send people to. ⁓ And we've got our social media links and everything else. So guys, with that, that's it for this weekend of this episode of Unhacked. Let's go ahead and say our goodbyes. Brian.
Mounir (45:05)
Thank you all.
Bryan Lachapelle (45:17)
Yeah, just hey, everybody start your cybersecurity journey today and improve a little bit every day.
Justin Shelley (45:21)
Mario.
Mario Zaki (45:23)
Guys, lock down your shit.
Justin Shelley (45:25)
Mon- Mon- Mon- Mon-
Bryan Lachapelle (45:27)
Lock it down.
Mounir (45:30)
I don't know how to follow that one. Thank
you all for having me. This was fun.
Mario Zaki (45:32)
You
Justin Shelley (45:33)
Appreciate you being here. I am Justin. Remember, listen in, take action, and keep your businesses unhacked. See you next week.
Mario Zaki (45:41)
Thanks, Mounir.
Creators and Guests


