[00:00:00] Speaker A: Cybersecurity is far more than your Facebook account being hacked or getting a virus out of your phone. This isn't about better passwords or 50 ways to authenticate yourself. Everything in the world sits on top of IT infrastructure.
[00:00:19] Speaker B: This is KBKAs.
[00:00:20] Speaker A: Are they completely scientific?
[00:00:21] Speaker B: As a primary target for ransomware campaigns.
[00:00:24] Speaker A: Security and testing and performance and scalability, risk and compliance, we can actually automatically take that data and use it.
[00:00:34] Speaker B: Joining me now is Ski Stevens, director at Future Crime Agency. And today we're discussing that cybersecurity is about to change and we're not ready for it. So, Skiv, thanks for joining and welcome.
[00:00:45] Speaker A: Welcome. Thank you very much.
[00:00:47] Speaker B: Okay, so let's start right there. Now, I just want to say this, that you are probably one of the most out there dudes that I have met, come across. I'm really interested to interview today because you are not shy of an opinion and that's what we want.
[00:01:02] Speaker A: Just a little bit. Just a little bit. Yeah.
[00:01:04] Speaker B: Well, I mean, that documentary, I mean, you know, I was like, definitely getting on the podcast, so here we are. But I really want to start there. So cybersecurity is always changing, right? And you know, even day to day, week to week, minute to minute, depends on how granular you want to get. There's always something changing or something going on. But what's your view on how do we get ready? What do you mean by that?
[00:01:26] Speaker A: Firstly, let's take a step back. It's more meta than most people realize. It's far. Cybersecurity is far more than your Facebook account being hacked or getting a virus on your phone. This isn't about better passwords or 50 ways to authenticate yourself. Everything in the world sits on top of IT infrastructure. And look at the recent crowds joined global outage as essentially a typo. Some bad data and an update rig have it all over the world. So we're talking airlines, banks, healthcare providers across Australia, Europe. And just saying cyber is like saying it these days. And the way I look at. And I mean, it's not the. I guess it's not the friendliest way to look at cyber. It's through the lens of warfare. We're in the middle of a war. Some people are wearing stab vests, Some people are wearing bulletproof vests. Some have bulletproof windows in their cars. Some have an armored convoy with a Navy SEAL team, but not many. But the problem is that the significant percentage of the population are blindly wandering down the street. They're already struggling with the basic underlying concepts of technology itself. There are still many, many who barely even use smartphones. I was actually in the bank the other day withdrawing some money to buy a new motorcycle and there was a woman doing a transaction and she, I didn't actually think she was that old, but she was using the old school passbooks. I actually had just assumed that technology disappeared 10, 15, 20 years ago and there. But there are still many out there in the IT world and living like in the small country town mentality where they live, their houses and their cars are locked and these are the people that are suffering the most. Every day on the current affairs TV shows, you see people being scammed out of money and the new attacks are constantly getting them. So how do you get ready when the baseliner population barely understands and to what you say is right? It's always been changing from the day that it started. And the sad thing is we're not just in an operational world, we're in a defensive world. Cybersecurity isn't an offensive where you can go and out there and pre attack and everything like that. So how do we get ready? That is a much bigger question. And it depends. If you're talking about the general population, the technically aware population, the advanced population, people in it, then you got the cybersecurity and then you've got the, say the law enforcement, government, military, like what's going on in Russia, Ukraine? Which level do you want to talk about about where you're ready at? Because it's a very, very complicated space.
[00:04:16] Speaker B: Okay, so there's a couple of things in there. Number one, I agree, I would say as a rule of thumb, people just aren't aware. Totally get that. The amount of people that call me about, I think I randomly got scammed. Do you think someone's listening to my phone? Like I get that a lot from random people I may have met like once in my life. And I don't mind helping them. Right? Because obviously people are grappling to be like, I need an answer. We're not sure. But I mean wherever, wherever that question takes you. I mean, I don't mind. I mean this is your interview, so I really want to know, like, how do we sort of get ready? Because everyone, I mean from a media perspective, we're seeing content, we're seeing stuff on our side all the time, everyone's getting ready. But like, what does that mean though? Like what do we need to get to your point? Like are people wearing like, like the vests and stuff like that? How do we get that proverbial vest on? So that we're ready.
[00:05:07] Speaker A: Yeah. And this is. This is one of the things that on a daily basis concerns me. A lot of people have come to me and said, hesuke, can you train cyber people to be more aware and how to defend and have defend against the latest techniques? To be honest, they're professionals. They can go and figure that. Kind of figure that out through other people or themselves. I actually more care about the normal people that are getting on a daily basis that are getting more and more vulnerable. They're not as paranoid as they kind of need to be. And to be honest, they don't really want to be. They want to live a peaceful, happy life, but everything's coming at them, but we don't have a choice. The biggest problem that most people's understanding of cyber, you know, your Facebook account and things like that, but it's far beyond that. It's entering the real world. When we're talking about Internet of things and what's smart, what's happening in your home, your smart assistants, like, you know, Alexa, Google, and all those kinds of things, like these things are entering your home. We've got cameras. We've got, like, door cameras, like the ring cameras that everyone's using these. It's not just an online thing. Our real personal, physical life is now part of the cyber problem. So. And it goes through cycles. And to your question of, you know, how do we get ready? There's no concept of getting ready because you can't ever be ready. And as I said, it's a defensive stance. You have to. Something happens and you defend against it. So a quick example might be people stealing Amazon packages off your porch and things like that. So now everyone runs out, goes and buys their door cams. Bunning sells all these kinds of things now, and even the most basic people could just wander in and buy one of these devices. So what happens now? The criminals up there now, they're starting now. The thing that's in the big in the US and it's starting to hit here towards the end of the year is personal jammers. So people will be like turning on a jammer, going up to your house, like. And because of the ring cameras and that they don't have local storage, they go over wifi that'll jam the wifi and they'll. So you'll just have this blank part. And this, to me, is still cybersecurity. It's not just your password and logging into things and being scammed. This is now everything around us is relating to cybersecurity. And actually our home, I mean I've just installed some unilocs here and they're all actually online and connected. So cybersecurity and this is one of the things that does frustrate me, frustrate me a bit about the cybersecurity world. We'll talk about that a little bit later. I think about the education is cyber people are still thinking about firewalls protecting companies, protecting your data. But there is so much like everyone's home is essentially needs to be defended in. And getting ready means being prepared for what's, knowing what's happening now and getting prepared for what's coming next. But how do we deal with the fact that 90% of the population barely understands what's happening now?
[00:08:24] Speaker B: Okay, those are great points. Now this is interesting because, okay, to your point around like firewalls, again, like if you're an enterprise, of course you're going to care about firewalls. But for the average person, to the 90% that you alluded to, I do agree with you. But then people come back and say, yeah, but kb like who cares? Like, I'll give you an example. So look at all the data breach that's happened in Australia. People go, oh, but who cares? I've already been in Optus and Medibank and whoever else in between. So I feel like people are going to get that. Feel like people may be desensitizing themselves from being breached and not caring because something so much has happened. And I've asked people in the industry, you think people desensitized now to breaches and so everything. I think the obvious answer to that is yes, but then that doesn't solve the problem to your earlier point around the 90%. So how do we close that gap? And I know there are people out there trying to educate, but I'm seeing more and more of these. I think I got an email before around, you know, how many people in Australia, et cetera, are being scammed? Obviously from this was just, you know, from pirating like films and stuff like that. This was just one example. Obviously the message isn't getting through. But then I feel like people are just more focused on enterprise and stuff like that. There are some people like focus on like personal security. But I don't know, I mean, what are your thoughts? I mean I'm just rattling off sort of what I'm hearing and what I'm seeing. But what's your view on that?
[00:09:45] Speaker A: The problem is that it's the way society functions. If a government comes and brings out laws about something that's not an issue. Like, I'm just trying to think of an example here. Let's say they tried to bring laws into Parliament that were banning people walking backwards. And you'd be like, but why would you do that? There's no one, there's no one committing crimes, walking backwards and doing silly things like that. But then let's say they know what's happening, they see what's happening in different parts of the world and they know that in a year's time is going to be a problem. Now that's the situation we've got at the moment where all of these breaches. So I was done in Medibank and I was done in the Optus one, and I've just clicked the button. Yes, join the class action, why not? And I've actually looked deeply, I actually got copies of the actual information that was breached. And let's say that when I was told that I was breached, I was like, oh yeah, what another one? When I looked at the information they had about me, I was taken aback. I was taken aback. So what I mean by this, the order is that the government can't proactively do things. So we've all been breached. We're. All the data's out there and all these people are holding information, all these companies are holding information about us in the millions and screens of people, and then it gets breached. And now the government is bringing in that new digital system where the companies can now authenticate through the government. But it's years late. After all, the information about nearly all Australians is already out there. How do we preemptively, as I say, just as a government society, let's say we could elect the right people that happen to understand what they were doing, write the laws and create the services that were required, they would go out and they would create things that we would think like. The proposal that the government is doing now is already something. Oh, I can't remember when it was. I think it was in the 80s, it was called the Australia card and that's like the American Social Security number. And when they even talked about bringing that in, we shut them down so hard. Now if we'd had that and we'd be in a state now where we wouldn't be going to real estates with pages full of identification, opening bank accounts and giving these companies so much information they wouldn't even need it. But back then we stopped them doing. We stopped the government. Now, I'm not a hardcore pro government person here, I'm talking About the way society seems to function. We only close the door after the horse is bolted. And the same problem with the say people getting scammed. We're now bringing in scam laws or where people have their moneys transferred. They're now trying to make the banks liable after squillions of dollars have already been bled from people they're bringing in after the damages occurred. And that's like setting your earthquake standards after Christchurch happens, not before. So that's just the way that society functions. And do we think it's actually going to change now? I've given some examples here that have related to personal things, but when we talk about the corporate levels, like there's now new discussions about when companies do something bad, like trying to prove that the actual directors knew what they were doing, whether it was lax security or something bad that like right now trying to get the actual people rather than the company is very difficult. But they're bringing in new laws decades after companies have been screwing people and doing the wrong things. So will we ever like Bunnings right now sells all this home automation things, cameras, things like that. Now they're all great, awesome tools and I've spent much, much, much money there. But what about in the future in one year, two years time when there is a major hack and suddenly the homes of 3 million Australians have been able to be looked into because there was a breach in some sort of new camera in zero day and then they go, hey, let's do a law that stops people doing camera, blah blah, blah, blah blah blah. So the problem is we, it's like, it's like having you know, like the gay Mardi Gras in Sydney, but let's clean up the garbage six years later. Like, like thankfully it's either clean it up that night in cybersecurity we like the education is so lacking in so many years behind the laws are lacking in so many years behind. How could we ever be proactive now getting ready probably a bad concept in that we could never be ready for what's coming next. And what's coming next even keeps shocking people like me that who deeply understands this world. And so let's talk about just now. Like trying to understand now. Now is constantly evolving in how do we educate everybody. It's a very difficult thing. I did a talk a few weeks ago and I actually used the pandemic as a, as an interesting example. It took the pandemic and a global lockdown and virus that was killing many people to teach the entire planet a short course of hygiene, washing hands, face masks, you know, all the different things. And like the entire world was taught a short course overnight of how to defend yourself against like this virus and things like that. I feel like we should turn off the Internet for a week and suddenly try to tell everybody what they need to do. And you're only allowed back on once you, once you sign thinking Reddit because there is so many different stages in the world of awareness of technology that there is always going to be people that are being scammed. And I'm. And again, we're dealing with critical infrastructure, industry, commercial, small business and individuals. There's like five layers there of people that need to know this at a different level. They've only just started to try and target, which again I find mind blowingly frustrating. They're finally starting to target, wanting to target the people doing things like creating deep traits and things like that. But now those laws becoming irrelevant because it's all AI now. Who do you go after? So in the cyber war thing, some guy's going to buy the black market, he's going to buy an AI bot tool that he's going to give it a general objective, go make me money doing X, Y and Z. And AI is going to evolve and come up with its own attack and it's going to go out there and do its own thing. How do you attack something and stop something? There's going to be evolving faster than we can possibly handle, changing its techniques and it's very frustrating. The best example that I've seen, it was an AI that they taught how to play one of the games. I'm not sure if it was Fortnite or something like that. They had it playing like thousands of hours a day of like so many scenarios. And then they put it in with the best E gamers in the world in those games and it slaughtered them. So when you've got. So that could be a very interesting parallel to cybersecurity people versus say hackers and AI hackers. This absolutely annihilated them to the point that the actual human gamers started to emulate aspects of the way that the AI was playing because it was just playing so out there. How can humans deal with swarms of AI attacks that are not only just. They're not going to be brain dead, just DOS attacks. They're going to be very smart, very carefully designed, targeting specific things in different ways that are going to, that the humans are too slow to understand. And as far as I know from a corporate and even military level, there is no firewall, AI powered firewall defense apart from very simple high level scripting things, you know, ids, intrusion prevention, duration detection systems, which see this, do this, see this, do this. I have not seen and I have looked of an AI powered file systems. Cisco's still selling the same stuff. Asas Juniper's still selling, selling the SRX as I checked. Actually a week ago I used to be a Cisco engineer for engineer. I was like, oh, what's changed in the past five years? Nothing. They're the same things. There's no AI powered firewalls that are dynamically understanding defense and taking proactive things and going. If this is happening, rather than just doing an IPS IDS which is responding and creating a rule, I see this, this, this, you know what that means? That means that this or this could come. So I might create preventive, preventative rules of something that isn't happening yet. And like there is no, as far as I know, there is no firewalls like that yet.
So the AI is in front. And I guess this is the same as here's all the AI creating all the deepfake and all the videos and imaging and we're barely trying to create some laws to figure out how to defend against it. This whole, the AI across the world, you know, everywhere is being used is the same in cybersecurity. And I've been talking to a few people, I mean often, you know, and the sort of things that you and I talk about how in social things people just look at us and they glaze over, but they're all starting to hear about deep fakes, they're all starting to get a bit scared and I feel like they've skipped an entire generation of understanding cyber because to me, broad with AI and deepfake and audio and all these different kinds of things, that's just a different form of cyber security.
And they barely understand it, much less understanding the cyber threats we've been dealing with over the past five to 10 years. And now we're throwing them into the world of AI, where we're literally going to be in a space where in the next six to 12 months what's being put out by AI is going to be indistinguishable, that we're not going to be able to figure out. This is going to take cyber scamming, like those phone calls you get and from an unknown spam number, imagine that times a hundred thousand, because that's what they can generate live communication out of data centers and they're not even going to need those little Kenyan or what Nigerian call centers anymore. There's going to be one guy with a programmer in a corner making a million calls an hour and they're going to sound perfect, but then they're also going to sound exactly like your wife, your child, your husband, your best friend. They're gonna, the video generation is gonna be good. They're gonna make video calls to you and be your wife, your husband, your child and your best friend. How are people going to deal with what is. But just on the cusp of. And that is. That is an entirely different meta level of cyber. I don't know if we even if is the word cyber still could. AI, AI, cyber. I'm not really sure what we should be doing that's about to hit us really hard. And how can anyone even all the way up to the government level and the lawmakers be ready when we can't, when they're barely struggling with today?
[00:20:55] Speaker B: Yeah, I mean, look, there's a lot of things in there that's really interesting. I'm nodding as you're speaking. I agree. The horse has bolted. One sort of parallel I could draw when you were speaking is do you think as well, like back in the day when vehicles sort of, you know, people were driving around that seat belts and then obviously people started having car accidents and then dying, then they put the seatbelts in. Is it sort of the same as like, well, we're not going to do anything until it becomes like a serious problem. To what you. To what you explained before with the examples. Right. So it's like, oh my gosh, they're only retrofitting this stuff now probably because it became enough of a problem. Do you think that's sort of the same sort of like what you, what you've said before? Because originally, like, if people were dying in car accidents and losing their lives, we probably wouldn't have seatbelts. But that's why they brought it in. And that's probably the same thing for what you're sort of saying now. I'm not agreeing with. We're not being ahead of the curve because the horse has bolted and it's going to be pretty hard now to catch up.
[00:21:52] Speaker A: But we can't even see what's coming. Like, yes, and that is a very good example is people move from horses to cars and they drove around like crazy and died. There was literally a hard ivory kind of wooden steering wheel which they just went splat against and the sat. Just one of the things that blew my mind recently is I Actually saw a car with historical number plates. Do you know that they are legally allowed to not have seat belts in them still? That breaks my brain. Like, are they allowed to be on the road? Yes. Do they have to have seatbelts? No. And you're like, why do we do that to ourselves? It's a weird parallel, but yes. We don't seem to ban anything. Like they're banning vapes now and things like that. You know, three to five years after they knew it was a problem. We're not stupid as a society. We know these things are a problem, but we don't stop society doing all the things that is necessary. Politically, philosophical things, these fall under. But I think that the police really should be like, in China, you have to go and get a license at the police station to use the Internet. Now that is. And that's how they can watch you and know what you're doing. But it's almost like, I think we need it like a driver's license where it's. You go in, you got a one pager, like this is. Is this a good password? No. Yes. You fail type of thing. Like, you get a phone call and it says something, something, something like, it would be. Do you know how much. How many people would be less scammed and less at security risks if we actually did that for the general populace?
[00:23:33] Speaker B: But how many people would even pass that test? Majority of people would fail it.
[00:23:38] Speaker A: That's true. It is. It is very true. It's like, who would actually pass it in? Like. Like, I know people that would just. Would just absolutely struggle, you know, and like people to this day. And this is the problem that I've got. People to this day are still, as we said before, not wearing seatbelts, talking on the phones in their car, drink driving, still smoking. Where we know all these things are bad and dangerous, but they still do it. So how can we expect them to follow basic levels of cyber? And I'm kind of to the point where I watch these current affairs and it's like, oh, this person has been scammed and scammed and scammed. The biggest. One of the biggest issues I see is with the scams, I kind of half feel sorry for them and half don't.
[00:24:24] Speaker B: I'm listening because I'm probably in the same boat as you.
[00:24:27] Speaker A: Yeah. My problem is, is that. Is that most things relate to greed. There was one I just watched yesterday. Some woman was about to get married. She was scammed $37,000 because she got a phone call and it was about a crypto thing and a job and you put the money in and you do this. If it's too good or it sounds too good, it's too good. And the amount of people that are bleeding away hundreds of thousands of dollars, but this is no different than what happened hundreds of years ago. They're just con artists. They just are using a more digital form of it. And like, there's no difference between stealing someone's money and hacking an ATM and putting a skimmer on it. It's the same technical concept that we've had for hundreds of years where the bad guys are taking advantage of a population that is unaware of what's going on. But it doesn't help when the underlying problem is greed. You can make money doing this. You've got a woman in just as an example, you've got a woman in her 50s that gets this romantic scam online. And she's talked to a guy, she's hopeful, but I don't understand. She's talking to him for three weeks and that she transfers in $50,000 to help him because of a sob story. I get we all want to take a shortcut or we all want to believe, but the cyber. And while people are doing these bad scams, AI is going to start doing the scams. The scams are going to increase by a thousand fold and they're going to be better than ever before. And really do worry. And I do not have the capacity to deal with and educate the general population. That's just too many people. Even when I'm working for law enforcement and military, they go, you know, some organizations, oh, can you teach these 3,000 police? And I'm like, I'll teach the teacher. But you got to kind of start to learn and build up your own skills inside your organization. Because there is. It's just not. We just don't seem to want to help ourselves. I did a talk for the Melbourne Council at the Federation Square where they were talking about what's the worst risk in current cybersecurity. At a personal level, passwords were a problem. So what did we do? We created Two Factor Authentication and then we put backdoors in. And what I mean by a backdoor is you enter your Two Factor Authentication and there's a little tiny checkbox, remember, for 30 days. I don't understand. Why do we create ways like Two Factor authentication is pretty cool. It levels up the requirement to get past just figuring out someone's password. We just check that little box because we're a little bit lazy. We want to make life a little bit easier for us. So someone comes up here, whether it's your kid or someone else breaks into your house and they get access to your computer and they just open your browser and like I mean I keep my, I've got highly sensitive different things. So to get into my computer, facial recognition as fingerprint, it's all different things because once I log into my computer, me too, I want to save time. I have a lot of browsers open to a lot of very sensitive things. These are, this is the things that we don't make it safer. And I think that there's, and because there's no repercussions for the companies making these things. Many and I'll admit to this now I don't normally talk about this because I don't want to embarrass them but it's been so long now. Back in the days with Anz Internet banking I contacted them and I said to them you've got a major problem in your Internet bank. And they went no we don't. Blah blah, blah blah blah. So I said can I come and show you? And I went into the bank in Melbourne and they had, they had the dumbest back door to the Internet banking. Which means once someone had logged in and then they logged out and all you did was right click back and it would go back to the login page and the password was in the field still and you just hit enter again.
[00:28:28] Speaker B: When was this?
[00:28:29] Speaker A: This is about 20 years ago. The only the way they fixed it was they put a tiny bit of code in there that cleared the password.
[00:28:36] Speaker B: And this is what we're relying off.
[00:28:37] Speaker A: And but this is what I mean is if there was legal repercussions for doing bad things like that, that your customers, your customers lives rely on this, whether it's a bank account, whether you bought your Tesla. The software that people write affects lives in a big like directly not just a little bit of money stolen here but can affect people's lives and there is no lawsuit pretty much across the board that stop people screwing up. Like they're trying to figure out how to take Medicare Medibank to the, to, to, to court. And what is the point of hitting them with a multibillion dollar fine? It's just going to screw all healthcare across the country. What happens if you take out Optus and give them a billion dollar fine? It's going to screw telecommunications for the company if they go out the back door if we don't do anything about it like in the whole banking Collapse in the US where the banks basically got off scot free. These big companies, they've got software people that are, that are writing code that are directly affecting our lives and there is essentially no laws around the world that affect it. Like how many people's lives were affected by the giant global outage with crowdstrike? Like oh, beyond. Yeah, like well it's considered from what I last I heard it was about had a hundred billion dollars worth of damage over of people being embarrassed in shops and banks and airlines and flights not being able to take off and all these different things. And how much, what's the penalty against CrowdStrike for a start? How do you penalize globally a company? People seem to not understand that like there is no like the concept of international law is around very small things globally. Like not every country in the world can take crowdstrike to court. In a global situation I think yes, you could suddenly have like 100 countries on a hundred law cases and oh, and then what are we going to do? Put them out the back door? It's very complicated. I'm very confused about, a little bit confused about how we do this because if we proactively try to do it then does that mean people stop making software or they're just going to be more careful?
[00:30:47] Speaker B: The answer to that is who knows. But one thing you mentioned before with people, right? Like let's just look at general people. Do you think people's mentality towards cyber is like well who cares? It's not my problem. I'll give you an example.
[00:31:01] Speaker A: No one cares, correct?
[00:31:03] Speaker B: No one cares really about anything.
[00:31:04] Speaker A: Yeah, no one, no one cares about anything until it affects them. Like you don't care about the death of a close family member until you have one. Your friend's mates down the road whose grandmother passed away, he's distraught. Nothing that affects until it affects us directly. No one cares about smoking until you get sick. No one cares about riding your motorcycle like an idiot until you come off. No one cares until you know, it's like I used to say this to people, like they're standing on the roof with the towel around their neck going like I can fly. And we're down there going no you can't. And you can't tell them. You can't tell them that they're going to hurt themselves. And that's would be like proactively or preemptively creating laws about some things. And that's where, you know, New South Wales gets accused of being nanny state because it's preemptively stopped a whole bunch of different laws and like different types of weapons and everyone just goes, oh, they're the fun police. So I guess the problem is will anyone care? These people being scammed that are on the Current Affairs. I'm sure they watched Current Affairs a month before and some were getting scammed and looked at it and went, that would never happen to me. And now it happens to them. How do you convince people of the implications of something where they do not believe it? And that's why we have anti smoking ads. We have don't drink drive. We have to have put police on the streets because they still do it until they get caught. And even then stupid people keep doing.
[00:32:40] Speaker B: It afterwards, the reoffenders. So okay, this is interesting because I agree with you. Like not like at the end of the day, like no one really cares about anything unless it impacts them or it's, it's, you know, they've been, you know, harmed by the thing. So someone I think before on the show has spoken about like, you know, kids touching the hot stove. Your mom would be like, don't touch the hot stove, skeeve. And then eventually you touch it and you burn yourself. Then you sort of learn. Now I hate to be pessimistic, but would you say it has to get to the point where everyone has had some cybersecurity scam related thing for people to really get it? Now I asked that lightly. I'll give you an example. When I was 18 years old, I just got my license and I obviously didn't I hit someone. Okay, no, the guy wasn't injured but the thing was I hit the guy from behind. But now I'm so cognizant of having that, you know, a fair few years ago, haven't ever rear ended anyone since then because I've had that incident when I was 18 years old. So it's like now I'm more mindful of, I don't want to do it because I've been there before and the problems and then I had to go the insurance, it was a headache. So if I'm more cognizant of not doing it, I would have to go through the entire, you know, ordeal. So do you think it's going to get to that point? Because the amount of people, people have come to me and they haven't really cared about cyber up until something's happened like, oh, my ex boyfriend stalking me. Now I care. And look, I get it, right? Because you know what, there's other things in life that people Tell me about that. I don't really care about. Right. So everyone is in the same boat up until something happens to them. Right. But then that doesn't solve the issue. So what would you recommend? Like what do we do now?
[00:34:20] Speaker A: That's a very difficult thing. It's like don't eat so much Big Macs until you suddenly get diabetes. Like we seem to need to experience something as a, this is just as a human, it's nothing to do with it, technology or even cyber. This is in life. You can't tell the 16 year old girl who the boy smiles sweetly at her. The mom goes no, no, he's bad furious where he rides. He's a bikey or something, something. And she's like, but he's so cute and like she's, she's not going to listen. This is just who we are as humans across the board. So I guess from my perspective we have to accept that that is the situation. We can't preemptively explain to everybody in the world about everything that could go wrong. We've been trying to do that all in modern history and it doesn't work. So what? So given that that's a thing and that people are not going to listen until they're personally affected, I guess if you base it off, that is the question is then now what do we do? What do we do in this situation? So there are what they call reverse punishment, meaning if you do one of these bad things cyber wise, your ability to have a bank account with X amount of banks or these type of services gets suspended for one year. There needs to be some sort of punishment. The last thing you want is people losing hundreds of thousands of dollars. So the problem is there needs to be something like, and I don't know what that is. Like maybe if we've all got that new government MyGov website thing, maybe it's. If you do not set your website, your password properly and go through this three minute thing of how to do a good password, you know, your Social Security is suspended type of thing. That's a very simplistic thing to say because there's obviously so many people. There's illiterate people, there is poor people, there is homeless people, there is so many different things. But as a ignoring the simplicity, the complexity of that having repercussions is the only way to get people to do the right thing. So people obey the road rules not because they want to, but because they're going to get in trouble if they don't. Like, I mean there are good people out there that do want to. But the problem is that doesn't seem to be the significant amount of populations. Like it's one of those questions that I like the moral questions where it's what would you do or get away with if all laws are suspended for a day? And most people go, well, I would, but it's only like 10% of people that just go, well, nothing, just be myself and do the right thing. And that's the problem, is that the percentage of those people are small. So we do need to somehow bring in some laws where it goes along the lines of if you are a part of X, Y, Z and you don't do the right, like maybe some very basic simple things, then you lose access to something or say, one thing I learned about the other day was you can get a motorcycle license in Australia and you can ride anything you want. After all, in Japan you can ride a 460 motorcycle, then you need another license. So maybe it might be one of those things where you're not allowed to use all these online government services, Medicare, social, blah, blah, blah, blah. Unless you go into Centrelink or Services Australia, sit down like a driver's test and do a 10 minute basic thing. Like if you do that, then you're allowed to use all these online services and you just go and put in your little QR code. I want to open a Bank account with CommBank. The Commaq goes, have you done your little, your thing? And this is not actually that uncomplicated. I sent a package to the United States yesterday and the guy hands me a QR code and it may send me to this government website that I had to just fill out five little questionnaires about an export thing. Just like what was in the package. We have systems in place for this. Anyone wanting to send a package overseas that's worth more than X has to do this. So clearly we have the functionality in society. So why would it be that complex to suddenly say if there's any online services that involve monetary or things like that, like bank accounts and trading accounts and health and everything, that you're not allowed to use those until you go to your state governments, the Services Australia or the RMS or whatever, and do this little thing of basic cyber security familiarity like that doesn't sound that burdensome on society? And that would pretty much quickly educate. You'd have everyone at whatever age like, hey, this could be this. This could actually be. Actually thought about this last time. This could be a very interesting thing. You've Got the New South Wales and South Australia Social media youth conference on in the next two days that why not make the same for kids? Like if you, if you're 14 and you want to get on the Internet to be able to use social media before you're 16 or 18 or something, go down, do this little test that is going to run you through a few questions which is no different than a 15 and 9 month old person getting their license. We have these solutions in place. Why not? They don't seem to be too much, too burdensome on society and it would mean that everyone that is online and doing these things had at least done a basic test. They'd answered the three questions about what involves a good password and what a scam might seem like. Because the people that are being scammed and having the big problems at the moment and having risk of security are the people that have no idea, that do not even have the most basic of knowledge. We're putting them, it's almost like putting them in high performance motorcycles without a license. That's what the Internet is like to some people.
[00:40:17] Speaker B: But wouldn't you say as well that look, I get what you're saying but then my mind goes people probably haven't implemented this because they're trying to hedge their bets. So what I mean by this is there was a retailer in Australia, they got breached. Now I asked them that question. Don't you think you sort of hedged your bets with your customers? So they're so like if there's too many barriers to entry specifically for an e commerce site, people just won't shop with them as too hard. I can't get it. I don't know anything about the Internet. It's all too hard.
[00:40:46] Speaker A: Right, so make it with all of them.
[00:40:48] Speaker B: Yes, but they hedge their bets because they want the revenue through the door. If it's too hard and it's all too difficult, people are not going to, you know, transact. So I think that companies potentially would you agree are gambling that to be like well let's just hedge our bets. You're probably right in theory. Let's just prey on the fact that people are at skeeves level and not at, you know, a year one sort of level. So I think that I hear what you're saying but there's you know, I could go on and on about this just through the conversations.
[00:41:21] Speaker A: Well companies and companies are never going to spend money that they don't have to.
[00:41:25] Speaker B: Well that's what I'm saying there. They're preying on that, though, they're preying on the fact that, well, hopefully, yeah, you're right, but let's gamble that and see, because. Or else we wouldn't have half the things that have occurred.
[00:41:35] Speaker A: But see, the thing is, no one wants to be singled out. I mean, the biggest argument I hear about the. I don't know if it's pro smoking or this, that is, but the like on the vapors, there you go, oh, but there's alcohol people and this people and the that people. You have to treat everyone the same. So one of the things that's been frustrating me for years is just typical, you know, your home routers that they didn't. Just didn't bother putting IPv6 in. We have been out of IPv4 for so long. The Internet is having lots of problems regarding this. Probably is. There's no decent case for rolling out IPv6 on a large scale. But if the government suddenly went, we're not buying anything that doesn't have dysfunction in it. If the government said to every retailer at the same time you like. Which they've done many times regarding privacy rules, blah, blah, blah, all these different things. If you say everyone is bound by the same thing, then the retailers will do what they're supposed to do and they won't feel singled out and they won't whinge about it too much. That's how you do it. You get everybody. And I'm not.
What I do hate is laws that do nothing.
[00:42:47] Speaker B: Give me an example of a law that does nothing.
[00:42:50] Speaker A: Laws that do nothing are things like the new deepfake laws. Firstly, it's going to be an offence to distribute some deepfake images or nude images of someone and you go, that's reasonable. And then they go, it's also an offense to create it. And I'm like, create an image for myself or it's really. Technically, it's not much different than me hand drawing something many other ways, but in the privacy of my own home. Like there are laws that just don't make much sense to me. Like the one that was prosecuted in the Victorian court about the Nazi salute. They've literally, and by the way, completely against it, against that sort of thing. But they literally put a law in place of how someone moves their hand. And I'm finding some of these laws are just a little bit bizarre. They're telling us how we can think there's a new law coming about, like online offense laws. And the sad thing is it seems to be an offense to discuss the online Offense law like this whole new look like the woke culture and things like that. Putting laws in that wrap themselves around each other that make it complex. And I don't like laws that don't seem to have any actual purpose or that they're very very easy to circumstance to get every. Anything that you can circumvent easily is an issue to me.
[00:44:08] Speaker B: Okay, look, this is interesting. So what is the premise then of putting into your, to your point laws that do nothing? Is it just to give the illusion that with doing something even if they don't make sense and they contradict what they're saying.
[00:44:22] Speaker A: I spent 10 years advising politicians and that's exactly what they do.
[00:44:26] Speaker B: I'm so rattled by that.
[00:44:27] Speaker A: I've seen things that have made me walk into a put my head into. I mean I was in Parliament House like about five years ago and I was in a room and there was discussions happening with some like federal level MPs and I literally walked over to the corner room and put my head into the corner because they were talking absolute nonsense. And some of the laws that have been proposed over the last 10 or 15 years and I don't know how to say this have actually been proposed to see if the public will be upset by it, by it at all. Like Abbott did the whole no one can talk about anything to do with the boats and things like that. The sad thing about that they expected the public to push back. Public didn't. So they were like.
And the amount of national security laws that like Dutton and Turnbull did not get on. But Dutton went I'm going to push this through. And Turnbull actually thought sometime that's not going to get through strain public if they do not understand it, do not care. It's like when I think it was about four elections ago when the broadband was coming out and the broadband and 100 megabit to the home and who needs this like that. The public had no idea what anyone was even talking about. There's a lot of these things that go through that if the public doesn't clearly understand what the purpose of it is sometimes and I know this sounds odd, the politicians want the public to push back. It's like case law. Like there are actually and this and this sounds funny most people kind of, I'm probably unaware of this. The ATO if you have a questionable situation, the ATO goes we don't agree with you, we're taking you to court. And you go that's not very nice. And they go no no, no, no, no, you misunderstand Us, we are going to pay your defense of our case. And they go, and that's because the ATO hates ambiguity. And if something's ambiguous, they want it like sorted out in court. So they'll go, we think this, even though it might not be clear in law, and we're going to take you to court, but we're going to pay your defense so they can get a ruling on it. It makes it non ambiguous in the future.
[00:46:37] Speaker B: Whoa, that's crazy.
[00:46:38] Speaker A: But it's not really. Because the last thing you want as a government organization or any organization that's run by regulations is to have something completely ambiguous, because it means you, I get that. You freeze and you get stuck. So paying someone else to run a legal case against you wasn't really against them. They don't really care. They just want to enforce the rules. But when the rules are ambiguous and blurry, like back in the days when cryptocurrencies first came out and all these different things, the ATO just went, and by the way, we're going to fund these cases because we need clarity. And that does make sense. But. So they do this a lot in like normal laws that come out. We want to bring this out. And. But sometimes the public just shrugs their shoulders and goes, okay, I mean, what did we do with the lockdowns? In America, you would stay at home. And the Americans went, screw you, we are staying at home, I've got my rights, blah, blah, blah, blah. In Australia, we went, stay at home. Here's some money to watch Netflix. It didn't matter how right or how left you were, we went, okay, so.
[00:47:38] Speaker B: Then I was thinking about that as you're talking before. So do you think generally, I do agree the United States is like that. Do you think in your experience it's better to be more of a compliant society than non compliant? I mean, I get from a government perspective, they want people compliant. I get that. But just in your experience, what would be better to be. Because, I mean, we do need people challenging stuff. Right? Or else, you know, we're not going to ever get ahead of these cyber kernels. And we won't. I don't believe so. But I think this is interesting because you have had a, you know, crazy background.
You know, I think you're obviously well versed to answer this, like, which one is a better.
[00:48:20] Speaker A: It's complicated because people are complicated. And it's like, I really love bipartisan politics, where especially I think it was Barack Obama, he won, but then he appointed a Republican in a big role I think the right person for the right job is always the right thing. The problem is that we can't, especially in Australia at the moment. We are at the point where if Albanese said it's daylight outside Dutton and Pop and go no it's not, that's just a different perspective. And you're like, it's like they have to disagree. Like Turnbull was first rolled as the leader of the opposition because in one week he agreed with the opposition. Like the Labour, he was in power three times in a week he went yeah, that's reasonable. And then they rolled him because he agreed with the other side. There are so many people that will not agree just because it's the standard, whether it be the Greens or something like that, they're always taking a different position. So. But I believe that there is the. What the thing is, I don't think we disagree on what's right, we disagree on how we go about it. So if you, somebody came up with the rule and went here's 10 points about safety for people relating to technology, it wouldn't be hard to get probably an agreement from all sides. But could you get a law through about it? Of course not, because the implementation and the way like I think what's one of the arguments at the moment? There's a couple of arguments at the moment like whether about housing or the Federal Reserve. There's a, there's a few things where all sides agree but they don't agree on how it should be done and who gets money and where gets the focus and the process and, and this is the problem is that we know we'll never be able to get agreement from the population on how we go about it. There is a lot basic baseline concepts in society that you would agree with. No one out there in society, highly rare exceptions are going to think that people should be allowed to just walk up to someone else and punch them in the face. No one's going to agree on that, but they'll all disagree on the laws, the punishments, the penalties, the processes, the everything like that surrounding it. In cybersecurity, you're going to go, should, should anyone be allowed to be frauded? And should they all just respond, well everyone's going to go absolutely not everyone should be protected from being fraud. How do we do that? And there'll be 5,001 suggestions from 400 organizations, 350 of which you've never heard of before. Not actually not even representing the organizations, the people that they claim to represent. Like I'm a Christian and The Christians are going down the career like a few elections ago going blah, blah, blah, blah. And I'm like, I've got gay and lesbian friends that were like, what? Same sex marriage? I don't want that. I don't want her to propose no men. Like, the problem is that these groups that are just the loudest people that get to talk and we do, I don't know how we get a what is safe for society? Like Albanese, he couldn't even do a thing on the 7th about the Palestinian Gaza conflict where he said nobody should. Like everybody should be safe and nobody should be dying. And the Greens and the liberals went, you shouldn't be saying that about the Palestinians. Forget about them. It's just about the Jew. It's like you can't even say that people shouldn't die. Like they all kind of agreed. They all proposed their own versions of it, but they couldn't agree together of how to say it. This is what we can't even get. We can't even get something as simple as we shouldn't be killing children pushed through Parliament. We can't even get an agreement on that, on the words and the way it said. How are we going to actually get some basics of operation which is, and I'll be honest about this, sometimes with some things I think you just don't ask.
I do. I am frustrated that politics is getting in, spending the first year doing some things and second year and then the next two years trying to get back into the next time. We spend so long not doing anything while they're trying to get ready to get back in that we don't have time to get any of these things done. Like any government that ever promises a bridge being built by 2030 doesn't mean anything because in two, three, four years the next government can completely change it. So how do we get things in place that require years? Like we might, like the current government, whichever government might be of today might come up with a very good baseline security awareness, funding for it in schools or something like that. The next government might turn it off. How do we deal with the fact where we've got, we're becoming more and more at risk by scarier, more indistinguishable from reality and faster than ever and more fantastical than even the movies that we watch. This technology is getting to a point which is going to cause significant problems. The law is already freaking out right now. The problem is who writes the law. Scientists don't write the law. Lawyers and politicians write the law. And they don't, they barely understand all of the deep fake and like what's happening in Hollywood about cyber generations and like AI generations of characters and movies and what's real and what's not real. The militaries and the police are dealing with these problems too of like images and what's real and what's not. And how do you, like, how do you deal with all the things that's going on and when organizations and governments are struggling to deal with it, people are just going to be the victims in the meantime. And while they're, while this is, I mean we've got other problems coming that are going to be equally as devastating to people's lives. For example, self driving. We know the cars are ready, Tesla's ready when they get hit in the road. I was in meetings about self driving. So the self Driving Council in Australia like three or four years ago, insurance companies don't know what to do. Each state still has its own rules. So the car that might be allowed to do what it's supposed to do might get to Victoria. It'd have to change its programming rules as it goes over the border.
What happened that's going to affect everybody in a significant way because does the car abdicate on your benefit or the person that may hit benefit who makes these moral judgments? What if a government decides that in Australia cars can only favor the person that's outside the car? So your car may move to avoid a child and hit a semi trailer, but who's going to buy that car? The next government might change that more. So these are the things that frustrate me when some of these things may be common sense. And I do agree that one government may do something that might not be great and the next one should fix it. Who decides where is the center point and where is the reasonableness of society? And while there is, as I said, while there is many different things, we've got so much coming at us right now that people barely understand technology. I always say when I'm doing my futurist talks that most people's understanding of technology is about five to six years before now. Like very few people know that you can drive a car by thought and I've got the technology sitting on my shelf shelf here and it's been around for 12 years. There is brain control things. There is so much stuff out there that people do not even that exists. Like I've had my microchip implants in both hands for like eight years and people are still like wow, like it's Sci fi and the problem is that the general populace, that's where they sit, they don't understand about AI. All they're hearing is AI is going to take my job and they're freaking out. An example of this Even in the IT world was I was, I jumped on a LinkedIn live stream yesterday. It was about IT careers. I was curious somewhat. I knew I was doing it and it was about being web developers. I'm like from a start there's no such thing as a web developers to be subject on the front end, back end CX1 stack languages and AI is about to obliterate them. Like on this chat I actually went put a thing into ChatGPT saying design me a JavaScript that will take a form and enter this and do this, blah blah blah blah blah. And I hit enter and I posted the result. I'm like web developers are done like they need to be prompt, engineer, front end, something somethings. But they're not even, they haven't even thought about the training of this yet. And yet there is. Their entire industry is about to be obliterated and we know this from what they can terminate from law researching to so and they've already just proven that some of the new AIs can, can do PhD level like PhD level like documents and do it in three minutes and this is what we've got right now. And the world doesn't understand what's about to come and then they see and they don't put the meta picture together, they see oh look there's lots of robots out there. Tesla's making a robot, blah blah, I've got a robot dog. And like I really would like to buy one of the new Unitree H1S. They're only about you know, 20, 30 thousand dollars and these are absolutely like humanoid robots. What happens when that's walking around ChatGPT equivalent level of intelligence in it we've got that tick. I can get my dog an upgrade my that's got chatgpt in it which I'll be then be able to put arguments to it. This is not the future. This is one of the reasons why I don't call myself a futurist. Sometimes I'm more of a now this is things and tech we have here today. Ten years ago I was in Japan and I had a robot taking my hand and walking me around a store explaining to me 10 years ago and it's about to converge where you start to see robots are about to be in our houses very very quickly like within the next two to three years with AI that's ramping up at a rate that is Moore's Law. It's now obliterated Moore's Law every six months now. It's like every very short period of time. The power and the intensity. I consider myself quite knowledgeable in this space and AI and all these different things. Honestly, I'm struggling. I'm struggling with the 10 YouTube video news things about AI that I have to watch a day to just barely keep up. And this is me who's at the cutting edge of all those sort of things. But this is the problem, right? So the bad guys are going to use AI to do the attacks. Do we have any cyber security professionals proficient in AI defense? Do we have any education, even teaching about it? Nope. Do we have any products that you can actually buy?
No. So we've got a war coming, We've got no defense for it. This is the same as the Australian military. They generally have a policy of human command, meaning a remote device that they can control remotely can't make a choice to shoot and kill. It must be approved by a human. But they did, however, invest in a lot of the equipment that could be upgrading. But how do you defend. I've had, over the last five, six years, I've had a few joint task force. How do you defend against a swarm of a thousand drones with microplastic exploits of. The only way you can defend against that is with a swarm of your own that's allowed to do what it's supposed to. Then we enter complicated world where what if someone takes control of your swarm and sends it over into the populace and other things like that. We're all the people in this space, we're all very, very careful watching Ukraine, they're doing amazing things. They're breaking all standard military rules. And what Russia and Ukraine are doing against each other, they're taking drones and they're changing on a daily basis. One thing I saw that Russia did really something really cool recently is they caught a Ukrainian tank and then automated it and were driving around with remote controls. And this is not standard military. And where you have procedures, this is just them making it up as they go. They're like maker soldiers and they're like, let's see what we can do. Today. The police are going to reach that point here where they're going to turn up to a fight and they're going to have to figure out standard procedures are not going to work. And what happens when my dog gets hacked and it's out of control and it's got some little basic iron and it's running around the street zapping people. Let's say it got out, it got hacked and got out and they turn up to my door hours later and they're like, is this yours? And I like, oh, I didn't even know it had gone missing. And then they arrest me because I'm the owner of a robotic dog that went rogue. There is no laws for any of this concept yet and they don't know how to create them. Who do you blame if your Tesla has a software glitch and if it's doing self driving, drives through a shop window and kills two people in Sydney, is there a programmer in Texas that suddenly has a hand on his shoulder and they're like, you did that update overnight and by the way, your typo on this pay on this, on this line of code has just killed 75 people around the world. That thankfully we've just stopped him. Like does he get a life sentence? Does he get executed? Does like is the Texas they're still going to execution maybe. Like who is responsible?
[01:01:53] Speaker B: These are the questions. These are the questions. No one knows.
[01:01:55] Speaker A: Well, no one knows. It's not just no one knows. You could say, you could suggest so one of the examples I use, let's say with liability and self driving cars like Tesla, let's say the government went we're not allowing self driving cars on the road unless you show us the code. They won't. But let's say Tesla goes sure and hands them the source code for the Tesla. Has the government got the capability to even read the code? Nope. Who do they get? Do you think any third party consultancy is going to take that on? KPMG with its million, with its all its people goes sure, we'll take that on. We'll declare it okay. And then they're liable with something bad happens. No one's going to want the liability. Not that anyone has got even got the skill. Like it's a known comment at the moment, right? We don't actually know how it works. The CEO of OpenAI has openly admitted they don't exactly actually know how their AI works. We're basing the world on something that nobody knows how it works. Little scary, but we're going to put our entire lives in infrastructure and we're going to build it on the back of something we actually don't understand how it works. And then what if it gets hacked? And that's where my brain comes in with the weaponization technology. I look at everything that gets released that I see, I go, well, how could I abuse that? Like years ago, I was working with some people and I figured out how to hack a cochlear implant. And I was working with people talking details and I had a guy who had a cochlear implant and I said, can I test something? He went sure. And I was able to generate the frequencies which nearly knocked him out. And he was like, ouch, don't do that again. I went, well, I mean you did agree. And he was like, yeah, fair enough. Keys, not that happening again. And then I demonstrated, I think it was about a month afterwards I demonstrated how to remotely kill someone by hacking a cardiac implant or a heart pacemaker. And someone said to me, how on earth did you figure out how to do that? I said, I bought one on eBay for $200. And the problem is that all these devices, whether it be a car, whether it be a pacemaker or cochlear implant, these are all amazing devices that are made by amazing people in the medical world, but they're not cybersecurity people. On my computer and my phone, I'm allowed to use whatever brand of cybersecurity software that I deem fit. And I have options, right, Norton, blah blah blah and caffeine, et cetera. Can I put that into my Tesla? Can I load Norton for Tesla? Can I load something else into my dog? Is there a semantic for robot dogs? No. And these people are not cybersecurity experience. Cybersecurity is as one of the things that is fundamental to what I talk about cybersecurity education. A education is a joke and terrible. But there are still tens of thousands of professions that don't exist. There's no cybersecurity person that's got gens and skills that also understands Internet of things, that also understands agriculture is going to be able to protect our farms or mining or agricultural manufacturing. You need to understand the industry that you're in and those very specific professionals, they don't exist. Or if there is, if they do, there's one guy. And that's the sad thing is that and now before and this is why we're trying to struggle trying to get there with cybersecurity and those areas. Let's add AI to that. And that's where we're stuck with barely. We're probably 5% way into the maturity of cybersecurity knowledge skills that we needed to have. And now the game has already changed where configuring like all everyone that worked on Cisco's Junipers and everything like that is all irrelevant now. Now I was actually testing ChatGPT to see what it. And actually it did freak me out a little bit because being an ex Cisco and Juniper engineer, I was like, let me throw some questions at it. And I just threw. I'm getting an attack from multiple things. I've got BGP with multiple feeds and I've got this and this and this and this. What code could I apply? And it just wrote code out and I just looked at it and I'm reading the current. I'm like, oh yeah, actually that would. Oh no. Like this $20 a month product eliminated the need, has eliminated the need for cybersecurity people in that, in those roles, writing code and stopping things. All we've got to do like is give the IDS intrusion prevention system, intrusion detection, some capability functionality plugged into a back end of ChatGPT and let it defend our networks doing what it needs to do, given what the data it has at the time at a speed no human today or ever for the rest of time will ever be able to handle. We should be able to do that now. So all and all these vendors are going, these products, I can guarantee you maybe by about February, March, they'll start to hit. They'll start to have the Juniper SRX with AI. Like I've got my new Galaxy S24 and they just rolled a Galaxy AI. Those things will start to hit the network engineering world and the corporate world very soon. So everyone's got to buy their cybersecurity stuff again. That's going to annihilate the entire cybersecurity workforce by eliminating a whole bunch of people that to go really aren't needed probably anyway. That's where an entire industry is going. But then who's left to teach everybody else?
[01:07:46] Speaker B: Look, there's a lot of questions here and I think that that's why I run the show, to get people on like yourself to share your thoughts and your insights.
[01:07:54] Speaker A: It's a, it's a nervous world. I do want to create a bunker and go and live in.
[01:07:59] Speaker B: I couldn't think of anything worse to go and live in. A bunker.
[01:08:02] Speaker A: Oh no, I just, I know you.
[01:08:04] Speaker B: Were the type of dude to do something like that.
[01:08:05] Speaker A: Well, it's not that. It's more the cyber and the Internet used to be something you used to be able to get away from. Like, you know, you'd log off, you leave your phone at home, but now your car's online, this is online, everything's going to know where you go. You got. I mean I'm talking to you over Starlink, which is the satellite of Elon Musk. And I can go anywhere and I can be doing this sitting on top of AIRS rock at this same quality. You can be anywhere. They're just about. Starlink is just about to bring out the mobile phone version of this which is going to upset an awful lot of people because borders and everything get messed up and things like that. So you can. Governments can no longer intercept you. We're just in a very complicated world that I see not getting better the educational paths. Problem is education is a difficult situation. You need experts. So this is what I love at the moment, right? People go oh, I'm an AI expert. And people say that and they're like really? What degree have you got in AI? What degree have you got an AI that you did over the last four years that lets you. And how many years experience have you got doing it to become the expert? There are few hands like on each hand type of people. But everyone else is just talented amateurs like me. I'm a talented amateur. I just happen to be in the right place with the right exposure. I've got no degrees in AI or anything like that. I've spin around a long time. But I type of girl like me need to be going and helping in the university. He's writing the courses that teach the professors to figure out how to students that is a two, three years to get ready, three or four years to get them graduated before they get out. So that we're talking five to seven years till capable people come out. And what is how long has AI gone? How long has it been since AI has gone? Absolutely office the last two or three years. So by the time these people graduate, where are we going to be then? So that's the whole education structure like so pre two, three years ago we had this world of certification. I'm a ccnp, I'm a cissp. I'm all these different things and it was a little bit calm. We had a baseline of if they'd got that, they could probably do that. The entire cyber world has just about to be blown apart because everyone's knowledge is about to become irrelevant. Work at a speed that humans can no longer work at. It's just a fun time. It's a fun time and I'm a little scared about where it's going to go. But I'm going to sit here watching it like a car slowly moving car crash.
[01:10:32] Speaker B: I have one really Last question. You have to answer it quickly because I know you.
What happens now after this, like everything you've discussed, all of the things. What do you think happens now? You're a futurist. What would be some of your predictions on, well, where do we go from here?
[01:10:48] Speaker A: The problem is that we are not willing as a society to change the way we're doing it. So everything we've been doing is what we're going to keep doing. It's going to hit us harder and harder. The giant breaches that we've been seeing are going to get worse and worse. I saw one at the bottom of my screen up here that shows me the news yesterday that the Australian government, Department of Foreign affairs, is unhappy about the leaking of passports in Bali. No, like these breaches are going to get worse and worse and worse. And because of the nature of what this government wants to do about our centralizing our data, that the organization don't have to have it. Who says they're going to get done? At some point, even they are going to get done. So we need to have a different way of thinking. So they're going to get worse. They are going to get to the point where it's going to actually hurt people. I have one significant worry, and that's that what most people don't understand about the next couple of years in warfare, like we've got little wars going on, Russia, Ukraine and Israel and things like that, when China wants to start taking things seriously with Taiwan and other things, if they don't think the war is going to end up in each of our houses where we're sitting here and we lose power or we lose gas or we lose things like that, because that's what they're going to go after is the infrastructure. Everyone thinks that war happens over there. Most people don't realize the war's happening here. And I'm really nervous because there are things happening at different levels where if you get scammed and that today you go to the police and they don't know what to do. What happens when the police don't know what to do? They don't have the expertise. They don't. They might have some framework for laws, but the laws that they're getting are more and more complicated. Like all the laws around domestic violence and different people like that, they're now complicating them so much that even the prosecutors, I know a couple of case prosecutors and they are not looking forward to that. This is common law regarding something simple. This AI this like proving it Wasn't you or the AI did it. My AI home got out of control. Like these are going to go mad and breaches are going to become more frequently and just how we started this conversation. What's going to happen when we hear the big giant breaches? We're going to look at the tv, we're going to nod ahead and go and is it going to change our behavior? Nope. Has Optus and Medibank and the others changed our behavior in Elon? Could they have? Is there anything that we could do differently to not. We can't not give our documents to these organizations otherwise we can't get anything. Do these big scary things actually enable us in any way to change our behavior? So I don't. It's not like that we have. I don't see the government's new service plan, what they're trying to do as being the final solution for it. I see it's going to be helpful. But given that these organizations will be connecting to the government to verify the data all that you need is a network to be hacked before that same thing happens. And they'll be able to access the government's data maybe via that way. It's not like we've even got some proposals in place that are going to make things better. I live in hope that getting kids off social media will help them mentally and mental health wise and things like that. I think that may help the 9, 8, 9 year olds coming through. How we fix the ones that are already broken. I'd see that potentially better. But I don't know how to help the general population. I don't know if the government spent a billion dollars tomorrow on education if it would help. The only thing as I said before is if you start to bring in laws that control things. We have nothing like that. Nothing on the table, nothing suggested for years because obviously that thing gets talked. That something gets talked about for years. Who got elections coming up? Has anyone mentioned that? No. Which means it's not going to be on anyone's agenda for five years potentially. And where will all this be in five years? This political and government and election things moves and policy moves too slowly for what's happening now. But imagine if the government suddenly tomorrow jumped up and put out all these laws which they can't even get through the Lighthouse or something like turn to the housing policy through how are they going to get things that and those are designed to help society. These would be designed to help society as well. And if they can't get them through then we're stuck. We're just going to go down further and further with more people being badly affected by the weaponization of technology. But I've already done simulations and scenarios. I'm not really going to go into them because I don't really want people coming up with ideas on how AI can be misused. And I've talked, I've talked about it in a couple of closed briefings and people just, the response is tell us that can't happen. And other people are just like, yeah, that's actually not the future that we could go and create that today. Got the tools to do some of these things today. It's now. This is what's going to happen. So all we need is the bad people to do things worse and it's going to Happen.
[01:16:16] Speaker B: This is KBCast, the voice of Cyber. Thanks for tuning in. For more industry leading news and thought provoking articles, visit KBI to get access today.
This episode is brought to you by MercSec. Your smarter route to security talent MercSec's executive search has helped enterprise organizations find the right people from around the world since 2012. Their on demand talent acquisition team helps startups and mid sized businesses scale faster and more efficiently. Find out
[email protected] today.