March 01, 2024

00:46:23

Episode 246 Deep Dive: David Batch | Demystifying Privacy by Design, Understanding Its Challenges, and Identifying Solutions

Episode 246 Deep Dive: David Batch | Demystifying Privacy by Design, Understanding Its Challenges, and Identifying Solutions
KBKAST
Episode 246 Deep Dive: David Batch | Demystifying Privacy by Design, Understanding Its Challenges, and Identifying Solutions

Mar 01 2024 | 00:46:23

/

Show Notes

In this episode, we are joined by David Batch (Executive Director, Privacy Advisory – CyberCX) as we delve into the principles of Privacy by Design and the need for organizations to prioritise privacy as a pillar of trust. From data retention challenges to the impact of data erasure on regulated industries, we navigate through the complexities of privacy management. Join us as we explore the potential catalysts for change, the role of regulatory frameworks, and the innovative solutions for empowering individuals to take control of their privacy.

David leads the Privacy Advisory Practice at CyberCX and brings to the role over 25 years of government and industry experience.  Prior to entering the consulting sector, David was responsible for privacy risk management at State Street Corporation across APAC, built and led the CBA’s first dedicated privacy team, led the privacy function at NEHTA (now the ADHA) and was the online safety, security and privacy lead for https://MySpace.com in Australia/NZ. Prior to his privacy focused roles, David was a Federal Agent with the Australian Federal Police and held a number of investigative, legal policy and training roles during his 9 year tenure with the agency.​

In addition to these roles, David was the creator of the annual consumer and organisation privacy research, the Deloitte Australian Privacy Index and its principal author for four editions. David also launched the annual Australian Privacy by Design Research and Awards in 2022, for which he and the team were awarded by the Australian Information Security Association as ‘Cybersecurity Researcher of the Year’ in 2022.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: There needs to be a conversation. The bar needs to be raised. There should be data ethics guardrails about what you will and won't do with data, what good consent looks like. Do you think that you've really got someone's permission to do what you're doing with the data? Because on page five of the privacy policy, you said you'll share with your third parties. I don't think so. I don't think that's good consent. So I think there needs to be some guardrails around that, and hopefully our law reform will raise the bar a there. [00:00:29] Speaker B: This is KBCAT as a primary target. [00:00:33] Speaker A: For ransomware campaigns, security and testing and. [00:00:36] Speaker B: Performance risk and compliance. [00:00:38] Speaker A: We can actually automate that, take that data and use it. [00:00:46] Speaker B: Joining me today is David Batch, executive director, privacy advisory from Cybersex. And today we're discussing navigating and implementing privacy by design. So, David, thanks for joining and welcome. [00:00:57] Speaker A: Thanks for having me, Chris. [00:00:58] Speaker B: So I want to start very basic, because sometimes people's interpretation of privacy by design, or in our cyber world, we like to use acronyms like PBD. So maybe let's start there. What is it and what does it do? Just so everyone's on the same page before we start getting to some of the nuances around privacy by design. [00:01:19] Speaker A: Yeah, sure. Look, privacy by design came about as a creation of a former ontarian privacy commissioner called Anne Kavukian. Back in the 1990s, she and her team there at the privacy commissioner's office decided that businesses were taking very much a compliance approach to managing privacy risk. And as we know, laws, particularly as it relates to data and technology, are usually sort of lagging behind advances in those things. So she thought, let's develop some principles that if organizations adhere to, they'll get privacy risk management right. So there's actually seven principles that have been developed, and they've been quite timeless. I mean, they're still applicable today, which is actually quite remarkable for something that was developed back in the this space. But, yeah, that's what they are. And they actually have universal application right across the world. And if you look at most privacy laws around the world, which dictate what organizations should and shouldn't do with personal information, they actually work really nicely in sync with those. [00:02:24] Speaker B: So would you mind running through the seven? You don't have to go into any sort of detail, but maybe just list them out. [00:02:30] Speaker A: Yeah, sure. So the first principle is around privacy being proactive, not reactive. So before I even get into the seven, I guess if you can just picture seven circles like a Venn diagram. There's a lot of overlap between them, but there are sort of unique qualities for each. That first one, privacy being proactive, not reactive, is actually sort of taking charge of privacy risk. Things that organizations can do to proactively manage privacy risk is having something like a privacy impact assessment process built in to your product development lifecycle. Anything that you do with personal information, just making sure that you've got something there to assess risk proactively, not responding to incidents as they occur, and then making expensive changes to the way you do things after the fact. The second principle is about making privacy the default setting in what you do, and a really good example of that might be something that you think about. We all get cookie banners when we visit certain websites on the Internet, and often they'll ask you to opt out of tracking and marketing cookies. But if you were to apply this principle, you would actually be asking people to opt in. So having privacy as the default and then letting people choose how much information you collect from them, particularly if it's information that isn't strictly necessary for the function that that person is seeking from you, then that's a really good example of having it as default privacy embedded into the design. Again, this is an example of how things overlap. But if you're designing something, making sure that you got privacy protections and some of the other things that the other principles speak to built in from the beginning, that will be achieved if you are applying default setting kind of approaches to things. If you are doing privacy impact assessments before you actually finalize and build that type of thing. There's another one about full functionality. And the principle here is largely about it doesn't have to be a zero sum game when it comes to privacy. You can have privacy and you can also have the full functionality of a website. A really good example would be a lot of organizations require you to authenticate before you use their service. If you think about the ABC website, for example, you can watch that content free on the television, but they're now asking you to authenticate to go on. But if you don't authenticate, you can't use it. That's an example of zero sup? Like they're taking information from you so you can then actually use the service. What would be positive some is that they gave you a choice. Either way, you might get some extra features because you've logged in, but that's what full functionality is about. That doesn't necessarily have to be a trade off between the two. The fifth principle is around security. So does what you've built have end to end information lifecycle security, which is obviously a really important part of privacy. Not all privacy is about security. Some people do have that misconception. If I get security right, I've got privacy locked out. That's not entirely correct. Privacy is as much about ensuring that you're using information or you're processing information in a way that would protect people's privacy as much as it is actually securing it. So that's a really important thing to note, particularly for people that work in the security domain, visibility and transparency. So that's largely about making sure that you always bring people along for the journey, that you're not trying to bury what you're doing in a 500 page privacy policy or something like that, that you're actually being really transparent and you could take a risk based approach to that. So there's a lot of processing that organizations do where they're collecting information about us that are fairly non controversial. But there are some things that are innately controversial for most people, like collecting people's location, for example. We're now living in an age where that's actually very possible. Tracking people physically through the world is now very possible. So should you be very transparent about that activity? I think so. So giving people that choice and that transparency is really important. And the 7th principle is about respect for user privacy. So keeping something really user centric, it's difficult because there's a lot to manage here. But a really good example you'll see on, say, Google's privacy dashboard, Facebook's privacy dashboard. In the real digitally native businesses, they have a really great dashboard for an individual to control the privacy settings. That's about know managing your privacy user centric. A lot of businesses and organizations don't have the same level of capability because they don't have the same control or visibility of their data to enable that type of technology. But those businesses do, and that's why they're the leading light. And when it comes to individuals being able to take control of their privacy, irrespective of, I guess, the actual core business, which is collecting information about people, they've really enabled people to dictate how that information will be used. Whether or not they're actually applying privacy as the default setting is a really good question, but they're actually giving people greater control. So that's what the 7th principle is about, and that's all of those working together. If you apply them as intended, you'll actually end up having a really good privacy posture. [00:07:55] Speaker B: Yeah, thanks for the overview. I think that makes sense in terms of everything you've listed out. So I want to go back a step now, from my understanding, I think it was principle number two, you said around the cookies, so the principle would be more like you're opting in. But again, going back to your website example, you're saying that websites are now asking you to opt out. So how does that sort of work? That seems almost like counterintuitive to the second principle, no? [00:08:20] Speaker A: Well, the last principle was about giving people control, so asking people to opt in is also giving them control as well. So they actually work together in that regard. But asking people to opt in means that you have employed the second principle, which is privacy as the default setting. Asking people to opt out means that you're still giving them control, but you haven't had privacy as the default setting. If we're talking about marketing and tracking cookies, for example. So that's an example where either option is actually user centric. One is about having privacy as the default, and the other is privacy, not as the default, if that makes sense. [00:09:00] Speaker B: No, got it. 100%. So going back to the 7th principle, the respect. Now, what are your thoughts on this with, look at what large, big tech giants have been over the years, and then going back to you mentioned before, around people being live tracked. Like I think Taylor Swift has barked up in the media to say, stop tracking my private jet. So do you think people have just lost respect nowadays? [00:09:25] Speaker A: Yeah, look, I'd say the bar is lowering constantly, and that does concern me, because I don't think people realize that as the bar lowers on a basic human rights principle, which is the right to privacy, then other things start to go awry. I don't know if you've seen the documentaries made around the Cambridge Analytica scandal, but that at its very core, was about basically industrial scale privacy breach, industrial scale collection of information about people, that on an individual scale, it was just people answering up an innocuous psychological survey. But obviously, they went beyond just the people that were answering that survey. They went out to all their friends, and they've collected this data on such a scale that data became very useful to change, potentially geopolitics. There is an argument out there, given the margins that both were won by, and I'm talking about Brexit and Trump's election, they were very, very narrow margins in both cases. And they do actually think that the campaign run by Cambridge Analytica, the disinformation and misinformation campaigns based on the data they had on these people from that source was enough to actually swing the outcome. So that's what people aren't sort of thinking about, though, as we lower the bar around what's acceptable and what's collected and how it's used, we're actually potentially putting democracy at risk. If you want to draw that really long bow. I actually don't think it's that long a bow to draw, to be honest. But there is now empirical evidence to suggest that we are on a track to something that I don't think anyone really likes the sound of. But, yeah, I think respecting privacy as a fundamental human right is foundational to the success of our society. That's my personal feeling about the whole topic. [00:11:12] Speaker B: Yeah, that's fair. I understand. And I think even when I know we sort of worked in the same company at a point in time, and even this was obviously a major priority for that organization. But then I have seen it. I don't like to use the word decline, but it kind of feels that way when all these things that are being said in the media, like imagine being Taylor Swift and someone's actually tracking a jet. I think Elon Musk had that as well. Like how violated people must feel. So going back to your point before around the bar is lowering, do you ever think we can get the bar back up there, or do you think sort of the horses bolted? [00:11:48] Speaker A: Well, that's a really good question. I am an optimist. I do believe that we can. I do think there'll have to be some catalysts, though, for that to happen, because I don't think it'll happen naturally. The reason I don't think it'll happen naturally is I think there's just far too much money to be made out of breaching people's privacy effectively. That's the problem. That's the driver for the bar lowering. And we've already seen some kind of change in perception, I think, in the australian market since a number of data breaches, well known ones in the last couple of years, people are now really starting to question, why do you have that data about me? Or why do you still have that data about me? What were you doing with it? That type of thing. So people are actually, I think, becoming more cognizant of it. I don't think we've seen yet an incident which actually changes the game, though. There could be an incident in the future that actually exposes far more sensitive information, information that people hold really, really dear. That would just be a bridge too far if that were to happen on a sort of national scale. I think that would change the conversation, and people would actually be wanting to learn more about why they got my information and withholding it more. There is this kind of, I guess, really frustrating situation. It's like this dilemma where people say they want privacy, but they're actually behaving slightly differently. Competition obviously changes things, and Facebook and Google dominate their sectors online. If they had actually more sort of competition, would people go to the alternatives if they felt like their privacy was being more preserved? So there's a few different things that I think need to happen. But not least of all, I think there needs to be a change in our regulatory framework, because for far too long, privacy hasn't been treated, I don't think, seriously by our governments. And I do think that there is a shift happening there because the Cambridge Analytica scandal really shone a light on the power that corporations now have over government, just because in some respects, I think it's fair to say they know more about the population than the government do. And I think they're really starting to see that at risk. So we're actually seeing some positive moves there. But regulation needs to catch up. I think privacy by design should be viewed as a carrot. Like privacy by design is great because it increased trust in your brand, but there also needs to be a corresponding stick, because the problem is there are other carrots. There are other carrots out there to misuse, overcollect, monetize information. And for that, you need the stick. So I think that the regulatory framework really does need to improve, and not just stronger laws, but a stronger regulator, because the laws are only as good as the regulator that enforces them. No criticism of a current regulator. They do a fantastic job, but the government has not given them the resources or the funding to actually really make a difference. And that needs to change as well. I think if those things happen, and there's quite a few things in that wish list, but if those things do happen, I think that we can start to raise the bar again. [00:15:00] Speaker B: Okay, those are excellent points you raised. So I want to get into this a little bit more. So, going back to consumer, we can use myself as an example about what data you collect about me. So, in large organizations, if you call up, they have the whole, oh, if you want to understand a privacy policy, like click number two or whatever it is. So if I called up a company, like whether it's a health insurance company, and said, what are you doing with my data? Do you think that the person on the other end of the phone can answer that? [00:15:25] Speaker A: No, they can't. They absolutely can't. Because I don't think there's almost no one single person that could answer that on the spot in an organization because the data gets used in so many different ways. This is a problem. This is a problem. This is a problem that we've got a solution for. That's where a really well designed privacy dashboard that allows you to access all the information that an organization holds about you lets you control what they're doing with it. So you can see what you've allowed. What you haven't allowed is actually where we need to get to. And the digital companies like Google and Facebook and quite a few of the other platforms do actually allow that. You can actually download all the information they have about you. But regular corporations aren't there yet, and there's a very simple reason why they aren't. It's because they still are struggling with understanding their systems aren't as modern, they haven't been designed in the digital age. They are retrofitting. They are retrofitting into those businesses and they still haven't caught up. So once they've got a really good understanding of their data and really strong data governance and data management practices, that's the type of thing that they'll be able to enable. So you can put it back in the consumer's hands to be able know, determine what information I have about me, what do I want back, what do I want to raise? We don't yet have a right to erase. You're in Australia, but that is one thing that is being currently discussed by the government as a proposed change to our law. So when that comes in, it's going to be really interesting because I think there'll be a flood of inquiries from people to different organizations saying, I want you to raise my data. How a company is going to comply with that, that's a really good question. It's going to be either very labor intensive or they're going to have to get better at managing their data in a scalable way. To answer your question, no, people can't answer that question easily. At the moment, it's all very manual, but we do need to get to a point where that is possible with any organization that's holding our information. [00:17:21] Speaker B: So I just want to press on this a little bit more because this is interesting because I've always been curious by this. So just so hypothetically, I do call up someone's like, look, I'll put you onto David Batch equivalent in the company, and they tell me I probably won't be able to stand it anyway because I'm not a lawyer, I'm not a privacy expert. I have to probably call you up, David, to say, this is what the person said. But then how do I trust that they're managing it in a way that they say they're going to manage it? [00:17:45] Speaker A: Yeah, look, that's a really good question. I know enough to be a little bit cynical about that process. I don't know if other people do. I also know that it's actually just because of the mess that most organizations data is in, it's really, really hard for them to comply with those type of requests in 100% sort of capacity. I think the answer to your question is you can't be sure that they're doing it properly. You can't be sure that they're doing it completely. [00:18:14] Speaker B: And would you also say that these companies are hoping that people like Carissa Breen aren't calling them up, asking that question? [00:18:21] Speaker A: Yeah, of course. And that's where I think when the law changes and we do have a right to erasure, for example, I mean, it's arguably there already implied as part of what's called australian privacy principle eleven, where if you no longer have a lawful basis to be holding information, your obligation is to destroy it. So if your basis to hold the information is consent from an individual, and an individual calls up and says, I want you to delete my information, they've effectively removed their consent. It's arguable that there's an implied right to erasure there already. Unless of course there is a competing regulatory requirement to hold the data, for example. But when it's made an express right, I know for a fact that we're going to see a lot of people calling up companies asking to raise their data. And the reason I know that is because I was working for an organization in the UK when the General Data Protection Regulation, or the GDPR came into force. And whilst it wasn't a new right in the UK for consumers, just the fact that there was a very highly publicized new law, it just really spurred consumers into trying to exercise their rights. So this is coming, this is absolutely coming in Australia, and organizations are going to have to comply with. So this is a really great time to start embedding privacy by design into everything you do for the entire lifecycle of how you manage personal information, because if you don't, you're going to end up with a mess that just gets you bogged down in process. So really great time to ensure that you're not collecting information you don't need, that you're obfuscating it in some way if it's not being used, that you're not retaining it for any longer than you need to, and that you're getting a really good view of what data you've got, why you've got it, what it's being used for, where it's being stored, and have that automated. So if you do get requests like this, they'll be very easy to comply with. But there's also another reason why organizations should do that. Privacy isn't about reducing an organization's ability to collect and use information. It's about putting the person in control of that, that the information relates to. And more often than not, people do want you to do those things with their data. So we also really love it when things are personalized to us from a brand that we trust. So having that really good view of data isn't just a regulatory and risk concern, it's actually a corresponding benefit. Like, the things that you can do with really good quality data are untold, and people love that. So I think that there's a really sort of virtuous cycle there, applying privacy by design, getting your data under control. And I think everyone wins. Businesses win and the community win, individuals win. [00:21:11] Speaker B: So just to go back one more moment, just on the erasing of your data, so what things can you erase now? I asked that question because now I'm not an expert, but I know in certain home loans or certain insurances, you've got to give your salary and your health and your kids health. There's a lot of information there. So if I just said, well, actually, I don't want you guys to have that, I'm going to erase it. Therefore, how does that impact then would look like, because sometimes they need that information to make certain homeline calls or something like that. So again, I'm not an expert, but I'm just then curious to be like, what stuff can you erase? [00:21:46] Speaker A: So it's a general right, but it will be overridden by any regulatory or legal obligation an organization has to retain information. So when you're talking about home loans and other sort of interactions, there are laws that require institutions to hold information for a certain period that would override a general right to erasure. But those periods aren't forever. They do expire. And the question is, if a company's got data, say it's a bank, it's a home loan that was paid out ten years ago, and they've still got information from the application. Why do they have it? That is the question. So in that instance, there may very well be a right to erasure, because there's no regulatory or reasonable reason that they should be holding onto that data, and holding onto it just because it might be useful in the future or they want to use it for marketing is not considered reasonable. So they have to have a really good reason. That's when you can exercise your right when it comes to marketing, or when you don't no longer use a service, for example, and there's no regulatory or legal requirement to retain the data. That's when you'll be able to exercise the right in full. [00:22:55] Speaker B: So do you think there's anyone out there that has had a home loan and then have called up ten years later to ask, hey, can you erase that? Anyone that you know? [00:23:02] Speaker A: No, that's exactly the point. It's beyond memory. But what do you. I mean, I don't know about you, but I was one of the, I think, majority of Australians that got a letter from latitude, and I had no idea that they actually had data of mine that was nearly 20 years old. I've never been a customer of latitude, but I was a customer of GE money, and I had subsequently learned that they acquired GE money and all their data. So when I got the letter, I was questioning, why am I getting this letter? Never been a customer of latitude. Latitude, I think, have been around for about 1314 years. So that was an example of, whilst I've never called up latitude to have my data destroyed in the event of a data breach, it became apparent that they actually had my information because they had to tell me under the mandatory data breach notification laws. So that's another example of that creates risk. And there were people, with every single data breach, there are people saying to these organizations they might be an existing customer, so they know full well why they're holding their information. But if they were a customer ten years ago, and we've heard those examples, they rightfully should be asking, why have you got my information? [00:24:18] Speaker B: Still such a good point. I was involved in the other two data breaches, so I can relate on that front. And one of them, I was a previous customer, so I can understand and empathize on that front. So going into the regulation side of things, so again said it's going to come in, we don't know, or you may know for sure when that's going to happen. What are the specific industries that will be perhaps targeted first? And another one that I think is really up there in terms of the privacy is retailers. [00:24:48] Speaker A: Yeah, retailers obviously have learnt that personal information is a really great way to target their marketing, and it is. It's actually a really great way to target their marketing. So they've been engaging in trying to learn as much about their customer base or prospective customer base as possible, which is perfectly legitimate. There's a way to do it that respects privacy and a way that do it to do it that doesn't. And privacy by design is about positive sum, not zero sum. You don't have to have one without the other. You can have both. But in terms of the new laws that are coming in, they're actually going to apply to all organizations in Australia. So it's going to have universal application across government and industry, not for private sector, everyone. So everyone's going to have to start to think about how do we manage privacy more effectively. And that's where privacy by design in particular, is a great place to start. If you comply with those principles, you're almost going to pass with flying colors with your compliance requirements. In fact, it really tends to go, if you're applying the principle properly, you tend to go beyond compliance terms of those principles. [00:25:59] Speaker B: With your experience, how many organization as a percentage would you think are actually implementing privacy by design? [00:26:07] Speaker A: Well, that's really interesting. I think there's elements of organizations applying the privacy by design principles. We know that because at cybersex we've actually been doing research for the last couple of years. In this space in particular, we've actually run. We're now in the third year of the privacy by Design awards, where we recognize companies that actually are demonstrating excellence in privacy by design. It's not universally applied, there's pockets of it. And I think organizations, it would be very hard to find an organization that's applying them universally across all their operations as well. But when you do see examples of it, they really stand out. [00:26:48] Speaker B: So how do companies demonstrate effective privacy? Is it internally? Is it externally? Is it both? [00:26:54] Speaker A: Yeah, that interesting question, because our research for the last couple of years has only really been limited to what can be observed externally, which is why we've actually moved to a new format for the privacy by design awards, where companies can nominate what they're doing internally, because not every organization actually has the ability or the platform to express publicly what they're doing in this space. In previous years, we had to really look at consumer organizations because they've got a really strong web presence and they can express certain things, particularly in the design of their web shop front, that we can observe. But yeah, there's a lot of companies out there that they might be b to b to c that don't have a big web presence. Bricks and mortar businesses, for example, that are doing some amazing stuff that we didn't have visibility of. So this year we're asking for people to nominate so we can learn about that and recognize it in what they do. But if we pivot back to the web shop fronts for organizations, there's a lot of things that we can actually observe that are really great examples of privacy by design. And you can tell just by the virtue of those things being there that this organization obviously takes privacy seriously. [00:28:09] Speaker B: So what are the some of the examples that come to mind? [00:28:12] Speaker A: Well, last year we gave an award to the ABC, and the ABC has probably suffered some criticism in recent years because they moved from being an open platform to being one where you had to authenticate and provide identity to be able to use their service, which was very unpopular. But one thing that they do do, and it is unique, it was unique in the brands that we assessed, at least, and we looked at the top ten across eleven sectors. So looking at brands that effectively probably have 90% of market share across each of those sectors in the consumer sphere, they were actually doing something really excellent with their cookie banners. So as I mentioned earlier in this discussion, a lot of cookie banners are designed where you have to opt out of tracking. They've got dark patterns in them where they push you towards selecting all cookies, that type of thing. So the ABC were literally the only organization, when we did the assessment last year, that had a cookies banner where you had to opt in to being tracked for marketing and analytics purposes, and they didn't try and get you to do it by highlighting the accept all button, they gave them equal footing. And it wasn't trying to trick you into, which is what we call a dark pattern, into giving up your privacy. So that was one really, really good example. And we were thinking, why aren't more organizations doing this? And that's because marketing and comms teams, and they're really obsessed with trying to understand who's visiting their websites. And I get that because they invest a lot of time and effort into those shop fronts, but it is a little bit creepy that we are being tracked on everything that we do, because if you're in a physical bricks and mortar store and the shopkeeper was following around, looking at everything you're picking up, you'd get a little bit annoyed, but that's kind of what's happening online. So that was actually just a really good example of an organization that's taking privacy seriously with one very little simple thing. So that's a great example. We've also Commonwealth bank, and I have to disclose they are a former employer of mine, but they actually went to great lengths to explain on their website how they were deidentifying data before they shared it with third parties. And we looked through in the materials and thought, this is great. We actually want to reward organizations that are still participating in the digital economy, still sharing data for, sometimes it's for good, sometimes it's just purely for monetary gain, but doing it in a way that preserves privacy and is not revealing information about individuals. And that really stood up because a lot of other organizations are actually just sharing information with third parties willynilly, sometimes to profit, sometimes for other benefits, and not really preserving privacy when they do it. I just had to have a phone call with a company yesterday that sent me some marketing in the snail mail, and I wanted to know, how did you get my information? They said, oh, we get it from partners. You agreed to it. And I said, well, okay, that's where we differ. Just because it's written in some privacy policy doesn't mean I agree to it. But that's unfortunately the state of the law at the, that's, that's why we sort of looked at Commonwealth bank and thought, yep, you're sharing information with third parties. You're preserving privacy in the process that should be recognized. So that's another example. And another good example was Google. Now it was a little bit controversial for us to give Google a privacy award for obvious reasons, because there's been a lot of negative sentiment towards Google for privacy, but they have on their website probably the gold standard for a privacy dashboard, which is really putting privacy back in control of the user. Now that's not to say that they aren't doing things behind the scenes that we wouldn't necessarily agree with, but if every organization gave that level of granular control to individuals about how their information is processed and used, then we would have a really great outcome from a privacy risk management point of view. So that's why we decided to recognize it, because if every organization did that, that would be absolutely fantastic. So there's some really good examples of who's doing well in this space, discrete examples. It doesn't necessarily mean that they're doing well right across all the different metrics, but we thought that they were worth recognizing because if every business did what they're doing, it would actually raise the bar. [00:32:48] Speaker B: So just going back to the CBA example. Now, you said they've explained what that means on their website, but have they explained it in a way that someone like me is going to. Yeah, I'm not David match. I don't understand these things. So is it written in a way that the average person could read it and understand it, or is it still convoluted? [00:33:07] Speaker A: Yeah, no, look, I believe it is. It's hard for me to be objective there, I guess. But I actually think it's communicated really well, and it's communicated by what they were trying to achieve, why they were trying to achieve it, because they take privacy seriously. And I just thought this is a brilliant example. And if more organizations that are sharing data with third parties actually went to the trouble not just to do this, but to explain what they've done and why they've done it, that actually means that this is something they care about. [00:33:38] Speaker B: And then going to your other example that you said that you called up and you asked, and they said, oh, you've agreed to it. Do you think companies are doing this intentionally, unintentionally? Do you think they're neutral about it and they unfortunately get sent off to a third party and then they're calling you up to try to sell you something? Where does that sort of sit in terms of morals. [00:33:58] Speaker A: To say that companies are doing it intentionally? I just think we've got a whole new generation of people that work in the space of marketing, that that's just the norm. That's what you do. You get data about people and you market to them. And I think that there's been a lowering of standards, which means that I don't think it's intentional, but just don't think people are really thinking about it. It's interesting when I talk to people about in marketing and analytics, when I say, well, how would you feel if this was your data or your mother's data or your father's data or your children's data? And they go, oh, we make sure that we don't get ourselves into that position. It's like, why? Because we know what's done with it. And I'm like, well, okay, so why aren't you personalizing this for other people? So it's almost like there's a disconnect. Like people don't view it as people's information, they view it as just information. So, yeah, it's one of those things that I think there needs to be a conversation on the bar needs to be raised. There should be data ethics guardrails about what you will and won't do with data what good consent looks like. Do you think that you've really got someone's permission to do what you're doing with the data? Because on page five, the privacy policy you said you'll share with your third parties. I don't think so. I don't think that's good consent. So I think there needs to be some guardrails around that and hopefully our law reform will raise the bar a bit. There consent notice and consent is broken anyway, and they are talking about putting a fair and reasonable test over information that's provided. But yeah, I think that people are just automatically doing this and not necessarily rethinking about what they're doing. [00:35:41] Speaker B: So just going back to your awards now for a moment. If we zoom out, as we've alluded to already in this interview, there are companies who have been had data breaches. How do you sort of build that trust back to say, like, hey, we actually do take your privacy seriously? Sort of stuffed up originally, but now we're working towards that because we can't keep holding these companies over a barrel for the next ten years. They made a mistake, but they got to try to fix it. But it's a hard thing to try to engender. We do good privacy and we're trying to build your trust back. Do you have any sort of advice on that front? [00:36:09] Speaker A: Yeah, look, I do. I think there's a lot of motherhood statements made, like, we take your privacy seriously. [00:36:15] Speaker B: How. [00:36:16] Speaker A: What are you doing? What are you doing? What are you changing? Like, tell us what you're doing. I actually think that's always my advice. Just tell people exactly what you're doing to make a change. How are you changing your data handling practices? Are you investing in making a change here? Are you reviewing your privacy risk framework and operating model to understand where we can make improvements, make those kind of comments and commit to them? We're going to put all our staff through detailed training so that they better understand how to properly handle personal information, how to ethically handle personal information, make some commitments, make some commitments that people can hold you to, but also deliver on them. And I actually think that's how you build trust when you stop making mulletwood statements and actually start saying, I'm going to do x, y, and z. [00:37:07] Speaker B: So just going back again. Yes, you're right. The whole virtue signaling, I would take your privacy seriously, and then if you're on a phone call to a client or you're in a meeting with a client and you're saying, well, what are you doing? How do people sort of answer that question? Or do they not answer it? Because they haven't probably really know. It's just sort of a sweeping statement that people sort of lead with. [00:37:26] Speaker A: Yeah, look, to be honest, it's one of those, everyone's got so much to be concerned about, and privacy is just another, it's just another risk domain and a very long list of risk domains that organizations have to manage. So I don't think it's been ignored intentionally, certainly not by the people that look after risk and compliance. But the conversations we typically have is, what do we need to do? What does good look like? So we will actually help clients with that. We'll go in and we've got to understand their obligations and also their unique context, their industry, their data pricing activities to give them advice on what good looks like. But yeah, I think a lot of people are on that journey because they've just done a few little tactical things, because they know they've got to. They've written a privacy policy, and that's probably about it. Are they actually adhering to the things they're saying in their privacy policy? Well, nobody knows because they don't really have any sort of governance control framework to measure what they're doing. So, yeah, there's a lot of uplift that needs to happen right across industry. And if you use the privacy by design principles as your guiding light on top of overlay it with your sort of obligations, then I think that's when things will actually start to move. That's where you'll move the dial in privacy risk management in your organization. You do have to take a principles based approach to this, and that's why our law is actually written in that way as well. Because if you get too prescriptive, a little bit like security, if you get too prescriptive, then it's not necessarily going to flex with context. [00:38:58] Speaker B: So you made a comment before, David, saying that people, consumers, they want more privacy. Do you think they even know really what that means? Like in today's day and age? Because everyone, or most people now operate the Internet, they're banking online, everything's online shopping, et cetera. So what does that statement mean for someone like you who is a privacy expert? [00:39:18] Speaker A: I don't think people's desire for privacy has really changed pre digital, past, digital world. I think everyone still has the same basic desire for privacy. What constitutes privacy might have shifted slightly, but I think people have slept walked into this position where they didn't realize how much they were giving up. And as I sort of mentioned earlier, I think that people are starting to wake up to that fact. I think the sleepwalking is coming to an end. They're starting to wake up with the data breaches that are happening. I think people are starting to understand there's a lot of data out there about me. I'm not necessarily comfortable with that. So I think we are seeing a little bit of a shift. But yeah, there is that sort of that tension between people want what they get when they give up their information, but they want privacy at the same time. There's a really interesting, he's a behavioral psychologist in America called Alessandro Acquisti. He actually did some experiments. At what point did people give up their privacy from a monetary point of view or a benefit point of view? And it's actually really fascinating research. But yeah, there is this dilemma. How do you have both and the privacy by design principles do say it should be a positive sum, it shouldn't be a zero sum. You shouldn't have to give up your privacy for benefit. You should be able to have both. And properly applied, you can actually find a way where people have their privacy preserved, but they get what they came for. That's a really important principle, and I think it's one that has been lost a little bit. People say you can either have your privacy or not, and I don't think that's the way it needs to be. [00:40:59] Speaker B: So at what point did we give up privacy? Or is that the conundrum? Is that the question? [00:41:03] Speaker A: Well, it's interesting. So I started my privacy career working at Myspace, and I was the director of safety Myspace in Australia, New Zealand, back in the late. Yeah, I know it's a long time ago, back in the late sort of. Interesting, because my role there was largely in response to some of the safety issues that the audience, which were largely children and teens, were having online. Because it was a new sort of creation. People hadn't really, I guess, exposed themselves and who they are and where they are and what they are online before. So, yeah, it created a whole lot of safety issues, but as you'll be really aware at that time, social networks, search engines, they were all looking at ways to monetize the information they had so they could actually be a business that survived and they didn't want to go down the subscription route, but they did see the opportunity. We've got this information, this really rich data set about people that we can sell to advertisers and we can create target audiences here, and I could see that starting to emerge when I was working there and I'm like going, wow, this is a bit dystopian. This is only going to grow bigger. And that's what got my interest in privacy. And I've stayed in the field ever since, obviously, since the early days of Google and Myspace, which was the first large social network, but has been completely eclipsed by Facebook. That's when the game started to change, Web 2.0, when we were basically putting tons of information online about ourselves and everyone started to understand the value of that. That was when the game changed. So I think around that time, I think it's Web 2.0 that changed the game, the ability for user generated content, and ever since then, it's just been an exponential increase in data. About us. [00:42:59] Speaker B: So David, you touched on before around the Australian Privacy by design awards. Now, is there anything you'd like to add to that? Again? In the show notes we all put links, et cetera, but is there anything that you'd like to just touch on again and summarize? [00:43:11] Speaker A: Yeah, sure. Look, I think privacy is one of those areas that if you're working in a corporate role or in data more broadly and you're trying to do the right thing, you're applying ethics to data. Some of those initiatives in organizations are actually quite hard won because there is a tension. There are people that want to do the wrong thing with data, want to monetize information to an inch of its existence, and it can be really, really hard to fight that force in an organization. So we've created these awards to recognize organizations and the people in them that are actually doing things that are exceptional, because we know how hard it is, we know how important it is, and we actually want to shine a light on the work they're doing so other companies can go, oh, there is actually a benefit in doing this stuff. We will get recognized for it. It's popular with consumers. So that's why we've created the privacy by design awards as really the only mechanism in Australia that recognizes organisations that are doing exceptional things in this space. And we really want organizations, large, small, government, private, public companies, anyone, to tell us those stories. Nominations are open right now and we do have a panel of seven judges from across industry, government, academia who will be voting on what are the best examples for that. And it will culminate in an award ceremony which will be held on the 2 May, which we are yet to advertise. But at this point, the nominations process is open. So we want to hear from everyone, if you're doing something exceptional, or even just applying the principles. Generally, we want to hear about it, and yeah, we'd like to recognize you for it. [00:44:57] Speaker B: And lastly, David, is there any sort of closing comments or final thoughts you'd like to conclude our interview with today? [00:45:03] Speaker A: I just would like to say I think privacy is underrated. I actually think it's a pillar of trust. I think the more you apply these principles, the closer you get to your customers, to your staff, and the more trust you have, the more benefits come from that. If you think about your personal relationships, you get really close to people. You learn a lot about them when they trust you, and you can't breach that trust by doing things with the information they give you that they wouldn't expect. Why would you treat your customers or employees any different? [00:45:41] Speaker B: This is KVcast, the voice of cyber. Thanks for tuning in. For more industry leading news and thought provoking articles, visit KBI Media to get access today. This episode is brought to you by Mercksec, your smarter route to security talent Mercksec's executive search has helped enterprise organizations find the right people from around the world since 2012. Their on demand talent acquisition team helps startups and midsize businesses scale faster and more efficiently. Find out [email protected] today our.

Other Episodes