Hosted by Ben Thompson, the Oxford Business Podcast is a monthly podcast featuring conversations with experts in a range of fields including marketing, finance and sales.
[00:00:00] Ben Thompson: Welcome to the Oxford Business Podcast of the Oxford Business Community Network. We are recording from Story Ninety-Four at their wonderful podcast studio here in Oxford. If you are fascinated by a podcast studio, please, please get in touch with them and have a look. It is a really, really cool space, and I know I say that in every episode!
Today I'm so excited to be joined by Regina Johnys of Databasix, who's going to be sharing lots and lots of practical advice to you, the listener, all around data protection. Welcome, Regina.
[00:00:35] Regina Johnys: Hi, Ben. Thanks for having me!
[00:00:37] Ben Thompson: Thanks so much for coming. So let's jump into it and tell those listening all about Regina and indeed Databasix.
[00:00:44] Regina Johnys: Well, I founded Databasix with Kelly and it's our 10th anniversary in November since we founded the organisation. So we're going to be having a little party in November to celebrate that. Obviously, Database Six, we founded it with the idea of bringing people and data together. So we always take a really practical approach to all things data, all things data protection.
So most people probably know us for our data protection services. So we help people manage the data that they have, look at what personal data they're responsible for across an organisation, have it all nicely documented, have appropriate policies in place to protect it, and then make sure that that final piece of staff engagement is done really well and in an engaging way so that people can take it on board and know what they need to do on a day to day basis to keep data safe.
[00:01:30] Ben Thompson: Amazing. I always think it's nice to start with a little bit of the why behind a business. Why did you and Kelly set up Databasix? You are so passionate about what you do, but why did you do it?
[00:01:43] Regina Johnys: Wow. Yes, so we were looking for a new challenge in where we both were at our careers and had the idea from looking at the way people in the NHS and the public sector were looking at how they use systems very reliant on spreadsheets and realised that that translated into business sector as well and so we thought we'd set up the business and then data protection came along in the form of GDPR, not came along, but was enhanced and widened to a bigger scope of people due to the increase in fines, and we recognised that we had a very practical application to implementing data protection information governance from our experience in the NHS and so we felt that then that was a really good way of shaping the business to engage with more organisations from where we'd previously set it up from a few years before GDPR was actually introduced. So we both wanted to bring that real practical element to something that can be really overwhelming and confusing for organisations.
[00:02:41] Ben Thompson: No, absolutely. And in terms of GDPR, it was certainly a buzzword, wasn't it? Why was GDPR introduced and what is GDPR?
[00:02:50] Regina Johnys: Good question. So data protection has been around since the 80s, data protection legislation in the UK and the last prior version was 1998 and that was what we'd all been working to, and if you think about the situation of technology in 1998, you know, I was at university then and I wasn't sitting in Starbucks on my laptop writing my essays.
I was having to book a computer in the computer lab and hoping that I hadn't left it too late and last minute to actually book in and do the research, write the essays and internet was just about coming online. I had one of my projects was about creating an internet page about, you know, reportage in France and so it was very, very different to what we have now and what developed over the subsequent years, which is, you know, internet banking, mobile smartphones, you know, we're being tracked all over the internet, what we do, what we look at, what are we interested in, what are we clicking on and so when technology advances so quickly, it's really hard for legislation to keep up and I think the idea of GDPR was very much around strengthening individual rights, making sure that the data protection legislation was suitable for the world that we now live in and if you think since GDPR was introduced, we've now moved on quite significantly in sort of artificial intelligence and AI and again, you're going to see a, you know, I think we've seen some of the discussions about how does the legislation keep up and make sure that we're all protected in the way that these huge algorithms and computers are using our personal data and how do we protect people's privacy through that?
[00:04:33] Ben Thompson: Absolutely, and just in terms of without meaning to scare people, I know yourself and Kelly always put this so well is there were a lot of big figures in terms of you could be fined millions and millions. It shows my knowledge in terms of I don't know the figure, and I think there was a lot out there of what could happen. Did that happen? And what are the risks to you as a business owner of not taking data protection seriously?
[00:04:59] Regina Johnys: Sure. So technically through the legislation, you can be fined up to 20 million euros, which is about 17 million pounds, or 4 percent of your global turnover, which ever is bigger for a data breach and for other breaches of the actual legislation and technicalities, so if you didn't report the breach in time, or if you didn't do a data protection impact assessment when you did too, it's then half of that. So 2 percent global turnover or about 8 million pounds. So those were the figures and that's what got people's attention. Previously under the 1998 Data Protection Act, it was 500, 000 and that, you know, was for a lot of businesses. you know, I think a lot of them could factor it in to their figures, whereas the numbers that GDPR introduced meant that it became a bit more of a focus point for many organisations.
We haven't seen the ICO fine a loss of companies under GDPR. There's been some big headline figures, the likes of BA and the Marriott, and they still go through and challenge and so it takes some time for those even to work through those legalities, those steps, and often the fines will potentially be reduced on the challenge. For us, the bigger risk that we see, and I think it's much more of a concern for businesses, particularly smaller businesses who maybe feel that they're not going to be subject to those, those big fines is that reputational risk. You know, so many people now because of GDPR and the training they've had within their own organisations are aware of it. They're aware of the fact that they have rights now, which maybe they were less so before, and they're more willing to exercise them. So we've certainly seen increases in data subject access requests where somebody can go to an organisation, ask them to see the data that they hold about them, which is time consuming for an organisation and breaches hit the news, they'll hit national news. They'll hit cyber attacks and loss of data. You'll see stories routinely on the BBC, in the, in various newspapers and it gets people's attention and then they get worried about, well, what's happening to my data? What are you going to do with it? And then there's that question of trust. Do you trust that organisation to look after the very personal information you're giving them? Even if it's just your name, contact details, financial data, there's a risk there with that financial information being made available more widely and so I think organisations are more likely to lose customers from a data breach, from something adverse happening, from being cavalier with people's data than they are likely to get a huge fine from the ICO and it's that reputational damage that I think can often have a really significant impact on on businesses of all sizes.
[00:07:48] Ben Thompson: Absolutely. Reputation and trust is so important in any relationship, including business. Something I'm keen to just build on a little bit more is the process of GDPR and data protection. What, I appreciate there might not be a one size fits all for every sector and every business, but what should businesses be doing under the umbrella of data protection and how do they know if they are compliant?
[00:08:13] Regina Johnys: One thing I'd say to start off with is that if I see anyone say that they're 100 percent compliant, then I question that because I think there's always going to be something happening somewhere in your organisation that you'll need to be looking at and will be drifting off course, but taking it in sort of a phased approach, the first thing you really need to understand is what is the personal data you have?
What? What are you looking to protect? So being able to identify that across your organisation and thinking about it from the different pots of people that you engage with, so one of the questions that we will often get is does it apply to me? You know, I don't really have that much personal data. So my challenge to that is always, do you have any staff? And as soon as they say yes, which normally they do, then you've got personal data because you've got all that information about your employees. So if you think about mapping your personal data through employees, through your clients, through your suppliers, any other sort of stakeholders that you have in your business, then start thinking for each of those pots of data, what do we hold?
So there's going to be name, contact details, and getting down there more into the detail of that and the insight that you're getting. So employees, if we think about that, you'll have professional history, you'll have qualifications, you'll have the appraisals that you're doing with them, maybe over time, you'll have training that they've been on, grievances, disciplinaries, holidays, sickness and suddenly you start seeing the wealth of data that you are actually responsible for and that's just with one pot and it's the same with clients, it's the same with any other stakeholders that you have and data that you collect in your business.
I think once you've then thought about what have we got, think about what the systems that it's stored in. How do you use it? What's the reason that you're collecting that data? Looking at where in the world is it? You know, people talk about the cloud and, you know, generally they might not be necessarily sure where that cloud is, but essentially it's a service sitting in the world somewhere. So you need to know if that somewhere is as secure as the UK and the EU for protecting that data and making sure that people's rights are protected. Having that really good picture and map of, this is the data we've got, this is where it's stored, this is who in the business has access to it, this is who we share it with, you've got a really good understanding there of being able to then say, well, where are the risks?
Where could it break down? Where could somebody get access where they don't, they shouldn't, they shouldn't see it? Is it right that everyone in our business sees this or should it be restricted? You know, you wouldn't want everybody in the business necessarily having access to HR data, it wouldn't be appropriate and so you start to get that picture of where your risk is, and then you can start creating those policies and processes to be able to protect it. So how do you want people to access it? How do you want people to use it? Where are you happy for them to store it? So can they store it locally on their laptops or not? And if they do, is that laptop encrypted? If it's not, what happens if you lose that laptop? You start being able to think through those risks, possibilities, potential for breaches, and then put measures in place, both from a technical perspective. So what can your IT support company do to help? What can you do sort of technically to put in place to control and manage that data? And then what do you need to do to engage your people to understand their responsibilities? Because everyone within an organisation has a responsibility to protect that data in how they use it, how they store it, and comply with those policies.
So as an organisation, if you can get that in place and then make sure you've got your staff really well trained on their own responsibilities, how they use that data, what they should be doing with it and what they shouldn't be doing with it, then you're going to be in a much better position to be compliant and then the final piece is using all of that information to be able to then tell people what you're doing with the data and be transparent and that's a really important point because I think, you know, people have privacy policies and are often tempted to use templates and then when you look at the template versus what they do, it doesn't match up and it's very generic and so being able to go through that piece of understanding, this is what we do, why we do it. How we use that data and then translate that into something that means something for your customers, for your employees. You can then really be clear and transparent about we want this data. This is why we're going to use it and you know, in some cases, are you happy with that? If you're asking for consent, but consent isn't the only option. So there might be other reasons that you're collecting it and then you can tell them this is why, this is how we use it, this is how we protect it, this is how long we'll keep it for. So, that transparency piece is really important, because again, more and more people are wanting to know. I will always look at a privacy policy. So people like me out there who, you know, you want to be clear so that I can really understand this is what you're going to do with my data and I get it and I'm confident and happy to share my data with you on that basis.
[00:12:59] Ben Thompson: Absolutely. Really interesting actually, point you made about a privacy policy, which really resonates with a personal situation out of sector, I'm probably going on a tangent, sorry to those listening, but just is around insurance. So my house is 10. 5 meters from a river, and we pay a lot for a house insurance for that reason, but with that in mind, if you go on a comparison site and our next door neighbour had done exactly that, you can buy insurance really, really cheap that says it a big tick under a flood. But actually, if you look in the small print, if you're within 50 meters of a river, the flood risk isn't covered and there's no value in having an insurance policy if it's not going to cover you when there's a flood and equally, that extends to a privacy policy, doesn't it? There's no value in having a privacy policy if it's not fit for purpose. Is that example that I've used a really real example and is that example true with something like a privacy policy or different policies you could have under data protection? Just by having it, it doesn't necessarily mean that you are protected and it's fit for purpose.
[00:14:01] Regina Johnys: Exactly that. I think you can feel like you've ticked a box, but when it comes to it, it doesn't actually help if you've got a data protection policy, or sometimes you see it called a GDPR policy if it doesn't tell your staff how you expect them to behave if it doesn't tell your staff, this is how we as an organisation protect and manage our data how do they know what you expect them to do? How do they follow that policy because they'll read it and think it doesn't really make any sense. It doesn't relate to what we do same with a privacy policy if you read it and I've seen somewhere people have maybe tried to sort of pull together various different ones and they end up talking about processing data on vital interests, now, that's in the vital interest. You can process data to save somebody's life. It's a life and death situation, and they weren't in a life and death situation, so it wasn't a relevant basis for them and I think if you then, again, are telling people you're doing something with their data, or this is how you're going to use their data, and it doesn't reflect actually what you're doing, then you're not meeting that transparency element so just by having a privacy policy or privacy notice, unless it actually relates to what you do, why you're doing it, the legal basis you're relying on, all those things that you have to put in there, then it may as well not really be there because you're just ticking a box and not actually fulfilling that transparency requirement.
[00:15:25] Ben Thompson: No, really good point. One of the other services you offer at Databasix is training and I want to talk a little bit about training in the eye of your staff. So we've spoken about having the right policy in place. We've spoke about making sure that your staff know what the policies are. What can you do as a business to ensure that your staff are really trained? And what sort of training should you be looking at?
[00:15:47] Regina Johnys: You need to be assessing where your teams are. So what's the level of training that they need? So at the very basic, you'd be wanting to make sure that everybody is trained as they join your organisation, sort of on induction, that GDPR and data protection is part of that and that it's at a level that will give them that foundation.
So you want to be covering core principles, making sure that they understand their requirements and recognising what a data breach looks like, what subject access request looks like, so that they can respond to those things if they happen, it should be relevant. It should be something that they can link to that they can recognise having training that talks through the detail of the legislation, as interesting as that might be for some people like me, it's not necessarily going to engage your teams and so trying to bring a practical approach that makes it relevant to your team is really important I think, for engaging them and making them think about how it applies to them on a day to day basis as they do their role.
You need to make sure it's just not one off and tick box exercise as well and I think that's for me, the sort of the thread that runs through all these things is If you approach data protection with a tick box, we've done it exercise, then it's going to catch up with you at some point down the line and fall away. You want to try and embed good data protection practices into your culture and so thinking about yes, have the training, have it on induction, have an annual refresher, but then look at how can you bring that through into team meetings? How do you have a relevant discussion about data protection risks, issues?
As you sort of on an ongoing basis throughout the year, one of the things that we have in the team is like a little, we use Slack and we have a little channel that's data news and so we'll be sharing all the kind of, this is, look at this breach, look at this happening, this is what's come out from the ICO. So you kind of have a little bit of a channel where you can pull stories together and see what you can learn from that. I'm always a big advocate of the, the ICO have a page on it that says, action we've taken, and you can see what they've looked at in terms of breaches, the investigations they've done, the types of companies that they've looked at. Have a look at that, because if you find a similar company to your own on there, you can see the mistake that they made. Have a discussion amongst your team, understand, are any of these risks applicable to us? What do we need to do to learn from that person's mistake? Because learning from someone else's mistake is much better than making the same one yourself and I think the thing to recognise is that your team are working with that data on a day to day basis, so encourage them to flag risk. If they see something that they think, this puts personal data at risk, this puts our company at risk of a cyber attack, losing data, get them to raise it and say, you know, I've seen this, how can we make it better? Come with those solutions because they're on the front line using it and if there's sort of management, data protection lead. You've put policies in place, it's about looking and how can they be improved on a month by month basis and making sure that those risks are being managed and then the other element of training, once you've got that foundation in place is think about your specialist staff, so people in HR, people in IT they will potentially, people in finance, they will potentially have more access to more sensitive information. So you then need to think about what training can you offer them to support them in understanding the higher risk and what they need to do and how they can manage and work with that more sensitive information or respond to those responsibilities that they have in those roles that they employ on a day to day basis.
[00:19:16] Ben Thompson: No, absolutely and what happens with human error? So what happens with human error? So you are working with Databasix? Obviously, you've really invested in policies, you've really invested in your staff training, and then somebody within your team does something wrong, or whether that be human error, or whether and it does happen, whether it be doing the wrong thing, e. g leaving your business and taking a list of clients or whatever, what do you do in that scenario? And I guess the risk averse business leader listening, where is the blame there? Are you safe?
[00:19:52] Regina Johnys: So this will largely depend on what you've done to prepare yourself. If you think about the scenario described of what people should do, the one thing it doesn't prevent is mistakes happening and it, we always sort of say you need to be prepared for when a breach happens, not, not if, an email going to the wrong person with some attachments of personal data. That is an error and it might be a simple error and it might not necessarily be a huge impact of a breach, but it could be depending on the information that's gone and that can happen quite easily because of autofill. You know, it doesn't take much to send an email to the wrong person.
There is a responsibility of every individual to comply with policies and processes that organisations have. So if they have those in place, and you have an individual, as you said, who then maybe takes a list of clients with them and goes to the next place, sets up on their own in competition and contacts them all, that individual is very clearly in breach of the company's policies, as well as data protection legislation and so, because the company's done what they can to prevent that from happening and and the mistake from happening that individuals made an active decision because they know that's something that they shouldn't do they can be prosecuted under GDPR for that theft of data, which is essentially what it is and there was a an example I think it was last year of a lady who was fined for that very thing because the company has set up everything appropriately and she's effectively stole data. The thing that you need to, I suppose, think about is as a data protection lead, you need to have those processes in place that people can a recognise when a breach has happened, what constitutes a data breach, what's the potential breach that they will raise it, all the team will raise it with you so that you can then really look at it quickly and assess it.
So if something's happened, can you contain that breach? Can you stop it from happening and being ongoing? So in the case of an email, it might be trying to get it back or get confirmation that you've had the data back. If it's a cyber attack, can you, you know, get all the cables out the wall so that it can't spread further? Can you block access to your system? How do you stop that breach from being ongoing? And then you need to then look at what's happened. So once you've sort of contained it, you need to be understanding, right? What's been affected? Who's been affected? What's the nature of that data? And be making that assessment of the severity so that if people are at risk of physical financial harm, identity theft, anything like that, even if it's one or two people, you're more likely to have to report it through to the ICO, in which case you've got 72 hours. So the ICO is the Information Commissioner's Office, and they're the ones responsible for enforcing data protection legislation in the UK.
[00:22:33] Ben Thompson: Amazing. No, really practical advice there. Let's talk about something that you rightly touched on at the start of this podcast, which is about protecting your brand and your reputation and actually, I think the point you made in terms of in some ways, that's the biggest risk. So. If you do do something wrong, and there is human error, and I think we, we've all got the emails, that have been sent to a group of people, and then quickly, it's clearly, something around cyber, because an email quickly comes, I'm so sorry, I've been, etc, etc. How should you deal with that? So if you have done something wrong that is public, how should you deal with that? What would be your advice?
[00:23:11] Regina Johnys: So I think it's accepting the fact it's happened and yes, so you've done that email, CC instead of BCC. It is about trying to claw it back and not hide the fact that it's happened. You need to think about how you communicate. So depending on what the breach is, you need to be mindful of the fact that if people's information is at risk, then you have to think about how you can help them to protect themselves.
So the likes of some of the bigger companies, they may well look at employing PR or com support because you need to make sure that messaging that's going out is done in such a way that it accepts and apologises for what's happened, but also that you're managing those communications because you can make it worse and I think there was a talk talk breach way back when, pre GDPR, that the CEO didn't really manage the communications very well initially, and it didn't go down very well. So you can actually make it look worse. Trying to apologise, pull the information back, recognise that you've made that mistake, acknowledge that you'll be logging it in your data breach log, that you'll be investigating what happened and that you'll be learning from it and I think that's the key, you know, people make mistakes, but you need to learn from them and make sure they don't happen again. So I think customers may well, depending on the impact, accept the fact that a breach of emails happened. That continues to happen and they continue to make the same mistake and see that you're being maybe less focused on that compliance and how you can do things properly. They're more likely to then start thinking this isn't right and and then they make a complaint to the information commissioner's office as well. So I think being open and honest about what's happened and what the steps you're putting in place to rectify it for the longer term. So then your teams are, again, more aware of what they need to do more vigilant and less likely to make those errors, but also looking at the technical controls you can put in place.
So, can you put a delay, for example, on Microsoft office, for example, you can delay the actual send by 30 seconds, a minute, because it's always that point you hit send that you go, Oh no, I shouldn't have done that and it's that moment, then you've got a minute to get in and actually stop it from leaving your sandbox.
So there's some practical things you can do to help like that, but yeah, having a checklist. If you're sending out personal data via email, which again, think about other ways that you might be able to share that data. Think about, you know, what's the checklist beforehand? And if it's, is it in BCC is it, you know, is it in CC? What is that appropriate? Because sometimes it is appropriate to use the CC field if you need people to talk to each other, but if you should have done it BCC, so the email addresses don't get revealed. Try and just have that double check, triple check before you actually hit send on an email and try not to do things in a rush. It's always when we're rushing that mistakes happen.
[00:26:12] Ben Thompson: No, definitely, definitely. I'm now going to ask you a bit of a crystal ball question. So apologies in advance. What's going to be the future of data protection? And what should businesses consider now thinking about that future?
[00:26:26] Regina Johnys: Really interesting to think about. I certainly think that AI is going to have an impact on technology, and therefore data protection. You have to think about, and I see this a lot, people are getting excited by the new shiny tech, and don't necessarily think about what they're putting into things like chat GPT and the equivalence, you know, if you're putting data in there, have you read the privacy policy? Do you know what happens to it? Can you see that it's going to use your data to train its model? And then what's the risk of that popping out in somebody else's answer at some point along the way? So being mindful of what personal data is whenever you're engaging with any kind of technology for me is really important as these developments happen and not being swept along with it.
It's really important that you can be innovative. You can think about new ways of improving your business but data protection has this concept of privacy by design and by default. So what you're looking at is the way that systems are designed and managed, puts people's privacy at the heart of that so that they don't have to make choices to up the privacy. The privacy is high and then they can lower it and I think it's going to be really interesting to see how it shifts over potentially generations because the generations behind ours have grown up with everything being available on social media and on, you know, being public and what's going to be the impact on that of data protection and privacy and people's expectations of privacy and will that expectation of privacy still be there as those generations who've grown up with a much more public persona and presence, whether they're or not, they'll still need that.
But I saw quite an interesting story about a new digital verification. So trying to verify somebody's ID online so that they can only, you know, it's linked to an actual person and it's called, it's going to be interesting to see what the implications on privacy of that, because they're trying to verify people using an eye scan, an iris scan, and they've gone around, it's called WorldCoin. It's the same developer, same owner as ChatGPT. They're giving a small amount of this cryptocurrency to individuals who go and verify their identity for a digital world ID. They've started off in developing countries. So there's a question there about, is that actually fair? Because you're trying to encourage, do they have a genuine choice? You're encouraging people who maybe don't have very much with a small amount of money to then give over personal data and biometric data at that, which is special category because it's your iris scan and then also facial recognition to see whether or not you are a live human and it's not just a fake scam and it's very early stages. I think it was a news story a couple of weeks ago. It was being released across European and more countries around the world and about 2 million people have already signed up for it. But it's again, that question of how much have you thought about what's being captured and why, and how would that change the internet if you need to have a digital ID verification and it's in the hands of private companies and I just always think it's, you've got this huge database of people. If that gets into the wrong hands, what else could that data be used for? What are the purposes that maybe it wasn't collected with that in mind, but if somebody gets access to it, what's that risk of this huge database of individuals, the verified ID, how does that work? What are the risks? And I think there's going to be more insight coming in the future on that, but yeah, I think privacy is important and I think maybe as you get a bit older, you get a little bit more attached to keeping some things private.
[00:30:17] Ben Thompson: Absolutely. Absolutely. No, thank you so much. We are coming right to the end of the podcast. So thank you so much for sharing such insight. Just before we finish, one of the things I'd love you to do is if, and this is data, if you're willing, and we've got your consent, if somebody is listening to this and think, do you know what I need some help with data protection. I run a business and I would love to talk to data basics. How can they get in touch with you?
[00:30:41] Regina Johnys: Well, obviously we'd love to talk to them too. Always happy to help people and we get excited about data. So it will be an exciting, engaging conversation. We promise. So you can get in touch with us. Have a look at our website, which is dbxuk.com on there, we've got a little chat function that you can get in touch with our web team there or you can give us a call on 01235 838507 or you can drop me an email at regina.johnys@dbxuk.com
[00:31:11] Ben Thompson: Brilliant. Thank you so much. Thanks so much, Regina and thank you to you, the listener for listening to another episode of our podcast. I hope you found it really, really interesting. You've been listening to the Oxford Business Podcast of the Oxford Business Community Network recorded at the wonderful podcast studio of Story Ninety-Four.
Thank you so much to the team there for creating this great episode. So thanks again.