Technology Now

In this episode we are looking at a topic which we could spend a whole lot longer than one episode on, so we’re going to make it a two-parter - cybersecurity and mental health. 

We’ll be looking at two aspects in particular - why our health data is particularly vulnerable and of interest to criminals, and how our mental health state affects our ability to be able to make good cybersecurity decisions. 

In this episode, we’ll be discussing the first part of the equation with Catherine Knibbs, a Psychotherapist and specialist in online harms.

This is Technology Now, a weekly show from Hewlett Packard Enterprise. Every week we look at a story that's been making headlines, take a look at the technology behind it, and explain why it matters to organizations and what we can learn from it. 

Do you have a question for the expert? Ask it here using this Google form: https://forms.gle/8vzFNnPa94awARHMA 

About this week's guest, Catherine Knibbs: https://www.childrenandtech.co.uk/ 

Sources and statistics cited in this episode: 

Mental health data more valuable than credit card data on the dark web: https://kevincurran.org/security/patient-data-10-15-times-more-valuable-than-credit-card-data/

Statistics on compromised health records: https://pubmed.ncbi.nlm.nih.gov/36580326/

Cybersecurity: a critical priority for digital mental health, published in the journal Frontiers in Digital Health: https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2023.1242264/full#B3 

Neanderthal and human social mixing: https://www.nature.com/articles/s41598-024-70206-y 

Creators & Guests

AL
Host
Aubrey Lovell
MB
Host
Michael Bird

What is Technology Now?

HPE news. Tech insights. World-class innovations. We take you straight to the source — interviewing tech's foremost thought leaders and change-makers that are propelling businesses and industries forward.

Alysha Kempson-Taylor (00:10):
Hello and welcome back to Technology Now, a weekly show from Hewlett-Packard Enterprise, where we take what's happening in the world and explore how it's changing the way organizations are using technology. We are hosts, Alysha Kempson-Taylor, in again for Aubrey Lovell.

Michael Bird (00:10):
And Michael Bird.

Alysha Kempson-Taylor (00:26):
And just like last week, we are recording this in person from the Customer Innovation Center in London.

Michael Bird (00:32):
Yes, we are. And it's so lovely to be sat opposite physically in the same room opposite each other, so maybe we'll do more of that. Anyway, in this episode, we are looking at a topic which we could spend a whole lot longer than one episode on. So we're going to make it a two-parter. Cybersecurity and mental health. Now, we're going to be looking at two aspects in particular, why our health data is particularly vulnerable and of interest to criminals, and how our mental health state affects our ability to be able to make good cybersecurity decisions. Now, in this episode, we'll be examining the first part of the equation. Why our mental health interactions are vulnerable, why are they of interest, and why all of our organizations should be paying attention to the issue? Next week, we'll be looking at how to put people in the right mental head space to make good decisions around their security or to put safeguards in place.

Alysha Kempson-Taylor (01:31):
Yes, super important topics. So if you are the kind of person who needs to know why, what's going on in the world matters to your organization, this podcast is for you. Oh, and if you haven't yet, subscribe to your podcast app of choice so you don't miss out. Right, let's get into it.

Michael Bird (01:50):
Our health data is a huge industry in and of itself. According to Statista market research, global revenue in the digital health market is projected to reach 171.90 billion in 2024, with average revenue per user of around $85. That is a huge amount, but not without its challenges. Personal health data is now the most valuable form of data on the dark web, even more so than credit card data.

(02:20):
That's according to research from Professor Kevin Curran at the University of Ulster, which we've linked to in the show notes. Meanwhile, in January 2021 alone, a total of 878.17 million data records were compromised worldwide. My goodness. And that number is likely to have grown significantly since. That's according to research published in JAMA Health Forum, which we've also linked to. We've got those statistics from a paper called Cybersecurity: a critical priority for digital mental health, published in the journal, Frontiers in Digital Health. It's obviously a massive issue. So today, we are joined by one of the authors of that paper, Catherine Knibbs, a psychotherapist and specialist in online harms. Catherine, welcome to the show. So why is mental health data particularly valuable to cyber criminals?

Catherine Knibbs (03:15):
Certainly, the way I approach this is from an ethical and cyber ethical perspective, that actually the mental health services hold what I call sacrosanct data. And the reason I say that is the level of cyber security required for banking protects your financial data, and that might include your address and transactions that are coming in and out. However, when you talk to a mental health practitioner, the perspective that you take is you share your secrets, your wishes, your desires, third party information. You share your traumas, you might share your political views. And we move into a space where I think under GDPR that gets called sensitive data, but I'm saying it's more than sensitive. It's sacrosanct.

(04:06):
If that data is specifically targeted, it can be sold to cyber criminals who might then sell it on for ad revenue. It can be used to threaten somebody with ... If this information is not known outside of your relationship with your health practitioner, then that puts a whole heap of pressure on people. And one of the particular approaches that I often say in this space is that data that you share with a healthcare professional, whilst it's sacrosanct, it also has the most leverage for blackmail, for coercion. But also, it has the highest threat in terms of what a person might do, should that data be revealed, if that data is misused for nefarious purposes in any way.

Alysha Kempson-Taylor (04:55):
So how is it that our mental health data becomes so easily mined and so easily made available? Where do those weak points begin?

Catherine Knibbs (05:03):
So in my experience, and I've been doing this for a long time, I would say mental health professionals or healthcare professionals usually move into this role because they want to help people. That means that the kinds of personalities that move into this space are not necessarily like myself a systems thinker. So being a systems thinker myself, I tend to look at high-end integrity, ethical values, and what it is that I'm actually doing when it comes to the use of technology. So I have a particular approach in terms of protecting data, knowing how to protect that data. That comes from a background of being around engineering, being in the military, and certainly knowing about things like personal security, information security, and cyber security. Whereas many of the professionals who move into a space of wanting to help are not necessarily versed in the intricacies of technology.

(06:08):
They're not necessarily versed in data protection, the legislation, because that's almost an outside part of the training. What we actually trained to do as healthcare professionals is how to talk to their clients and patients. And the use of technology in practice is not necessarily a core part of the training of many disciplines. Hence, why myself and some of the other contributors to the article took the approach we did, because we see that there is a huge crisis that hasn't really been addressed in relation to data breaches or data concerns being used for those ad revenue on nefarious purposes just yet.

Alysha Kempson-Taylor (06:52):
Could you talk us through how someone might inadvertently put themselves at risk?

Catherine Knibbs (06:57):
So I will receive contact via a number of locations online. And usually, what happens is people in distress are not in a position to think about, if I email Cath and I give her all of this personal information, what's referred to as sensitive information under data protection, what's going to happen to that data? How is she going to look after it? Where will she store it? How long will she store it for? None of those questions are really in a person's mind when they're in a state of distress. And what's currently happening is ... I often use the term to vomit in terms of this is what people in distress do. They will say to you something along the lines of, "Hi, Cath, I've split up with my partner, my nephew, my niece, my friend," whoever it might be. "They're suffering with depression. They're doing this, they're doing that, they're doing the other. He said this, she said that."

(07:50):
And they put it all into one contact with you because they want you to understand where they're coming from. They want you to see their distress and they want you to help them. So certainly on, a lot of the social media platforms at the moment, this is happening in DMs. So I've had to put a caveat onto my social media platform saying, "I cannot help you in the direct messages and please don't send personal information through." And unfortunately, people in distress don't read that. They are not in a place to consider it. And so they share, they overshare. And unfortunately, most email service providers that practitioners use, they're not encrypted, they're not protected. For example, contact forms on websites are not necessarily secure. They're going through a third party.

(08:40):
And even that level of consideration seems outside of the practice of many practitioners, because what they do tell me is, "Well, I'm using X service, or I'm using X platform," or, "I password protect my computer." I say, "Okay, have you considered that even if you are contacting your potential client's, customers, patients, and you're using your mobile phone and you're connected to a wi-fi network in a cafe, you are really not protecting that person's data?" It really does require us, as these professionals in the space of using technology, that we actually have a basic understanding of cybersecurity.

Alysha Kempson-Taylor (09:21):
So what can people listening, who are making an outreach, do to protect themselves?

Catherine Knibbs (09:26):
I do say to people, please don't reach out with personal information. Please reach out and say ... For example, one of the platforms I used to use would say, "Please email me and say confidential," or, "Please email me and say, "Email me, Cath."" And then what I would do is provide a secure link for that person to reply back to. Sadly, the service I was using at the time is no longer running.

(09:49):
So at the moment, what I suggest to people is that they reach out and they say, "Hi, I'm looking to book an appointment," or "Hi." And even in the space of where we are, I think sometimes, the website URL or somebody's email gives away what service you are delivering. But by the very fact that people visit our websites, by the very fact that they email us, it might not protect every part of their data. Generally, I would say try and find the most secure method of contacting somebody, which in today's age might just be text messaging. It might be to call the practitioner. And if you're going to send an email, keep the information that you share to a minimum.

Alysha Kempson-Taylor (10:31):
And in your opinion, do you think that there is a need for regulation in this space when it comes to health communication?

Catherine Knibbs (10:37):
In one word, yes. The reason is because ... And I'm just thinking about how the internet gets talked about. So it's the wild, wild west. Well, in that case, we need some form of legislation, and the rules and regulations are what help us form social norms. Those social norms are there to protect the people who are part of that tribe, that society, that village, so to speak, that I talked of earlier.

Alysha Kempson-Taylor (11:03):
Thanks, Catherine. There are some really amazing insights in there.

(11:09):
Now, is time for Today I Learned, the part of the show where we take a look at something that's happening in the world that we think you should know about. Michael, what have you got for us this week?

Michael Bird (11:18):
Well, this week, I have news that scientists believe they may have discovered the exact spot where humans became the creatures we are today, mostly homo sapien, but with a touch of Neanderthal genetics. Somewhere between one and 4% for most of us.

Alysha Kempson-Taylor (11:19):
Wow.

Michael Bird (11:32):
Now, researchers from Cologne in Germany believe they may have pinpointed the spot where humans and Neanderthals first met and learnt to co-exist, at least for a while. Using a novel combination of genetic archeological, topographical, and ecological data, the team managed to narrow down the location to the northern parts of the Zagros Mountains in Northern Iran. The region has plenty of natural shelter, good resources, and the perfect climate combination. It's warm and dry in parts, which is what modern humans like. And on the other hand, it's colder, even snowy in other pockets. Perfect for Neanderthals. The research suggests that the mountains acted as a funnel for the two groups as they migrated north and south, meaning that they were forced to interact in the mountain passes, something that supported by archeological and DNA evidence. The evidence of both has previously been spotted in the area. But until now, the point where our genetics became what they are today has been shrouded in mystery. What's not yet known is exactly when it happened. The window is quite large because it sits somewhere between 80,000 to 120,000 years ago.

Alysha Kempson-Taylor (12:48):
Wow. Awesome. Thank you for that, Michael.

(12:54):
Right. Now it's time to return to our guest, Catherine Knibbs, to talk about our digital mental health data. So Catherine, are there major differences between the ways that individual practitioners are approaching this, versus large healthcare organizations?

Catherine Knibbs (13:09):
So if we look at individual practitioners, small and medium enterprises all the way through to large organizations, corporations, NGOs, there is a stark difference. And generally, that's because if we think about large organizations, they can afford to employ staff who are trained in and around IT, parts of that business, for example. So often, when I'm working in this country and I'm talking to singular practitioners, they're using consumer-based video platforms to provide confidential services. And when I have a conversation with them, what often happens is they will say, "Well, this is what ..." So if I just mentioned the NHS, which is the largest healthcare service that we've got in the United Kingdom, they will say, "But that's what they use." Then I have to have a conversation with the practitioners and with the services saying, "Okay, but in this particular organization, they have IT staff, they have systems analysts. They provide weekly backups." If the organization has that infrastructure in place, then the employees and people who work there are held and contained within that infrastructure. And singular practitioners have to be all aspects of a business whilst operating and using technology.

Alysha Kempson-Taylor (14:31):
Okay. But I suppose it's not foolproof. There have been huge leaks of healthcare data from large organizations.

Catherine Knibbs (14:37):
Yeah, we are all fallible because we're humans. I think when we talk about data and working in this environment, we often forget that data is many different forms. And I think that's also confusing to some of the practitioners, when I explained to them about the use of mobile devices, the use of paperwork and how we transition between those different mediums and spaces. Also, what we're actually doing when we think about maybe the device we're using and how we back that up ourselves, and the fact we then using third-party services to do so, and what that actually means. So it's a really complicated landscape at the moment.

Alysha Kempson-Taylor (15:20):
And do you think among SMEs, the understanding of this issue is improving?

Catherine Knibbs (15:25):
I would say in the last five years, I am finding the same issues, the same levels of, "I didn't know that. I didn't know that people could do that. I didn't know that that would happen. Nobody told me." There's a large level of naivety and there's a large lack of, "It's too complicated. I don't want to know." And if there was recommended software ... And often I get asked this question about, "So what do you use, Cath? Why are you recommending that platform?" And I say, "Because at this point in time, it's a trustworthy service or it's a trustworthy platform."

(15:58):
I have yet to find a safe and secure platform for healthcare services. One that has really high levels of industry standards. There have been a few attempts. Sadly, the bit that I find that's really, really hard to swallow at times here is practitioners also don't want to pay out for platforms when there are free services that people can use and that makes their life easy. And to give you an example of that, even one membership body said to me, "Cath, oh, don't make it complicated. We can just use a named a consumer-based video platform because we all know how to use that one," was the next sentence.

Michael Bird (16:43):
So Catherine, why should our organizations be concerned about our health data, particularly around mental health? And I guess then what happens to it?

Catherine Knibbs (16:52):
This is everybody's responsibility. If we provide a service that's within or under that umbrella of health care, we have to consider that the people using that system are not end users as we often refer to them. They are people with their own histories, with their own distress and tolerance levels. And when we are using processing, protecting data, that data isn't just ones and zeros. It's somebody's history. It's somebody's sacrosanct data.

(17:24):
We certainly need education around technology. And what's really happened is technology became something that's now so ubiquitous that we haven't even stopped to consider the ramifications of it. Many of us have begun using this technology without ever taking a step back, taking a moment and saying, "What is it that I'm doing? What is it that I need to think about? What are the risks? What are the dangers? What are the pitfalls? And what are the things I need to do?" Yes, we certainly need more education, legislation that helps us understand this, and most certainly training. And that's one of the points that's in the cybersecurity article, is the recommendation is all healthcare professionals have training in data protection from the outset of their training.

Alysha Kempson-Taylor (18:17):
Perfect. Thank you so much. It's been really great to talk to you. And you can find more on the topics discussed in today's episodes in the show notes.

(18:27):
Right. We are getting towards the end of the show, which means it is time for this week in history, a look at the monumental events in the world of business and technology, which have changed our lives. Michael, what was it from last week?

Michael Bird (18:39):
Well, the clue last week was it's 1869, and this Austrian invention was a real red letter day. Alysha, what did you think it was? Do you know what it was?

Alysha Kempson-Taylor (18:53):
I have an idea, but I don't know if I'm quite correct on this one.

Michael Bird (18:55):
Okay. Well, I'll tell you what it is and you can tell me if you've got it right. Because hold onto your stamps, it's the invention of the postcard.

Alysha Kempson-Taylor (19:03):
Ah.

Michael Bird (19:03):
No?

Alysha Kempson-Taylor (19:03):
No.

Michael Bird (19:04):
No, okay. Which was this week, 155 years ago. One doctor, Emanuel Herrmann, had suggested so-called correspondence carts, hopefully I pronounced that correctly, to provide soldiers with a cheap way to keep in touch with home. They were an immediate hit. Three million were sold in the first three months, and within a few months, the rest of the world picked up on them too. Here in Britain, 75 million were posted in the first year of use, with the U.S. following in 1873. Amazing.

Alysha Kempson-Taylor (19:40):
That is pretty cool. Very impressive. And the clue for next week, well, it is 1942. Just copy me and you'll be fine. Any ideas?

Michael Bird (19:51):
No. You know what? The last few I've really struggled with, so no. No idea.

Alysha Kempson-Taylor (19:56):
Well, that brings us to the end of Technology Now for this week. Thank you to our guest, Catherine Knibbs, psychotherapist and specialist in online harms. And to you, thank you so much for joining us.

Michael Bird (20:09):
This episode of Technology Now was hosted by Alysha Kempson-Taylor, and myself, Michael Bird. And this episode was produced by Sam Datta-Paulin, with production support from Harry Morton, Zoe Anderson, Aubrey Lovell, Alison Paisley, Alyssa Mitri and Kamila Patel.

Alysha Kempson-Taylor (20:24):
And our social editorial team is Rebecca Wissinger, Judy-Ann Goldman, Katie Guarino. And our social media designers are Alejandra Garcia, Carlos Alberto Suarez, and Amber Maldonado.

Michael Bird (20:35):
Technology Now is a Lower Street production for Hewlett Packard Enterprise. And we'll see you at the same time, same place, next week. Cheers.