[toggle title=”View Transcript” state=”close”]
Maria Palombini
Hello everyone. And welcome to season two of the IEEE SA Rethink Health Podcast Series. I’m your host, Maria Palombini, and I lead the IEEE SA Healthcare and Life Sciences Practice. The practice is a platform for multidisciplinary stakeholders from around the globe who are seeking to develop solutions for driving responsible adoption of new technologies and applications that will lead to more security protection and universal access to quality of care for all individuals. We all know cyber security is hot right now. There’s a lot of discussion about the challenges we’re seeing from breaches to organizational individual risk. And it’s constantly evolving from policy getting involved to technologists and engineers, trying to develop these solutions. As quick as the challenges come at us, this season features conversations with experts on this growing challenge on cyber warfare, the breaches, the use of the technologies that are out there helping us to improve our healthcare, but making our data vulnerable at the same time.
So with that, I would like to introduce you to Dr. Becky Inkster, who is our guest today. A little bit about Becky, she’s a neuroscientist. She’s passionate about everything from cell phones to genes, to jewelry, hip hop, you name it. Becky likes things to like it all. And she integrates it somehow into all of her work, very seamlessly. She researches artificial intelligence, machine learning and mental healthcare, computational creativity, ethics, and governance, digital clinical music based interventions. So you’re going to find that this conversation is going to be very, very enthusiastic, but also very different than what we’ve traditionally had in our other episodes. So before we get to the core of the work you’re doing, maybe you want to tell us a little bit about your work around cybersecurity, especially your passion for doing things around mental health, including both children and adults.
Becky Inkster
Absolutely. Just to build on the context that you’ve kindly set for me, I am really passionate about digital mental health, and I work very closely with a lot of different digital mental health and wellbeing providers. There’s a lot of support being offered across a wide range of age groups. So working with VR and pediatrics, tangible interfaces and toys to support emotional development and kids youth peer, peer mental health support networks, one-to-one, tele-psychiatry psychotherapy virtual companionship for the elderly to reduce loneliness. And it just goes on and on. Those are a couple examples just across the different age ranges, but with this diversity in tech innovation, this explosion in mental health accelerated by COVID, there’s a lot of diverse challenges from a cybersecurity perspective. So even hacking into a VR headset is very different from a cyber criminal, trying to attack patient records that are fire compliance.
We have to think of all these different surfaces that are very vulnerable, especially when we work with some of the most vulnerable people. And so given the sort of the surge in the supply and demand of digital mental health and wellbeing, I argue that cybersecurity needs to go straight to the top, it has to be one of the highest priorities, privacy by design, and a lot of other issues actually. The World Economic Forum recently released a white paper and the word cybersecurity was only mentioned once in 71 pages, which is around 26,000 words. I think that kind of sums up where digital mental health is in terms of thinking about cybersecurity. I really do want to make this a trend in our industry really, and the trends that I’ve noticed as you’ve mentioned, Maria, the sort of difficult times for healthcare where breaches are at an all-time high.
I read one report that was showing almost 900 million data records were compromised worldwide in January, 2021 alone. And that’s more than the entire year 2017. I recognize that mental health data falls within the category of health. We absolutely want a parody of esteem, which means that we want to value mental health equally with physical health. I argue that from a cybersecurity perspective, mental health data needs extra attention and extra scrutiny as it’s extremely sensitive in ways that perhaps people haven’t really fully thought about. Even just a crude example here in no country is cancer illegal. But attempting suicide is a crime, a criminal act or prison offense in certain countries and even disclosures of sexual orientation or gender identity could put people at risk or in danger. And the reason I mentioned mental health and sexual orientation and suicide just as one example, there was a groundbreaking report by the Trevor project involving over 30,000 young people between 13 to 24 years old.
They found that 40% of those who identify as LGBTQ plus have seriously considered a suicide in the past year when they were surveyed in 2018. So a lot of these issues are very personal and they spread beyond what you might normally think of as being a mental health concern. And another trend that I’ve noticed. Many people have noticed that cyber crime has evolved beyond just encrypting data to essentially blackmail or extortion of especially vulnerable people. A lot of people know the example of Vestavia, but for those who don’t, this was Finland’s largest psychotherapy provider that went bankrupt. They treated tens of thousands of patients across multiple centers in Finland. And they experienced data breaches where confidential client therapy session notes were stolen.This includes other personal information too. And when the cyber criminals went for the provider and they refused to pay the criminal started to blackmail victims and this included children.
So we’re seeing this shift or a potential trend really going directly at the vulnerable people and disclosures of this type of sensitive information could really endanger victims and others. For example, there’s often an emergency contact or details of another person or someone named during the therapy session. They might have abused the individual or somehow been connected and information disclosed about them too. And things such as previous suicide, thoughts and attempts, or if someone who’s approached when they’re vulnerable to pay a ransom, this could trigger issues if they already had experienced financial hardships or debt, and it really goes on and on. So, it could be naming sex abuse, victims, abusers, et cetera. So generally speaking mental health globally, there’s still a huge amount of stigma. And extorting vulnerable people could have a really harmful impact that could be life or death, or really trigger very serious instant consequences.
And then just to kind of tie up two other trends that I’ve noticed here that don’t necessarily directly relate to mental health yet, but I want to bring awareness to this is a possible trend relating to cybersecurity insurance. So, AXA, they’re no longer covering ransomware payment reimbursements. They’ve changed their policies in France. In digital mental health, we have to be very aware of such trends. Often in our fields, we are small and medium sized businesses, which could be deemed as soft targets. Many of the providers are vendors to large enterprises that are just starting on their journey. I also wonder whether trends will move from being solely motivated for financial gain but hacktivism and other types of targeted efforts. For example, hospitals being forced to release a patient or cyber criminals targeting human rights organizations, or what I’m trying to say is really I wonder whether the motivations for cyber attacks might become more complex rather than just financially motivated.
But that’s just my own personal concern coming from mental health. And you mentioned new approaches, what new approaches am I looking at and areas of research? So for me, I found a very successful approach was to bring providers together to have that conversation and that conversation could be anything from cybersecurity to collecting data. So not too long ago, I brought together over 50 providers in the digital mental health and wellbeing space. And together they gather data insights from millions of people around the world to show the impact of COVID on mental health and wellbeing. And that really got me thinking that I should launch a cyber security project because safety is a non-competitive issue. We’ve created this project. We’re at the beginning of this journey, but we want it to be a huge opportunity for providers to be proactive and to examine the current state of cybersecurity in the digital mental health space, and that it can help towards creating coordinated standards and responses to cybersecurity threats and attacks within our industry.
I’ve started working with ethical hacker Alyssa Knight to examine API vulnerabilities in digital mental health. And then I’m also working with XRSI, which is X reality safety initiative and just supporting the development of standards for safety and security in XR environments. And one last thing just to mention new areas of research that I’m really keen to map out is to look at not just how to fix the systems, how to address these threats and attacks, but how breaches map to clinical and psychological outcomes. So what is the impact on individuals of interruptions to mental health service provision on these outcomes for victims, or even just service users of the platform who may not have been affected as well. So just really trying to explore these clinical outcomes. And we’ve seen in previous research in cases of a heart attack, the clinical outcomes were worsened after the data breach. So I just think it’s really important to see what happens to risks of self-harm suicide, substance abuse, and how can we mitigate these risks, by having victim support centers, ready to address any issues of continuity of care after a breach,
Maria Palombini
That was a very powerful and insightful opening. And I think now you all know why I enjoy talking to Becky so much because she just gives you a lot of great information. I can sense it. I’m sure our listeners can sense it, but you have a very deep enthusiasm and passion and motivation for your work. Maybe just to share with our audience a little bit about what inspires you, motivates you to look at these new things and pursue all the research and try to take it to the next level?
Becky Inkster
I really like trying to make connections that are either extremely far apart and then link them all together. But for me, it starts with identifying a huge blind spot. So also being able to take a step back and not rushing into developing technology, and just seeing the horizon of the risks, what could potentially be ahead for the longer term future. I find that very inspiring. Also it occurred to me that I’m really motivated by working with digital mental health and wellbeing providers and combining this with cybersecurity experts, because both of these groups are really focused on safety, safety of data, safety of care. When you combine these two groups together, as I’ve just started to experience, it’s unbelievable how you can really start to get exponential output from combining those.
Maria Palombini
As many of you have noticed, Becky touched on this when it comes to mental health, we’re starting to see now more and more focused attention. Like we saw with telemedicine on the issue of mental health, you know, as a result of the consequences of the COVID-19 pandemic, you know, people in isolation or just the post-traumatic stress disorder of something of this kind, you know, still going on, we’re not over it yet, but at this point, and we’re seeing that these issues of mental health are coming more to the forefront. And this is from children through adults, 30 to 40 years old, all the way up to the older generation. So with that, you know, we actually even saw the US FDA put out some digital health enforcement guidelines around treating the psychiatric disorders during the coronavirus year, 2019 using these remote health devices. So what are some of the concerns as it relates to the security or the vulnerabilities and the patient’s privacy when it comes to use of these technologies, when we’re talking about, as you mentioned a very vulnerable population right now?
Becky Inkster
I think what you’ve just identified, which I can elaborate on, is a serious imbalance.We’ve seen the positive side about this surge in demand and supply with digital mental health providers, which is excellent, but with the FDA regulatory change in April of 2020, we see that many providers, it’s spurred them to move a lot faster than they had planned. And a lot of providers wanted to take advantage of this wide open door, which had never been opened, or it was even tricky to get your foot in the door. I think this created a really big imbalance and the world economic forum to quote them said it really nicely that this imbalance was between time to market and time to security. It really does create this enormous attack surface filled with just endless vulnerabilities for cyber criminals to explore even then, especially API vulnerabilities.
I personally have witnessed this firsthand where providers would get very excited about accelerating their technology and their products and their services, but not thinking further into the future about things like what would be their appropriate breach responses. Have they mapped out their responsible disclosures, have they considered budgets and thought about victim support safety, have they considered GDPR and having to report breaches and all these types of things. Quite a few, to my knowledge, really haven’t considered this at all, or let alone have a budget for cybersecurity to begin with when it comes to security and patient privacy, we’re really at the beginning of that conversation and digital mental health. That’s why it’s so important to bring in cybersecurity experts to help us with this. But again, we also have to feed back and explain just how sensitive this information is, a separate point on not cybersecurity per se, but related to data, grabbing on a historical scale and the loss of patient privacy.
When you mentioned security, the NHS recently announced that it was going to create a database with 55 million patients’ medical histories to be shared with third parties to improve research and planning. Now obviously that’s not an attack. I would never say that, but there are some elements that fit feel similar. Patients, including myself in the UK have a very short window to try and control the privacy of our data. And we have to opt out, by printing a piece of paper and sending it to our GP and many people still aren’t even aware of this issue, that it’s involuntary sharing of their data and in the past not perspective, but other past data. It’s a big issue here, and it’s not cybersecurity, but it’s still unclear who will use this data and for what purpose. So it kind of resonates in this, in this strange way, and it certainly feels like an invasion of privacy taking data without consent or transparency or public debate, especially when it includes private sensitive data like criminal records, mental health episodes, smoking, drinking habits. It really is everything diagnosis of disease, dated instances of domestic violence, abortion, sexual orientation. So I think while it’s a very different scenario, there are a lot of similarities to these issues.
Maria Palombini
Absolutely. So for all of you out there, I guess many of you are thinking, well, we’re talking about cybersecurity for people who have access, right? But really at this point I want to get to here with Becky, and I think it’s really important. It is assumed that technologies and these remote tools and models are for hard to reach patients in need of mental health treatment. While also watching these patients often emphasize that through the use of these technologies, the healthcare industry is supposed to be doing more by promoting the concept of self care and democratizing patient health data. So Becky, in your research, you find this to be true, or do you see that this only relates to certain areas of populations like established versus emerging economies or the connected versus unconnected? Are you seeing any sort of sparks or lack of sparks in this kind of area?
Becky Inkster
I’m seeing a lot. I’m seeing almost every type of combination and variant here. So I work with the extremes with people who are severely unwell with mental health illness right across the wellbeing spectrum. And similarly people who don’t have access maybe through financial hardships or other reasons they’ve left prison to reenter the community. There’s a lot of different groups that I work with. I’ve seen tech fail for very sick people. I’ve seen unconnected people be excluded from things that could probably really help them. People becoming more occupied or, or overburdened by self care responsibilities. So I think you just, you see every possible angle, but I really do believe that there’s nowhere near enough support for the hard to reach communities, from a technology perspective.
And when we do, it’s quite linked with depression, anxiety and some of the more common mental health conditions, and we need to go deeper and reach people who are unable to access technology, unable to use technology. And we just need to understand how we help people who are experiencing homelessness. For example, it’s not going to be the same solution, and it’s not as simple as just handing someone a phone someone who has drug dependencies. It really isn’t that simple, I should say on the flip side though, because I don’t want to just always see things from, from one lens for eating disorder group therapy, there was some research showing that digital was actually, it was just as good if not slightly better. Sometimes if you are able to access it by just having the comfort of your own space and not having to go to a physical place, maybe you don’t want your body on display or to be judged in a physical space. Or perhaps if you’d been crying after a session, you don’t want to have to leave a physical space. So there’s a lot of benefits to it just to kind of balance that out.
Maria Palombini
Interesting. So it’s not a sort of one size fits all approach. We have to definitely look at where some things are working and maybe there’s some learning cases, right? Like what seems to be working for this group can be sort of amplified and maybe potentially help another group. I mean, that’s the beauty of the research.
Becky Inkster
Yep.
Maria Palombini
Our focus obviously is on privacy and protection of all individuals from children to older adults in using these digital health toolkits and technologies remotely of that nature. But when it comes to pediatrics or the role of guardianship or the role of a caregiver with these new technologies, there always feels like there’s a little throw, a little kink into the chain. There always seems to be some sort of challenge that we have to sort of figure out. I think one of the big things is around children, a parent or clinician. We know the end to end encryption of anonymization in the data chain. So a parent or clinician might want to wish to access the content from a young adult’s digital mood diary per se, right. Just to, you know, for safeguarding, you know, does the duty of care override privacy rights does that, and then does that negatively impact the treatment’s effectiveness? So I’d love to hear your perspective on this.
Becky Inkster
I could go on for a long time about this and it’s a huge area of focus that I care deeply about. So it’s an excellent set of questions. And I think the simple answer should be duty of care, should override privacy. If someone has harmed themselves or threatens to harm themselves or others, this needs to be reported and confidentiality needs to be broken, but in digital spaces, it’s just not always that clear cut. And it’s not as easy to protect someone, especially in anonymous settings. So obviously parents can do things like making sure they’re aware of passwords and these types of things. And I should say parental monitoring is a very good protective factor for mental health outcomes in a young people’s development of mental health problems in later life.
So it’s extremely important for parents to be involved, but increased privacy doesn’t always equal increased protection. And there’s this strange juxtaposition that we need to start teasing apart. And I’ll give an example here in the UK. One of Britain’s most prolific pedophiles would not have been brought to justice without using social media data according to the national crime agency here. So now Facebook is planning to potentially implement end to end encryption in its messaging services. And this has caused a lot of concern for police and their ability to identify predators who would be abusing children and making it easier and safer for predators and making children even more isolated. The role of parents, the role of clinicians to make sure they’re very aware of these issues, it’s extremely important. Especially when predators are pretending to be someone else, especially in a position of trust and especially on a digital mental health and well-being platform or space, where there’s already such a very real risk about the young people talking about their vulnerable state, potentially risk disclosing information that the predator could offsite them or take advantage of.
So I think that this juxtaposition between cybersecurity encryption, keeping everyone safe, there’s a huge issue that we need to tackle because if we keep young children, young people vulnerable to predators, these adverse childhood events, which we call ACEs in mental health and psychiatry, there are huge predictor of poor mental health outcomes later in life. We have to be so careful about how providers, how end to end encryption is rolled out in digital mental health settings. If it’s ruled out, this juxtaposition between preventing cyber attacks and keeping data secure, we have to balance that with the potential harm of not monitoring signs of abuse and it’s unrelated to mental health, but in the news right now, one of the headlines is that WhatsApp announced that they’re taking the Indian government to court over a controversial new law that will increase the government’s ability to monitor online activity. So, the law would require Facebook to remove encryption so that messages could be pulled into a database and monitored for illegal activity. So you could see mental health fitting into that, but that’s an ongoing issue where Facebook said that they won’t store user data in this way. They’ve launched a legal challenge on that basis. So it’s a really big issue. But in mental health, in particular, we’ve got to figure out how to balance that.
Maria Palombini
You know, I often hear about responsible data use when we’re talking about any kind of medical technology. So I think from, based on your research, and I know with mental health care, we have to be extra sensitive on how we define responsible use of the data coming out of these devices and the use of these devices. So I guess based on your research, is there some sort of industry defined standard as to what would constitute, responsible data use, you know, security, privacy, in developing or testing new technologies for mental health care.
Becky Inkster
This is just my lone voice, and this is why I want to work with so many different providers. But as I mentioned, I’m trying to gather these providers together to get their views on this as well, but I’m also a co-founder of a mental health and wellbeing venture. Through that startup journey or the inside perspective, I personally don’t feel that there’s a strong sense of support in terms of how we follow industry defined standards, or quite often, we make very strong, ethical judgments on our own because we want to be ethical, not just following legal or industry defined standards, but it becomes very difficult. And I think that’s part of the reason why I wanted to gather all of these providers together to almost create our own set of standards or discoveries that could then be embedded into something bigger significance.
There’s this tricky trade-off where it’s data minimization. You don’t want to collect anything that you don’t need, and you don’t want to keep it, just make sure you’re churning through your data if you’re not using it, just get rid of it versus collecting enough to be able to tell a full, valid truth about someone’s experiences or mental health status. It really is this tricky balance of trying to support someone on their journey, but not run into issues with false positives or making predictions, inaccurately and things. I worry a lot about responsible data use again, when looking out at the provider space not too long ago, I was approached by a whistleblower making extremely serious allegations involving a data coverup. So this was within a digital mental health wellbeing space. Being responsible with data and, and looking to, to these standards or these industry defined standards, I think standards get you so far, but the way people decide to act within their provision, that’s a very separate issue that I think we need to cover more. So I really do actually worry a lot about responsible data use and I know standards can get us to a certain level, but I think that’s why it’s so important for providers to come together and to share their issues, to link the standards to something that is real worlds practicing the front line of digital mental health.
Maria Palombini
Absolutely. We are approaching a time when doctors prescribed mobile apps games for adolescents or virtual reality kits to treat social anxiety or mental health rather than medication or talk sessions. Yes or no?
Becky Inkster
No, and now I’ll explain. Yeah, we have to tread very carefully with this type of discussion. That question just reminds me of when I was interviewed by a media outlet and they were talking about this hip hop therapy paper that we published and the media wanted to use the headline to stop taking your medication and just listen to hip hop. And we were horrified. We obviously didn’t let them go ahead with that headline. We didn’t acknowledge that, that we said that we just completely parted ways, but abruptly stopping medication can have such serious consequences. So we’d have to steer away from people thinking that this one thing can really do it for us and nothing else really matters. Obviously there’s a lot of increasing numbers of treatment options that’s becoming available. But we have to remember that with mental health.
You can see this from a biological perspective, psychological perspective, or sociological perspective. All these different factors, each person’s needs are different and their treatment plan or how they’re supported, will differ as a result. While all these different treatments are emerging, they’re very exciting gamification and all sorts of really interesting tech innovation. We just have to acknowledge that each option has its own important way of contributing, but we have to fit the right pieces for each person differently. I think my first point that I’d want to make is, medication is still a very important option for patients. They can benefit from this, those who choose to go down this route. Medication adherence has always been an issue in mental health. But we’re starting to see some tech innovation like Digi meds, trying to help improve outcomes and adhere to medication to help them feel better.
And within medicine, again, there are other emerging alternative drug industry trends. We’re still seeing the psychedelic drugs being used to treat phase three trials to treat mental illness and combining talk therapy with medication and leading to positive outcomes as well. I think that there’s still a lot of great work being done in that sort of space. Where I’m excited about is the evolving concept of talk therapy. So making it more informal through talk sessions or chatting about your mental health while gaming with others, it can be very therapeutic and less prescriptive. It can open conversations with peers and really open dialogues about mental health or how chatbots can just be there to listen while someone tapes or speaks a very intense expression about how they feel so talk or chat therapy. I think it’s really evolving, but it shows a lot of promise there.
And an extension of that, an area that I’m interested in is music therapy. I think that this is really gonna start to do great things when you combine it with technology, allowing people to express themselves in non-prescriptive ways. And yeah, a lot of interesting things asking people how they feel, but more on their terms, it might make things a little bit more accessible, especially for hard to reach groups. We can’t always use prescriptive approaches or clinical approaches because I think things like even the dark web people will want to be as far away from clinical spaces as possible so that they can seek support if they were wanting to find a suicidal partner. For example, I just really think that people go where they want to go when they’re seeking support and that we just have to make sure that all of these options are available and we try to keep people as safe as possible because we’ve seen this kind of a surge in the supply and demand of the more well-being side of things or the mild to moderate mental health. It’s made me sort of curious about whether in coming years, we might see a stronger push away from medical treatments or attempts to blend and then taper medication or reduce side effects and reduce medication treatments from treatment plans, and then try to add other options into, kind of the talk therapy, the physiological measurement. But yeah, that’s just one thing that I’m curious to see how that space gets bigger and bigger, what happens.
Maria Palombini
So Becky touched on this in your introduction remarks, how much time from a point of development and design is being put towards cybersecurity and privacy risks. Do you find that developers have this at the top of their agenda, or is it more about ease of use for the patient or extending the battery life or making it more accessible or that kind of thing? We see a lot more, you know, function on human factors or on usability, but sometimes we feel like the attention is not so much on the cybersecurity and protection of data side. Do you find that this is more the same in digital health and you’re in health technologies or are you seeing it that you’re finding more of the developers are more focused on the privacy and protection side? Is that top of agenda? Point of view?
Becky Inkster
Yeah, I really wish I could say we were. But no, I think it’s, it’s exactly those issues that you said in healthcare more broadly. If anything, I’m a little bit worried that we are lagging behind a lot more in considering cybersecurity and privacy risks. Even when I work with mental health providers who have 10 plus years experience and a lot of data that needs protecting there are still serious issues and concerns because the threats and the attack surface is constantly evolving. But at the same time, I think the journey of that 10 plus year provider can be very beneficial for providers who are just starting that journey. I think it’s important to work with both extremes, especially those who are just starting the journey, because that’s exactly when we can start to build in a cybersecurity culture and really get them thinking about all the different issues in that space. And this whole privacy by design, or just really thinking about that from the beginning, gives us an opportunity to go from way behind to right at the forefront.
Maria Palombini
Absolutely. So I asked this of all my guests, because there’s always so much diversity in this answer. Many have argued that the regulators, the policymakers should do more to require the developers, the technologists for software and these connected health technologies to do more in building these security features from an either privacy by design perspective or just an engineering perspective. So my question to you is do you think that this is more like policy needs to stand up or require it? Do you think it’s more like a combination of everything? Like we need more standards, we need policy to set it up and we also need the industry to step up and come together and help address the problem, like where’s your perspective on this kind of thing?
Becky Inkster
My disappointing answer is that it’s everyone’s problem. But I will say again, as you’ve noticed, I’m really coming at this from a provider perspective and I think providers are at the heart of it all. And they’re the ones that are on the front line, they’re facing the real issues, they’re making decisions. And yet in my field, a lot of these decisions or discoveries are not being captured and either fed up to the powers that be, or, you know, embedded in various decision-making processes. So my answer to this is just that we have to start listening to providers more because they’ve really got a lot to say that could help us with these issues. And then one group that wasn’t mentioned in the examples that you gave is especially for mental health, the importance of lived experience, it’s extremely important and in cybersecurity and there’s research showing that there’s increased mental health challenges and burnout. It might be very interesting to look at the dual expertise like experiential mental health and wellbeing knowledge, as well as the professional knowledge, this crossover between cybersecurity experts and especially those who face mental health problems. I think that is also another really important source where we can learn a great deal about what’s working. What’s not, how do we roll this out? So those are two of my angles that I always like to think of, but of course it’s everyone’s problem.
Maria Palombini
What would you find or say is the most important call to action? Either for the healthcare professional, the hospitals facilities, the clinicians, or the engineers, or the policy makers or patients themselves, what’s the most important action you can impart to them to start mitigating this risk in the use of these technologies?
Becky Inkster
I don’t have an answer yet, but this is exactly the question that I want to ask at my summer conference. We’re going to be discussing that exact issue and trying to figure out how we rank the vulnerabilities and how do we come up with decisive actions? So we’re very fortunate to have experts like Alyssa Knight who can really help us, but the conference that I run is digital innovation in mental health. And normally it’s in the precincts of Westminster Abbey, but we’re obviously virtual at the moment. This is the main thing that I want to cover at the conference, how to make impact, especially from a cybersecurity perspective, but equally looking at the online child protection issues and balancing those together. So while I don’t have anything to say just yet, that is really what we’re trying to tackle in the coming months and at the summer conference. I’m hoping that people now are really appreciating the importance and the extra sensitive nature of mental health. We should have the highest standards and we should have the most support. So we’ve got to come from the back of the queue and get to the front somehow.
Maria Palombini
Excellent. So for everyone listening, just click on to read the blog post and the link to this upcoming conference in August, IEEE SA is gonna also be participating in it because this is an important initiative for our work here in the healthcare life science practice. But for all of you who may want to attend there, just get involved. Please take a look at that blog post for the link to the conference. Becky has shared with us many great concepts, and a lot of the different points are covered in different activities. We have here in the IEEE SA healthcare life science practice activities. We do virtual workshops because we’re all virtual. We have incubator programs, we have standards, development projects. We’re doing as many of you may have heard from our previous episodes. We have a global connected healthcare cybersecurity virtual workshop in which Becky did participate in our last one as a facilitator in one of our virtual breakout sessions we’ve, those are the ones that are on demand from February and April and June.
The live ones are going to be in September and November as a five-part series. So hopefully you can find out and join us in listening to those. Plus we have plenty of incubator programs around wearables and medical IoT, interoperability, and intelligence, decentralized clinical trials, and obviously tele health security, privacy, and accessibility for all. So we’re covering many different areas that Becky touched on at some point in our conversation. And if you want to learn about all of these activities, you can visit IEEESA.IO/CYBER2021, but we hope you come and check out. And if you have an idea or want to get involved in any of these activities, the best way to do it is to express your interests and tell us so that we can make sure you can bring your expertise and time to finding a solution for everyone. So with that, I want to thank Becky for joining this conversation today and your time. And with that, I want to wish everyone to continue to stay safe and well until next time.
[/toggle]
We are seeing rapidly growing trends towards the application of digital therapeutics for mental health conditions; a great deal exacerbated by the COVID-19 pandemic. This increase was further validated by the Enforcement Policy for Digital Health Devices For Treating Psychiatric Disorders During the Coronavirus Disease 2019 Public Health Emergency issued April 2020 by the US FDA (Food & Drug Administration).
Are we approaching a time when doctors prescribe mobile apps, games for adolescents, or Virtual Reality to treat social anxiety and/or treat mental ill-health rather than medication or talk sessions? If YES, then managing cybersecurity and privacy risks must be top of the agenda.
Listen to this episode to understand the opportunities and growing challenges in managing duty of care, security, and privacy with a highly vulnerable population of patients with an easily accessible suite of mental health digital therapeutics.
Related Resources:
- DIMH2021: Digital Innovation in Mental Health
- IEEE SA Ethical Assurance of Data-Driven Technologies for Mental Healthcare Industry Connections Program (ICAID)
- IEEE SA Healthcare and Life Sciences Practice
- IEEE SA 2021 Global Connected Healthcare Cybersecurity Virtual Workshop Series
- IEEE SA IoT Ecosystem Security Industry Connections Program
- IEEE SA Transforming the Telehealth Paradigm: Security Privacy, Connectivity, and Accessibility for All Industry Connections Program
- IEEE SA Tech and Data Harmonization for Enabling Decentralized Clinical Trials Industry Connections Program
- IEEE Global Wearables and Medical IoT Interoperability & Intelligence (WAMIII) Program
About the Guest:
Dr. Becky Inkster
Cambridge University, UK
The Alan Turing Institute, UK
Lancet Digital Health, International Advisory Board Member
Self-Employed Neuroscientist
I am a clinical neuroscientist, seeking innovative ways to improve our understanding and treatment of mental health in the digital age. I apply measured optimism when working across artificial intelligence-enhanced mental healthcare, neuroscience, and digital-, clinical-, and music-based interventions. I am passionate about patient data privacy, cybersecurity, ethics, governance, and protecting vulnerable populations. I provide cross-sectorial guidance and leadership to numerous institutions and companies (e.g., academia, technology, human rights, mental healthcare, and government).
Follow Dr. Becky Inkster on LinkedIn.