Podcast: Contact Tracing Applications and Technologies Beyond COVID-19

Re-Think Health Podcast Series Season 1

Array

Contact tracing technologies and applications (CTT/CTAs) have become a great focus in the new norm of COVID-19. There are many ethical and validation challenges surrounding the uses of these technologies which have been in play before the pandemic and will continue after.

Maria Palombini, Director of Emerging Communities & Opportunities Development and Healthcare & Life Sciences (HLS) Practice Lead at the IEEE Standards Association (IEEE SA), interviews Ali Hessami, Innovation Director at Vegas Global Systems LTD and Chair and Technical Editor for IEEE P7000 Standards, about ethical considerations on contact tracing technologies and applications in order to mitigate spread of the pandemic and protect personal privacy and public.

Re-Think Health Podcast Series is part of the IEEE SA Voice program. IEEE SA Voice shares insights and perspectives from the IEEE SA community, subject matter experts, and industry leaders that are working to raise the world’s standards, drive market solutions, and much more, keeping you at the forefront of technological innovation for the benefit of humanity.

Stay to up-to-date on the latest podcasts and HLS related activities at IEEE SA.

Subscribe to our feed: PodbeanApple PodcastsGoogle PodcastsSpotify, or TuneIn.

About This Episode’s Guest

Ali Hessami HeadshotAli Hessami

Ali is a physics and electronics engineer with a track record in risk assessment and management, knowledge and talent/competence management, and design of mission critical systems. He has experience in safety, security, and sustainability assurance in complex products and projects, developing European and Global safety/security standards and technology ethics certification including COVID-related Contact Tracing Technologies. General interests include Photography, cosmology, mysticism and culture.

Follow Ali Hessami on Linkedin.

[toggle title=”Full Transcript” state=”close”]

Maria Palombini:
Welcome, everyone, to the next edition of IEEE SA’s Rethink Health podcast, today. We’re gonna be talking about the great debate in contact tracing technologies and applications ethical considerations and protecting personal privacy and public health. I’m your host Maria Palombini. I’m the leader of the IEEE Standards Association Healthcare and Life Sciences practice.

You may want to why are we doing this podcast series. There’s so many new technologies applications, scientific breakthroughs and more frequently. Recurrence of unexpected, natural disasters such as pandemic. That makes us really have to think, how are we going to rethink the healthcare system, so that we can deliver better care for you me, and anyone it was a patient we’re all patients? So we all should have the right to better care. This series will feature guests for technology, ethicist, clinical, medical researchers, advocates, and any other committed or passionate stakeholder who is pushing the boundaries in our approach to better healthcare.

We’re not just talking about bedside practice or therapy development. We’re looking at any aspect that impacts our care. And how do we make it better? And how do we make it universal for everyone? So, with that today, I would like to welcome, Ali Hessami, to our discussion. He’s going to talk about this contact tracing technology. An application that is a very hot topic, thanks to COVID-19. They’ve been around for a while, but now they’ve come to a center stage and it’s something we’re all debating public health versus private, a private personal data privacy.

So, with that, I want to first, thank you Ali for joining us and welcome to the series. Tell us a little bit about the work you do as innovation director of biggest system, but I know you’re heavily involved in a lot of the work and IEEE SA’s ethics programs. So, perhaps you can give us a little information about that.

Ali Hessami:
Of course, my role and regular systems has been really largely focused on tackling critical problems through innovative, technical technology, and solutions basically. So that borders with many aspects, such as system systems engineering, that’s my general background as a physicist as well as knowledge and risk management practices that I’ve been doing over many decades in industry, largely at the top involvement in the academic research publication and teaching. I got involved roughly four years ago being an arguably member for over thirty-five years. But with SA, I got involved with P7000 technology, ethics standard initially, as a working group member and ultimately honored to accept leading work as a chair to basically process manage and architect and approach to ethics certification. For SA, I was given a very brief free made and since then I’ve been largely involved in the ethics certification program for autonomous, and intelligence systems. For short. We have been exploring what kind of qualifications are needed for technology embedded products and services, largely autonomous and a category to make them ethically acceptable societally and make them successful. And one latest initiative has been the application of similar approach on thinking towards contact tracing and proximity tracing, risk mitigation technologies

Maria Palombini:
So, when it comes to everything being virtual, and we have great experts speaking on our podcast. But I always like to humanize the person behind the expertise. Perhaps you can share with us. You’re very passionate about this topic. And I know this, because I see all the different people involved the P7000 series and the ethics programs. Maybe you could share a little bit about what personally drives you.

Ali Hessami:
If I may indulge – I’m a follower of mysticism of the Eastern nature, a couple of people I revere in my life had private life. These people are really focused on paths to fostering harmony into society and spirituality. I can just quote from my favorite work from Rumi. He says this world is the deep trouble from top to bottom, but it can be swiftly healed by the palm of love. And that frankly translates into what can build bridges for good will and understanding the barriers being encountered in life. To me, ethics, respecting other people’s differences is an excellent bridge building environment. That’s what enthused me to get involved in P7000 series of standards and a number of them indeed. I’m later greatly honored and embraced the responsibility of developing ethical certification criteria, again with the same aim to promote a faster ethical value in technologies that’s ultimately device for the benefit of humanity.

Maria Palombini:
Great, thank you. And for our audience who may not be familiar, IEEE has a globally renowned initiative on autonomous systems. You can definitely find it on our website at https://standards.ieee.org [or a link in the resources section of this blog post], if you’re interested and just as passionate as Ali and the rest of the hundreds of other individuals who are participating in those projects.

Right now, I want to get to the core of what we’re here for – contact tracing technology. So we know they’re in the news. Everybody’s talking about it thanks to COVID, but they’ve been around for a while. It’s just not so widespread. Most people don’t know that, but in your research, what does it actually mean when we say a contact tracing technology and applications, regardless whether we’re using for an infectious disease, such as COVID or some other public health matter?

Ali Hessami:
Going back to the story around how human societies effectively and globally have responded to the threat and the major scourge of this pandemic, we have really not been very successful. We haven’t been prepared adequately. We haven’t responded with sufficient insights and effectiveness. It gave us the impression that as IEEE’s tag line implies we are really trying to contribute to the advancement of technology for the benefit of humanity. Since there was already an ongoing project on ethically qualification of products and services, in consultation with the Managing Director of SA and we managed to agree that the line of work on how to make contact tracing more successful globally, as the current only technological solution that we have available to us to mitigate the risks of the pandemic would have been in line with our strapline and our philosophy in IEEE.

So, contact tracing is basically using any form of technology or proximity tracings sometimes, to identify how close we have been to other contacts in any context, family, social, general public contacts. In the event that one of those people who have been in proximity with ourselves have tested positive regarding possibility of having caught the virus, then it informs others to take protective measures and stop this spread. So any technology, whether it’s an application that you download on your mobile device to any valuable technology is subject to the study that started some eleven weeks ago and in a fairly fast track project, we managed to get to the bottom of essential ethical that captures any such technology and not just a particular application of it that is necessary to the public trust in these technology, so that they will become more effective and ultimately end up saving lives.

Maria Palombini:
For sure. It’s definitely something. I think people still grasping, from a common citizen to technologist to everybody. In the last six to eight months since contact tracing technology started to surface, we’ve seen various challenges with them, whether it be some sort of central repository information, or we find that they’re using a Bluetooth proximity tool, so that you could see somebody within fifty feet, your app would dictate that that person has it. All of these as with anything they come with deficits. So what are some of the deficits that you’re seeing in these tracing technologies or applications that you really feel that the technical community, ethicist, scientists, or whoever has to come together to ensure that we have citizen consent to participating them but as well as that we can safeguard confidentiality.

Ali Hessami:
Indeed. If you look at the history, and this is only the last few months of successes and lack of successes of rolling out these technologies as the short term solution, while we’re still waiting for vaccines against the virus. To imagine the global community, technologically, this is one of the very few promising solutions that we have. But as any technology, this has been a quickly put together as a means to an end in the sense that the end is to try and reduce contamination, reduce the spread. And this is one very simple and technically a feasible solution that we don’t need to do a lot of R&D on.

Of course, on the downside, there are many dimensions, but those mentions are often to do with ethical properties in terms of: How transparent is the way technologies operate? Who are behind these developments? What is the mindset? Is there a clear concept of operation? Is there a reasonable attempt of ethically architecting such solutions, so that our data are not kept on central servers and hacked and abused? Is there any form of confidence in the ecosystem behavior? It’s not just the technology, it’s the other players. And all of these, including how do we operate and how do we keep vigilance during the operation. And eventually the demise and retirement of such technology, whether it’s a downloadable app or electronic product.

Technologies have got some shortcomings such as lack of precision in proximity measurement, what does exposure mean, how long is long enough to be considered to be at risk. Because it uses Bluetooth low energy that’s embedded in most mobile devices for other purposes as means of proximity detection with other people. On the ethical side, matters of transparency in communicating with the society citizens you’re trying to protect, how is the device architected? How is the overall operation? What are our stakeholders? Who would have access to our data? Would it be shared with other government agencies, et cetera? There’s significant lack of transparency in such matters. Governments, rushed by desperate necessity to come up with a solution to get out of lockdown and go back to some form of normal economic activities are rushing these technologies from private enterprises without sufficient clarity or transparency.

The second part is lack of sufficient accountability in the sense that we want to know who are the key decision makers, where our grievances go, when things go wrong, who are the responsible people or bodies or entities especially since we are talking about massive, large scale adoption and implementation of such solutions in such a hasty manner. The formal aspect is matters of privacy. We know that in Europe, we have protection of privacy by legislation called GDPR. But we’re talking about ethical privacy: in what manner our personal data is respected and regarded as private to us, rather than become some entities’ property, whether it’s a public entity, government entity. etc.

So these are three matters where the focus of the fast track study that we started sometime in late June, and they managed to put a fast interim report on what kind of actions, behaviors and practices on policies can safeguard against abuses in accountability, transparency and privacy. Hence a report was issued in July as the interim report, and the team and myself, have been working hard to complete this study. I’m pleased to announce that the studies being completed. Now we are in the final stages of putting a more comprehensive set of criteria on a report that’s going to be shared by SA as a Creative Commons global attribution, as a non-commercial product, where the whole international community with the focus of helping governments – city and states, public and private duty holders – to actually declare how they have gone about preserving these attributes of privacy, transparency and accountability in the architecture and the solutions that they’re rolling out, providing trust and confidence in the public and more effectiveness of technology to save lives.

Maria Palombini:
I can very much sense your passion and your advocacy for this work. I mean, you already answered two of my next questions on transparency and accountability, because I was having the same concerns of wondering how you all were addressing them. Now we’re coming to our final segment, which is an action. We need to take an action, right? The idea of Re-Think Health is that we want to actually do something to make it better. In this case, I know you’ve already sort of previewed the paper and we’ll leave that to the end to tell the audience how they can get to it. But there’s a lot of stake because I believe that contact tracing technologies, thanks to COVID, maybe sustainable and used in other applications. So I think we need to understand what will allow it to be sustainable, but in a way that people feel like it’s responsibly used and we can establish trust in the use of it. Where do you and the team feel like it’s needed there?

Ali Hessami:
Well, when we started this work, we were all passionate about how quickly we can generate a workable, comprehensive solution for supporting governments’ and private and public enterprises’ efforts towards rolling out such mitigation technologies against the pandemic. We actually decided that the most insufficient time and huge urgency with sensible parameters in terms of ethical assurance. We tapped into the work, the ethics certification program that we have developed prior to COVID. Last year we developed three sets of ethical criteria for certification of autonomous and intelligence systems on ethical transparency, accountability, and freedom from unethical bias. And we decided that the best solution was to adopt one of those and configure, adapt, modify and tailor that to the needs of contact tracing.

Fundamentally, it was around accountability and transparency models that we had already developed and these models have a huge body of criteria for ethical assurance. We basically decided that, because of the urgency, we couldn’t afford the luxury of a comprehensive ground-up sort of green field site study. We could build on some of the criteria that we already had and that’s exactly what we have done. That’s why we have two phases. Within roughly ten weeks we have reached a stage where we can share some of the findings to an interim report, which we have already published, and the work continued to completion. We’re hoping that before end of October and in November, a final report will be shared under this Creative Commons attribution for global access and benefits. Basically any entity who’s got these solutions is aware of aspects that are highly conducive to generating public trust. For example, don’t cover up any aspect of technology behavior; do not cater for any feature that can lend it open to abuse; and on the plus side make sure there are sufficient competences into design and understanding and governance of the institutions who generate such technologies, declare their intent and concept of operation with everyone transparently architected in such a way that it doesn’t lend itself to hacking, abuse, and loss of data; make the users aware of how they are interacting with the systems. Generally, make human supervision and oversight a feature of such systems rather than total and autonomous systems based on AI. And ultimately manage the operational risk.

So these were the concerns that are all factored in our fast track study. Our first report covers part of these issues, and our final report covers all of these issues to a fair amount of detail. Just to give you a feed, our first report has roughly fifty-five parameters for ethical transparency, accountability and bias in contact tracing. Our second report is likely to be doubling that.

Maria Palombini:
I’m sure there’s going to be a lot more. So on this paper on pandemic and ethics and contact tracing application, are you still open for comments and feedback? Is there a timeline for that or what’s the process there for that?

Ali Hessami:
The report was published late July and is available for free download from the IEEE SA site. It’s called The IEEE Use Case Criteria for Addressing Ethical Challenges in Transparency, Accountability, and Privacy of CTA/CTT, which stands for contact tracing application, which is contact tracing technology. Our focus is a global call for consultation.

We’re sharing our insides and criteria with the global community. We believe these are a comprehensive and reasonably broad set of ethical promises that can be accepted across many cultures. The deadline for commenting was also announced to be the first of September, but recognizing that August was a holiday season we have extended it to eighteenth of September. So people are still welcome from any institution or as an individual to send comments. This was an interim report and we are in the final stage of creating a very comprehensive set of ethical criteria by end of October. We are also planning a webinar to explain this to the global community to explain our intent, approach and value proposition from this non-commercial sharing of insights for the benefit of humanity on the tenth of October, ten o’clock, Eastern time. The experts and myself as the chair and vice chair of the ethics certification would be present to explain both the way we have worked together and benefits for others, and also rationalize why do we need to invest energy and time in making technology trusted before we rolled it out for the benefits of the society.

Maria Palombini:
And we will feature both the links to sign up for the webinar and for the paper on the IEEE SA site. You can find the information and we will be posting the webinars. I believe it’s already open for registration and we’ll have a link there as well. So we’re actually coming to our close. There’s been so much great information and we could do this for two hours. Is there any final thought that you would like impart to our audience? Something for them to think about because it’s something on our mind that’s very current in front of us.

Ali Hessami:
It actually pains us to have medicine solutions but no trust. If you look at the lack of takeoff of these technologies, which are currently the only medicine we have, if you like, against the virus that the majority of global community are suffering from. This is one technology that could be successfully rolled out, as problematic as it is in some aspects in terms of lack of precision. Nevertheless, we have a solution and the only failing on behalf of the global community has been the mannerism of explaining and respecting public opinion.

So the final thought I would suggest is that, as a global responsibility, it is upon all of us, especially decision makers, to actually not just think technologically for quick fixes, but also why should the public trust such quick fixes? Look at the process it takes to generate a vaccine. And why do people trust the vaccine? Because it goes through an enormous amount of verification and validation. We haven’t done that successfully for contact tracing. Our attempt is to get into what makes it transparent, accountable, and respectful of people’s privacy, to try and support understanding and trust in these technologies. Then if the majority adopt, adapt and apply it, we are going to hopefully maximize protection for the society and much faster return to normal social and economic life.

So it is really amazingly indicative of the necessity for humanizing technology to care for the society by explaining what happens to people’s data, who owns it, how it’s shared, what is the sunset criteria for technologies, etc. None of those have been done virtually anywhere that I know, with a catastrophic consequence that in most societies you ended up with countries that suffered most where the general public face only single solution on the horizon that could protect them automatically with the use of commercial kits, such as a mobile phone, and decided that they were not entrusting data to such technology because they didn’t understand how it was going to operate it, in what way they could be victimized or used for surveillance. In some countries, statistics show that less than 3 percent of the population, even though they are suffering from the pandemic, trusted and downloaded the application. This is desperately poor in time of crisis. We need to do much better than this. Our attempt is just one gesture on how to do better.

Maria Palombini:
I think that’s very well said. The statistics that you pointed out – something that we really need but yet we don’t trust so we don’t engage. Thank you Ali for this really insightful, interesting conversation. I’m so delighted that you joined me today. And I want to thank our audience for listening in and I hope you will check out the rest of our episodes on the podcast series. We’re covering everything from clinical research, wearables, decentralized health technology – everything that we can to make it better for all of us.

And if you want to get involved in any of our activities like incubator programs for decentralized health technologies and toolkits to drive adoption of decentralized clinical trials, establishing privacy and security in the use of wireless connected, medical devices, our newest program coming up will be telehealth, privacy, security, connectivity for all in a world of pandemics. Please visit our website at https://ieeesa.io/hls. Thank you everyone again and I look forward to joining us on our next episode. Bye.

[/toggle]

Related Resources

Share this Article