00:00:09:14 00:00:10:21 Hello, you're listening to 00:00:11:09 00:00:13:04 Leadership in Extraordinary Times. 00:00:13:17 00:00:15:10 I'm Peter Tufano, the Dean 00:00:15:12 00:00:17:06 of the Said Business School at the 00:00:17:07 00:00:18:11 University of Oxford. 00:00:19:10 00:00:21:05 This podcast is curated from a 00:00:21:06 00:00:23:00 series of live online events 00:00:23:10 00:00:25:03 which began in response to the 00:00:25:04 00:00:26:08 covid-19 outbreak. 00:00:27:06 00:00:28:16 It's never been more important to 00:00:28:17 00:00:30:17 understand and respond to the issue 00:00:30:18 00:00:31:17 shaping our future. 00:00:32:13 00:00:34:01 We want to share our insights and 00:00:34:02 00:00:35:12 our research from the front lines of 00:00:35:13 00:00:36:13 business. 00:00:36:24 00:00:38:14 And as a transformational business 00:00:38:15 00:00:40:04 school, we want to promote new 00:00:40:05 00:00:42:00 thinking when we need it the most 00:00:42:06 00:00:44:00 and reveal how to find a path to the 00:00:44:04 00:00:45:12 opportunities of the future. 00:00:46:17 00:00:48:04 Please do explore our growing 00:00:48:05 00:00:49:17 library of past episodes if you 00:00:49:24 00:00:50:24 haven't already. 00:01:00:12 00:01:01:12 Episode two, How 00:01:02:13 00:01:03:23 can businesses use artificial 00:01:04:06 00:01:05:18 intelligence responsibly? 00:01:07:09 00:01:09:01 The University of Oxford is at the 00:01:09:05 00:01:11:02 cutting edge of the revolution, 00:01:11:22 00:01:13:13 we live in a world in which change 00:01:13:18 00:01:15:09 is driven by new technologies 00:01:15:17 00:01:17:11 and artificial intelligence, or A.I. 00:01:17:23 00:01:18:25 is at the heart of this. 00:01:19:23 00:01:21:19 AI, predictive analytics 00:01:21:23 00:01:23:12 and other digital disruptions are 00:01:23:13 00:01:25:02 transforming business models, 00:01:25:06 00:01:27:20 processes and customer relationships 00:01:27:21 00:01:29:03 across every industry. 00:01:30:01 00:01:31:14 Organisations increasingly find 00:01:31:23 00:01:33:11 themselves confronting moments of 00:01:33:12 00:01:34:16 existential change, 00:01:35:09 00:01:36:23 and in this landscape of digital 00:01:36:24 00:01:39:05 disruption, leaders need both 00:01:39:09 00:01:41:13 the understanding and the tools 00:01:41:19 00:01:42:20 to integrate A.I. 00:01:42:21 00:01:44:02 into their overall strategy. 00:01:45:18 00:01:47:19 In this episode, Oxford Said's 00:01:47:20 00:01:49:07 L'Oreal Professor of Marketing, 00:01:49:09 00:01:50:18 Professor Andrew Stephen, Dr 00:01:51:11 00:01:53:12 Natalia Eframova and Dr 00:01:53:13 00:01:54:24 Felipe Thomaz are joined by Dr 00:01:55:17 00:01:57:20 Yasmeen Ahmad, Vice President 00:01:57:21 00:01:59:07 of strategy at Teradata, 00:01:59:19 00:02:01:02 to discuss the principles for 00:02:01:03 00:02:02:20 ethical uses of in business 00:02:03:06 00:02:04:21 based on a new report from Oxford 00:02:05:01 00:02:06:24 Said and the International Chamber 00:02:07:00 00:02:07:24 of Commerce. 00:02:08:25 00:02:10:09 Now over to Professor Andrew 00:02:10:11 00:02:11:11 Stephen. 00:02:12:00 00:02:14:00 Hi, and welcome to today's 00:02:14:01 00:02:15:05 episode of Leadership in 00:02:15:15 00:02:16:15 Extraordinary Times. 00:02:16:19 00:02:18:14 Today, we are talking about 00:02:18:15 00:02:19:25 how businesses can use AI 00:02:20:11 00:02:21:11 responsibly. 00:02:21:20 00:02:23:11 We're going to try and unpack that 00:02:23:12 00:02:25:03 and make it as practical 00:02:25:17 00:02:26:23 as we possibly can 00:02:27:18 00:02:29:12 throughout today's discussion. 00:02:30:05 00:02:31:13 I want to welcome all of you 00:02:31:23 00:02:33:07 wherever you happen to be. 00:02:33:14 00:02:35:01 Thank you for joining us. 00:02:35:06 00:02:36:09 And I want to extend a special 00:02:36:10 00:02:37:21 welcome actually to our 00:02:38:09 00:02:40:13 Oxford Executive Diploma 00:02:40:15 00:02:42:13 in AI for Business students, some 00:02:42:14 00:02:44:02 of whom are here in the building 00:02:44:12 00:02:45:17 here at the Said Business 00:02:46:12 00:02:47:24 School and others who are around the 00:02:48:00 00:02:49:12 world joining us today, because 00:02:49:19 00:02:51:06 actually today is the first day 00:02:51:19 00:02:52:23 of one of our modules of 00:02:53:15 00:02:55:01 that Executive Diploma programme. 00:02:55:02 00:02:57:02 And what better topic to talk about 00:02:58:01 00:02:59:20 for an AI programme than 00:02:59:24 00:03:01:18 responsible AI in business. 00:03:01:25 00:03:03:16 So what we're going to do over the 00:03:03:17 00:03:04:18 course of the next 45 00:03:05:21 00:03:08:02 minutes or so is break 00:03:08:03 00:03:09:13 down this topic. 00:03:09:23 00:03:11:07 We're going to hear from 00:03:11:24 00:03:13:24 Dr. Yasmeen Ahmad, who is the 00:03:14:00 00:03:15:11 Vice President for Strategy at 00:03:15:14 00:03:17:17 Teradata, as well as Dr. 00:03:17:18 00:03:18:17 Natalia Eframova, who 00:03:19:14 00:03:21:21 is the Teradata Research Fellow 00:03:21:22 00:03:23:00 in A.I. 00:03:23:01 00:03:24:14 and Marketing here at the Said 00:03:24:21 00:03:25:19 Business School, working in the 00:03:25:20 00:03:26:23 Oxford Future of Marketing 00:03:26:24 00:03:27:22 Initiative. 00:03:27:23 00:03:29:24 And so we'll hear from them in 00:03:30:05 00:03:32:05 a little while to really talk 00:03:32:06 00:03:34:07 about this from both a technical 00:03:34:08 00:03:36:07 and a practical standpoint. 00:03:36:08 00:03:38:05 But first, I want to bring 00:03:38:06 00:03:40:01 in my colleague, Dr. Felipe Thomaz, 00:03:40:10 00:03:41:18 who's an Associate Professor of 00:03:41:19 00:03:43:15 Marketing here at the Said Business 00:03:43:16 00:03:45:19 School. And Felipe has been heading 00:03:45:20 00:03:47:00 up a project 00:03:48:00 00:03:49:06 that a number of us have been 00:03:49:07 00:03:51:14 working on over the last few months 00:03:51:23 00:03:53:12 with the International Chamber of 00:03:53:13 00:03:55:01 Commerce in 00:03:56:06 00:03:57:17 drafting some guidance for the 00:03:57:22 00:03:59:25 private sector around responsible 00:04:00:00 00:04:01:01 AI. So I thought we'd 00:04:01:22 00:04:03:11 kick things off by having a chat 00:04:03:18 00:04:05:18 with Felipe about this 00:04:05:19 00:04:07:10 this project, which is about to be 00:04:07:16 00:04:09:02 released next week, in fact, 00:04:09:23 00:04:11:23 they'll be releasing the report 00:04:12:05 00:04:13:14 to come out of this research 00:04:13:19 00:04:16:03 project. So, Felipe, welcome. 00:04:16:04 00:04:17:09 Thank you for joining us. 00:04:17:10 00:04:18:20 So why don't you start by telling us 00:04:18:21 00:04:20:21 a little bit about this project 00:04:20:22 00:04:22:10 that we've been working on over 00:04:22:21 00:04:25:03 the last few months with the ICC. 00:04:26:15 00:04:27:20 Thank you, Andrew. It's a very exciting project. 00:04:27:23 00:04:29:23 The challenge was how do we 00:04:30:07 00:04:32:07 synthesise what we know 00:04:32:08 00:04:34:16 about responsible AI applications 00:04:34:17 00:04:36:03 and ethical considerations and 00:04:36:08 00:04:38:07 ethical risks in AI, and 00:04:38:12 00:04:40:06 organise it in a way that is 00:04:40:15 00:04:41:15 business sensible and 00:04:42:14 00:04:43:18 it is in a way organised 00:04:44:21 00:04:46:23 to allow the manager to implement 00:04:46:24 00:04:49:00 and actually put it to use so 00:04:49:01 00:04:50:12 that they can just go ahead and take 00:04:50:13 00:04:51:23 those initial steps. 00:04:53:11 00:04:54:23 All right, so so what are some 00:04:55:10 00:04:56:18 of those steps that need to be 00:04:56:19 00:04:58:14 taken, because part of this is 00:04:58:15 00:04:59:24 very much about coming up with 00:05:00:10 00:05:02:08 things that businesses can do 00:05:02:18 00:05:04:03 to try and ensure that the work that 00:05:04:04 00:05:05:14 they're doing around AI and machine 00:05:05:15 00:05:07:13 learning is is responsible, is 00:05:07:14 00:05:09:08 ethical, is is appropriate. 00:05:09:25 00:05:12:01 What are some of the, I guess, the 00:05:12:02 00:05:13:11 elements in your framework? 00:05:14:07 00:05:15:07 Right. So to start 00:05:16:06 00:05:17:11 out, one of the things that we had 00:05:17:12 00:05:18:14 to do in order to force 00:05:19:08 00:05:20:09 it towards a business 00:05:21:05 00:05:22:24 oriented and actually a managerially 00:05:23:03 00:05:24:21 useful approach 00:05:25:19 00:05:28:07 was actually to force a hierarchy, 00:05:28:13 00:05:30:19 also some complex ethical ideas, 00:05:31:09 00:05:32:13 basically organising the 00:05:33:11 00:05:34:22 questions around what is right, what 00:05:34:23 00:05:36:21 is correct, and organising the trade 00:05:36:22 00:05:38:14 offs that managers are going to face 00:05:39:10 00:05:40:12 in a way that allows them to 00:05:40:13 00:05:42:00 actually analyse their environment. 00:05:42:01 00:05:44:05 So still very much into 00:05:44:19 00:05:46:04 ethereal land and discussion here. 00:05:46:13 00:05:47:23 But just to say that to go to 00:05:48:15 00:05:49:15 a framework, we 00:05:50:14 00:05:52:01 are actually forcing a way for 00:05:52:06 00:05:54:08 managers to make trade-offs 00:05:54:09 00:05:55:16 and decisions and assist in 00:05:55:18 00:05:56:21 decision. So to do that 00:05:57:14 00:05:58:16 is starting to go from 00:05:59:12 00:06:00:23 the most complex ideas to the 00:06:01:09 00:06:03:08 more simple ideas that 00:06:03:09 00:06:05:02 make up those more esoteric 00:06:05:06 00:06:07:07 concepts. So ethics is 00:06:07:09 00:06:08:13 essentially a combination of 00:06:08:14 00:06:10:12 responsibility and accountability, 00:06:10:17 00:06:12:19 and then responsibility becomes 00:06:12:20 00:06:14:06 human centricity, fairness and 00:06:15:00 00:06:16:16 the ability to be harmless in your 00:06:16:17 00:06:17:16 execution. 00:06:18:03 00:06:19:07 And the report is going to have the 00:06:19:08 00:06:20:10 full kind of components 00:06:21:11 00:06:23:00 available for people to dive in. 00:06:23:06 00:06:24:22 But the idea is to say, how do the 00:06:24:25 00:06:26:07 pieces fit together? 00:06:26:18 00:06:28:16 How do we organise them in a way 00:06:28:17 00:06:30:08 that say what has priority? 00:06:30:09 00:06:31:18 What is more important than another 00:06:31:22 00:06:33:11 is to allow for these trade offs. 00:06:33:22 00:06:35:09 Now, when I take it to reality 00:06:36:05 00:06:37:08 and I go, let's go, how 00:06:38:02 00:06:39:19 do I do this as a manager? 00:06:39:23 00:06:40:23 How do I organise my 00:06:41:24 00:06:43:15 business to start leveraging these 00:06:43:16 00:06:44:13 things? 00:06:44:14 00:06:46:05 We start to get to those practical 00:06:46:07 00:06:47:01 points. 00:06:47:02 00:06:48:10 And what we're trying to do is say, 00:06:48:11 00:06:49:22 OK, you're going to start from 00:06:50:08 00:06:52:05 the technical aspect of 00:06:52:13 00:06:53:23 you're going to go to your workflow 00:06:54:23 00:06:55:24 from that workflow. 00:06:56:00 00:06:58:02 You're going to overlay an ethical 00:06:58:03 00:06:59:15 design component onto it, which 00:06:59:24 00:07:01:12 is essentially saying, here are the 00:07:01:13 00:07:02:16 steps that I'm going to take. 00:07:02:17 00:07:04:10 So very broadly listening to 00:07:05:09 00:07:07:00 here's my data concerns, here's my 00:07:07:01 00:07:08:16 algorithmic choices and here's my 00:07:08:17 00:07:09:24 business case. This is what 00:07:10:12 00:07:12:11 I'm going to use my outputs for. 00:07:13:01 00:07:14:14 And you just map all that out and 00:07:14:15 00:07:16:13 that overlay of ethics comes in 00:07:16:14 00:07:18:10 and starts asking questions of 00:07:18:11 00:07:19:20 the manager saying, do I have 00:07:20:18 00:07:22:23 specific threats or risks 00:07:22:24 00:07:24:06 or concerns arising from my 00:07:24:22 00:07:25:18 data sourcing? 00:07:25:19 00:07:27:15 Do I have ethics concerns, 00:07:27:20 00:07:29:06 risks associated with my data 00:07:29:18 00:07:31:04 cleaning up, pre processing, 00:07:31:24 00:07:33:13 how I use it, how I absorb it into 00:07:33:14 00:07:35:03 my company, all of those things, 00:07:35:13 00:07:37:03 and start getting matched so that 00:07:37:06 00:07:39:14 you have a good understanding 00:07:39:17 00:07:40:24 of these potential threats 00:07:41:21 00:07:43:11 to your ethics test. 00:07:43:19 00:07:45:04 So you're saying I want to do things 00:07:45:06 00:07:47:06 well, then there are 00:07:47:07 00:07:49:15 aspects within your design 00:07:49:16 00:07:50:22 of your workflow for your AI 00:07:51:00 00:07:52:06 application that can give 00:07:52:25 00:07:54:14 rise to some complications. 00:07:55:02 00:07:57:02 The first step then is to map those, 00:07:57:03 00:07:58:20 identify what they are, and then we 00:07:58:24 00:08:01:01 fall into some fairly robust 00:08:01:13 00:08:02:17 managerial aspects where 00:08:03:11 00:08:05:12 we say, OK, what are those 00:08:05:13 00:08:06:24 mitigation strategies that I can 00:08:07:00 00:08:08:16 bring to bear to account for this 00:08:08:17 00:08:10:03 risk exposure that I have? 00:08:10:21 00:08:12:19 How do I minimise my risk exposure 00:08:12:20 00:08:14:04 and this are ethical risks, 00:08:14:05 00:08:15:19 potential risks that you can avoid 00:08:16:02 00:08:17:09 doing incorrect things and 00:08:17:24 00:08:19:13 then institutionalising them via AI 00:08:20:09 00:08:21:17 or just like embedding them 00:08:22:09 00:08:23:22 into code, as it were, and just 00:08:23:23 00:08:25:09 perpetuating those errors. 00:08:25:14 00:08:27:04 So this is an early mapping, 00:08:27:15 00:08:29:18 understanding how it fits within 00:08:29:19 00:08:31:16 what you're trying to apply and 00:08:31:17 00:08:33:03 then minimise and mitigate and 00:08:33:12 00:08:35:08 just live with that notion 00:08:35:09 00:08:36:10 and that idea of saying that I'm 00:08:36:11 00:08:37:25 going to revisit and keep coming 00:08:38:06 00:08:39:09 back to this process to 00:08:40:02 00:08:41:13 saying, do I have any threats to 00:08:41:15 00:08:42:12 recognise new things? 00:08:42:13 00:08:43:22 Am I changing anything in my 00:08:43:23 00:08:45:18 workflow, being able to 00:08:45:19 00:08:47:08 actually start delivering against 00:08:47:15 00:08:49:08 that goal, which is what is the most 00:08:49:09 00:08:50:23 responsible or what is a proper 00:08:51:03 00:08:52:19 ethical application of AI 00:08:53:03 00:08:54:03 for my business? 00:08:54:24 00:08:56:18 And so, Felipe, this 00:08:57:20 00:08:59:05 this research that that we're 00:08:59:06 00:09:00:23 talking is going to be released next 00:09:00:24 00:09:02:10 week. And if anyone is wanting 00:09:02:21 00:09:04:10 to find out more, they can visit 00:09:04:16 00:09:05:16 OxfordFutureOfMarketing.com. 00:09:07:05 00:09:09:09 So what I'm hearing from you is, 00:09:09:16 00:09:11:03 is in some sense saying, look, it's 00:09:11:04 00:09:13:00 not about changing the way that 00:09:13:01 00:09:14:03 you do everything. 00:09:14:04 00:09:15:25 It's about sort of bringing a 00:09:15:26 00:09:17:14 set of responsible, ethical, 00:09:17:15 00:09:19:22 accountable notions around AI and 00:09:19:23 00:09:21:22 data and analytics usage into 00:09:22:04 00:09:23:21 existing workflows, which I think is 00:09:24:15 00:09:25:19 is quite appealing. 00:09:25:20 00:09:27:03 Right. It's not about overhauling 00:09:27:04 00:09:28:18 everything we do for this new way 00:09:28:24 00:09:30:18 of doing things, but rather finding, 00:09:30:24 00:09:32:20 again, some kind of happy medium, 00:09:33:02 00:09:34:22 you know, am I getting it right in 00:09:34:23 00:09:36:13 that characterisation the way that 00:09:36:14 00:09:37:14 you're thinking? 00:09:37:17 00:09:39:11 Yeah, I mean, I think it's important 00:09:39:15 00:09:41:03 for managers everywhere, really 00:09:41:12 00:09:43:06 to have that sense of terra firma 00:09:43:07 00:09:44:16 essentially saying, like, this is 00:09:44:17 00:09:45:16 familiar ground. 00:09:46:01 00:09:47:04 This is something that we do in 00:09:47:06 00:09:48:20 business. There's a lot of 00:09:49:01 00:09:50:22 new news with AI; there's a lot 00:09:50:23 00:09:51:19 of hype. 00:09:51:20 00:09:53:16 But from the business sense 00:09:53:17 00:09:55:17 and for the business case use of it, 00:09:55:18 00:09:57:00 it's very useful to go back 00:09:57:13 00:09:58:22 to some of those basic ideas 00:09:59:15 00:10:01:14 and say, like, actually, I might 00:10:01:15 00:10:03:02 have a new business model, I might 00:10:03:03 00:10:05:00 have new capabilities, but actually 00:10:05:01 00:10:06:16 the art and science of management 00:10:07:06 00:10:08:20 behind it, it's still relatively 00:10:09:10 00:10:11:02 stable that I can have appropriate 00:10:11:03 00:10:13:04 controls over these different 00:10:13:05 00:10:14:24 components. It's a new machine, 00:10:15:04 00:10:16:14 but actually how I manage the 00:10:16:15 00:10:18:01 machinery doesn't have a whole 00:10:18:14 00:10:20:04 lot of new moving parts to it. 00:10:20:12 00:10:22:12 We have some concerns 00:10:22:13 00:10:23:20 that are specific to AI and 00:10:24:09 00:10:26:03 some uniqueness that we describe. 00:10:26:16 00:10:28:00 And we were able to leverage 00:10:28:21 00:10:30:10 a lot of pre-existing, like - 00:10:30:18 00:10:32:05 one of the most exciting parts 00:10:32:15 00:10:34:01 for me for this project was 00:10:34:19 00:10:36:09 the way that we arrived at these 00:10:36:10 00:10:38:07 recommendations was by actually 00:10:38:08 00:10:40:11 going to existing body of knowledge, 00:10:40:19 00:10:42:06 everything that companies have 00:10:42:09 00:10:44:12 mentioned on their stances, 00:10:44:13 00:10:45:23 their guidances, governments, 00:10:46:13 00:10:47:13 NGOs. 00:10:47:23 00:10:49:06 We took all of that body of 00:10:49:18 00:10:51:02 knowledge, as well as the academic 00:10:51:03 00:10:52:16 literature on responsibility and 00:10:52:22 00:10:54:04 ethics, and AI, 00:10:54:19 00:10:56:09 put it all together and use our own 00:10:56:10 00:10:57:21 AI machinery on it to organise 00:10:58:15 00:11:00:03 it and give it the shape to put 00:11:00:11 00:11:01:18 things in perspective. 00:11:03:01 00:11:03:24 So I think I think what you're 00:11:04:00 00:11:05:10 talking about here is quite appealing, 00:11:05:22 00:11:07:20 I guess, process for 00:11:07:21 00:11:09:10 the businesses of all kinds of 00:11:09:11 00:11:10:15 shapes and sizes 00:11:11:10 00:11:13:03 to use. And I like that kind of 00:11:13:09 00:11:14:18 you know, that same use of that 00:11:14:19 00:11:16:04 businesses need that terra firma, 00:11:16:05 00:11:17:13 which is business, and I want to 00:11:17:14 00:11:18:13 come back to that. 00:11:18:14 00:11:20:15 But it's also suggesting 00:11:21:00 00:11:22:17 that businesses themselves need to 00:11:22:23 00:11:24:12 be taking responsibility 00:11:25:03 00:11:26:17 for responsible AI. 00:11:27:20 00:11:29:10 I want to hear what you all watching 00:11:29:19 00:11:30:18 think about this, so we're going to 00:11:30:19 00:11:31:21 do a poll and I'm going 00:11:32:15 00:11:34:10 to invite you to to 00:11:34:11 00:11:35:06 do an online poll. 00:11:35:07 00:11:37:01 And we'll come back to the results 00:11:37:09 00:11:38:16 a little bit later on in the 00:11:38:17 00:11:39:13 programme. 00:11:39:14 00:11:41:14 But the question is, who should 00:11:41:15 00:11:43:03 be most responsible 00:11:43:25 00:11:46:08 for ensuring that applications of AI 00:11:46:20 00:11:48:14 in the private and public sectors 00:11:48:15 00:11:50:03 are appropriate, ethical 00:11:50:11 00:11:51:11 and responsible? 00:11:51:18 00:11:53:04 So is it government? 00:11:53:05 00:11:54:21 Is it the tech companies and so on 00:11:54:24 00:11:56:02 and so on, whether it's 00:11:56:22 00:11:59:01 industry bodies or the businesses 00:11:59:02 00:12:00:19 themselves, maybe intergovernmental 00:12:01:02 00:12:02:11 organisations like the United 00:12:02:12 00:12:03:11 Nations or 00:12:04:11 00:12:05:11 individuals? 00:12:05:15 00:12:06:20 But Felipe, one other thing I want 00:12:06:21 00:12:08:20 to talk about with you is 00:12:08:21 00:12:10:10 you mentioned sort of I guess these 00:12:10:11 00:12:12:09 principles or pillars of thinking 00:12:12:10 00:12:14:01 about what what, I guess, a set of 00:12:14:03 00:12:15:18 guidelines or policies would be. 00:12:16:05 00:12:17:15 There's a term that you mentioned 00:12:17:16 00:12:19:05 that I really want to hear a little 00:12:19:06 00:12:20:13 bit more on, which is human 00:12:20:18 00:12:21:18 centricity. 00:12:22:09 00:12:23:17 What what do you mean by that? 00:12:24:17 00:12:25:17 So that one is 00:12:26:15 00:12:28:17 kind of one of the most important 00:12:28:18 00:12:31:06 layers of the ethics considerations, 00:12:31:13 00:12:32:23 which probably shouldn't surprise 00:12:32:24 00:12:34:05 anybody in the room, right. 00:12:34:11 00:12:35:17 We're talking about ethics 00:12:36:07 00:12:37:17 and human considerations, the 00:12:37:18 00:12:39:10 consequences that impacts on 00:12:39:11 00:12:41:01 individual human individuals here. 00:12:41:03 00:12:43:05 So here what we're talking about is 00:12:43:15 00:12:45:22 a combination of actually 00:12:46:01 00:12:47:23 achievable means and 00:12:47:24 00:12:49:10 goals within the AI to deliver 00:12:49:21 00:12:51:12 on human benefits. 00:12:51:18 00:12:53:15 So one core thing inside of that 00:12:53:16 00:12:55:07 is this idea of beneficence, which 00:12:55:16 00:12:57:04 is you're going to generate 00:12:57:11 00:12:59:14 something good out of this process 00:12:59:15 00:13:00:14 for people. 00:13:00:25 00:13:02:18 It can be very broad or it can be a 00:13:02:22 00:13:04:13 good business outcome that comes out 00:13:04:14 00:13:05:13 of it. 00:13:05:17 00:13:07:18 A transparency which is often 00:13:07:19 00:13:09:11 discussed in terms of 00:13:10:01 00:13:11:21 how deploying trustworthy AI is on 00:13:12:02 00:13:13:15 the idea to have an intelligible, 00:13:14:12 00:13:15:22 understandable system. 00:13:16:15 00:13:17:24 All of these are components that 00:13:18:00 00:13:19:01 make it centred around 00:13:19:24 00:13:21:14 the human, essentially - it's almost 00:13:21:15 00:13:22:21 like using the word to define itself 00:13:22:22 00:13:25:06 - is the more 00:13:25:10 00:13:27:12 that a human can interact and 00:13:27:17 00:13:29:01 the more that the system then 00:13:29:06 00:13:31:03 appreciates that there's a human 00:13:31:05 00:13:33:01 that is going to bear a consequence 00:13:33:08 00:13:35:11 of our automated decision 00:13:35:12 00:13:36:11 making, then the more 00:13:37:07 00:13:38:15 stable you are into being a 00:13:38:16 00:13:40:17 responsible business and the more 00:13:40:18 00:13:42:04 grounded you are in the fact that 00:13:42:05 00:13:43:15 there is going to be a human cost 00:13:43:16 00:13:44:24 associated with some of the 00:13:45:09 00:13:46:18 decisions that we make inside of our 00:13:46:19 00:13:47:19 businesses. 00:13:48:01 00:13:49:06 All right, thank you so, Felipe. 00:13:50:02 00:13:51:15 Sit tight because we'll come back to 00:13:51:16 00:13:52:24 you in the Q&A a little bit 00:13:53:11 00:13:54:11 later. 00:13:54:14 00:13:55:14 So thanks, Doctor Felipe Thomaz, Associate 00:13:56:10 00:13:57:06 Professor of Marketing. And I want 00:13:57:07 00:13:59:05 to now bring in our two 00:13:59:06 00:14:00:21 panellists as I introduced before 00:14:01:09 00:14:03:09 Dr. Yasmeen Ahmad, who is 00:14:03:10 00:14:05:04 VP of Strategy at Teradata, which 00:14:05:05 00:14:07:04 is an enterprise technology 00:14:07:05 00:14:08:22 company, and Dr. Natalia Eframova, 00:14:09:18 00:14:11:01 who is a computer scientist 00:14:11:23 00:14:13:01 and is a 00:14:13:20 00:14:15:24 Research Fellow in Marketing 00:14:16:02 00:14:17:14 and AI here at the Said Business 00:14:17:15 00:14:19:17 School, and Natalia is also heavily 00:14:19:18 00:14:21:02 involved in the research work 00:14:21:13 00:14:22:23 that Felipe was just telling us 00:14:22:24 00:14:23:24 about. 00:14:24:04 00:14:25:15 So welcome to both of you. 00:14:26:00 00:14:27:14 What I want to do and we'll come to 00:14:27:15 00:14:28:19 the poll results a little bit later 00:14:28:20 00:14:29:25 as well. But but first, I 00:14:30:16 00:14:32:07 want to actually come to 00:14:33:01 00:14:35:01 a question that actually 00:14:35:02 00:14:36:18 has popped up from some of our 00:14:37:13 00:14:39:01 A.I. Diploma - Executive AI 00:14:39:16 00:14:40:18 Diploma students, 00:14:41:12 00:14:43:10 which is actually around how 00:14:43:11 00:14:44:17 well a business is doing 00:14:45:10 00:14:46:04 at the moment. 00:14:46:05 00:14:47:25 So Andrew, Claire 00:14:48:04 00:14:50:01 and a few others have asked this 00:14:50:02 00:14:51:15 question from that group. 00:14:51:22 00:14:53:07 You know, if we sort of think about 00:14:53:08 00:14:54:08 responsible AI, where's 00:14:55:04 00:14:57:00 kind of the starting point at the 00:14:57:01 00:14:58:05 moment with businesses? 00:14:58:21 00:15:00:05 Are businesses already pretty 00:15:00:06 00:15:01:08 responsible with this, or is there 00:15:01:09 00:15:02:08 room for improvement? 00:15:02:20 00:15:04:09 Yasmeen, let me let me go to you 00:15:04:10 00:15:05:09 first. 00:15:06:11 00:15:07:23 Thanks, Andrew. That is a great 00:15:07:25 00:15:08:24 question. 00:15:09:00 00:15:10:18 So how are businesses thinking about 00:15:10:19 00:15:12:07 this? And I think it's worth 00:15:12:08 00:15:13:13 reflecting on 00:15:14:09 00:15:15:23 how do businesses think about AI? 00:15:16:16 00:15:18:01 And if I just rewind back in 00:15:18:13 00:15:20:11 my career, I was 00:15:20:12 00:15:22:05 looking after data science analytics 00:15:22:19 00:15:24:13 teams who were working with our 00:15:24:14 00:15:26:08 clients, some of the largest 00:15:26:09 00:15:28:00 companies in the world, and in 00:15:28:24 00:15:30:20 new AI data science 00:15:30:21 00:15:32:13 techniques, a few years back 00:15:32:19 00:15:33:23 at that point, AI, 00:15:35:03 00:15:36:10 some of the newer machine learning 00:15:36:11 00:15:37:13 techniques, etc. 00:15:37:17 00:15:39:22 were isolated to these 00:15:39:23 00:15:41:07 data science functions, these 00:15:41:13 00:15:43:07 centres of excellence that have been 00:15:43:09 00:15:44:25 set up, whether that was in the bank 00:15:45:00 00:15:46:07 or the retail or the telco 00:15:47:03 00:15:48:03 company. 00:15:48:08 00:15:49:08 And so it was self 00:15:50:09 00:15:51:22 constrained in some ways because it 00:15:51:23 00:15:53:20 was one group that was developing 00:15:54:00 00:15:54:24 these algorithms. 00:15:55:00 00:15:55:24 It was one stop shop 00:15:56:24 00:15:58:08 for looking at how are those 00:15:58:19 00:16:00:09 algorithms being developed, where 00:16:00:14 00:16:02:00 what kind of use cases are they 00:16:02:01 00:16:03:01 being applied to? 00:16:03:02 00:16:04:11 Where in the business are we 00:16:04:17 00:16:06:05 leveraging those algorithms? 00:16:06:18 00:16:08:23 You fast forward to today and 00:16:08:24 00:16:10:06 we really see AI being much 00:16:10:24 00:16:12:04 more pervasive across the 00:16:12:07 00:16:13:07 organisation. 00:16:13:19 00:16:15:18 Those use cases are no longer 00:16:15:23 00:16:17:24 limited to one group that's 00:16:18:00 00:16:19:18 doing the development work or in one 00:16:20:02 00:16:21:02 business function. 00:16:21:12 00:16:23:09 We see whether it's 00:16:23:10 00:16:25:13 supply chain, customer experience, 00:16:25:14 00:16:27:22 operations, fraud departments, 00:16:28:08 00:16:29:08 they're all leveraging AI 00:16:30:11 00:16:32:03 analytics techniques to 00:16:32:07 00:16:33:25 improve their operations, to create 00:16:34:05 00:16:35:05 differentiation. 00:16:35:21 00:16:37:12 That makes it really challenging for 00:16:37:13 00:16:39:09 really large organisations 00:16:39:16 00:16:40:21 to now be able to control 00:16:41:24 00:16:44:06 how that AI is applied, to control 00:16:44:13 00:16:45:17 the decisions that AI is 00:16:46:13 00:16:47:13 making. 00:16:47:17 00:16:49:01 And coming back to what Felipe 00:16:49:15 00:16:51:04 said there, the human centricity 00:16:52:02 00:16:53:13 piece, even when the decision 00:16:54:00 00:16:56:03 is automated, is there a human 00:16:56:09 00:16:58:06 who is ultimately responsible for 00:16:58:07 00:16:59:21 the decision that's being actioned? 00:17:00:08 00:17:01:18 So I think many organisations 00:17:02:09 00:17:03:17 have had to move to looking 00:17:04:09 00:17:05:11 at general guidelines, 00:17:06:05 00:17:08:07 putting in frameworks to support 00:17:08:14 00:17:09:14 application of AI, 00:17:10:11 00:17:12:07 some of those organisations because 00:17:12:08 00:17:13:13 they've seen the backlash 00:17:14:04 00:17:15:22 of when they get it wrong and 00:17:16:10 00:17:17:24 consumers don't like it 00:17:19:07 00:17:20:07 - AI - being applied in certain 00:17:20:08 00:17:21:10 circumstances. 00:17:21:20 00:17:22:20 And so we're seeing 00:17:23:23 00:17:24:23 that AI is 00:17:25:19 00:17:27:19 becoming the responsibility of 00:17:27:20 00:17:29:14 roles like the Chief Data Officer 00:17:29:16 00:17:32:02 to think about not just 00:17:32:09 00:17:34:01 enabling the organisation with the 00:17:34:04 00:17:36:03 tools, but also how do we 00:17:36:04 00:17:37:22 start to govern, how 00:17:38:07 00:17:39:20 AI is used and applied across the 00:17:39:21 00:17:41:20 business and really pervasively 00:17:41:21 00:17:42:21 across the business. 00:17:43:11 00:17:44:18 So Natalia, I want to I want to 00:17:44:19 00:17:46:04 bring you in here and 00:17:46:14 00:17:47:23 kind of take the question in a 00:17:47:24 00:17:49:16 slightly different way: Do we 00:17:49:19 00:17:50:14 actually need this? 00:17:50:15 00:17:51:25 I mean, I know you and others have 00:17:51:26 00:17:53:11 worked hard on thinking about these 00:17:53:12 00:17:55:07 guidelines, but but, you 00:17:55:08 00:17:56:13 know, why is there a need 00:17:57:04 00:17:59:02 for you know, it sort of seems like 00:17:59:03 00:18:00:16 we we have to keep on convincing 00:18:00:17 00:18:02:03 businesses they need to they need to 00:18:02:04 00:18:03:13 actually do these things because 00:18:03:14 00:18:05:10 they are not perhaps innate. 00:18:05:24 00:18:06:24 So why why is 00:18:07:20 00:18:09:03 this a problem that we need to 00:18:09:04 00:18:10:03 solve? 00:18:12:10 00:18:13:20 Thank you for the question, Andrew. 00:18:13:21 00:18:15:14 And it's a really, really good one, 00:18:15:21 00:18:16:24 an important one. 00:18:17:05 00:18:19:02 And this is something not all 00:18:19:03 00:18:21:01 businesses ask themselves, are we 00:18:21:02 00:18:22:15 using the data correctly? 00:18:22:22 00:18:24:01 Are we implementing A.I. 00:18:24:12 00:18:25:12 correctly? 00:18:25:18 00:18:26:21 Because AI function is, 00:18:27:23 00:18:29:21 in many cases, not the central 00:18:29:22 00:18:30:18 to the business. 00:18:30:19 00:18:31:23 It it has a support role 00:18:32:17 00:18:33:17 in the organisation. 00:18:34:03 00:18:35:21 And in this case, businesses do not 00:18:35:25 00:18:37:20 pay that much attention 00:18:38:02 00:18:39:23 to what is happening there, how 00:18:39:24 00:18:41:14 they should curate their data, how 00:18:41:15 00:18:42:25 the should care, take care of 00:18:43:10 00:18:45:02 their data, how they 00:18:45:07 00:18:47:01 use it in operations. 00:18:47:02 00:18:48:09 This is all very important 00:18:49:08 00:18:51:03 and AI is 00:18:51:07 00:18:52:03 something new. 00:18:52:04 00:18:53:16 This is not something we use in 00:18:53:18 00:18:56:00 practise for very, very long and 00:18:56:13 00:18:58:09 best practises are simply 00:18:58:10 00:18:59:21 not there. 00:19:00:03 00:19:02:02 So every business when 00:19:02:22 00:19:04:20 coming to the question 00:19:04:21 00:19:06:14 'is my AI ethical?' 00:19:06:24 00:19:08:21 has to decide it for 00:19:08:22 00:19:10:06 for itself. 00:19:10:14 00:19:12:05 And it's not always the case that 00:19:12:12 00:19:14:07 they have resources to do that 00:19:14:08 00:19:15:17 or they have education to do 00:19:16:05 00:19:17:23 that, or simply 00:19:18:04 00:19:19:07 they cannot find proper 00:19:19:25 00:19:20:25 guidelines. 00:19:21:10 00:19:22:17 So in practise, 00:19:23:07 00:19:24:14 AI, is ethical as business 00:19:25:12 00:19:27:20 decides it should be ethical 00:19:27:24 00:19:29:22 for now then. 00:19:29:24 00:19:30:24 Do you think it varies, Natalia, 00:19:31:17 00:19:32:23 sort of around the world? 00:19:33:03 00:19:34:03 Are some parts 00:19:35:06 00:19:36:20 of the world thinking more about 00:19:36:23 00:19:38:13 this than than others or thinking 00:19:38:24 00:19:40:04 differently about this to 00:19:40:25 00:19:41:22 others? 00:19:41:23 00:19:43:15 And I'll go with you, Natalia, 00:19:43:16 00:19:44:18 first, but I also want Yasmeen's 00:19:45:03 00:19:46:02 perspective on this. Given that 00:19:46:03 00:19:47:18 you're both both you know, 00:19:48:03 00:19:49:13 you have perspectives on different 00:19:49:14 00:19:50:13 parts of the world. 00:19:50:15 00:19:52:05 So, Natalia, what's your thoughts? 00:19:53:21 00:19:55:23 Well, of course, it's not the same. 00:19:56:07 00:19:58:01 It differs a lot 00:19:58:08 00:20:00:02 across geographies. 00:20:01:17 00:20:03:09 I think that from 00:20:03:14 00:20:05:04 really my personal experience, AI 00:20:05:09 00:20:07:07 ethics is mostly 00:20:07:08 00:20:08:15 developed in more developed 00:20:08:25 00:20:10:23 countries with more 00:20:10:24 00:20:12:04 technical businesses, and 00:20:12:22 00:20:14:04 it's far less developed in 00:20:14:22 00:20:17:04 smaller economies as just 00:20:17:05 00:20:18:12 historically there are less 00:20:19:05 00:20:20:12 resources and 00:20:21:12 00:20:23:06 less opportunities for businesses to 00:20:23:10 00:20:25:12 look at these problems. 00:20:26:05 00:20:28:04 So in many cases, 00:20:28:16 00:20:30:07 AI has to comply with regulations, 00:20:30:18 00:20:32:08 but in some countries is completely 00:20:32:09 00:20:33:05 unregulated. 00:20:33:06 00:20:34:17 And so it's more difficult for 00:20:34:23 00:20:36:10 businesses to regulate themselves, 00:20:36:13 00:20:37:13 of course. 00:20:38:10 00:20:40:08 Yasmeen, from your experience 00:20:41:01 00:20:42:24 working with clients, customers 00:20:43:00 00:20:44:14 in lots of different parts of the 00:20:44:15 00:20:45:09 world. 00:20:45:10 00:20:47:04 What's your take on on this? 00:20:48:03 00:20:50:03 I would have to agree with Natalia, 00:20:50:04 00:20:51:14 I don't think it's consistent 00:20:52:04 00:20:53:04 across the globe. 00:20:53:11 00:20:55:07 In fact, I think it has a lot to 00:20:55:08 00:20:57:00 do also with societies and cultures 00:20:57:17 00:20:59:15 and what's acceptable, what's 00:20:59:16 00:21:01:02 not. And often 00:21:01:11 00:21:02:23 for a society and culture, what's 00:21:02:24 00:21:04:22 acceptable or what consumers, 00:21:04:23 00:21:07:14 individuals, citizens 00:21:07:21 00:21:09:14 are willing to accept is 00:21:09:18 00:21:11:03 reflected in law. 00:21:11:10 00:21:13:11 So there's typically 00:21:13:12 00:21:14:18 more regulation in Europe. 00:21:15:00 00:21:16:13 And so a lot of regulation will 00:21:16:14 00:21:17:13 often drive companies 00:21:18:10 00:21:19:21 to take those steps 00:21:20:16 00:21:22:02 even before the regulation is 00:21:22:03 00:21:23:13 written, because there's that 00:21:23:15 00:21:25:00 feeling of responsibility and 00:21:25:13 00:21:27:07 it will be regulated. 00:21:27:17 00:21:29:14 Whereas having lived in Europe and 00:21:29:15 00:21:30:17 now living in the US, I 00:21:31:11 00:21:33:00 see the difference there between 00:21:33:03 00:21:34:03 those geographies. 00:21:34:04 00:21:35:12 In the US there's typically 00:21:36:01 00:21:37:08 less regulation. 00:21:37:17 00:21:39:15 In some areas there's kind 00:21:39:16 00:21:41:02 of more innovation pushing the 00:21:41:04 00:21:42:19 boundaries and then at some point 00:21:42:20 00:21:44:03 some regulation will come in. 00:21:44:04 00:21:45:17 But it's not consistent across the 00:21:45:18 00:21:46:14 US. 00:21:46:15 00:21:48:14 And then you say, see, in 00:21:48:15 00:21:50:01 Asia-Pacific and Australia, it 00:21:50:14 00:21:52:02 really again varies by country, 00:21:52:07 00:21:54:01 varies by geography, 00:21:54:09 00:21:56:11 and it is linked to that the legal 00:21:56:12 00:21:58:05 regulatory system of what's accepted 00:21:58:19 00:22:00:13 and how our business is governed 00:22:00:19 00:22:02:08 and crucially, what our citizens 00:22:02:24 00:22:04:10 and consumers expecting. 00:22:05:09 00:22:06:18 And actually, Yasmeen, what about I 00:22:06:19 00:22:08:03 mean, we talk about geography, but 00:22:08:04 00:22:09:17 what about different industries? 00:22:10:12 00:22:11:20 That's a great lens to look 00:22:12:13 00:22:14:08 at it, Andrew, and I would 00:22:14:14 00:22:16:10 even take it a step 00:22:16:11 00:22:18:01 deeper than the industry level. 00:22:18:02 00:22:19:07 It's it's the use case in 00:22:19:22 00:22:21:15 the application level, because even 00:22:21:24 00:22:23:20 within an industry, there are 00:22:23:21 00:22:25:06 some business use cases, some 00:22:25:22 00:22:27:07 business functions, departments 00:22:27:08 00:22:28:07 where 00:22:29:03 00:22:30:17 AI is more freely leveraged 00:22:30:24 00:22:32:25 versus other areas in the business. 00:22:32:26 00:22:34:15 And I think it comes down to 00:22:34:21 00:22:36:05 what's the business outcome. 00:22:36:16 00:22:38:22 So how much risk 00:22:38:23 00:22:40:06 are we willing to accept with that 00:22:40:07 00:22:42:01 business outcome if the AI 00:22:42:04 00:22:43:15 gets it wrong or if the AI is 00:22:43:22 00:22:44:22 biased? 00:22:44:23 00:22:46:02 So say we have AI applied 00:22:46:23 00:22:48:16 to our supply chains, 00:22:50:01 00:22:52:02 if the supply chain is not 00:22:52:03 00:22:53:10 quite efficient it might have a 00:22:53:11 00:22:54:24 negative impact on the business, but 00:22:54:25 00:22:56:17 nobody is going to to 00:22:56:22 00:22:58:13 splash that across the front pages 00:22:58:14 00:22:59:25 as being a biased algorithm or 00:23:00:07 00:23:01:07 unethical. 00:23:01:09 00:23:03:03 However, any time you're dealing 00:23:03:04 00:23:05:14 with citizens or customer experience 00:23:05:15 00:23:07:12 or consumer applications of A.I., 00:23:07:19 00:23:09:11 it takes on another lens. 00:23:09:20 00:23:10:20 So I think naturally 00:23:11:17 00:23:13:14 some industries have more 00:23:13:21 00:23:16:09 consumer citizen 00:23:16:11 00:23:18:13 say it's health care or it's banks 00:23:18:14 00:23:20:19 or retailers have a lot of consumer 00:23:20:20 00:23:22:02 and citizen data and they're 00:23:22:03 00:23:23:10 leveraging AI on that data. 00:23:23:17 00:23:25:19 And so there is more scrutiny there. 00:23:26:01 00:23:27:06 Whereas if you go to the 00:23:27:07 00:23:28:24 manufacturing industries, 00:23:29:14 00:23:31:03 maybe less so because again, the 00:23:31:11 00:23:33:14 application of the AI outcome, 00:23:34:03 00:23:35:09 there's maybe less ethical 00:23:35:21 00:23:37:06 considerations on that 00:23:37:19 00:23:38:19 outcome. 00:23:39:10 00:23:40:14 So it's a good point sort of how 00:23:40:15 00:23:42:01 close you are, I guess, to to the 00:23:42:02 00:23:43:11 humans, which is essentially 00:23:43:23 00:23:45:21 back to Felipe's principle of human 00:23:45:22 00:23:47:24 centricity, I guess. 00:23:48:23 00:23:50:15 Natalia, I want to come back to you 00:23:50:16 00:23:52:05 with a different question, though. 00:23:52:12 00:23:53:21 You know, how do we get 00:23:54:13 00:23:56:13 businesses to really pay 00:23:56:14 00:23:57:21 attention and mean this is sort of 00:23:57:22 00:23:59:10 comes to a question that's that's 00:23:59:11 00:24:01:07 come up already from some of our 00:24:01:21 00:24:03:04 executive diploma students. 00:24:03:10 00:24:04:25 But if I'm a, 00:24:05:09 00:24:07:02 you know, a manager in some 00:24:07:03 00:24:09:02 organisation, how how 00:24:09:03 00:24:11:05 do you convince me that 00:24:11:14 00:24:12:09 being honest is just kind of the 00:24:12:10 00:24:13:13 right thing to do that? 00:24:13:16 00:24:15:05 I really need to to think deeply 00:24:15:11 00:24:16:12 about this, and I'm better in the 00:24:16:13 00:24:18:01 way that we're doing things 00:24:18:11 00:24:20:08 when it's yet another thing that 00:24:20:09 00:24:21:16 I need to, in some sense, comply 00:24:21:17 00:24:23:20 with. Practically speaking, 00:24:24:04 00:24:25:07 how do we get the right 00:24:26:04 00:24:27:09 people in 00:24:28:02 00:24:29:10 organisations to be thinking about 00:24:29:11 00:24:30:17 this, in I guess an action 00:24:31:07 00:24:33:02 way as opposed to a 00:24:34:05 00:24:36:04 sort of a token gesture kind of way? 00:24:38:06 00:24:39:07 That's a great question. 00:24:39:08 00:24:40:08 Thank you, Andrew. 00:24:40:12 00:24:41:12 And 00:24:42:09 00:24:44:01 if we look even deeper, 00:24:44:09 00:24:46:11 we need to ask ourselves, how 00:24:46:12 00:24:47:18 can any individual working 00:24:48:11 00:24:50:24 with AI impact 00:24:50:25 00:24:51:25 the outcome and be ethical? 00:24:52:09 00:24:53:22 Because when 00:24:54:06 00:24:56:12 it comes to AI implementation, 00:24:56:13 00:24:57:24 it's not only about managerial 00:24:58:14 00:25:00:20 decisions in many cases 00:25:00:21 00:25:03:01 when it comes to manager, 00:25:03:06 00:25:05:12 there is a lot of steps done, 00:25:06:19 00:25:09:02 for example, data collection or 00:25:09:17 00:25:12:00 data cleaning or just 00:25:12:01 00:25:14:01 dissemination some information about 00:25:14:02 00:25:15:15 how AI is operating. 00:25:15:21 00:25:18:01 So I would say it is important 00:25:18:02 00:25:19:01 to every business 00:25:19:23 00:25:21:21 role to think about 00:25:21:22 00:25:23:09 what is that we're doing? 00:25:23:11 00:25:25:11 And it's important to understand, 00:25:25:14 00:25:27:13 at least on very high level, what 00:25:27:14 00:25:29:10 is AI doing, which 00:25:29:14 00:25:31:07 which with which you're working. 00:25:31:12 00:25:33:08 In many cases, managers 00:25:33:09 00:25:35:06 and other 00:25:35:07 00:25:37:09 industry pros don't really 00:25:37:10 00:25:39:03 know very well what's happening, and 00:25:39:12 00:25:41:16 it is up to higher management 00:25:41:19 00:25:43:09 to educate to the 00:25:44:22 00:25:46:15 level people understand 00:25:46:19 00:25:48:04 what are the consequences and 00:25:48:17 00:25:50:18 not only short term like tomorrow 00:25:50:19 00:25:52:11 consequences, but more longitudinal 00:25:53:05 00:25:54:05 consequences. 00:25:54:12 00:25:56:06 If I use this data on 00:25:56:13 00:25:59:02 what will be impact on the society 00:25:59:03 00:26:00:18 or on my clients 00:26:01:04 00:26:03:02 next year, a two and 10 00:26:03:03 00:26:04:03 years, unfortunately, 00:26:05:04 00:26:06:06 there are not many use 00:26:07:05 00:26:08:15 cases now. 00:26:08:16 00:26:11:02 And in general, management 00:26:11:03 00:26:13:01 is not aware what 00:26:13:02 00:26:14:03 can go wrong. 00:26:14:14 00:26:15:14 But it's probably 00:26:16:20 00:26:18:15 our role as 00:26:18:24 00:26:20:15 educators and 00:26:20:21 00:26:22:20 institutions to build more. 00:26:22:21 00:26:24:18 This case is to say, OK, that 00:26:25:00 00:26:26:07 there are so many things that 00:26:27:03 00:26:28:18 can go wrong and we need to think 00:26:29:04 00:26:30:10 about them now. 00:26:30:17 00:26:31:22 And the role of education 00:26:32:23 00:26:34:16 here is super important. 00:26:35:19 00:26:37:05 So the extent to which the 00:26:37:17 00:26:39:12 consequences of the 00:26:39:13 00:26:41:11 almost proximity to potential 00:26:42:01 00:26:43:13 consequences for human beings, 00:26:43:24 00:26:46:02 you know, your customers, society 00:26:46:19 00:26:48:17 and citizens, you know, 00:26:48:18 00:26:50:10 that suggests, you know, how 00:26:50:24 00:26:52:22 in some sense, seriously, 00:26:53:11 00:26:54:18 this needs to be thought through. 00:26:54:19 00:26:55:25 And then I guess that's Yasmeen's 00:26:56:07 00:26:58:05 point. Then, Natalia, your point is, 00:26:58:06 00:26:59:01 well, actually, it's everyone's 00:26:59:02 00:27:00:01 responsibility. 00:27:00:04 00:27:01:17 This is not something that just 00:27:02:01 00:27:03:24 has to exist in sort of the 00:27:04:02 00:27:05:18 technical parts of an organisation 00:27:05:19 00:27:07:08 with the computer scientists or data 00:27:07:09 00:27:08:18 scientists and engineers who 00:27:09:06 00:27:10:05 have to certainly think about it 00:27:10:06 00:27:11:06 from certain dimensions and govern 00:27:11:07 00:27:12:08 it in certain ways. 00:27:12:14 00:27:13:24 It's not just for, sort of, the 00:27:14:00 00:27:15:21 middle and upper management to sort 00:27:15:22 00:27:18:02 of say, hey, we need to impose 00:27:18:03 00:27:19:14 these rules and these regulations, 00:27:19:15 00:27:20:14 these frameworks. 00:27:20:17 00:27:22:07 It's for everyone, everyone who's 00:27:22:13 00:27:24:11 thinking about how data may be used, 00:27:25:03 00:27:26:17 how algorithms may be used to to 00:27:26:24 00:27:28:13 inform or make decisions or make 00:27:28:23 00:27:30:23 recommendations and so on and so 00:27:30:24 00:27:32:13 forth, which I think is is a very, 00:27:32:14 00:27:33:18 very important practical 00:27:34:04 00:27:35:04 perspectives. 00:27:35:18 00:27:37:09 It's sort of a top down and bottom 00:27:37:10 00:27:40:02 up technical and non-technical 00:27:40:08 00:27:41:19 set of things to think about. 00:27:42:00 00:27:43:13 The other point I note, Natalia, is 00:27:43:14 00:27:45:05 to never forget about the law, 00:27:45:10 00:27:46:22 what I call the law of unintended 00:27:46:23 00:27:47:23 consequences. 00:27:48:04 00:27:49:09 You know, we may see that through 00:27:49:10 00:27:50:21 the lens of uncertainty, but we may 00:27:50:22 00:27:52:08 also see that as a 00:27:53:05 00:27:55:11 almost a proposition or a challenge 00:27:55:18 00:27:57:06 to people, an organisation to say, 00:27:57:07 00:27:59:02 OK, well, OK, no, it's not 00:27:59:03 00:28:00:20 just about what's obvious that could 00:28:00:21 00:28:02:03 go wrong here. And therefore, how do 00:28:02:04 00:28:03:14 we prevent that. But what what are 00:28:03:19 00:28:05:15 kind of the less obvious things and 00:28:05:16 00:28:06:21 maybe to think a little bit more 00:28:07:17 00:28:09:11 out of the box on on those 00:28:09:12 00:28:11:08 unintended consequences so 00:28:11:09 00:28:12:07 you can have everything on your 00:28:12:08 00:28:13:18 radar. But at least expanding 00:28:14:06 00:28:16:07 that set of possibilities may indeed 00:28:16:18 00:28:18:09 be another way to practically help 00:28:18:10 00:28:20:10 thinking about implementing 00:28:20:18 00:28:22:19 responsible, ethical AI. 00:28:23:12 00:28:24:16 So I think that's all, I got more 00:28:24:17 00:28:26:06 questions for you, but I actually 00:28:26:10 00:28:28:02 want to come to a question that was 00:28:28:11 00:28:29:19 posed by one of our diploma 00:28:30:11 00:28:31:23 students, Joanna, who 00:28:32:08 00:28:33:19 is in the US at the moment. 00:28:34:01 00:28:35:14 And it's sort of to this point, 00:28:35:22 00:28:37:08 I think, about everyone being 00:28:37:11 00:28:38:21 responsible. But her question is, 00:28:38:22 00:28:40:00 well, who needs to be in 00:28:40:18 00:28:42:02 the room? How to how do you get the 00:28:42:03 00:28:43:17 right people or the right personas 00:28:44:13 00:28:46:03 essentially in the room to 00:28:46:17 00:28:48:11 to express their concerns, to think 00:28:48:12 00:28:49:19 about perhaps these unintended 00:28:49:20 00:28:51:14 consequences within an organisation? 00:28:52:02 00:28:53:11 So if it's not just the technical 00:28:53:12 00:28:55:08 people, if it's not just, 00:28:55:18 00:28:57:07 say, middle and upper management, 00:28:57:15 00:28:58:15 who should basically 00:28:59:16 00:29:00:22 have a voice in governing 00:29:01:20 00:29:03:02 this within organisations? 00:29:03:07 00:29:04:17 I think it's it's a really important 00:29:05:01 00:29:06:08 question in terms of how we 00:29:06:15 00:29:08:07 implement it. Yasmeen, what do you 00:29:08:10 00:29:09:10 think? 00:29:10:01 00:29:11:17 That's that's a good question, and 00:29:12:03 00:29:13:21 it's I think it's 00:29:14:05 00:29:15:09 back to what you were saying, 00:29:15:10 00:29:16:23 Andrew, it's really important that 00:29:16:24 00:29:18:14 it's not just the analysts or the 00:29:18:18 00:29:20:13 data scientists who are thinking 00:29:20:14 00:29:22:05 about the implications of 00:29:23:07 00:29:25:15 the ethics or the fairness 00:29:25:25 00:29:27:18 of algorithms being applied. 00:29:27:24 00:29:29:09 In fact, it's more important 00:29:29:25 00:29:31:17 than ever to have business leaders, 00:29:32:01 00:29:33:23 business people 00:29:33:24 00:29:35:05 in the room to discuss the 00:29:35:10 00:29:36:24 implications of analytics and 00:29:37:00 00:29:38:22 algorithms and being applied, 00:29:39:05 00:29:40:10 because typically it's those 00:29:40:11 00:29:42:01 business leaders and those business 00:29:42:02 00:29:43:21 people who really understand 00:29:44:02 00:29:45:05 the application and how 00:29:46:10 00:29:48:09 that application will go out 00:29:48:10 00:29:49:07 in the world. 00:29:49:08 00:29:50:11 And so what we found it 00:29:51:04 00:29:53:11 helpful to do is focus 00:29:53:16 00:29:55:13 your key stakeholders on 00:29:55:20 00:29:57:00 specifics of what is that 00:29:57:18 00:29:59:08 use case that you're trying to drive 00:29:59:09 00:30:00:13 and what's the outcome. 00:30:00:22 00:30:02:17 And when thinking about 00:30:03:12 00:30:04:15 the the the responsible 00:30:05:19 00:30:07:19 use of algorithms, it's 00:30:07:20 00:30:09:02 useful to think about what if the 00:30:09:03 00:30:10:09 algorithm got it wrong. 00:30:11:06 00:30:12:16 And in analytical terms, when 00:30:13:05 00:30:15:04 we're using algorithms, we often 00:30:15:05 00:30:17:00 talk about false positives or false 00:30:17:01 00:30:18:01 negatives. 00:30:18:02 00:30:19:16 What's the chance that if we predict 00:30:19:17 00:30:20:16 something to be true, 00:30:21:21 00:30:22:21 it's actually not true? 00:30:23:02 00:30:25:03 It's false positive or a false 00:30:25:04 00:30:27:06 negative, If we say it's not true, 00:30:27:09 00:30:28:21 but it does actually happen to be 00:30:28:22 00:30:29:21 true. 00:30:29:23 00:30:31:14 What's - with this 00:30:31:24 00:30:33:16 this application of the algorithm, 00:30:33:20 00:30:35:02 what's acceptable to us as 00:30:35:18 00:30:37:11 a business? And the use case often 00:30:37:12 00:30:38:25 drives that acceptability. 00:30:39:10 00:30:41:06 So think about in health care. 00:30:41:12 00:30:43:09 We've seen use cases around 00:30:43:10 00:30:44:21 A.I. and algorithms being used 00:30:45:10 00:30:48:01 and applied to mammograms 00:30:48:02 00:30:49:08 or CT scans to look at are 00:30:49:24 00:30:52:12 there abnormal or abnormalities 00:30:52:19 00:30:53:21 with that image. 00:30:54:06 00:30:56:09 And so actually to get 00:30:56:10 00:30:57:09 a 00:30:58:08 00:30:59:24 false negative where the algorithm 00:31:00:01 00:31:01:05 said there's nothing wrong, but 00:31:01:06 00:31:03:01 there actually is in a health 00:31:03:02 00:31:04:11 care situation that might not be 00:31:04:12 00:31:06:02 acceptable, you need a high level 00:31:06:11 00:31:08:01 of accuracy for an algorithm. 00:31:08:12 00:31:10:07 So at that point, if 00:31:10:08 00:31:11:19 you're able to, as a business, 00:31:12:01 00:31:13:08 express what the algorithm 00:31:14:10 00:31:16:09 can run, but there's a level 00:31:16:10 00:31:17:20 of accuracy that's acceptable 00:31:18:10 00:31:20:00 to the business problem that then 00:31:20:10 00:31:22:07 gives you your data science through 00:31:22:08 00:31:23:15 analytics teams a direction 00:31:24:07 00:31:25:07 to say, OK, we 00:31:26:06 00:31:27:21 can go into testing algorithms in 00:31:27:24 00:31:29:15 this area, but until we don't get 00:31:29:24 00:31:31:14 to the level of specificity 00:31:31:20 00:31:33:06 and the business teams can can 00:31:33:19 00:31:35:09 help to define what that level of 00:31:35:10 00:31:36:22 specificity is, 00:31:37:06 00:31:39:02 that algorithm is not 00:31:39:03 00:31:40:25 acceptable for for 00:31:41:16 00:31:43:13 production or real real 00:31:43:14 00:31:44:14 world use. 00:31:44:22 00:31:45:23 And so I describe that 00:31:46:19 00:31:48:05 scenario because I think it's a 00:31:48:06 00:31:49:06 useful way. 00:31:50:04 00:31:51:22 I liked how Felipe, I got a preview 00:31:51:24 00:31:53:13 of the paper that 00:31:54:00 00:31:55:14 the team have been working on at 00:31:55:16 00:31:56:24 Oxford with the ICC 00:31:57:13 00:31:59:09 and that Felipe described 00:31:59:10 00:32:00:09 it earlier. 00:32:00:11 00:32:02:12 You need a way of making 00:32:02:13 00:32:04:15 these big conversations about ethics 00:32:04:16 00:32:06:07 and fairness, etc., more tangible, 00:32:07:01 00:32:07:25 more real world. 00:32:07:26 00:32:09:15 How can you apply them in business? 00:32:10:13 00:32:11:21 Getting into a little bit more of 00:32:11:22 00:32:13:12 the details around the algorithm on 00:32:13:13 00:32:15:11 what level of accuracy is acceptable 00:32:15:17 00:32:17:15 helps to frame the conversation 00:32:17:16 00:32:20:03 in the implementation. 00:32:20:04 00:32:21:10 How can we move forward? 00:32:21:11 00:32:23:02 What's the level of acceptability? 00:32:24:11 00:32:25:09 Thanks Yasmeen. I think that's 00:32:25:10 00:32:27:02 really, really helpful way to think 00:32:27:05 00:32:29:00 about these errors. 00:32:29:09 00:32:30:17 In statistics we would call them 00:32:30:18 00:32:32:00 type one and type two errors with 00:32:32:01 00:32:33:07 sort of false positives and false 00:32:33:08 00:32:34:18 negatives. And and what might 00:32:35:03 00:32:36:21 the consequences of those be if the 00:32:36:24 00:32:38:21 algorithm is wrong, for whatever 00:32:38:22 00:32:39:21 reason, whether it's the wrong 00:32:39:22 00:32:41:13 algorithm or the data is not quite 00:32:41:14 00:32:43:03 right or there's some bias 00:32:43:10 00:32:45:07 for one way or one, one reason or 00:32:45:08 00:32:46:02 another. 00:32:46:03 00:32:48:01 But then in practical 00:32:48:02 00:32:49:12 terms, like, well, if it gets it 00:32:49:13 00:32:51:05 wrong, what does that mean and what 00:32:51:08 00:32:52:08 is our tolerance or 00:32:53:04 00:32:54:15 required level of precision? 00:32:54:16 00:32:55:10 If you want to think about 00:32:55:11 00:32:57:03 differently, that brings it down to 00:32:57:10 00:32:58:20 sort of conversations that you 00:32:58:21 00:32:59:17 actually have. 00:32:59:18 00:33:01:14 And back to Joanna's question, 00:33:01:15 00:33:03:04 I think the people in the room need 00:33:03:05 00:33:04:15 to be the people who can really talk 00:33:04:16 00:33:06:07 about what those real life 00:33:06:08 00:33:07:22 consequences would would actually 00:33:08:14 00:33:10:04 be. What would those errors 00:33:10:10 00:33:12:18 mean to the customers, to 00:33:12:19 00:33:14:05 other people who might be affected, 00:33:14:06 00:33:15:14 maybe to your employees, to 00:33:16:01 00:33:17:21 regulators, to whoever else might be 00:33:17:22 00:33:18:23 a relevant stakeholder. 00:33:19:04 00:33:20:15 So I guess the point is, have that 00:33:20:16 00:33:22:13 diversity of opinion and 00:33:22:18 00:33:24:09 that multistakeholder perspective, 00:33:24:13 00:33:26:13 you know, quote unquote in the room 00:33:27:03 00:33:28:08 to think about these things. 00:33:28:09 00:33:29:11 But the question actually is, what 00:33:29:12 00:33:30:22 do you how do you frame the 00:33:30:23 00:33:31:17 questions? 00:33:31:18 00:33:33:10 How do you get them to think about 00:33:33:11 00:33:35:06 these issues? And I think that, sort 00:33:35:07 00:33:36:23 of false positives, false negatives, 00:33:37:07 00:33:38:25 you know, is a really useful way of 00:33:39:00 00:33:40:13 thinking. So we're about at the 00:33:40:14 00:33:41:22 midpoint of our broadcast, I just 00:33:41:23 00:33:43:05 wanted to welcome anyone who's 00:33:43:06 00:33:44:05 joined us. 00:33:44:09 00:33:45:14 You know, since the beginning, 00:33:46:04 00:33:47:11 you're obviously watching Leadership 00:33:47:12 00:33:48:16 in Extraordinary Times here at the 00:33:48:17 00:33:49:24 Said Business School at the 00:33:50:00 00:33:51:17 University of Oxford, I'm Professor 00:33:52:01 00:33:53:01 Andrew Stephen, the L'Oreal 00:33:53:08 00:33:54:20 Professor of Marketing and the 00:33:54:24 00:33:56:11 Research Dean. And I have been 00:33:56:12 00:33:58:06 talking with Dr. Yasmeen Ahmad 00:33:58:13 00:34:00:17 from Teradata and Dr Natalia 00:34:01:19 00:34:03:03 Eframova from here at the Said 00:34:03:04 00:34:04:03 Business School. 00:34:04:08 00:34:05:12 And we're talking about how businesses 00:34:05:13 00:34:06:24 can use AI responsibly. 00:34:07:09 00:34:09:01 I want to now go to have a look at 00:34:09:02 00:34:10:02 our poll and see what 00:34:11:02 00:34:12:25 you all thought about 00:34:12:26 00:34:14:10 who should be responsible for 00:34:14:11 00:34:15:11 responsible AI. 00:34:15:13 00:34:16:21 So if we can take a look at 00:34:17:09 00:34:18:22 the results and then I'm going to 00:34:18:23 00:34:19:22 see what both 00:34:21:16 00:34:22:12 Natalia and Yasmeen think about this. 00:34:22:13 00:34:24:15 So there's no clear 00:34:24:16 00:34:25:24 frontrunner, I suppose, here. 00:34:26:00 00:34:27:18 But the government is is on top, 00:34:28:01 00:34:29:14 followed by the global 00:34:29:15 00:34:31:09 intergovernmental organisations, the 00:34:31:10 00:34:33:00 tech companies, industry bodies 00:34:33:07 00:34:35:09 and individuals last. 00:34:35:10 00:34:37:06 But quite a spread 00:34:37:07 00:34:39:07 here. So to be honest, 00:34:39:08 00:34:40:12 that's not exactly what I 00:34:41:04 00:34:42:04 expected to see. 00:34:42:14 00:34:43:17 What do you think, Natalia? 00:34:43:18 00:34:45:03 What's your reaction to 00:34:45:20 00:34:46:20 our poll results? 00:34:48:03 00:34:49:18 Well, that's very interesting. 00:34:49:19 00:34:51:04 I also didn't expect this 00:34:51:16 00:34:53:17 and I would say, well, 00:34:53:18 00:34:55:06 that would be amazing if a 00:34:55:07 00:34:56:18 government was able to control 00:34:57:08 00:34:58:08 it and provide 00:34:59:09 00:35:01:03 us all guidelines what we should be 00:35:01:04 00:35:02:04 doing. 00:35:02:17 00:35:04:09 Unfortunately, for now, it's not the 00:35:04:10 00:35:06:02 case. We do have some 00:35:07:04 00:35:09:18 recommendations from policymakers, 00:35:10:19 00:35:12:00 and I think that's amazing 00:35:12:17 00:35:14:05 that we are moving towards 00:35:14:15 00:35:15:17 these same direction. 00:35:15:23 00:35:18:12 But unfortunately, currently, 00:35:19:10 00:35:21:05 probably the only regulations 00:35:21:06 00:35:23:16 like hard regulations that exist 00:35:24:10 00:35:26:10 inside the companies inside big 00:35:26:18 00:35:28:19 technical industry 00:35:28:20 00:35:29:21 organisations. 00:35:31:25 00:35:33:00 Intergovernmental and 00:35:34:01 00:35:35:18 international bodies 00:35:36:17 00:35:38:11 do catch up slowly 00:35:38:12 00:35:40:08 and they do 00:35:41:03 00:35:42:13 publish their own guidelines 00:35:42:24 00:35:45:04 and sets of recommendations. 00:35:45:23 00:35:47:21 What I believe is that 00:35:47:22 00:35:48:23 inputs from all levels 00:35:50:00 00:35:51:02 are very important and 00:35:52:01 00:35:53:20 why, from a technical perspective, I 00:35:53:24 00:35:55:06 believe that it's difficult for 00:35:55:07 00:35:57:17 governments to do is 00:35:58:12 00:36:00:16 they really don't have 00:36:00:17 00:36:01:16 a tool for now to 00:36:02:12 00:36:04:09 check what is 00:36:04:14 00:36:05:14 AI doing. 00:36:05:24 00:36:07:03 So no one outside of the 00:36:07:22 00:36:09:14 organisation can 00:36:09:25 00:36:11:23 check what the algorithm 00:36:11:24 00:36:12:24 is doing. And even if 00:36:13:23 00:36:15:13 someone wants to check the code, 00:36:15:20 00:36:17:21 the problem can be not in the code, 00:36:18:03 00:36:20:00 but in the data or 00:36:20:10 00:36:22:04 elsewhere in the operations or 00:36:22:05 00:36:23:05 production line. 00:36:23:15 00:36:26:05 So it's very, very complex problem. 00:36:26:12 00:36:28:15 And until we 00:36:28:16 00:36:30:17 develop this centralised 00:36:30:24 00:36:32:06 understanding, centralised 00:36:32:09 00:36:33:14 guidelines, how AI should 00:36:34:07 00:36:35:12 be developed, the problem 00:36:36:06 00:36:37:18 that the responsibilities would be 00:36:37:19 00:36:39:17 largely on the companies themselves 00:36:39:25 00:36:41:10 because they know their business is 00:36:41:11 00:36:42:10 the best. 00:36:42:23 00:36:44:02 Yasmeen, what do you think? 00:36:45:17 00:36:47:14 I have to agree with Natalia there. 00:36:47:15 00:36:48:23 It's such a complex problem 00:36:49:20 00:36:52:03 to try and find a regulation 00:36:52:04 00:36:53:21 or a framework that would cover 00:36:54:04 00:36:55:12 the amount of innovation that is 00:36:55:13 00:36:56:14 happening in the space at the 00:36:56:15 00:36:57:13 moment. 00:36:57:14 00:36:59:00 It's very hard for a government to 00:36:59:01 00:37:00:22 regulate. And having said that, 00:37:00:23 00:37:02:17 regulation often happens 00:37:02:18 00:37:03:17 after the fact. 00:37:03:18 00:37:05:08 Regulation looks at how 00:37:05:09 00:37:06:16 technologies, 00:37:07:13 00:37:09:16 how digital 00:37:09:22 00:37:11:23 is being leveraged and then looks 00:37:11:24 00:37:13:11 to regulate uses, 00:37:13:19 00:37:15:19 so I think companies do have a 00:37:15:20 00:37:17:16 real responsibility to make 00:37:17:17 00:37:20:07 sure that they are responsible 00:37:20:08 00:37:21:23 and accountable for the algorithms 00:37:22:10 00:37:23:19 that they're developing and how they 00:37:23:20 00:37:24:19 are applying them. 00:37:24:25 00:37:27:12 And as I think about organisations, 00:37:27:13 00:37:29:11 I also I don't 00:37:29:12 00:37:31:17 think it's about just the strict 00:37:31:18 00:37:33:07 frameworks or guidelines. 00:37:33:08 00:37:34:20 I think all large organisations 00:37:35:09 00:37:37:12 will have a ethics framework. 00:37:38:08 00:37:40:04 It was mentioned earlier, often 00:37:40:05 00:37:41:21 the foundations have been there in 00:37:41:24 00:37:43:07 the organisation, they need to be 00:37:43:08 00:37:44:23 evolved. The ethics frameworks need 00:37:44:24 00:37:46:23 to be evolved to take into 00:37:46:24 00:37:48:18 account new risks or new factors 00:37:48:19 00:37:50:19 that have come up because of A.I. 00:37:50:20 00:37:51:20 and digital. 00:37:52:01 00:37:53:07 So, yes, there's existing 00:37:53:08 00:37:54:18 frameworks. They need to evolve. 00:37:54:22 00:37:56:09 But we also need to look at company 00:37:56:10 00:37:58:12 cultures and how a company 00:37:58:13 00:38:00:14 leads through this evolution 00:38:00:15 00:38:02:04 and the change because AI digital 00:38:02:20 00:38:04:18 analytics is now impacting 00:38:04:23 00:38:06:12 all aspects of our lives, all 00:38:06:13 00:38:08:05 aspects of organisations. 00:38:08:14 00:38:10:16 And so some of the 00:38:10:17 00:38:12:07 ethics in the fair and 00:38:12:12 00:38:13:14 equal and unbiased use 00:38:14:08 00:38:15:24 of algorithms needs to be embedded 00:38:16:06 00:38:18:04 in company culture and needs to 00:38:18:06 00:38:19:11 be embedded in how people 00:38:20:09 00:38:21:17 build these tools, how they 00:38:22:07 00:38:24:03 apply the algorithms, how they 00:38:24:11 00:38:25:20 analyse data. 00:38:26:07 00:38:28:04 To Natalia's point, it's so complex, 00:38:28:05 00:38:29:11 there's so many touchpoint, there's 00:38:29:12 00:38:30:23 so many people that are involved in 00:38:30:24 00:38:32:15 the process, it needs to be part of 00:38:32:16 00:38:34:17 the the DNA of the organisation 00:38:34:18 00:38:36:20 to want to have 00:38:37:02 00:38:38:22 ethical use of these new 00:38:39:04 00:38:40:17 technologies, new tools at their 00:38:40:18 00:38:41:17 disposal. 00:38:42:03 00:38:44:04 So there's a question 00:38:44:05 00:38:45:24 that's come up from from Richard in 00:38:46:00 00:38:47:12 my home country of Australia, in 00:38:47:13 00:38:49:01 fact, who was asking about, 00:38:49:12 00:38:50:16 you know, this point about 00:38:50:17 00:38:52:05 innovation that both of you talked 00:38:52:06 00:38:53:21 about. So if we regulate, 00:38:54:16 00:38:56:12 I guess, too much, the worry is 00:38:56:13 00:38:57:22 that will sort of box in 00:38:58:08 00:38:59:23 the opportunities that might 00:38:59:24 00:39:01:18 therefore stifle 00:39:01:24 00:39:02:19 innovation. 00:39:02:20 00:39:04:17 So I guess it feels like 00:39:04:18 00:39:06:05 it's this balance of, yes, we need 00:39:06:06 00:39:07:20 to have some, I guess, guidance from 00:39:08:05 00:39:09:20 governments and intergovernmental 00:39:09:21 00:39:11:22 organisations, but obviously can't 00:39:11:23 00:39:12:16 cover everything. 00:39:12:17 00:39:14:15 And nor, nor, probably should they. 00:39:15:05 00:39:16:16 And I guess these regulations 00:39:16:23 00:39:18:23 because of of the need 00:39:18:24 00:39:20:07 for further innovation and the fact 00:39:20:08 00:39:22:01 that that is happening anyway, 00:39:22:10 00:39:23:22 we need to sort of have these as 00:39:23:23 00:39:25:18 living regulations in some 00:39:25:19 00:39:27:19 sense or guidelines around that. 00:39:28:01 00:39:30:01 And I think to me, my take on 00:39:30:05 00:39:31:15 the poll result is actually 00:39:32:01 00:39:32:24 it's sort of everyone's 00:39:33:00 00:39:34:08 responsibility in one way or 00:39:34:09 00:39:35:11 another. So I think, as I was 00:39:35:12 00:39:36:21 saying, we need multiple voices in 00:39:36:22 00:39:37:23 the room to think about those 00:39:37:24 00:39:39:24 consequences and unintended 00:39:40:00 00:39:41:23 ones and and the false 00:39:41:24 00:39:43:07 positives and false negatives. 00:39:43:15 00:39:44:23 I guess the point is that that 00:39:44:24 00:39:46:11 higher level, we also need 00:39:46:19 00:39:48:07 this to not only be 00:39:48:15 00:39:50:07 something that governments do, not 00:39:50:08 00:39:52:02 only something that 00:39:52:07 00:39:53:24 intergovernmental organisations do, 00:39:54:00 00:39:56:04 but indeed something that companies 00:39:56:05 00:39:57:11 do, groups of companies or 00:39:57:23 00:39:58:24 organisations, 00:39:59:20 00:40:01:12 the tech companies and so on and so 00:40:01:13 00:40:03:09 on. So I think there's there's 00:40:03:10 00:40:05:08 a need for this to be, 00:40:05:15 00:40:06:17 I guess, governed 00:40:07:25 00:40:09:02 with with a small G in 00:40:10:00 00:40:11:17 a in a very collective way, which, 00:40:11:24 00:40:12:24 of course, it's going to be fraught 00:40:12:25 00:40:14:15 with difficulties given all those 00:40:14:21 00:40:15:19 different stakeholders. 00:40:15:20 00:40:17:17 But I like what I'm hearing is that 00:40:17:18 00:40:19:10 that seems to be a ways forward. 00:40:19:20 00:40:21:06 I want to bring back Felipe 00:40:22:00 00:40:23:24 now and bring him 00:40:24:00 00:40:25:09 back into the conversation, 00:40:26:07 00:40:27:18 because I think we've just got quite 00:40:27:19 00:40:29:03 a few questions now that have been 00:40:29:04 00:40:30:03 popping up. 00:40:30:10 00:40:31:17 Thank you, to those of you who've 00:40:31:18 00:40:33:04 asked questions that I think we can 00:40:33:13 00:40:34:25 we can start to go through. 00:40:35:20 00:40:37:00 So, Felipe, welcome back. 00:40:37:04 00:40:38:15 Just quickly, before we go on to 00:40:38:22 00:40:39:21 some more questions, what was your 00:40:39:22 00:40:41:09 take on those poll results? 00:40:42:07 00:40:43:20 Very similar to everybody else's. 00:40:44:07 00:40:45:21 I think I have very similar read to 00:40:45:22 00:40:47:23 you in that it does seem like 00:40:47:24 00:40:49:20 it is very much like it's a joint 00:40:49:21 00:40:51:18 responsibility that everybody 00:40:52:00 00:40:53:15 has to carry some of the burden. 00:40:53:24 00:40:55:20 The other bit that I think 00:40:55:21 00:40:57:12 going to that first point that we 00:40:57:13 00:40:58:17 have, you know, governments have 00:40:58:18 00:41:00:12 been the most responsible, even 00:41:00:21 00:41:02:04 by a narrow margin, I guess 00:41:02:21 00:41:04:09 a bit of a word of caution or 00:41:04:10 00:41:05:09 concern here from 00:41:06:07 00:41:08:08 somebody that hasn't lived forever 00:41:08:09 00:41:09:13 in the developed world. 00:41:09:24 00:41:11:07 Not everybody's government is 00:41:11:08 00:41:12:03 fantastic. 00:41:12:04 00:41:13:17 Not everybody's government has your 00:41:13:18 00:41:14:18 best interests in mind. 00:41:15:05 00:41:17:08 So just saying necessarily that 00:41:17:18 00:41:19:03 whose government is going to decide 00:41:19:04 00:41:20:11 what is correct to do? 00:41:20:18 00:41:22:06 I think that's something to worry 00:41:22:07 00:41:23:11 about. That's something that I put 00:41:23:12 00:41:25:05 out there to say, you know, yes, 00:41:25:06 00:41:26:08 regulation matters very 00:41:27:01 00:41:28:02 much to your point, that 00:41:28:24 00:41:30:10 there is a concern against the 00:41:30:19 00:41:32:10 attention against innovation. 00:41:32:21 00:41:34:13 But some companies are going to be 00:41:34:19 00:41:36:24 restricted exclusively to their 00:41:37:05 00:41:38:17 regional domains and they will 00:41:39:03 00:41:40:20 have just one set of legislation to 00:41:40:21 00:41:41:16 worry about. 00:41:41:17 00:41:43:24 A number of companies are going to 00:41:44:02 00:41:46:06 exist across national boundaries. 00:41:46:15 00:41:48:17 And then you run 00:41:48:18 00:41:50:06 into issues where you're picking and 00:41:50:07 00:41:51:06 choosing to the bare 00:41:52:05 00:41:54:03 minimum of legal requirement rather 00:41:54:04 00:41:55:24 than doing the ethical thing or 00:41:56:00 00:41:57:12 doing the appropriate thing for 00:41:57:13 00:41:58:17 yourself, for your company, et 00:41:58:18 00:42:00:04 cetera. By just forcing others 00:42:00:15 00:42:01:14 to say, all right, what's the bare 00:42:01:15 00:42:02:18 minimum I have to do? 00:42:03:01 00:42:04:06 And that's what you do when you 00:42:04:07 00:42:06:03 exploit everybody that's kind of 00:42:06:04 00:42:07:14 moving away from ethics and just 00:42:07:21 00:42:09:02 saying, hey, just let let 00:42:09:19 00:42:11:06 me make as much money as I can as 00:42:11:07 00:42:12:14 quickly as I can until somebody 00:42:12:15 00:42:13:14 catches on. 00:42:14:13 00:42:15:22 So good point, you know, not all 00:42:15:23 00:42:16:22 governments are 00:42:17:19 00:42:19:13 necessarily the right governments, I 00:42:19:15 00:42:20:20 suppose you could say here. 00:42:21:00 00:42:22:13 And then just to reinforce your 00:42:22:19 00:42:24:07 point about large organisations, I 00:42:24:08 00:42:26:05 think as we've seen actually 00:42:26:06 00:42:27:23 with other areas of responsibility, 00:42:28:09 00:42:30:01 such as environment and 00:42:30:20 00:42:32:04 social responsibility, I think about 00:42:32:05 00:42:33:04 ESG. 00:42:33:05 00:42:35:03 We see large multinationals 00:42:35:11 00:42:37:12 actually having 00:42:37:13 00:42:39:08 a big impact in different 00:42:39:09 00:42:40:21 countries just because of their 00:42:40:22 00:42:42:04 global reach. So I think there is a 00:42:42:11 00:42:44:16 an important role of business, 00:42:45:05 00:42:47:02 particularly businesses with 00:42:47:13 00:42:48:17 an international footing 00:42:49:20 00:42:51:18 to sort of lead by example 00:42:52:05 00:42:53:15 in a lot of this and of course, 00:42:53:16 00:42:54:12 collaborate with those other 00:42:54:13 00:42:56:05 stakeholders that we know we talked 00:42:56:06 00:42:57:14 about. So but speaking about 00:42:57:21 00:42:59:09 government and there's a question 00:42:59:18 00:43:01:22 from from Andrew, who's one of our 00:43:02:07 00:43:04:09 executive students here, who does 00:43:04:10 00:43:06:06 work for a government and 00:43:06:07 00:43:07:04 is asking, should there be a 00:43:07:05 00:43:09:02 hierarchy of government concern, 00:43:09:15 00:43:11:00 starting with regulation 00:43:11:11 00:43:13:16 for physical risk to the individual? 00:43:13:17 00:43:14:20 So in other words, I guess more 00:43:14:21 00:43:16:17 broadly, maybe government's thinking 00:43:16:18 00:43:18:03 about sort of that type of harm 00:43:18:13 00:43:19:18 organisation's think about other 00:43:19:19 00:43:20:25 types of harm, perhaps. 00:43:21:19 00:43:23:06 I don't know. Let's start with 00:43:23:15 00:43:25:09 Yasmeen and then I want to want to 00:43:25:10 00:43:26:08 hear what the others also think 00:43:26:09 00:43:27:05 about this, because I think it's a 00:43:27:06 00:43:28:04 really important, practical 00:43:28:05 00:43:29:04 question. 00:43:29:12 00:43:30:05 Yasmeen? 00:43:30:06 00:43:31:24 That's a great question, 00:43:32:03 00:43:34:06 because the implications 00:43:34:07 00:43:35:13 of how algorithms are used 00:43:36:02 00:43:38:03 is a multidimensional 00:43:38:12 00:43:40:11 challenge of there's various 00:43:40:12 00:43:41:22 types of risks that 00:43:42:07 00:43:43:13 can be created. 00:43:43:20 00:43:45:16 So as I think about even our 00:43:45:17 00:43:47:18 organisation and our risk framework, 00:43:47:19 00:43:49:18 there are a whole diverse set 00:43:49:19 00:43:50:19 of our categories of 00:43:51:16 00:43:53:14 risks that we consider to our 00:43:53:15 00:43:55:21 business and for a government. 00:43:55:22 00:43:57:19 I don't see why it wouldn't be the 00:43:57:20 00:43:59:04 same with AI. 00:43:59:12 00:44:01:12 So as I think about 00:44:02:00 00:44:03:09 different types of risk, the one 00:44:03:10 00:44:05:06 that popped into my mind, we did 00:44:05:07 00:44:07:09 a use case with a retailer around 00:44:07:17 00:44:08:20 wastage and we know how 00:44:09:16 00:44:11:17 much sustainability is important 00:44:11:18 00:44:12:14 right now. 00:44:12:15 00:44:13:25 And in this use case, we were 00:44:14:01 00:44:15:19 actually supporting the retailer 00:44:15:24 00:44:17:07 through AI to reduce waste, 00:44:18:01 00:44:20:06 to reduce how much grocery 00:44:20:07 00:44:21:20 products were thrown away at the end 00:44:21:21 00:44:22:18 of the day. 00:44:22:19 00:44:24:07 But equally on 00:44:24:16 00:44:26:09 vice versa, A.I. 00:44:26:12 00:44:27:24 has has also almost accelerated 00:44:29:04 00:44:31:00 or amplified the fast 00:44:31:04 00:44:32:10 fashion business and other 00:44:32:17 00:44:34:15 businesses that are creating a ton 00:44:34:16 00:44:35:15 of waste. 00:44:35:16 00:44:37:18 And so you can begin to think about 00:44:37:25 00:44:39:22 AI algorithms and the different 00:44:39:23 00:44:41:07 types of risk or implications 00:44:42:14 00:44:44:01 they have to societies. 00:44:44:08 00:44:46:04 There is the sustainability in 00:44:46:05 00:44:47:18 terms of green and environmental 00:44:48:01 00:44:49:06 impacts. There is impacts 00:44:50:02 00:44:52:05 to humans and peoples and jobs 00:44:52:06 00:44:53:05 and careers. 00:44:53:06 00:44:55:22 There's impact to diversity, 00:44:56:03 00:44:58:03 equality, inclusion. 00:44:58:15 00:45:00:10 So if I was to to 00:45:00:11 00:45:01:15 think about how to put that 00:45:01:16 00:45:03:05 framework together for government, I 00:45:03:06 00:45:04:08 would be thinking about those 00:45:04:09 00:45:05:09 different categories. 00:45:05:19 00:45:07:18 And in terms of prioritisation, 00:45:07:19 00:45:09:04 there's definitely some categories 00:45:09:05 00:45:10:19 that you may prioritise over and 00:45:11:02 00:45:13:09 above others, those that have impact 00:45:13:10 00:45:15:02 to human life, for example. 00:45:15:20 00:45:17:11 But equally, I think all of those 00:45:17:15 00:45:19:07 categories are important and 00:45:19:08 00:45:21:11 potentially require specialists 00:45:21:12 00:45:23:02 or experts or again, business 00:45:23:03 00:45:24:18 experts who understand those areas 00:45:25:03 00:45:26:03 and are able to fully 00:45:26:24 00:45:28:18 think through the implications 00:45:28:23 00:45:30:22 of A.I., which may 00:45:30:23 00:45:32:06 not be apparent initially to 00:45:32:21 00:45:34:14 you as a developer or 00:45:34:21 00:45:36:06 a builder of the tools or the 00:45:36:08 00:45:37:25 technology platforms, because this 00:45:38:04 00:45:40:04 is just such a new 00:45:40:05 00:45:42:08 application of analytics 00:45:42:09 00:45:43:17 and new areas. 00:45:45:08 00:45:46:15 So another question and this one I'm 00:45:46:16 00:45:48:08 going to direct at 00:45:48:12 00:45:49:08 you, Felipe. 00:45:49:09 00:45:51:02 This comes from Dennis who's here in 00:45:51:03 00:45:52:03 the U.K. 00:45:52:05 00:45:53:22 and he's asking or suggesting, 00:45:54:03 00:45:55:19 shouldn't all innovation be governed 00:45:55:20 00:45:57:06 by the purpose of the company, 00:45:57:22 00:45:59:15 hopefully being mindful to human 00:45:59:16 00:46:00:24 beings, then the planet and 00:46:01:14 00:46:03:21 then finally profit in that order? 00:46:04:09 00:46:05:09 What do you think? 00:46:05:24 00:46:08:02 I mean, I do like the word hopefully 00:46:08:03 00:46:08:24 there, right. 00:46:08:25 00:46:10:16 All discussions about ethics, none 00:46:11:03 00:46:12:03 of it matters 00:46:13:02 00:46:14:22 until there's a trade off being 00:46:14:23 00:46:16:09 made. So everybody agrees that 00:46:16:21 00:46:18:21 we should protect privacy until 00:46:18:22 00:46:21:00 it's 10 percent of your sales 00:46:21:06 00:46:22:25 that goes away if you take that 00:46:22:26 00:46:24:13 action, then suddenly everybody 00:46:24:24 00:46:26:24 in the boardroom just goes, maybe 00:46:27:00 00:46:28:18 we think about this some more. 00:46:29:06 00:46:30:23 So it's never a problem until it 00:46:30:24 00:46:32:05 touches kind of that money 00:46:32:10 00:46:33:10 component. 00:46:33:12 00:46:34:17 And that's that's a large 00:46:35:13 00:46:36:24 part of this initiative in this 00:46:37:00 00:46:38:07 project and the research in the 00:46:38:08 00:46:39:22 guidances is 00:46:40:03 00:46:41:10 I think ultimately we want 00:46:41:13 00:46:43:07 individuals to self-determine it and 00:46:43:08 00:46:44:11 say, yeah, your business 00:46:45:11 00:46:47:05 is going to decide what is the best 00:46:47:06 00:46:48:13 innovation and it will make 00:46:49:02 00:46:50:14 the right, good choices. 00:46:51:07 00:46:53:07 Frameworks like these are to help 00:46:53:16 00:46:55:06 people make the right choice 00:46:55:11 00:46:56:18 when the time comes. 00:46:57:11 00:46:58:17 If you're making a choice when 00:46:58:18 00:46:59:22 there's no consequence for your 00:46:59:23 00:47:01:04 business, if you're not giving 00:47:01:05 00:47:02:05 something up, then 00:47:03:02 00:47:04:08 you're not really facing a choice, 00:47:04:10 00:47:05:08 you're just doing the right thing 00:47:05:09 00:47:07:16 kind of with no cost. 00:47:07:23 00:47:09:07 I'm always more curious when it's 00:47:09:12 00:47:11:03 somebody's mortgage payment, if you 00:47:11:04 00:47:13:06 can't pay rent, if you lose your job 00:47:13:07 00:47:14:13 because you did the right thing, 00:47:14:14 00:47:15:24 that's your ethical quandary. 00:47:16:12 00:47:17:21 That's that's the thorn in the 00:47:17:22 00:47:18:17 question. 00:47:18:18 00:47:20:05 And that's when it matters having 00:47:20:06 00:47:21:19 this grounding of I know what my 00:47:22:00 00:47:23:14 company stands for, I know I'm going 00:47:23:15 00:47:24:17 to be backed for doing the right 00:47:24:18 00:47:26:15 thing, even if it's going to cost 00:47:26:16 00:47:27:16 us revenue and not 00:47:28:14 00:47:29:20 just because it was legally 00:47:29:21 00:47:31:18 required, that this is what 00:47:31:19 00:47:33:04 keeps me out of jail, because it's 00:47:33:05 00:47:34:19 the right thing to do in what you 00:47:34:20 00:47:36:21 said for people, for environment, 00:47:37:07 00:47:38:07 for et cetera. 00:47:39:07 00:47:41:10 So, again, comes back to that, 00:47:41:11 00:47:42:17 you know, it can't just be sort of 00:47:42:18 00:47:44:10 one set of rules imposed 00:47:44:17 00:47:46:08 from on high, whoever that happens 00:47:46:09 00:47:47:14 to be, whether it's government or 00:47:47:22 00:47:49:25 the organisation or some mix, 00:47:50:19 00:47:51:22 you need that bottom up. 00:47:51:23 00:47:52:18 Well, that's sort of you know, 00:47:52:19 00:47:54:08 everyone has to think about this, 00:47:54:15 00:47:56:09 but that's the tension as well, 00:47:56:14 00:47:58:02 because once you make it personal, 00:47:58:03 00:47:59:13 then, you know, people are 00:47:59:14 00:48:01:16 different. And so so that's that's 00:48:01:17 00:48:02:17 why this is messy 00:48:03:14 00:48:05:04 and complex. But that's why we do 00:48:05:11 00:48:06:23 need to be talking about these 00:48:07:00 00:48:08:23 issues and finding ways to take 00:48:08:24 00:48:09:23 action on them. 00:48:10:06 00:48:11:09 I'm going to go to another question 00:48:11:10 00:48:12:21 now. This one's for you, Natalia. 00:48:13:03 00:48:14:25 Is there a sense that algorithmic 00:48:14:26 00:48:16:12 tools and models are held to a 00:48:16:19 00:48:18:15 higher standard than 00:48:18:16 00:48:20:07 what would be applied to humans in 00:48:20:08 00:48:21:09 similar situations? 00:48:21:10 00:48:22:22 A bit of a human versus machine 00:48:23:17 00:48:25:17 type of type of consideration here. 00:48:26:10 00:48:28:11 It's an interesting question, 00:48:28:12 00:48:30:11 and I would say it depends 00:48:30:19 00:48:32:22 on what kind of algorithm 00:48:32:23 00:48:33:24 we are talking about. 00:48:35:08 00:48:36:08 So definitely some 00:48:37:05 00:48:39:05 algorithms that are performing 00:48:39:19 00:48:41:22 much, much better than humans. 00:48:41:23 00:48:43:15 Of course, they had 00:48:43:20 00:48:46:02 to be scrutinised more 00:48:46:03 00:48:48:01 because they just 00:48:48:08 00:48:49:22 work on a very, very different level 00:48:49:23 00:48:52:02 of accuracy. For some algorithms, 00:48:52:05 00:48:53:13 it's not not really, really 00:48:54:06 00:48:55:21 that important, like, think about 00:48:56:01 00:48:57:03 automatic irrigation systems. 00:48:57:20 00:48:59:18 Well, is it that important that 00:48:59:19 00:49:02:01 it gives the precise 00:49:02:02 00:49:04:05 amount of water plus two 00:49:04:06 00:49:05:06 minus three millilitres? 00:49:05:17 00:49:06:17 Probably not. 00:49:06:22 00:49:08:07 So it really, really matters 00:49:08:21 00:49:10:01 what kind of application we're 00:49:10:02 00:49:11:01 talking about. 00:49:11:08 00:49:12:22 Going to back to Yasmeen's point 00:49:13:10 00:49:15:07 about medical applications. 00:49:15:15 00:49:17:01 If we think about 00:49:17:16 00:49:19:11 human health and 00:49:20:13 00:49:22:08 X-ray analysis results, 00:49:23:11 00:49:24:20 is there like 00:49:25:12 00:49:26:23 a single chance that it can go 00:49:27:10 00:49:28:23 wrong? No, if 00:49:29:11 00:49:30:23 something can go wrong, that if 00:49:31:10 00:49:32:24 the algorithm can potentially be 00:49:33:07 00:49:35:04 incorrect, even this when 00:49:35:05 00:49:37:02 this chance is tiny, should 00:49:37:03 00:49:38:07 we go for this algorithm? 00:49:38:11 00:49:40:02 No. It really 00:49:41:09 00:49:43:17 depends on this large hierarchy 00:49:43:18 00:49:45:19 of risks and which 00:49:46:06 00:49:48:05 at the top is the human as 00:49:48:06 00:49:49:06 a human being and his 00:49:50:08 00:49:51:09 physical and financial 00:49:52:13 00:49:54:10 and other assets and 00:49:54:11 00:49:56:10 then going back down to 00:49:56:20 00:49:58:13 take danger or risk that 00:49:58:17 00:49:59:24 technology imposes. 00:50:00:04 00:50:02:07 So I would say it's, again, 00:50:02:08 00:50:03:09 a complex problem. 00:50:03:10 00:50:05:07 All problems in this discussion seem 00:50:05:08 00:50:06:19 to be very complex, 00:50:07:05 00:50:09:09 but it's really impossible to tell. 00:50:09:10 00:50:10:10 But we 00:50:11:05 00:50:12:20 also need to think 00:50:14:13 00:50:16:06 beyond the problem itself. 00:50:16:07 00:50:18:02 So what people often 00:50:18:03 00:50:19:23 not consider is 00:50:21:07 00:50:23:08 what are the bigger risk? 00:50:23:09 00:50:25:09 For example, when we talk about 00:50:25:24 00:50:27:23 large data centres who 00:50:28:06 00:50:30:03 drown their servers and thus they 00:50:30:04 00:50:31:12 save electricity on cooling 00:50:32:08 00:50:33:11 of the data centres, 00:50:34:07 00:50:35:07 do they always think 00:50:36:04 00:50:38:09 about ocean 00:50:38:10 00:50:40:07 and warming the water and how 00:50:40:08 00:50:42:12 it impacts wildlife 00:50:42:13 00:50:43:12 in the long term? 00:50:43:16 00:50:44:17 So this is incredibly 00:50:45:13 00:50:46:13 complex. 00:50:47:11 00:50:49:06 I would say a system of questions 00:50:49:10 00:50:51:01 that we need to ask ourselves. 00:50:51:11 00:50:52:20 And I think it comes to sort of 00:50:52:21 00:50:54:03 this, I guess the undertone 00:50:54:16 00:50:56:04 there is thinking about the fallibility 00:50:56:05 00:50:58:06 of of humans versus 00:50:58:07 00:50:59:06 the algorithms and 00:51:00:03 00:51:01:22 where potential bias might come in. 00:51:01:23 00:51:03:13 And I think we've got time for one 00:51:03:14 00:51:05:00 more question. And I want to want to 00:51:05:02 00:51:06:02 pose this one to you, Yasmeen, 00:51:06:10 00:51:07:18 because I think it's related to 00:51:07:19 00:51:08:18 that. And this comes from Walid, 00:51:08:23 00:51:10:10 who's in Saudi Arabia, and he's 00:51:10:18 00:51:12:17 talking about essentially the use 00:51:12:18 00:51:14:22 of AI in judicial decisions 00:51:15:02 00:51:17:08 or implementing laws, 00:51:17:14 00:51:19:12 for example, in the context of, you 00:51:19:13 00:51:20:23 know, getting rid of corruption 00:51:21:19 00:51:23:04 where maybe there is corruption. 00:51:23:12 00:51:25:07 Let the presumably 00:51:25:11 00:51:26:25 non corrupt algorithm 00:51:27:07 00:51:28:12 make these decisions. 00:51:29:02 00:51:30:21 How do you feel about that and those 00:51:30:22 00:51:33:02 sorts of approaches 00:51:33:03 00:51:34:16 of replacing the fallible humans 00:51:35:02 00:51:36:23 with the less fallible machine? 00:51:37:20 00:51:39:05 That's a very interesting question. 00:51:39:06 00:51:40:14 And it does link on from the 00:51:40:19 00:51:41:22 previous question. 00:51:42:12 00:51:44:05 In fact, there have been algorithms 00:51:45:01 00:51:46:13 specifically in the US judicial 00:51:47:08 00:51:48:16 system that have been used, 00:51:49:09 00:51:50:13 for example, 00:51:51:10 00:51:52:20 to predict the likelihood of 00:51:53:02 00:51:55:05 reoffending for people 00:51:55:06 00:51:56:17 who have been convicted of a crime. 00:51:57:06 00:51:59:04 And some of those algorithms, 00:51:59:05 00:52:00:21 the predictions have been 00:52:01:03 00:52:02:13 used as part of the judicial 00:52:03:04 00:52:04:24 decision making process of do 00:52:05:00 00:52:06:02 you give somebody bail? 00:52:06:13 00:52:07:14 Are various decisions 00:52:08:20 00:52:10:12 that might be linked to that person 00:52:10:22 00:52:12:18 in their life and circumstances. 00:52:14:04 00:52:15:12 And actually, I can't quote 00:52:16:03 00:52:17:19 the exact paper here, but happy to 00:52:18:01 00:52:19:09 share offline some of those 00:52:19:22 00:52:21:06 algorithms have been shown to be 00:52:21:07 00:52:22:18 biased because again, when you 00:52:23:02 00:52:24:17 look at the judicial system, 00:52:25:11 00:52:27:07 you might end up - as 00:52:27:08 00:52:28:21 you train algorithms, you're 00:52:28:22 00:52:30:15 potentially training them, and 00:52:30:16 00:52:32:04 they're learning biases from the 00:52:32:12 00:52:33:20 world that we live in today. 00:52:34:01 00:52:36:15 And so they are reinforcing 00:52:36:16 00:52:37:23 those biases or potentially 00:52:38:08 00:52:40:05 amplifying those biases that 00:52:40:06 00:52:41:22 they're a person of a certain race 00:52:42:11 00:52:44:20 or colour or a background 00:52:44:21 00:52:46:00 level of education is now 00:52:46:20 00:52:48:13 more likely to reoffend. 00:52:49:03 00:52:51:13 And that gets very dangerous. 00:52:51:14 00:52:53:16 And actually, it's a great example 00:52:53:17 00:52:55:13 of this whole ethical question. 00:52:56:02 00:52:57:18 We want to give every human person 00:52:58:08 00:52:59:13 a fair chance in life. 00:53:00:01 00:53:02:05 But if the algorithm has learnt from 00:53:02:06 00:53:03:09 real world data that if 00:53:04:10 00:53:06:03 you're of a certain background, 00:53:06:11 00:53:08:09 you live in a specific zip 00:53:08:10 00:53:09:21 code and you have a 00:53:10:05 00:53:11:17 certain level of education, you're 00:53:11:18 00:53:12:18 likely to carry a 00:53:13:15 00:53:15:16 commit an offence, that's 00:53:15:17 00:53:17:13 dangerous. We're no longer giving an 00:53:17:14 00:53:19:03 individual a fair chance at life 00:53:19:04 00:53:20:14 where we're stereotyping them 00:53:21:03 00:53:23:14 based on circumstances 00:53:23:15 00:53:25:06 of their upbringing, of their life 00:53:25:07 00:53:26:06 or where they've lived. 00:53:26:13 00:53:28:09 And so this is where it gets very 00:53:28:10 00:53:30:14 grey, as Natalia was mentioning. 00:53:30:19 00:53:32:14 How do we think about 00:53:32:15 00:53:34:21 these applications, 00:53:35:04 00:53:36:18 even though you can do it, should 00:53:36:19 00:53:37:18 you do it? 00:53:37:19 00:53:39:10 And so I think, 00:53:39:21 00:53:41:11 can you do it? Absolutely. 00:53:41:12 00:53:42:22 You can train algorithms. 00:53:42:23 00:53:44:07 Do you want to apply it? 00:53:44:08 00:53:45:22 Do we think it's fair or do we think 00:53:45:23 00:53:48:03 it's ethical is another question. 00:53:48:04 00:53:49:18 And it's important to consider that 00:53:49:19 00:53:51:02 question before just taking 00:53:51:20 00:53:53:09 an algorithm based on its output 00:53:53:21 00:53:55:13 and applying that to society. 00:53:55:21 00:53:57:13 Because, worst case scenario, you 00:53:57:14 00:53:58:20 now create a flywheel or a 00:53:59:08 00:54:01:04 perpetuating situation where 00:54:01:10 00:54:02:20 a certain part of society is 00:54:03:11 00:54:05:02 disadvantaged and the algorithm is 00:54:05:06 00:54:06:14 reinforcing that and people 00:54:07:04 00:54:08:17 can't escape that cycle. 00:54:09:04 00:54:11:01 So I think as a you know, 00:54:11:05 00:54:12:13 just looking at that history of 00:54:12:14 00:54:13:14 where the algorithms 00:54:14:13 00:54:15:23 have been applied in the judicial 00:54:15:24 00:54:17:21 system, we have to be careful 00:54:17:22 00:54:19:13 about how they're applied and what 00:54:19:14 00:54:21:13 consequences that has for people 00:54:22:14 00:54:23:23 I was going to say, it's back to 00:54:23:24 00:54:25:13 that law of unintended consequences. 00:54:25:14 00:54:26:15 But we know that those shouldn't be 00:54:27:02 00:54:28:13 unintended consequences now. 00:54:28:14 00:54:30:15 And I guess really to me that 00:54:30:16 00:54:32:18 this reminds us that, yes, 00:54:32:19 00:54:34:16 humans might be flawed in making 00:54:34:17 00:54:35:20 certain types of decisions and 00:54:35:21 00:54:37:14 maybe, you know, we might put some 00:54:37:15 00:54:39:11 hope in an AI system 00:54:39:12 00:54:40:19 or a set of algorithms, but 00:54:41:11 00:54:42:18 they're going to not be perfect 00:54:42:19 00:54:44:07 either. And really, I think it's 00:54:44:11 00:54:46:22 it's the the bringing together, 00:54:47:06 00:54:48:15 well, what I always like to call 00:54:48:16 00:54:50:09 augmented intelligence of the humans 00:54:50:18 00:54:52:15 and the machines where we 00:54:52:16 00:54:53:22 probably have a better chance of 00:54:53:23 00:54:55:12 actually doing things and reducing 00:54:55:13 00:54:57:09 those those types of errors 00:54:57:10 00:54:58:23 that Yasmeen introduced 00:55:00:04 00:55:01:08 earlier in the programme. 00:55:02:04 00:55:03:16 So unfortunately, we're out of time. 00:55:03:17 00:55:04:23 We could keep talking about this, 00:55:04:24 00:55:07:03 but the clock is 00:55:07:10 00:55:08:21 is unforgiving. 00:55:09:01 00:55:10:11 So I just want to thank Yasmeen, 00:55:10:23 00:55:12:20 Natalia and Felipe for 00:55:12:21 00:55:14:06 spending some time with all of us 00:55:14:07 00:55:15:25 today to talk about these very 00:55:15:26 00:55:16:25 complex issues. 00:55:17:13 00:55:19:05 And I'm sure this is not the first 00:55:19:07 00:55:20:07 time we've talked about these in 00:55:20:08 00:55:21:12 Leadership in Extraordinary Times 00:55:21:19 00:55:23:24 and certainly won't be the last. 00:55:26:19 00:55:28:06 My thanks to Professor Andrew 00:55:28:07 00:55:30:00 Stephen, Dr. Yasmeen Ahmad, 00:55:30:21 00:55:31:21 Dr. Natalia 00:55:32:19 00:55:33:20 Eframova and Dr. Felipe Thomaz. 00:55:35:06 00:55:37:01 My name is Peter Tufano and 00:55:37:02 00:55:38:05 you've been listening to Leadership 00:55:38:06 00:55:39:20 in Extraordinary Times, a podcast 00:55:40:12 00:55:41:23 from Oxford University's Said 00:55:42:00 00:55:43:00 Business School. 00:55:43:18 00:55:45:08 If you've enjoyed this episode, give 00:55:45:09 00:55:47:03 us a rating and a review and 00:55:47:04 00:55:48:19 subscribe to future episodes 00:55:48:23 00:55:50:14 wherever you get your podcasts. 00:55:51:10 00:55:52:25 In the next episode, I'll be 00:55:52:26 00:55:54:12 chatting to Gillian Tett, 00:55:55:02 00:55:56:13 the US editor at large for the 00:55:56:15 00:55:57:22 Financial Times, about how 00:55:58:14 00:56:00:10 anthropology, the other 00:56:00:12 00:56:01:17 AI, can explain business 00:56:02:10 00:56:03:10 and life. 00:56:03:24 00:56:05:13 If you'd like more information about 00:56:05:14 00:56:07:05 this episode and the Leadership in 00:56:07:14 00:56:09:12 Extraordinary Times series, please 00:56:09:13 00:56:10:12 visit OxfordAnswers.org. 00:56:12:06 00:56:14:02 Until next time, thanks 00:56:14:03 00:56:14:15 for listening.