00:00:08:19 00:00:10:13 Hello and welcome back to Leadership 00:00:10:18 00:00:12:08 in Extraordinary Times, a podcast 00:00:12:23 00:00:14:07 that brings you analysis from the 00:00:14:08 00:00:15:09 front lines of business. 00:00:16:12 00:00:18:08 My name is Peter Tufano and 00:00:18:09 00:00:19:23 I'm the Dean of the Said Business 00:00:19:24 00:00:21:13 School at the University of Oxford. 00:00:22:14 00:00:24:06 This series is based on a programme 00:00:24:07 00:00:25:21 of live virtual events that we've 00:00:25:22 00:00:27:06 been running since the first days of 00:00:27:07 00:00:29:06 the pandemic, covering 00:00:29:07 00:00:30:18 topics from social innovation and 00:00:30:19 00:00:32:20 social justice, to female leadership 00:00:32:21 00:00:34:05 and high impact entrepreneurship. 00:00:34:20 00:00:36:14 We aim to help our community respond 00:00:36:18 00:00:38:06 to this period of unprecedented 00:00:38:11 00:00:39:11 turmoil. 00:00:40:07 00:00:41:11 As a world leading business school, 00:00:41:12 00:00:43:00 our focus is on tackling complex 00:00:43:16 00:00:44:18 global challenges. 00:00:45:06 00:00:46:22 Our purpose is to prepare business 00:00:46:23 00:00:48:19 leaders of today and tomorrow 00:00:49:05 00:00:50:25 to make this world a better and more 00:00:50:26 00:00:52:18 equitable and more just place. 00:00:53:06 00:00:54:14 You can find all our previous 00:00:54:15 00:00:55:23 episodes wherever you get your 00:00:55:24 00:00:56:23 podcasts. 00:01:00:02 00:01:01:02 Episode six, Digital 00:01:01:23 00:01:03:13 platforms: saints 00:01:03:19 00:01:04:19 or sinners? 00:01:05:22 00:01:07:22 From social media to entertainment 00:01:07:23 00:01:09:19 and commerce, digital platforms 00:01:09:20 00:01:11:05 have become part of the fabric of 00:01:11:06 00:01:12:05 our daily lives. 00:01:12:25 00:01:14:11 We can even say that the world now 00:01:14:12 00:01:15:23 runs on exchanges of all kinds 00:01:16:10 00:01:18:07 of things on digital platforms. 00:01:19:07 00:01:20:17 But are they always good for 00:01:20:18 00:01:22:17 business and society or 00:01:22:23 00:01:24:05 is there a darker side to this 00:01:24:06 00:01:25:05 platform economy? 00:01:26:12 00:01:28:03 In this episode, we're going to join 00:01:28:04 00:01:30:08 Oxford Said experts from the Faculty 00:01:30:10 00:01:31:16 of Marketing, as well as from 00:01:31:17 00:01:33:13 Innovation and Entrepreneurship, for 00:01:33:14 00:01:35:05 discussion of the pros and cons of 00:01:35:06 00:01:36:06 digital platforms and 00:01:37:03 00:01:38:14 an answer to the question of whether 00:01:38:15 00:01:39:23 they are saints or sinners in our 00:01:39:24 00:01:40:23 world. 00:01:41:14 00:01:42:25 Chairing the discussion is Andrew 00:01:43:00 00:01:44:07 Stephen, Associate Dean of 00:01:44:22 00:01:46:10 Research and L'Oreal Professor of 00:01:46:11 00:01:47:10 Marketing. 00:01:47:18 00:01:48:23 And I'm going to hand over now to 00:01:48:24 00:01:50:14 Andrew who's going to introduce the 00:01:50:15 00:01:51:14 panel. 00:01:52:04 00:01:53:14 Hello and welcome to today's 00:01:54:09 00:01:55:13 episode of Leadership in 00:01:56:07 00:01:57:07 Extraordinary Times. 00:01:57:23 00:01:59:24 Today, we're going to talk about 00:02:00:07 00:02:01:25 digital platforms and are they good 00:02:02:10 00:02:04:00 or bad for business, for society, 00:02:04:25 00:02:06:10 for us as human beings. 00:02:06:23 00:02:08:08 And I'm joined by three of my 00:02:08:18 00:02:10:09 wonderful faculty colleagues 00:02:10:16 00:02:12:13 here at the Said Business School 00:02:12:19 00:02:14:06 who are all experts 00:02:14:23 00:02:16:13 on the various aspects of digital 00:02:16:18 00:02:17:24 platforms. And we're going to have a 00:02:18:05 00:02:20:04 discussion about this 00:02:20:05 00:02:21:04 very question. 00:02:21:05 00:02:22:15 So I'd like to welcome Cammy Crolic, 00:02:22:17 00:02:23:24 who's an Associate Professor of 00:02:24:00 00:02:25:24 Marketing, Pinar Ozcan, who's 00:02:25:25 00:02:27:12 a Professor of Entrepreneurship and 00:02:27:13 00:02:29:08 Innovation, and Felipe Thomaz 00:02:29:15 00:02:31:15 is also an Associate Professor 00:02:31:16 00:02:32:15 of Marketing. 00:02:32:18 00:02:34:16 Welcome to the three of you. 00:02:35:00 00:02:35:25 I think we're going to have a pretty 00:02:36:00 00:02:37:16 interesting conversation today 00:02:37:25 00:02:39:12 about various facets of digital 00:02:40:01 00:02:42:10 platforms, which really encompass 00:02:42:17 00:02:44:09 pretty much everything we do in our 00:02:44:12 00:02:46:01 daily lives, whether it's social 00:02:46:02 00:02:48:07 media and messaging and, 00:02:48:11 00:02:50:02 you know, sort of consumer facing 00:02:50:08 00:02:52:02 platforms that we all maybe know, 00:02:52:04 00:02:53:22 or love and hate, who knows. 00:02:54:02 00:02:56:00 Through to banking, through to 00:02:56:01 00:02:57:16 the systems that underpin 00:02:58:01 00:02:59:14 many of the services 00:02:59:21 00:03:00:24 that we all rely on and 00:03:01:17 00:03:03:02 our economies rely on. 00:03:03:10 00:03:05:04 Digital platforms are critical. 00:03:05:09 00:03:06:07 But the question we're going to 00:03:06:08 00:03:08:01 address today as well, is this 00:03:08:02 00:03:09:01 always a good thing? 00:03:09:11 00:03:10:19 And we'll come to the panel 00:03:11:14 00:03:13:06 in a second. But the other thing 00:03:13:07 00:03:14:19 that we're going to do today is 00:03:15:09 00:03:16:09 do a little poll. 00:03:16:14 00:03:18:10 And so if you want to 00:03:18:11 00:03:19:23 give us essentially your answer 00:03:20:10 00:03:22:03 to the question, digital platforms, 00:03:22:04 00:03:23:22 saints or sinners, basically thumbs 00:03:23:23 00:03:24:23 up or thumbs down. 00:03:25:01 00:03:26:17 And we'll see those thoughts in a 00:03:26:19 00:03:27:19 second. 00:03:27:22 00:03:29:09 It seems appropriate, though, that 00:03:29:10 00:03:31:11 we, first of all, set 00:03:31:12 00:03:32:19 the scene in terms of what do we 00:03:32:20 00:03:34:08 mean by digital platforms? 00:03:34:09 00:03:35:25 And Pinar I'm going to come to you 00:03:36:05 00:03:37:06 for that question. 00:03:37:07 00:03:39:04 What are digital platforms and why 00:03:39:05 00:03:40:05 are they important? 00:03:41:03 00:03:42:06 Sure. Thank you, Andrew. 00:03:42:07 00:03:43:22 And it's a pleasure to participate 00:03:43:23 00:03:44:22 here. 00:03:45:03 00:03:46:16 Digital platforms are a digital 00:03:47:09 00:03:49:12 version of what we understand as 00:03:49:13 00:03:50:17 exchange platforms. 00:03:50:18 00:03:52:06 So, you know, if you grew up in a 00:03:52:07 00:03:53:24 place where you had a street market 00:03:54:05 00:03:55:23 that was a platform, it was just not 00:03:55:24 00:03:56:23 digital. 00:03:57:00 00:03:58:24 Any place where interaction takes 00:03:59:00 00:04:00:23 place is a platform. 00:04:01:06 00:04:02:16 Digital platforms are special 00:04:03:06 00:04:05:06 because they have the ability to use 00:04:05:07 00:04:06:20 data from each and every member, 00:04:07:01 00:04:08:19 whatever platform that may be 00:04:09:01 00:04:10:24 in order to do things like matching 00:04:10:25 00:04:12:06 between different sides or 00:04:12:24 00:04:14:12 just simply allow communication 00:04:15:02 00:04:16:14 electronically between different 00:04:16:15 00:04:18:03 members. So a digital platform is 00:04:18:04 00:04:19:18 really a platform that happens in 00:04:19:19 00:04:20:19 the digital realm. 00:04:21:09 00:04:22:09 OK, so let's 00:04:23:06 00:04:24:25 start by thinking about social media 00:04:24:26 00:04:25:19 with digital platforms. 00:04:25:20 00:04:27:11 Obviously, these are for exchanges 00:04:27:12 00:04:29:23 of information, entertainment, 00:04:30:05 00:04:31:10 commerce and all sorts of 00:04:32:01 00:04:33:01 things. 00:04:33:04 00:04:34:07 So I'm going to come to you, Cammy, 00:04:34:14 00:04:35:14 first. 00:04:35:18 00:04:37:17 What's going on in social media 00:04:37:18 00:04:39:06 with respect to this question of 00:04:39:08 00:04:40:13 sites or centres? 00:04:41:14 00:04:42:23 Well, of course, that's a very 00:04:42:24 00:04:44:03 complicated question. 00:04:44:10 00:04:45:21 So there are many different ways 00:04:45:22 00:04:47:13 that we can we can think about it. 00:04:47:15 00:04:48:15 So I'm sure that you 00:04:49:12 00:04:51:01 all have been seeing featured in 00:04:51:02 00:04:52:08 popular press and a lot of 00:04:52:22 00:04:54:08 research lately, kind of talking 00:04:54:09 00:04:56:06 about some of the maybe 00:04:56:19 00:04:58:07 'sinner' aspects to digital and 00:04:58:14 00:04:59:14 social media. 00:04:59:21 00:05:01:14 And so we are seeing things 00:05:02:03 00:05:03:25 reported in the literature, such 00:05:03:26 00:05:05:05 as social media use being 00:05:05:22 00:05:07:12 associated with things like 00:05:07:19 00:05:08:20 depression and kind of 00:05:09:17 00:05:11:03 contradictory to what you might 00:05:11:04 00:05:13:03 think, even social isolation 00:05:13:04 00:05:14:15 or feelings of loneliness. 00:05:14:21 00:05:16:02 It's very interesting that 00:05:16:18 00:05:18:03 some of this research is emerging 00:05:18:04 00:05:19:19 that some of these platforms that 00:05:19:22 00:05:22:04 should be facilitating 00:05:22:23 00:05:24:07 connections and encouraging people 00:05:24:08 00:05:25:19 to interact with others may be 00:05:25:22 00:05:27:08 having some of these opposite 00:05:27:17 00:05:28:17 effects. 00:05:28:22 00:05:30:16 We also know social media use is 00:05:31:02 00:05:32:14 associated with other negative 00:05:33:04 00:05:35:07 psychological outcomes, such as 00:05:35:12 00:05:37:15 feelings of malicious envy. 00:05:37:16 00:05:39:14 And that's where malicious 00:05:39:17 00:05:41:23 envy, typically in the literature 00:05:42:02 00:05:44:10 is something where you see something 00:05:44:11 00:05:45:22 that somebody else has and you 00:05:46:09 00:05:48:02 want it, but you don't want them to 00:05:48:03 00:05:49:15 have it. So like if a promotion 00:05:50:01 00:05:51:16 happened, it would be like, I don't 00:05:51:17 00:05:53:02 want them to have a promotion. 00:05:53:03 00:05:54:18 I wanted the promotion instead. 00:05:54:19 00:05:56:19 And so things like malicious 00:05:56:20 00:05:58:15 envy, social 00:05:58:16 00:06:00:08 comparison and all of these things 00:06:00:09 00:06:01:08 are kind of outcomes 00:06:02:21 00:06:04:21 of increased social media use. 00:06:04:22 00:06:06:22 But I don't want to paint a totally 00:06:06:23 00:06:07:21 bleak picture. 00:06:07:22 00:06:09:22 So we know that there are 00:06:09:23 00:06:11:04 negative outcomes, but there are 00:06:11:05 00:06:12:07 also positive outcomes. 00:06:12:18 00:06:14:11 So there's other kind of literature 00:06:14:22 00:06:16:21 existing that at least amongst 00:06:17:02 00:06:18:14 UK adolescents, 00:06:19:01 00:06:20:14 they reported greater life 00:06:20:15 00:06:22:02 satisfaction the more that they 00:06:22:10 00:06:24:06 use social media and social 00:06:24:07 00:06:25:15 media use is also associated 00:06:26:17 00:06:28:18 with more and stronger 00:06:28:19 00:06:29:15 social ties. 00:06:29:16 00:06:31:13 In some instances, it can 00:06:31:14 00:06:32:25 be a really great platform for 00:06:33:00 00:06:34:03 people with bereavement 00:06:35:01 00:06:37:13 to find social networks of support. 00:06:38:05 00:06:40:10 We also know that oftentimes 00:06:40:11 00:06:42:19 it engenders stronger communities 00:06:42:20 00:06:44:03 and feelings of belongingness. 00:06:44:04 00:06:45:10 So we are seeing kind of this 00:06:45:11 00:06:46:10 contrast. 00:06:46:11 00:06:47:21 Sometimes we see these really 00:06:48:17 00:06:50:09 negative psychological effects 00:06:50:13 00:06:51:13 and sometimes the literature points to 00:06:52:01 00:06:53:17 more positive or at least neutral 00:06:54:08 00:06:56:00 effects of social media use. 00:06:56:08 00:06:58:08 So it's obviously 00:06:58:09 00:07:00:05 a complicated question and it's 00:07:00:06 00:07:01:17 still kind of outstanding, but 00:07:02:11 00:07:03:23 we do see positive and negative 00:07:04:05 00:07:05:05 effects. 00:07:05:14 00:07:07:00 And so I going to come back to you 00:07:07:22 00:07:09:10 soon, Cammy, about sort of when 00:07:10:06 00:07:11:10 is it positive or when is it 00:07:11:11 00:07:12:11 negative? 00:07:12:14 00:07:13:14 But I want to get to you, Felipe. 00:07:14:20 00:07:16:05 In the meantime, and just get your 00:07:16:06 00:07:18:06 take on this as well, given given 00:07:18:13 00:07:20:12 your research also involving 00:07:20:18 00:07:22:02 social media, particularly how 00:07:22:03 00:07:23:25 businesses use social media, what 00:07:24:01 00:07:25:17 are your thoughts on on this, 00:07:26:02 00:07:27:13 as Cammy said, complex question? 00:07:29:05 00:07:30:17 As Cammy well said, it's rather 00:07:30:18 00:07:31:17 messy. Right. 00:07:31:18 00:07:32:18 And I'm going to leverage Pinar's 00:07:33:01 00:07:34:03 kind of opening and saying, like, 00:07:34:04 00:07:35:03 this is broad. 00:07:35:09 00:07:37:08 So you have to consider also 00:07:37:09 00:07:38:17 how this 00:07:39:14 00:07:40:23 new development of digitised 00:07:41:15 00:07:43:12 platforms have enabled for 00:07:43:13 00:07:45:08 all sorts of new business models. 00:07:45:23 00:07:48:02 So not only are very large 00:07:48:03 00:07:49:20 companies using it to 00:07:49:24 00:07:51:13 reach out to customers, understand 00:07:51:14 00:07:52:15 them better, generate better 00:07:52:16 00:07:53:18 insights and generating 00:07:54:19 00:07:56:14 new avenues for profit 00:07:56:15 00:07:57:19 and better performance. 00:07:58:03 00:07:59:17 But also it's opened up the door 00:08:00:07 00:08:02:14 for really tons of SMEs, 00:08:02:15 00:08:03:15 the small and midsize 00:08:04:11 00:08:06:09 kind of companies, to suddenly 00:08:06:15 00:08:07:25 reach into the market where they 00:08:08:00 00:08:09:14 never had an opportunity. 00:08:10:10 00:08:11:23 One of the most interesting things, 00:08:11:24 00:08:13:08 even though we vilify some of the 00:08:13:09 00:08:14:19 largest platforms today 00:08:15:04 00:08:17:12 because of their auction based 00:08:17:16 00:08:19:17 vending, I guess, of their space 00:08:19:18 00:08:21:15 and advertising, it's actually 00:08:21:16 00:08:23:05 democratised access to that share 00:08:23:17 00:08:24:17 of mouth. 00:08:24:19 00:08:26:04 So if you really want to say 00:08:26:09 00:08:28:09 something about your business, if 00:08:28:10 00:08:29:16 you're a very small company and you 00:08:29:17 00:08:30:25 have to compete with a mega 00:08:31:01 00:08:32:22 multinational, it would have been 00:08:32:23 00:08:34:12 impossible in the age of just 00:08:34:13 00:08:35:25 magazines and television. 00:08:36:07 00:08:37:21 You just couldn't buy that space. 00:08:38:07 00:08:40:00 But now, with this immense division 00:08:40:09 00:08:42:06 of things, suddenly the smallest 00:08:42:07 00:08:43:07 players have access. 00:08:43:21 00:08:44:25 Now, that's no guarantee that 00:08:44:26 00:08:46:01 they're going to succeed, 00:08:46:23 00:08:48:08 but it allows them the chance 00:08:48:21 00:08:50:21 to play, which on its own, 00:08:50:22 00:08:52:19 it's a significant positive. 00:08:53:06 00:08:55:05 On the negative side of that is 00:08:55:06 00:08:56:14 obviously there's this flow of 00:08:56:15 00:08:57:12 information, right? 00:08:57:13 00:08:58:21 So one of the earliest fears 00:08:59:15 00:09:01:01 with social media and Web 2.0 00:09:01:24 00:09:03:16 was that business was going to lose 00:09:03:21 00:09:05:25 its edge, as it were, because 00:09:05:26 00:09:07:13 of the free flow of information. 00:09:08:08 00:09:09:07 You wouldn't be able to 00:09:09:08 00:09:10:10 discriminate. You want to be able to 00:09:10:11 00:09:11:20 price things correctly, wouldn't be 00:09:11:21 00:09:13:09 able to do much of anything that 00:09:13:21 00:09:15:16 we were able to do in Web 1.0. 00:09:15:25 00:09:17:22 And that went away because we're 00:09:17:23 00:09:20:01 actually able to 00:09:20:02 00:09:21:06 capture information about the 00:09:21:07 00:09:22:11 consumers that might change in the 00:09:22:12 00:09:23:13 future. There's a discussion about 00:09:23:14 00:09:25:10 privacy, but a lot of value 00:09:25:11 00:09:26:24 was created by the largest firms 00:09:27:08 00:09:29:04 and the smallest ones in terms of 00:09:29:05 00:09:30:06 generating insights and 00:09:30:07 00:09:31:17 understanding people without being 00:09:31:18 00:09:32:23 intrusive and asking them 00:09:33:13 00:09:34:23 questions, stopping them the mall, 00:09:34:24 00:09:36:15 it's actually generating insight 00:09:36:22 00:09:38:07 by virtue of just existing in 00:09:38:18 00:09:39:17 the social spaces. 00:09:39:18 00:09:41:03 So even on that side, you get 00:09:41:13 00:09:42:13 a lot of positives. 00:09:43:11 00:09:44:12 Pinar, what do you think about this? 00:09:45:09 00:09:46:24 Well, I think that's I completely 00:09:47:10 00:09:48:23 agree, first of all, with Cammy and 00:09:48:24 00:09:50:09 Felipe, that there are positives and 00:09:50:10 00:09:52:00 negatives. And at the individual 00:09:52:01 00:09:53:22 level, it seems like the negatives 00:09:53:23 00:09:55:08 should not definitely not be 00:09:55:09 00:09:56:23 ignored. If anything, they may be 00:09:57:07 00:09:58:16 more than the positives when it 00:09:58:17 00:09:59:20 comes to social media. 00:10:00:03 00:10:01:16 One of the things that's interesting 00:10:01:17 00:10:03:14 to understand about platforms is 00:10:03:15 00:10:04:14 that in order for 00:10:05:10 00:10:07:10 platforms to generate data 00:10:07:11 00:10:09:09 which can be used in many ways, for 00:10:09:10 00:10:11:10 example, in advertising or in 00:10:11:18 00:10:13:12 providing better kind of interaction 00:10:14:04 00:10:16:01 between users and providers, 00:10:16:03 00:10:17:07 the same Uber, etc., but 00:10:18:03 00:10:20:03 in social media platforms 00:10:20:04 00:10:21:06 in particular, in order 00:10:21:24 00:10:23:11 to generate data, the platform 00:10:23:25 00:10:25:25 provider will want to show 00:10:26:00 00:10:27:14 us content that we have to react 00:10:28:00 00:10:29:10 to. And so that's why we see 00:10:29:22 00:10:31:12 a bit of this polarisation and of 00:10:31:18 00:10:33:01 these negative effects and 00:10:33:18 00:10:35:19 people joining all sorts 00:10:35:20 00:10:36:21 of things that they wouldn't 00:10:36:22 00:10:38:24 otherwise join because the platform 00:10:39:04 00:10:40:12 benefits from the fact that 00:10:41:05 00:10:43:09 we show a reaction to what's 00:10:43:15 00:10:45:01 shown us on the platform. 00:10:45:13 00:10:47:09 And so I think that's in a sense 00:10:47:15 00:10:49:06 important to understand that part 00:10:49:16 00:10:51:15 of the negative reaction 00:10:51:16 00:10:53:06 that we're showing to platforms in 00:10:53:07 00:10:54:22 terms of affecting our psychology 00:10:54:23 00:10:56:13 comes from the very nature of how 00:10:56:17 00:10:57:22 the platform functions. 00:10:58:23 00:11:00:06 It's an interesting point to, you 00:11:00:07 00:11:01:20 know, basically saying it holds a 00:11:01:21 00:11:03:04 mirror up to us as humanity, 00:11:03:23 00:11:05:11 as a society, and 00:11:05:24 00:11:07:03 that therefore means we're going to 00:11:07:04 00:11:08:03 get warts and all. We'll get the 00:11:08:04 00:11:09:23 good, but will also get the bad. 00:11:09:24 00:11:11:16 And as all of you with speaking, you 00:11:11:17 00:11:13:04 know, the obvious other point to 00:11:13:05 00:11:14:05 bring up in terms of, say, Felipe, 00:11:14:14 00:11:16:05 your terminology about information 00:11:16:06 00:11:17:23 flows is, of course, misinformation 00:11:18:09 00:11:19:09 flows. 00:11:19:10 00:11:21:04 So we'll come 00:11:21:05 00:11:22:13 back to that. But I just want to 00:11:22:14 00:11:24:09 remind everyone to to definitely 00:11:24:10 00:11:26:11 share questions with us in the chat. 00:11:26:12 00:11:27:11 And let's let's 00:11:28:10 00:11:29:19 put it, though, and have a look at 00:11:29:20 00:11:30:19 our poll to 00:11:31:15 00:11:32:18 see if there's 00:11:33:14 00:11:35:18 some prevailing opinion here that 00:11:35:22 00:11:37:09 that we can react to. 00:11:37:14 00:11:38:19 So look at this. 00:11:38:20 00:11:39:19 We've got sort of a 00:11:40:15 00:11:41:15 fairly split opinion 00:11:42:24 00:11:43:24 here, but 00:11:44:20 00:11:46:16 maybe the opposite splits 00:11:46:17 00:11:48:02 what I was expecting anyway, 00:11:49:10 00:11:50:12 the Saints are winning. 00:11:50:21 00:11:51:23 Cammy, what do you think? 00:11:53:00 00:11:54:14 Yeah, actually, I'm surprised by 00:11:54:17 00:11:56:04 this as well, especially, as I 00:11:56:09 00:11:57:20 mentioned, popular press does 00:11:58:06 00:11:59:16 like to kind of seise on some 00:12:00:05 00:12:02:00 of the negative aspects. 00:12:02:13 00:12:03:13 But I think what is 00:12:04:12 00:12:05:22 maybe being reflected here is 00:12:06:17 00:12:09:01 the kind of natural understanding 00:12:09:02 00:12:10:19 by some of our listeners that 00:12:11:10 00:12:13:04 and what Pinar 00:12:13:09 00:12:14:09 and Felipe obviously brought up as well, 00:12:14:10 00:12:16:15 is this idea that platforms 00:12:16:16 00:12:17:17 have some control over 00:12:18:11 00:12:19:23 how the platforms are designed 00:12:20:12 00:12:22:18 that facilitate certain things 00:12:22:19 00:12:24:07 more or less often. 00:12:24:16 00:12:26:14 But we as users also 00:12:26:15 00:12:28:15 have a lot of control over 00:12:28:21 00:12:30:11 the way that we choose to consume 00:12:30:18 00:12:32:11 these digital platforms and 00:12:32:12 00:12:34:03 specifically in 00:12:34:22 00:12:36:11 social media is what I'm interested 00:12:36:12 00:12:38:07 in. And so I 00:12:38:08 00:12:39:23 think what we're finding here is 00:12:39:24 00:12:41:18 that there are a lot of positive 00:12:41:19 00:12:43:10 ways that we can use these digital 00:12:43:13 00:12:45:08 platforms and 00:12:45:12 00:12:47:12 and maybe people are becoming 00:12:47:13 00:12:49:04 a little bit more knowledgeable or 00:12:49:07 00:12:51:01 savvy to their own benefit 00:12:51:14 00:12:53:08 and being able to use these these 00:12:53:09 00:12:54:25 platforms more appropriately or at 00:12:54:26 00:12:56:18 least in ways that facilitate their 00:12:56:23 00:12:58:20 psychological well-being or maybe 00:12:58:21 00:13:00:11 even their small businesses. 00:13:01:04 00:13:02:13 And so maybe we're seeing some of 00:13:02:14 00:13:03:13 that. 00:13:03:14 00:13:04:17 And with follow up on that, because 00:13:04:18 00:13:05:23 I wanted to bring in your research 00:13:05:24 00:13:08:00 around well-being and social media 00:13:08:01 00:13:09:25 use at this point, because 00:13:10:06 00:13:11:22 it does hinge on the different ways 00:13:11:23 00:13:13:09 people are using. And I think it's a 00:13:13:10 00:13:14:19 good point that you raise that 00:13:14:20 00:13:17:01 perhaps as a species we're adapting 00:13:17:02 00:13:18:23 and finding sort of the good uses 00:13:19:07 00:13:20:21 versus the less good uses. 00:13:21:06 00:13:22:12 What from your research in 00:13:23:02 00:13:24:16 terms of psychological well-being 00:13:24:17 00:13:26:16 and using these platforms, what 00:13:26:17 00:13:28:09 are the uses that tend to be better 00:13:28:18 00:13:29:18 for people? 00:13:30:08 00:13:32:09 Yeah, so I have some 00:13:32:19 00:13:34:12 research looking at 00:13:35:01 00:13:36:19 social media use and its 00:13:36:20 00:13:38:07 relationship with psychological 00:13:38:24 00:13:40:12 well-being. And so this is slightly 00:13:40:13 00:13:42:02 different than life satisfaction 00:13:42:18 00:13:43:20 and happiness. 00:13:43:21 00:13:45:13 It does have a lot more 00:13:45:17 00:13:47:15 to do with where you think your life 00:13:47:16 00:13:48:15 is headed and how 00:13:49:15 00:13:51:02 much support that you feel. 00:13:51:14 00:13:53:01 But so this general concept of 00:13:53:03 00:13:54:16 psychological well-being, what we 00:13:54:17 00:13:56:19 find is that social media use 00:13:56:20 00:13:58:09 or increased social media use has 00:13:58:16 00:13:59:19 a small positive effect 00:14:00:16 00:14:02:05 on psychological well-being. 00:14:02:11 00:14:04:07 And that 00:14:04:12 00:14:05:25 should be expected because, as you 00:14:06:00 00:14:07:14 can imagine, social media use is 00:14:08:00 00:14:09:23 not one of the major 00:14:09:24 00:14:11:20 factors affecting your psychological 00:14:11:21 00:14:13:14 well-being. So much more 00:14:14:09 00:14:15:17 other life factors like how 00:14:16:06 00:14:17:21 fulfilling your work is and 00:14:18:02 00:14:19:25 the closeness of your family 00:14:20:00 00:14:21:00 and general positive 00:14:22:08 00:14:23:25 outlook. And so there are a lot of 00:14:24:09 00:14:26:08 other factors, multifaceted 00:14:26:09 00:14:27:10 factors that influence 00:14:28:09 00:14:29:19 your psychological well-being. 00:14:29:20 00:14:31:06 But we are still seeing a 00:14:31:07 00:14:32:23 relationship. And most importantly, 00:14:32:24 00:14:34:18 and I think what you're sort of 00:14:35:06 00:14:36:23 implying is this idea that what we 00:14:37:04 00:14:39:13 found is that effect really 00:14:39:14 00:14:41:10 is driven by 00:14:41:11 00:14:42:24 the types of connections that we 00:14:43:09 00:14:44:19 have or that we make on social 00:14:44:20 00:14:46:07 media. So if we're using social 00:14:46:22 00:14:48:17 media to connect with people 00:14:48:22 00:14:50:14 who are close friends and family 00:14:50:15 00:14:52:09 members, people that we we 00:14:52:10 00:14:54:25 want this deep, supportive 00:14:54:26 00:14:56:04 interaction with, that's 00:14:56:24 00:14:58:11 what's really driving our boost 00:14:59:05 00:15:00:19 in psychological well-being. 00:15:00:20 00:15:02:17 So we can obviously use 00:15:02:18 00:15:04:01 social media in a variety of 00:15:04:05 00:15:05:20 different ways. But what seems to be 00:15:05:21 00:15:08:02 most important is 00:15:08:03 00:15:09:25 this connectedness with 00:15:10:04 00:15:11:11 close others, 00:15:11:24 00:15:13:23 truly social social 00:15:13:24 00:15:15:10 interaction, so to speak. 00:15:16:04 00:15:17:21 I did want to qualify this slightly 00:15:17:22 00:15:19:14 is that this research was conducted 00:15:20:02 00:15:21:14 outside of the pandemic 00:15:21:23 00:15:22:23 and so we can't maybe 00:15:24:07 00:15:25:07 say as much. 00:15:25:12 00:15:26:15 Obviously, what I think 00:15:27:08 00:15:28:18 is coming out of this from the 00:15:28:19 00:15:31:04 recent research is that people are 00:15:31:13 00:15:33:15 using social media and 00:15:33:16 00:15:35:07 other digital platforms to connect 00:15:35:13 00:15:36:19 with close to others. 00:15:36:25 00:15:38:17 And what we're seeing is we're sort 00:15:38:18 00:15:40:19 of losing some of those 00:15:40:22 00:15:42:00 weaker ties. 00:15:42:08 00:15:43:15 So the people that you say 00:15:44:06 00:15:46:03 hi to in the hallway or 00:15:46:08 00:15:47:25 chat with the barista when you pick 00:15:48:00 00:15:49:15 up your coffee. And so what we're 00:15:49:19 00:15:51:04 seeing is it could be a very 00:15:51:15 00:15:53:11 complicated story right now. 00:15:53:14 00:15:55:15 But when life sort 00:15:55:16 00:15:57:04 of operates as normal, 00:15:57:11 00:15:58:22 social media does effectively 00:15:59:13 00:16:01:12 help us connect with others and 00:16:01:13 00:16:03:04 that does boost our psychological 00:16:03:05 00:16:04:05 well-being. 00:16:04:12 00:16:05:23 So let's - we talked a bit about 00:16:05:24 00:16:07:05 social media, and I'm sure it's 00:16:07:06 00:16:08:16 going to come back to us once 00:16:09:02 00:16:10:21 we start getting some questions from 00:16:10:22 00:16:12:05 from all of you who are watching. 00:16:12:06 00:16:13:13 But I wanted to expand a little bit 00:16:13:14 00:16:15:10 beyond, I guess, 00:16:15:11 00:16:17:03 the familiar social media platforms. 00:16:17:04 00:16:18:03 And I'm going to come to you, Pinar. 00:16:18:13 00:16:19:23 Let's talk about other types of 00:16:19:24 00:16:21:09 platforms. What are some of the ones 00:16:21:10 00:16:23:02 that we don't think about as much? 00:16:23:21 00:16:25:03 Thanks, Andrew. I think that there 00:16:25:04 00:16:27:02 are many different types 00:16:27:03 00:16:29:00 of platforms as we 00:16:29:01 00:16:31:03 as consumers don't realise exist. 00:16:31:10 00:16:33:04 And some of these are platforms that 00:16:33:07 00:16:34:20 entrepreneurs actually find really 00:16:34:21 00:16:35:20 valuable. 00:16:35:22 00:16:37:09 So, for example, B2B platforms 00:16:38:09 00:16:39:19 are really arising in finance 00:16:40:13 00:16:41:13 and in different 00:16:42:08 00:16:44:17 areas, in professional services. 00:16:44:24 00:16:46:17 You can get advice much more easily 00:16:46:25 00:16:48:19 from professionals through platforms 00:16:49:01 00:16:50:24 now. And I was glad to 00:16:51:00 00:16:52:22 see that the Saints were actually in 00:16:52:23 00:16:54:08 the in the up, and I think that 00:16:54:09 00:16:55:13 maybe if you look at it from a 00:16:55:14 00:16:57:02 business point of view, not from a 00:16:57:03 00:16:58:23 social media and individual point 00:16:58:24 00:17:00:05 of view, but from a business point 00:17:00:06 00:17:01:16 of view, what a platform does 00:17:02:08 00:17:03:22 is it can really democratise a 00:17:03:23 00:17:05:18 market because it allows 00:17:05:19 00:17:07:24 individuals or small firms 00:17:08:05 00:17:10:07 to really participate and find 00:17:10:08 00:17:11:23 buyers rather than 00:17:12:06 00:17:13:13 having, as Felipe said at the 00:17:13:14 00:17:15:05 beginning, having to spend lots of 00:17:15:09 00:17:16:10 marketing dollars. 00:17:16:20 00:17:18:14 And so and the review system, of 00:17:18:15 00:17:19:23 course, works in their favour. 00:17:20:06 00:17:21:25 The better they do, the more visible 00:17:22:04 00:17:23:11 they become on the platform. 00:17:23:16 00:17:25:13 And so this is a great way 00:17:25:14 00:17:27:16 for entrepreneurs to actually reach 00:17:28:04 00:17:29:17 consumers and end users. 00:17:29:22 00:17:31:09 And you can also think about sharing 00:17:31:10 00:17:32:20 economy. For example, sharing 00:17:32:22 00:17:34:18 platforms have been around for 00:17:34:19 00:17:36:07 almost 10 years now and have done a 00:17:36:08 00:17:37:11 bit of research on them. 00:17:37:18 00:17:39:14 And what we see really is that they 00:17:39:15 00:17:41:10 can really help with 00:17:42:24 00:17:44:17 understanding how we can 00:17:44:22 00:17:46:11 use resources better. 00:17:46:12 00:17:48:14 So these flats or kind of summer 00:17:48:15 00:17:50:04 homes that stay idle 00:17:50:11 00:17:52:07 or bikes that don't get used or 00:17:52:08 00:17:53:19 even cars that don't get used, 00:17:54:10 00:17:56:17 they can really find a usage 00:17:56:23 00:17:57:21 through platform. 00:17:57:22 00:17:59:21 So there's actually a lot of benefit 00:17:59:22 00:18:01:02 out there. And when it comes to 00:18:01:03 00:18:02:12 sharing platforms, I think that 00:18:02:13 00:18:04:01 benefit is actually quite high. 00:18:04:23 00:18:06:12 So, Felipe, building off of what 00:18:06:18 00:18:08:09 Pinar was just saying, I mean, you 00:18:08:10 00:18:10:03 also mentioned this notion of 00:18:10:04 00:18:11:06 democratisation before, 00:18:11:24 00:18:13:02 and so I kind of want to go with 00:18:13:03 00:18:14:10 that a little bit further because I 00:18:14:11 00:18:16:09 think it's a it's a really important 00:18:16:10 00:18:18:15 feature of digital platforms 00:18:18:16 00:18:20:03 in all the various ways, whether 00:18:20:04 00:18:21:17 we're talking about, you know, me 00:18:21:18 00:18:23:15 starting a business of selling 00:18:23:16 00:18:25:04 shoes on Instagram or 00:18:25:11 00:18:27:07 or the B2B applications 00:18:27:15 00:18:29:14 and sharing economy applications 00:18:29:15 00:18:30:20 that that Pinar was just talking 00:18:30:21 00:18:32:03 about. So what are your thoughts on 00:18:32:04 00:18:33:15 this and where do you see this 00:18:33:22 00:18:34:22 headed? 00:18:35:12 00:18:37:02 Can we all basically become 00:18:37:11 00:18:39:07 entrepreneurs by taking 00:18:39:08 00:18:40:23 advantage of the 00:18:41:06 00:18:42:08 digital platforms? 00:18:42:09 00:18:43:21 And if so, what? 00:18:43:22 00:18:45:07 What do these platforms actually do 00:18:45:08 00:18:46:11 for us other than giving 00:18:47:04 00:18:48:04 us an audience? 00:18:48:05 00:18:49:14 Yeah, I'll start there and I 00:18:50:03 00:18:51:19 might take a slightly more towards 00:18:52:05 00:18:53:05 the centre view here. 00:18:53:08 00:18:54:08 And also I have 00:18:55:06 00:18:56:13 to touch on on my research 00:18:57:11 00:18:58:07 on the true centres. 00:18:58:08 00:18:59:08 But I think one of 00:19:00:03 00:19:02:04 the first things here is 00:19:02:12 00:19:04:05 this recognition that it lower some 00:19:04:08 00:19:05:08 barriers to entry. 00:19:05:13 00:19:06:22 So what you get is a cheaper 00:19:07:16 00:19:09:10 play for you to come in and kind of 00:19:09:11 00:19:10:10 get started. 00:19:10:23 00:19:12:08 I've seen similarly the rise 00:19:12:22 00:19:14:10 in B2B, an increase in B2B like 00:19:14:19 00:19:16:17 finding suppliers, identifying 00:19:16:23 00:19:18:04 commercial networks. 00:19:18:05 00:19:19:17 It's just become so much easier 00:19:19:25 00:19:22:10 for you to identify partners, 00:19:22:23 00:19:24:05 to actually grow your market 00:19:24:06 00:19:26:03 presence and exist in 00:19:26:05 00:19:28:01 other regions, 00:19:28:02 00:19:30:02 right. So market entry, and finding 00:19:30:03 00:19:31:19 a local partner is just - has made 00:19:31:25 00:19:34:07 it that much easier for anybody 00:19:34:08 00:19:36:05 to kind of go and participate. 00:19:36:12 00:19:38:02 So the barriers are down and you can 00:19:38:03 00:19:40:06 imagine that for the larger 00:19:40:07 00:19:41:13 players and established players, 00:19:41:14 00:19:42:13 then the response is 00:19:43:10 00:19:44:21 going to be what is the strategy 00:19:44:22 00:19:46:20 that I take in order to 00:19:47:02 00:19:48:21 increase other sorts of barriers 00:19:49:02 00:19:50:20 if the cost of competition has gone 00:19:50:21 00:19:52:22 down? How do I kind of maintain 00:19:52:23 00:19:54:21 my advantage as a massive player? 00:19:55:18 00:19:57:07 And largely some of my research 00:19:57:09 00:19:59:03 shows like actually data control and 00:19:59:04 00:20:01:02 maintain those relationships is 00:20:01:03 00:20:02:06 where they're maintaining their 00:20:02:07 00:20:03:16 barrier and power. 00:20:04:17 00:20:05:17 Relatedly, 00:20:06:21 00:20:08:03 Cammy brought up like changes with a 00:20:08:04 00:20:10:03 pandemic, and I 00:20:10:04 00:20:12:00 think the work from home 00:20:12:01 00:20:13:06 situation also plays here 00:20:14:01 00:20:15:20 in a way. So if you just bear with 00:20:15:21 00:20:17:04 me for a second is that all 00:20:18:03 00:20:19:13 of our companies are clearly 00:20:20:07 00:20:21:17 not teaching from Oxford right this 00:20:21:18 00:20:23:01 moment. Right. Like I'm in a 00:20:23:17 00:20:24:17 room in my house. 00:20:25:04 00:20:26:21 Our companies are shattered and 00:20:26:22 00:20:27:17 distributed. 00:20:27:18 00:20:28:21 So all of our employees, 00:20:29:13 00:20:30:22 at least for a year, existed 00:20:31:18 00:20:34:03 and operated across these platforms. 00:20:34:19 00:20:36:06 And we have to work through 00:20:36:16 00:20:38:14 all of these systems in order to 00:20:38:15 00:20:40:12 participate the potential than 00:20:40:15 00:20:41:15 there is for you to 00:20:42:11 00:20:43:17 be an employee anywhere 00:20:44:07 00:20:45:13 and exist and interact in 00:20:46:06 00:20:47:16 any way that you might want. 00:20:47:23 00:20:49:17 So the firm is no longer necessarily 00:20:50:04 00:20:52:08 this coherent, stable 00:20:52:19 00:20:54:09 pillar that you have to go and 00:20:54:10 00:20:55:14 travel to and exist. 00:20:55:15 00:20:57:05 You have this much more distributed 00:20:57:06 00:20:58:22 system for even labour, 00:20:59:08 00:21:01:14 much less advices, not mentioned 00:21:01:20 00:21:03:04 or finding funding or so on. 00:21:03:12 00:21:04:18 So it's really potentially 00:21:05:12 00:21:06:15 quite disruptive. 00:21:07:15 00:21:08:17 There's negative that comes with 00:21:08:18 00:21:10:04 this that's leveraging the same 00:21:10:05 00:21:12:10 platform, but for criminal ends, 00:21:12:17 00:21:13:24 so they're taking advantage - it's 00:21:14:00 00:21:15:03 the same tool, right? 00:21:15:04 00:21:16:20 Nothing wrong with the tool itself, 00:21:17:05 00:21:18:24 but humans being the creative agents 00:21:19:03 00:21:20:22 that they are just decided, well, if 00:21:20:23 00:21:22:18 I can find sources of supply 00:21:22:19 00:21:23:18 for whatever 00:21:24:18 00:21:26:09 agricultural product I can get at 00:21:26:10 00:21:27:15 scale, why don't I inject 00:21:28:08 00:21:30:02 and use the same systems to transact 00:21:30:20 00:21:32:23 in illegal wildlife, 00:21:33:19 00:21:35:15 tiger skins, rhino horn, 00:21:36:02 00:21:37:02 you name it. 00:21:37:12 00:21:39:04 We've been able to find and identify 00:21:39:05 00:21:40:17 and map that product network. 00:21:41:00 00:21:42:13 So you end up with the same 00:21:42:14 00:21:44:15 advantages being given to 00:21:44:19 00:21:46:01 other individuals that you 00:21:46:16 00:21:48:06 perhaps don't want to have access 00:21:48:13 00:21:50:08 to those performance enhancements. 00:21:50:09 00:21:52:05 You just want proper nice 00:21:52:06 00:21:53:25 businesses to flourish, 00:21:54:07 00:21:55:18 not crime to go along with it. 00:21:55:24 00:21:57:11 But we're seeing that happen at the 00:21:57:12 00:21:58:11 same pace. 00:21:59:05 00:22:00:17 So want to come to some 00:22:01:01 00:22:02:23 some questions from from 00:22:02:24 00:22:04:22 you around the world and 00:22:05:06 00:22:06:19 one thing that I'll ask a couple of 00:22:06:20 00:22:08:03 things will pick up on, first of 00:22:08:04 00:22:09:19 all, is this from Laila, who 00:22:10:02 00:22:11:19 you mentioned the poll and showing 00:22:12:05 00:22:13:10 more Saints and question. 00:22:13:13 00:22:15:07 Do we think that this is sort of the 00:22:15:10 00:22:17:03 perspective, as you know, 00:22:17:06 00:22:18:07 as consumers of these 00:22:19:05 00:22:20:17 platforms? Or do we think that 00:22:21:05 00:22:22:15 this is coming more from the 00:22:22:16 00:22:24:04 perspective, I guess, a business 00:22:24:13 00:22:26:10 perspective as the 00:22:26:18 00:22:28:05 creators of content or 00:22:28:14 00:22:30:10 users of these platforms to some 00:22:30:11 00:22:31:24 of the purposes that you, Felipe 00:22:32:09 00:22:33:14 and Pinar were just talking about? 00:22:33:21 00:22:34:24 Does anyone on the panel have a 00:22:35:00 00:22:35:24 view? 00:22:36:01 00:22:37:11 Cammy, you know, do you think the 00:22:37:12 00:22:39:04 lens we all see these things through 00:22:39:05 00:22:41:07 is more of a user consumer lens? 00:22:41:08 00:22:42:11 Or do you think we're a little bit 00:22:42:12 00:22:44:20 more, I guess, multifaceted 00:22:44:21 00:22:46:04 in our thoughts? 00:22:46:20 00:22:48:04 Because I have a tendency to 00:22:48:17 00:22:50:25 focus on the individual consumer. 00:22:51:00 00:22:52:10 I'm way too biased, I think, 00:22:52:25 00:22:55:03 to offer a real opinion here. 00:22:55:04 00:22:57:03 But yeah, I think 00:22:57:04 00:22:58:19 that we we're all 00:22:59:00 00:23:00:19 kind of amateur psychologists. 00:23:00:20 00:23:02:11 And it's possible 00:23:02:18 00:23:04:08 I don't know the mix of the group. 00:23:04:09 00:23:05:17 I think that that would be very 00:23:05:18 00:23:07:09 enlightening if we if we had a lot 00:23:07:10 00:23:08:09 of people who were 00:23:09:16 00:23:11:19 regularly using these and for 00:23:11:20 00:23:12:25 for business purposes. 00:23:13:24 00:23:15:13 But yeah, I mean, I guess I just 00:23:15:16 00:23:17:02 naturally took the perspective of 00:23:17:03 00:23:18:16 the individual consumer, but that's 00:23:18:17 00:23:20:17 just probably belying my background 00:23:20:18 00:23:21:24 more than anything else. 00:23:22:15 00:23:23:11 Well, you are a consumer 00:23:23:12 00:23:24:12 psychologist, so 00:23:25:07 00:23:26:14 I would expect that. 00:23:26:21 00:23:28:00 But it's quite interesting, really. 00:23:28:01 00:23:29:07 I mean, we don't know. 00:23:29:08 00:23:30:10 But, you know, I'm also 00:23:31:10 00:23:33:17 not sure that we necessarily 00:23:33:24 00:23:35:16 feel the negative effects 00:23:35:23 00:23:37:12 ourselves as individual users or 00:23:37:13 00:23:38:19 consumers unless they're really 00:23:38:20 00:23:39:16 negative. 00:23:39:17 00:23:41:00 So examples of 00:23:41:15 00:23:43:07 hate speech, online bullying, those 00:23:43:12 00:23:44:19 sorts of things. Obviously, if we 00:23:44:20 00:23:46:10 see that, it's quite 00:23:46:16 00:23:48:21 visceral. But the more subtle 00:23:48:22 00:23:50:15 psychological effects and Cammy, you 00:23:50:19 00:23:51:20 talked about your research as 00:23:52:13 00:23:54:19 showing small but detectable effects 00:23:54:20 00:23:56:22 here, suggests that maybe we 00:23:57:06 00:23:59:11 we might be seeing the positives, 00:23:59:12 00:24:01:15 but maybe not seeing the negatives 00:24:01:16 00:24:03:01 as much because we just don't feel 00:24:03:02 00:24:04:11 them or we're using rose tinted 00:24:04:12 00:24:05:11 glasses. 00:24:05:12 00:24:06:10 I don't know. 00:24:06:11 00:24:08:05 But I think the points that Felipe 00:24:08:15 00:24:10:06 and Pinar you raised about sort of 00:24:10:07 00:24:11:11 both from the business side, I think 00:24:12:00 00:24:13:15 you both provide a pretty compelling 00:24:13:16 00:24:15:07 argument, I thought, for 00:24:15:15 00:24:17:04 the positives here. 00:24:17:18 00:24:18:18 But I don't think it's, you know, 00:24:18:19 00:24:20:19 without potential confusion 00:24:20:20 00:24:21:21 or cost or complexity. 00:24:21:23 00:24:23:12 And so it's a question that came 00:24:23:19 00:24:25:17 in from Sarah thinking about 00:24:25:20 00:24:27:13 the platforms as businesses. 00:24:27:14 00:24:29:04 She specifically mentioned 00:24:29:10 00:24:31:11 subscription business models such 00:24:31:12 00:24:33:07 as Disney Plus or Hulu 00:24:33:08 00:24:34:21 or Netflix. You know, the thinking 00:24:34:22 00:24:36:19 of those as platforms, in 00:24:36:20 00:24:37:23 this case, entertainment 00:24:38:07 00:24:39:07 distribution. 00:24:39:15 00:24:41:01 So from that standpoint, this 00:24:41:15 00:24:43:01 doesn't have the democratised 00:24:44:02 00:24:45:24 B2B access aspect of it. 00:24:46:10 00:24:48:10 I can't go and make 00:24:48:11 00:24:50:09 Andrew's Home Movie and suddenly 00:24:50:10 00:24:51:23 distribute that through Netflix. 00:24:52:13 00:24:53:19 As entertaining as I'm sure that 00:24:53:20 00:24:55:21 would be, I can put it on YouTube 00:24:56:05 00:24:57:08 as a user generated content 00:24:57:09 00:24:58:08 platform. 00:24:58:19 00:25:00:12 But we do have the sort of the 00:25:00:13 00:25:02:03 closed platforms, if you will, that 00:25:02:04 00:25:03:13 are making markets in different 00:25:03:14 00:25:05:17 ways. And so what do you think about 00:25:06:02 00:25:07:15 those types of platforms versus 00:25:08:01 00:25:09:17 the very democratised ones? 00:25:09:24 00:25:11:02 You know, I guess there's a place 00:25:11:03 00:25:12:03 for both, but what are the 00:25:12:04 00:25:13:21 implications of of, I guess, the 00:25:13:22 00:25:15:09 more closed platform 00:25:15:20 00:25:17:02 for the way we think about this? 00:25:17:07 00:25:18:21 Pinar, maybe I'll go to you first 00:25:18:22 00:25:19:21 and then Felipe. 00:25:20:10 00:25:21:09 Sounds great. 00:25:21:10 00:25:23:02 So I think we need to maybe 00:25:23:03 00:25:24:10 differentiate different types of 00:25:24:11 00:25:25:09 platforms here. 00:25:25:10 00:25:27:14 So we have platforms where 00:25:28:02 00:25:29:09 individuals or small firms 00:25:29:24 00:25:31:09 can interact either with one 00:25:31:15 00:25:32:23 another, and those types of 00:25:33:05 00:25:35:00 platforms could be for sharing 00:25:35:01 00:25:36:01 or for different types of 00:25:36:02 00:25:37:12 businesses, and some of them might 00:25:37:13 00:25:38:17 be illegal, etc. 00:25:39:01 00:25:40:13 But when it comes to platforms 00:25:40:22 00:25:43:04 such as Disney Plus and Netflix, 00:25:43:09 00:25:45:10 we're thinking of a platform where 00:25:45:17 00:25:47:18 individuals are on the user side. 00:25:47:19 00:25:49:21 But then the content is to a large 00:25:49:22 00:25:51:08 extent generated or controlled 00:25:51:25 00:25:52:25 by the platform. 00:25:53:05 00:25:54:19 And that type of platform really 00:25:55:04 00:25:56:25 does create a bit of a kind of a 00:25:57:02 00:25:58:13 monopolistic situation. 00:25:58:21 00:26:00:15 And the main reason for that is that 00:26:00:17 00:26:02:14 we do the more data that platform 00:26:02:15 00:26:04:19 has, the better that platform 00:26:05:01 00:26:06:22 can really tailor 00:26:07:03 00:26:08:08 its content to the users. 00:26:08:24 00:26:10:12 You know, Netflix has better and 00:26:10:13 00:26:11:08 better programmes. 00:26:11:09 00:26:13:01 Why? Because they know you and they 00:26:13:02 00:26:14:08 they know what you will like. 00:26:14:13 00:26:16:08 And so, you know, the new things 00:26:16:09 00:26:18:01 that are being produced for Netflix 00:26:18:06 00:26:19:19 with all sorts of famous actors 00:26:20:08 00:26:21:15 and actresses are actually 00:26:22:13 00:26:24:15 really based on the data that 00:26:24:16 00:26:25:16 Netflix has, based on 00:26:26:11 00:26:28:08 your usage and how many minutes of a 00:26:28:09 00:26:29:18 movie you watched, etc.. 00:26:30:03 00:26:31:10 And so I think that when you 00:26:31:11 00:26:33:04 differentiate in that way, you start 00:26:33:11 00:26:35:10 to see that the more data 00:26:35:11 00:26:36:16 platform has, the more it 00:26:37:07 00:26:39:13 has a tendency for a monopoly. 00:26:39:14 00:26:41:08 But we need to then take away the 00:26:41:09 00:26:43:02 obviously the ones where individuals 00:26:43:03 00:26:44:19 get to participate as providers as 00:26:44:24 00:26:46:02 well. When the content, 00:26:46:25 00:26:48:22 whatever services, products are 00:26:48:23 00:26:50:10 actually being provided by 00:26:50:20 00:26:52:09 a platform, then the data really 00:26:52:22 00:26:54:15 creates a monopolistic situation. 00:26:54:16 00:26:56:00 That's why we don't see many 00:26:56:01 00:26:57:21 platforms surviving in the same 00:26:57:22 00:26:59:10 area. It's typically, one or, at 00:26:59:16 00:27:00:16 most, two. 00:27:01:01 00:27:03:00 It's a really good point and glad 00:27:03:01 00:27:04:08 that you brought up data, because I 00:27:04:09 00:27:05:23 think this has come up 00:27:06:05 00:27:07:15 now in a couple of questions. 00:27:07:19 00:27:09:13 You know, a LinkedIn user in London, 00:27:09:17 00:27:10:14 we don't have your name. 00:27:10:15 00:27:11:14 I apologise. 00:27:12:02 00:27:13:17 You know, has exactly asked about 00:27:13:18 00:27:15:07 sort of the implications when we 00:27:15:11 00:27:17:01 come to think about the data that 00:27:17:12 00:27:19:10 all these platforms of all 00:27:19:13 00:27:21:19 all types really are collecting. 00:27:22:05 00:27:23:02 You know, they've referred 00:27:23:03 00:27:24:24 specifically to to social media 00:27:24:25 00:27:25:25 platforms. 00:27:26:08 00:27:27:09 But I think we could think about 00:27:27:13 00:27:28:24 commerce platforms because there are 00:27:29:00 00:27:30:05 all sorts of context here 00:27:30:20 00:27:32:05 where, you know, to your point, 00:27:32:06 00:27:33:21 Pinar, data is being collected and 00:27:33:22 00:27:34:21 it's a real asset 00:27:36:06 00:27:38:08 for the platforms themselves. 00:27:38:15 00:27:40:20 So, Felipe, I want to ask you about 00:27:40:21 00:27:42:10 about data and about 00:27:42:20 00:27:45:00 data ethics in this context, 00:27:45:01 00:27:46:03 because you're 00:27:47:19 00:27:49:13 an expert on this, you 00:27:49:16 00:27:50:19 teach on our Oxford Exec 00:27:51:15 00:27:53:02 Diploma in AI for Business and 00:27:53:17 00:27:55:16 you teach classes on on 00:27:56:03 00:27:57:03 AI machine learning, applied to 00:27:57:04 00:27:58:03 marketing. 00:27:58:04 00:27:59:13 So school us a little bit on 00:28:00:04 00:28:01:15 these data issues when we think 00:28:01:16 00:28:03:15 about this this asset 00:28:03:16 00:28:04:25 that the platforms themselves are 00:28:05:00 00:28:06:00 generating. 00:28:06:02 00:28:08:05 Yeah. So let's start by just saying 00:28:08:13 00:28:10:14 and simplifying it to it's very 00:28:10:15 00:28:12:13 messy and mostly because 00:28:12:14 00:28:13:25 you can think like we're having a 00:28:14:00 00:28:15:00 global discussion. 00:28:15:04 00:28:16:04 Right. 00:28:16:05 00:28:17:23 And it's very hard to have a global 00:28:17:24 00:28:19:20 discussion when you have 00:28:19:21 00:28:22:04 a localised regional regulations. 00:28:22:15 00:28:24:13 So even the reality of 00:28:24:16 00:28:26:19 one given platform might change 00:28:26:20 00:28:28:14 on what it's required to do 00:28:29:07 00:28:30:11 with respect to data and 00:28:30:12 00:28:31:11 information. 00:28:32:08 00:28:34:02 With that said, that's kind of like 00:28:34:03 00:28:35:20 a table stakes. 00:28:35:22 00:28:37:16 The bare minimum that you need to do 00:28:37:24 00:28:39:11 is to be a legitimate platform 00:28:40:05 00:28:41:05 and kind of behave 00:28:42:10 00:28:44:08 according to the local regulations. 00:28:45:03 00:28:46:17 And at that point, we then expect 00:28:46:18 00:28:48:14 that people, these 00:28:48:15 00:28:50:08 managers are going to take a more 00:28:51:04 00:28:53:01 powerful stance, let's say, 00:28:53:02 00:28:54:07 and that you're going to take a 00:28:54:08 00:28:55:19 higher moral ground and provide 00:28:55:20 00:28:57:14 additional benefits to the customer. 00:28:57:15 00:28:59:05 Now, an interesting thing is 00:28:59:16 00:29:00:20 we have had report after 00:29:01:12 00:29:03:09 report and kind of academic study 00:29:03:10 00:29:05:05 after academic study that shows that 00:29:05:19 00:29:08:03 even though an individual 00:29:08:04 00:29:10:05 might say, I want privacy 00:29:10:06 00:29:12:01 and I want to restrict my access to 00:29:12:02 00:29:13:01 my information, they 00:29:13:22 00:29:15:22 almost never behave in that way. 00:29:15:23 00:29:17:13 So you end up with what's termed the 00:29:17:14 00:29:18:19 privacy paradox, where we 00:29:19:11 00:29:21:01 say we want to be private, we want 00:29:21:02 00:29:22:09 to control the information. 00:29:22:16 00:29:24:04 But when given that opportunity 00:29:24:16 00:29:26:11 to do so, it goes away. 00:29:26:12 00:29:28:09 Right? Like you behave like as 00:29:28:10 00:29:29:16 soon as you're given the opportunity, 00:29:29:17 00:29:30:23 it's like here's my full name, 00:29:30:24 00:29:32:14 address, phone number 00:29:32:22 00:29:34:07 and all the pictures on my phone. 00:29:34:17 00:29:36:05 So it's it's this inconsistency 00:29:36:20 00:29:38:13 that's very difficult to manage and 00:29:38:18 00:29:40:11 then regulate around. 00:29:41:20 00:29:43:12 However, as is everybody mentioned, 00:29:43:23 00:29:46:02 that is a source of 00:29:46:03 00:29:47:20 power for the firms, that the 00:29:47:25 00:29:50:01 larger amount of data that you hold, 00:29:50:08 00:29:52:00 the better the systems that you can 00:29:52:01 00:29:54:02 run, and especially as we have ever, 00:29:54:03 00:29:55:17 ever increasing amount of 00:29:56:02 00:29:57:18 machine learning being introduced 00:29:57:19 00:29:59:07 into the systems to keep you on a 00:29:59:08 00:30:01:06 platform, as Pinar 00:30:01:07 00:30:02:06 mentioned. Right. 00:30:02:07 00:30:03:09 The content is getting better 00:30:03:10 00:30:04:09 because it's learning. 00:30:04:12 00:30:05:17 Learning takes this data. 00:30:05:18 00:30:07:15 So the fight and 00:30:07:16 00:30:08:24 this is related to, I think, 00:30:09:00 00:30:10:08 research that you're alluding to 00:30:10:09 00:30:11:22 that I published last year is 00:30:12:06 00:30:13:21 the fight is over this access and 00:30:14:05 00:30:16:03 permission and control over data. 00:30:16:04 00:30:18:02 So it's not so much that 00:30:18:07 00:30:19:13 customers want privacy is 00:30:20:06 00:30:22:01 that we as a platform want to 00:30:22:02 00:30:23:15 give privacy and create 00:30:23:23 00:30:25:20 barriers to our competitors 00:30:25:21 00:30:27:15 so they can't take our customer data 00:30:27:22 00:30:29:25 away from us or have 00:30:30:04 00:30:32:04 the same access to the data 00:30:32:05 00:30:34:00 that we have so we can run the best 00:30:34:01 00:30:36:03 learning algorithms, generate 00:30:36:04 00:30:37:18 the best content as a result, and 00:30:38:00 00:30:39:07 our competitors are going to be 00:30:39:08 00:30:41:13 running less smart, less informed 00:30:41:14 00:30:43:15 processes and be less competitive 00:30:43:19 00:30:44:19 as a result. 00:30:45:10 00:30:46:16 Felipe, I think that's the super 00:30:46:17 00:30:48:06 interesting when it comes to 00:30:48:15 00:30:50:07 privacy. And one of the things that 00:30:50:15 00:30:51:25 it would be great to add is that 00:30:52:13 00:30:54:15 regulators are very much realising 00:30:54:22 00:30:57:00 how data and access 00:30:57:01 00:30:58:24 to data is kind of really 00:30:59:07 00:31:00:16 driving competition now. 00:31:00:25 00:31:03:03 And we did some research in 00:31:03:04 00:31:04:04 banking recently. 00:31:04:07 00:31:05:22 And what we're seeing, especially in 00:31:05:23 00:31:06:22 the UK and EU, is 00:31:07:18 00:31:09:13 that there's a concept called open 00:31:09:14 00:31:10:13 banking or PSD2, you 00:31:11:11 00:31:12:11 might have heard of it. 00:31:12:15 00:31:13:15 And the basic concept 00:31:14:25 00:31:16:07 is that in order for a new 00:31:17:05 00:31:19:02 player to compete in 00:31:19:03 00:31:20:14 the banking sector, they would 00:31:21:01 00:31:23:01 need access to consumers' 00:31:23:02 00:31:24:03 data in order to offer 00:31:24:24 00:31:26:02 them better services. 00:31:26:03 00:31:27:19 Right. However, consumers wouldn't 00:31:28:03 00:31:29:24 trust a 00:31:29:25 00:31:31:09 newcomer with their data. 00:31:31:18 00:31:33:00 And so what we see is that 00:31:33:19 00:31:35:13 regulators have created this concept 00:31:36:01 00:31:37:10 of open banking, where a new 00:31:37:24 00:31:39:12 player with the consent of a 00:31:39:13 00:31:41:25 consumer can actually electronically 00:31:41:26 00:31:43:24 connect to a bank in 00:31:44:00 00:31:45:01 order to get access to 00:31:46:00 00:31:47:17 your information as long as you've 00:31:47:23 00:31:49:08 given electronic consent, to 00:31:49:22 00:31:51:16 then look at that and say, OK, based 00:31:52:01 00:31:53:22 on the way that you're spending 00:31:53:23 00:31:55:13 money, you really should be doing 00:31:55:19 00:31:57:08 this or that in order to pay off 00:31:57:11 00:31:59:09 your loan sooner. 00:31:59:20 00:32:00:23 And so what we start to 00:32:01:18 00:32:02:18 see really is that 00:32:03:17 00:32:04:20 this type of regulation 00:32:05:25 00:32:07:18 that - regulators are understanding 00:32:07:23 00:32:09:08 the importance of data, first of 00:32:09:09 00:32:10:17 all, but also that consumers 00:32:12:01 00:32:14:03 are really starting to 00:32:14:12 00:32:16:20 make choices in terms of whether 00:32:16:21 00:32:18:10 they want to share data in order 00:32:18:16 00:32:20:12 to get access to better 00:32:20:13 00:32:21:07 services. 00:32:21:08 00:32:22:22 And interestingly, the research has 00:32:22:23 00:32:24:15 found that the most activity in 00:32:24:16 00:32:26:01 terms of data sharing in open 00:32:26:10 00:32:27:15 banking is in lending. 00:32:27:23 00:32:30:01 And apparently it's because 00:32:30:09 00:32:31:12 now it makes sense to us. 00:32:31:13 00:32:33:04 But we didn't realise at that time 00:32:33:12 00:32:35:01 that in order when people are in 00:32:35:12 00:32:37:12 need of money, they're actually 00:32:37:17 00:32:39:06 ready to get a little bit out of 00:32:39:07 00:32:41:01 their comfort zone and share more 00:32:41:02 00:32:42:24 details in order to get a better 00:32:43:00 00:32:44:25 rate. And so it seems like 00:32:44:26 00:32:46:13 even this data and 00:32:47:10 00:32:49:09 data sharing and privacy 00:32:49:10 00:32:51:01 issues are really transforming the 00:32:51:02 00:32:52:04 banking sector as well. 00:32:53:06 00:32:54:21 You're listening to Leadership in 00:32:54:23 00:32:56:14 Extraordinary Times with me, Peter 00:32:56:15 00:32:57:15 Tufano. 00:32:57:25 00:32:59:18 In this episode, we're talking about 00:32:59:19 00:33:01:19 digital platforms with my Oxford 00:33:01:20 00:33:03:16 Said colleagues, Dr. Cammy Crolic, 00:33:04:06 00:33:05:15 Associate Professor of Marketing, 00:33:06:05 00:33:08:09 Professor Pinar Ozcan, Professor 00:33:08:10 00:33:09:23 of Entrepreneurship and Innovation, 00:33:10:10 00:33:12:08 Dr. Felipe Thomaz, Associate 00:33:12:09 00:33:13:09 Professor of Marketing. 00:33:13:20 00:33:15:05 And Professor Andrew Stephen, 00:33:15:08 00:33:16:20 Associate Dean of Research and 00:33:16:24 00:33:18:07 L'Oreal Professor of Marketing, 00:33:18:11 00:33:19:16 who's chairing this discussion. 00:33:21:02 00:33:22:11 The panel is now going to turn to 00:33:22:12 00:33:24:00 the complex relationship between 00:33:24:04 00:33:25:16 companies such as Facebook and 00:33:25:17 00:33:27:04 Google and the news industry. 00:33:28:04 00:33:29:18 We've seen this hit the headlines in 00:33:29:19 00:33:31:07 Australia following the government's 00:33:31:08 00:33:32:23 new legislation, a news publisher 00:33:33:06 00:33:34:06 writes. 00:33:34:19 00:33:36:06 So are the Facebook and Googles of 00:33:36:07 00:33:38:01 this world Saints, out there to 00:33:38:02 00:33:39:14 support journalism, or are they 00:33:39:17 00:33:40:23 Sinners who are just stealing the 00:33:40:24 00:33:42:01 local press's lunch? 00:33:42:25 00:33:44:18 Felipe Thomaz is up first. 00:33:45:12 00:33:47:04 It's difficult to put 00:33:47:21 00:33:49:09 in those clean buckets. 00:33:49:10 00:33:51:02 One of the concerns that I have 00:33:51:13 00:33:53:18 around this is the transfer 00:33:53:19 00:33:55:14 of responsibility towards the 00:33:55:15 00:33:56:18 platforms themselves 00:33:57:20 00:33:59:04 in the sense that they are not 00:33:59:08 00:34:00:10 journalistic entities. 00:34:00:17 00:34:02:13 Right. So when we say if 00:34:02:19 00:34:04:10 somebody like Facebook, for example, 00:34:04:11 00:34:05:17 or Google is suddenly held 00:34:06:05 00:34:07:18 responsible for the user generated 00:34:07:19 00:34:09:04 content or shared content, 00:34:10:07 00:34:11:07 then suddenly you're transferring - 00:34:11:15 00:34:13:01 you made them the police, the 00:34:13:12 00:34:15:16 agent in this relationship that has 00:34:15:17 00:34:17:19 the responsibility to validate 00:34:17:20 00:34:20:02 and confirm and then censor 00:34:20:03 00:34:21:06 and remove content that 00:34:22:01 00:34:23:13 they deem to be inappropriate. 00:34:24:10 00:34:26:10 And that's where I start to feel 00:34:26:11 00:34:27:22 slightly uncomfortable, where 00:34:28:18 00:34:30:11 at no point did I 00:34:30:16 00:34:32:02 allow or 00:34:32:12 00:34:33:23 give permission to Facebook to 00:34:34:07 00:34:36:04 curate content in that sense 00:34:36:23 00:34:38:12 to provide restrictions or not. 00:34:38:13 00:34:40:00 And that's true on all sorts of 00:34:40:05 00:34:42:01 things, not just on news, but 00:34:42:06 00:34:43:15 all sorts of other content that 00:34:43:16 00:34:45:03 they've been asked to manage. 00:34:45:10 00:34:46:21 I'm not sure I'm comfortable with 00:34:46:22 00:34:48:04 putting them in charge of what is 00:34:48:05 00:34:49:04 acceptable. 00:34:49:24 00:34:51:21 OK, Cammy, do you have 00:34:51:22 00:34:53:09 a view on this? 00:34:54:03 00:34:55:20 Yeah, sure, I'm going to take a 00:34:55:21 00:34:57:18 little bit of an opposing view 00:34:57:25 00:34:58:25 with Felipe, 00:34:59:23 00:35:01:03 and I totally respect and 00:35:01:18 00:35:02:19 understand his point. 00:35:02:21 00:35:04:01 And I think that that is an 00:35:04:02 00:35:05:17 important red flag to consider. 00:35:06:02 00:35:07:08 But I think that thinking 00:35:07:24 00:35:09:19 about the 00:35:09:20 00:35:11:18 onus that is on 00:35:11:23 00:35:14:06 these social media platforms, 00:35:14:09 00:35:15:22 because we know that people are 00:35:16:00 00:35:17:25 disseminating news or 00:35:18:01 00:35:19:23 user generated news, and 00:35:19:24 00:35:20:25 the problem is that we 00:35:21:20 00:35:22:24 often have a really good 00:35:23:20 00:35:24:21 memory for the content 00:35:25:20 00:35:27:18 of what we're reading, but 00:35:27:19 00:35:29:22 a really poor memory for 00:35:29:23 00:35:32:05 the source, and especially 00:35:32:06 00:35:34:08 if that is a sketchy 00:35:34:09 00:35:36:02 source, it's very easy 00:35:36:05 00:35:38:07 for us to remember the information, 00:35:38:14 00:35:40:10 but no longer add the tag of 00:35:40:11 00:35:41:23 all this was from an incredible 00:35:42:02 00:35:42:24 place. 00:35:43:00 00:35:44:24 And so to some sense, it's 00:35:45:00 00:35:45:25 really for psychology 00:35:47:01 00:35:48:10 is working against us and we 00:35:48:25 00:35:51:20 do need some help curating 00:35:52:03 00:35:54:15 accurate and reliable 00:35:54:16 00:35:55:24 sources of information. 00:35:56:00 00:35:57:04 And so whether or not we 00:35:57:23 00:35:59:01 want to put 00:35:59:19 00:36:01:20 Facebook in charge of that, that's 00:36:01:21 00:36:02:24 kind of a separate question. 00:36:03:00 00:36:04:12 But we do need to realise how 00:36:04:13 00:36:06:12 important it is because we have 00:36:06:13 00:36:09:03 this tendency to remember 00:36:09:04 00:36:10:04 this, oh, so-and-so 00:36:11:14 00:36:12:22 did something horrible and that 00:36:12:23 00:36:14:12 could be totally not true, but we 00:36:14:13 00:36:16:03 don't look at the fact that it was 00:36:16:04 00:36:17:09 like a very right or left 00:36:18:01 00:36:19:04 wing source or one that 00:36:19:22 00:36:21:06 doesn't follow up or check its 00:36:21:07 00:36:22:07 sources. 00:36:22:15 00:36:24:22 So so we do have a certain 00:36:24:23 00:36:26:00 amount of necessity for 00:36:27:03 00:36:29:04 somebody curating the content. 00:36:30:01 00:36:31:01 And so should that be 00:36:31:21 00:36:33:20 regulated, I suppose then and 00:36:33:21 00:36:35:07 this is this is popped up in the 00:36:35:08 00:36:37:05 chat, a number of you sort of raised 00:36:37:12 00:36:38:25 the question of regulation 00:36:39:10 00:36:40:19 we touched on already a little bit 00:36:40:20 00:36:42:12 that have have brought it 00:36:42:22 00:36:43:22 back into the fore, I suppose. 00:36:43:23 00:36:44:22 ANd so Pinar, I'm 00:36:45:19 00:36:47:18 curious what you think, you 00:36:47:19 00:36:49:02 know, how do we actually I mean, 00:36:49:03 00:36:50:01 whether whether we're talking about 00:36:50:02 00:36:51:15 regulating social media or maybe 00:36:51:16 00:36:53:12 something more general than that, 00:36:54:04 00:36:56:13 how can we best govern 00:36:56:25 00:36:59:04 digital platforms so that 00:36:59:05 00:37:00:24 we in some sense 00:37:01:05 00:37:02:07 maximise the positives 00:37:03:08 00:37:04:15 and minimise the potential 00:37:04:16 00:37:05:15 negatives? 00:37:05:20 00:37:07:17 Sure. I think that, first of 00:37:07:18 00:37:09:01 all, regulating social media 00:37:09:23 00:37:10:23 and content on 00:37:11:20 00:37:12:23 social media is - 00:37:13:16 00:37:15:02 obviously needs to need to be 00:37:15:13 00:37:16:24 there to a certain extent. 00:37:17:04 00:37:18:15 And that's quite different from 00:37:18:16 00:37:20:18 regulating platforms in general, 00:37:20:21 00:37:21:21 I think. 00:37:21:23 00:37:23:24 When it comes to social media, I 00:37:24:04 00:37:25:22 definitely agree with my panellists 00:37:26:02 00:37:27:22 in that there's a big 00:37:27:23 00:37:29:12 responsibility that falls on the 00:37:29:16 00:37:31:19 shoulders of Facebook and Google, 00:37:32:05 00:37:33:24 and that, in a sense, 00:37:34:04 00:37:36:06 puts them also at a conflict because 00:37:36:07 00:37:38:03 on the one hand, imagine Facebook 00:37:38:04 00:37:39:04 especially, is almost 00:37:40:03 00:37:41:19 like being able to hear a million 00:37:42:08 00:37:43:23 people speak at the same time and 00:37:44:05 00:37:45:10 there will be dissonance. 00:37:45:18 00:37:47:11 We now have access to 00:37:47:17 00:37:49:13 the opinions of many more people. 00:37:49:14 00:37:51:03 Normally, I wouldn't know what my 00:37:51:04 00:37:52:19 former colleagues somewhere else in 00:37:52:20 00:37:54:11 the world are thinking now, but now 00:37:54:12 00:37:55:11 I do. 00:37:55:12 00:37:56:14 And so in a sense, that 00:37:57:14 00:37:59:05 is not easy to 00:37:59:16 00:38:00:23 manage because there will be 00:38:00:24 00:38:01:22 conflicting views. 00:38:01:23 00:38:03:05 However, of course, false 00:38:03:06 00:38:04:20 information. And as Cammy rightly 00:38:05:09 00:38:06:24 said, you know, us not being able 00:38:07:10 00:38:08:18 to remember the source of the 00:38:08:19 00:38:10:07 information, but the information 00:38:10:08 00:38:12:03 itself is actually creating 00:38:12:04 00:38:13:10 a trend of pushing us into 00:38:13:24 00:38:15:24 directions that we actually don't 00:38:16:06 00:38:17:16 understand that we're being pushed 00:38:17:17 00:38:19:09 into. So I think that 00:38:19:12 00:38:21:08 it is a very messy you know, 00:38:21:11 00:38:22:17 it's just like Felipe said, it is a 00:38:22:18 00:38:23:18 very messy topic. 00:38:24:01 00:38:25:11 And I don't think that regulators 00:38:25:18 00:38:27:00 are going to figure that out very 00:38:27:01 00:38:28:11 quickly, but it is definitely 00:38:29:00 00:38:30:16 very important. Now, the other side 00:38:30:17 00:38:31:23 of regulating platforms is 00:38:32:09 00:38:34:05 regulating the monopolistic power 00:38:34:06 00:38:35:05 of platforms. 00:38:35:06 00:38:36:15 And that, as we said before, it 00:38:36:16 00:38:38:11 comes from just the sheer amount of 00:38:38:12 00:38:39:11 data that they have. 00:38:39:17 00:38:41:16 And a lot of platforms like Google, 00:38:41:17 00:38:43:07 Facebook and Amazon 00:38:43:12 00:38:45:01 have actually started to go into 00:38:45:08 00:38:47:03 even regulated industries. 00:38:47:15 00:38:48:24 You may not know that Google 00:38:49:04 00:38:50:11 actually has a partnership 00:38:51:15 00:38:52:20 with NHS where NHS trusts 00:38:53:14 00:38:54:22 them with data, and then in 00:38:55:13 00:38:57:05 turn they give AI-based diagnostic 00:38:57:22 00:38:59:20 services. Now that saves lives and 00:39:00:01 00:39:01:23 that also saves a lot of money to 00:39:01:24 00:39:02:23 the NHS. 00:39:02:24 00:39:04:23 But this shows that even in highly 00:39:04:24 00:39:06:16 regulated industries, these 00:39:06:17 00:39:08:05 platforms are becoming basically 00:39:08:17 00:39:09:24 the data keepers. 00:39:10:03 00:39:11:22 In a recent paper, we call this a 00:39:12:04 00:39:13:20 digital colonisation of these 00:39:14:04 00:39:15:19 highly regulated industries 00:39:16:04 00:39:18:03 where the people who are 00:39:18:08 00:39:20:01 in access of that data are starting 00:39:20:09 00:39:21:18 to rule that industry. 00:39:22:00 00:39:23:23 And so what the EU and 00:39:24:05 00:39:26:03 other other parts of the world 00:39:26:04 00:39:27:13 are trying to figure out right now 00:39:27:14 00:39:29:09 is how much should data 00:39:29:15 00:39:31:08 be? How much should these platforms 00:39:31:19 00:39:33:05 be allowed to use data across 00:39:33:24 00:39:34:23 industries? 00:39:34:24 00:39:36:04 And if we allow that, how 00:39:37:00 00:39:38:13 can we break their monopolistic 00:39:38:14 00:39:39:23 power across industries? 00:39:39:24 00:39:41:23 So I think that's also a very 00:39:41:24 00:39:43:17 messy topic, but very much important 00:39:44:02 00:39:45:07 for us to to figure out soon. 00:39:45:17 00:39:47:05 Otherwise, we will end up with no 00:39:47:06 00:39:48:09 companies other than these 00:39:48:17 00:39:49:17 platforms. 00:39:50:00 00:39:51:14 It's a very good point, although I 00:39:51:15 00:39:53:02 do wonder, I think I might want 00:39:53:18 00:39:55:07 - and I think the context or the 00:39:55:14 00:39:56:20 use cases matter here. 00:39:56:21 00:39:58:03 So if we're talking about, you know, 00:39:58:21 00:40:00:21 AI-driven health 00:40:00:22 00:40:02:22 diagnostic services, I probably 00:40:02:23 00:40:04:06 would want, you know, a very 00:40:05:00 00:40:06:25 advanced company 00:40:07:10 00:40:09:02 that's very good at working with 00:40:09:03 00:40:10:13 data and and processing large 00:40:10:22 00:40:12:21 amounts of data, such as a Google of 00:40:12:22 00:40:14:06 the world to be doing that as 00:40:14:17 00:40:17:01 opposed to Andrew's Start-Up 00:40:17:08 00:40:18:05 in my garage. 00:40:18:06 00:40:19:24 But but I think I think the point is 00:40:20:00 00:40:21:12 well taken in terms of how do we 00:40:21:24 00:40:23:07 how do we balance that sort 00:40:23:20 00:40:25:13 of that deep expertise, which, 00:40:25:21 00:40:27:21 you know, we sort of see the rich 00:40:27:22 00:40:29:16 getting richer or the the massive 00:40:29:17 00:40:31:11 advantages these tech powerhouses 00:40:31:12 00:40:32:11 have balance that and 00:40:33:10 00:40:34:21 the good that they can bring with 00:40:35:10 00:40:37:03 the risk that we are putting all of 00:40:37:06 00:40:38:13 our eggs in one basket and heading 00:40:38:14 00:40:40:11 into a very monopolistic world, 00:40:40:12 00:40:42:06 which is probably not going to be a 00:40:42:07 00:40:43:06 good thing. 00:40:43:15 00:40:44:18 But that's all sort of talking about 00:40:44:19 00:40:46:02 this in the regulated space. 00:40:46:09 00:40:47:19 And Felipe, I want to I want to ask 00:40:47:20 00:40:49:12 you about, you know, 00:40:49:18 00:40:51:02 I guess the unregulated Wild 00:40:51:13 00:40:52:23 West also where 00:40:53:17 00:40:54:22 where people are anonymous. 00:40:54:23 00:40:56:04 And I'm thinking about the dark web 00:40:56:05 00:40:58:06 and your research on the Dark 00:40:58:07 00:40:59:18 Web here, because I think it's 00:41:00:06 00:41:01:13 it's important for us to think 00:41:01:14 00:41:03:01 about, you know, we're not just 00:41:03:11 00:41:04:18 in a world where there are 00:41:05:08 00:41:06:12 eyes on what's going on. 00:41:06:22 00:41:08:01 There is a whole other world out 00:41:08:02 00:41:10:01 there. So through that lens, 00:41:10:02 00:41:11:12 what do you what are your thoughts 00:41:11:13 00:41:13:02 on the sort of issues we're talking 00:41:13:03 00:41:14:03 about right now? 00:41:14:09 00:41:16:11 Yeah. So the the dark web gives 00:41:16:12 00:41:17:12 us a very interesting 00:41:18:13 00:41:20:03 counterpoint to this discussion. 00:41:20:05 00:41:22:01 Right. So the dark 00:41:22:02 00:41:23:18 web is where there's no regulatory 00:41:23:19 00:41:25:13 eye I mean, you might have law 00:41:25:20 00:41:27:01 enforcement chasing people down, but 00:41:27:08 00:41:28:23 you can think of it as a hyper 00:41:28:24 00:41:29:24 private environment, meaning 00:41:30:21 00:41:32:16 that I have technologies in place to 00:41:32:17 00:41:33:16 really disguise you 00:41:34:15 00:41:36:11 to the best extent that we 00:41:36:12 00:41:38:13 can, if you follow proper procedure. 00:41:39:08 00:41:41:04 And really any data leakage 00:41:41:05 00:41:42:17 that you have as a consumer is 00:41:42:18 00:41:43:24 essentially your own fault. 00:41:44:02 00:41:45:17 Right. So it's a mistake that you 00:41:45:18 00:41:47:16 made that information about 00:41:47:23 00:41:49:20 was made available to the platform, 00:41:49:23 00:41:50:18 even. 00:41:50:19 00:41:52:08 So, it is as good as a protection 00:41:52:14 00:41:54:08 as we can get. And if you know about 00:41:54:09 00:41:56:10 the dark web and the onion router, 00:41:56:11 00:41:57:20 so on, that was like Navy 00:41:58:02 00:41:59:13 development from back in the day on 00:41:59:14 00:42:00:24 how you manage communication. 00:42:01:00 00:42:02:07 So it is as secure as they could 00:42:02:08 00:42:04:01 make it, and that's as private as we 00:42:04:03 00:42:05:21 can make it and also as unregulated 00:42:06:01 00:42:07:20 as we can make it, because the black 00:42:08:02 00:42:09:18 markets that I study in the Dark Web 00:42:09:19 00:42:11:09 are the ones that are trafficking in 00:42:12:01 00:42:14:01 humans and drugs and 00:42:14:02 00:42:15:02 weapons, et cetera. 00:42:15:03 00:42:16:11 So it's really where there's 00:42:17:01 00:42:19:05 no no enforcement, 00:42:19:06 00:42:20:06 really. 00:42:20:16 00:42:22:06 So you end up with a weird system 00:42:22:08 00:42:24:07 where people are free to do - it's 00:42:24:08 00:42:25:07 like the libertarian 00:42:26:08 00:42:27:23 dream in a sense of like anything 00:42:28:07 00:42:30:12 goes and I'm able to self-determine 00:42:30:14 00:42:32:01 and there's no higher power that 00:42:32:02 00:42:33:15 tells me exactly what to do. 00:42:34:07 00:42:35:22 You get with a very interesting 00:42:35:23 00:42:38:04 environment and that the 00:42:38:22 00:42:40:14 market itself, the systems, the 00:42:40:15 00:42:42:04 platforms themselves 00:42:42:11 00:42:44:09 are incredibly efficient. 00:42:45:03 00:42:46:19 They're probably better than most of 00:42:46:20 00:42:48:10 the platforms that we have in terms 00:42:48:11 00:42:50:19 of finding information, acquiring 00:42:50:20 00:42:51:19 a good - it's just as 00:42:52:19 00:42:54:22 easy and as simple as it gets 00:42:54:23 00:42:56:04 as compared to the ones that we 00:42:56:05 00:42:57:18 have. So they function better. 00:42:58:06 00:42:59:25 There's also absolutely no 00:42:59:26 00:43:01:08 protection for the consumer 00:43:01:09 00:43:02:24 whatsoever and that most of these 00:43:03:01 00:43:04:21 platforms end up being scams 00:43:05:14 00:43:07:02 and they're just designed as 00:43:07:17 00:43:09:13 very large, complex honeypots 00:43:10:03 00:43:11:07 that are going to come in and take 00:43:11:08 00:43:12:24 everybody's money or is just as the 00:43:13:00 00:43:14:11 whole system is like, as you 00:43:14:12 00:43:15:21 mentioned, a Wild West and a 00:43:16:00 00:43:17:01 perverse lottery. 00:43:17:08 00:43:19:04 So what do I get out of it 00:43:19:12 00:43:20:12 saying this is the 00:43:21:08 00:43:23:04 solution, purely privacy 00:43:23:05 00:43:24:22 and just giving privacy and making 00:43:24:23 00:43:26:11 everything a security private as 00:43:26:15 00:43:27:15 possible? 00:43:27:20 00:43:29:17 No, it isn't, because even 00:43:29:24 00:43:31:12 within that system, what we see 00:43:31:24 00:43:33:18 is rampant exploitation. 00:43:34:10 00:43:35:14 People are getting what they want, 00:43:35:21 00:43:37:08 they're existing and exchanging. 00:43:37:12 00:43:38:24 But you still have mass loss of 00:43:39:07 00:43:40:23 property and rights, 00:43:41:07 00:43:42:13 even though you don't know who 00:43:42:14 00:43:43:11 you're dealing with. 00:43:43:12 00:43:45:04 And trust is algorithmic as opposed 00:43:45:13 00:43:47:03 to social as we construct it to 00:43:47:04 00:43:48:04 that. So just forcing 00:43:49:03 00:43:51:00 privacy, based on 00:43:51:01 00:43:52:09 what we know from behaviour in the 00:43:52:10 00:43:53:16 dark web, in itself is not 00:43:54:08 00:43:55:06 a solution, even though it's 00:43:55:07 00:43:56:17 something that we all kind of as in 00:43:56:18 00:43:58:01 this conversation we've been asking 00:43:58:02 00:43:59:01 for. 00:43:59:02 00:44:00:09 So I think it's there's plenty that 00:44:00:10 00:44:01:05 we can learn there. 00:44:01:06 00:44:02:16 And I'm just sort of thinking about 00:44:02:17 00:44:04:04 - reflecting on 00:44:04:15 00:44:05:14 what we've been talking about for 00:44:05:15 00:44:06:22 the last 50 minutes here. 00:44:07:06 00:44:08:16 We've heard a lot of the goods, a 00:44:08:17 00:44:09:22 lot of the sort of the Saints, 00:44:10:02 00:44:11:19 consistent with the poll that we had 00:44:11:20 00:44:12:19 earlier on. 00:44:12:22 00:44:14:13 And they seem to me more 00:44:14:22 00:44:16:15 in some sense, concrete things that 00:44:16:18 00:44:17:22 we can point to and say, yeah, but 00:44:17:23 00:44:19:08 you can do this or you could do 00:44:19:09 00:44:21:04 that. And then a lot of the 00:44:21:05 00:44:22:24 sort of the the negatives or the 00:44:23:22 00:44:24:24 Sinner elements that that we're 00:44:25:00 00:44:26:09 thinking about are fears, 00:44:26:21 00:44:28:00 almost, or we're worried 00:44:29:04 00:44:30:07 about, for example, how 00:44:30:25 00:44:32:15 data might be used and therefore we 00:44:32:16 00:44:33:19 look to increase privacy 00:44:34:24 00:44:36:23 or perhaps increase regulation or 00:44:36:24 00:44:38:21 new types of regulation as 00:44:38:22 00:44:40:19 sort of our safety nets there. 00:44:40:20 00:44:42:09 But it's almost kind of like, you 00:44:42:10 00:44:43:19 know, the bogeyman is 00:44:44:11 00:44:46:02 maybe hiding in my bedroom 00:44:46:10 00:44:47:11 cubpoard or under the bed or something 00:44:47:12 00:44:48:21 like that. I can all see the good 00:44:48:22 00:44:50:05 stuff, and I'm worried about the 00:44:50:06 00:44:52:01 negative stuff. I'm going to 00:44:52:02 00:44:53:17 come to you, Cammy, about this as 00:44:53:22 00:44:55:17 sort of our resident psychologist 00:44:56:13 00:44:58:07 on the panel to sort of think about, 00:44:58:09 00:45:00:08 again, this this framing that 00:45:00:09 00:45:01:17 we have, because, you know, 00:45:02:09 00:45:03:08 the facts are the facts. 00:45:03:09 00:45:05:05 Billions of people use - if we think 00:45:05:06 00:45:06:23 about consumers - billions of people 00:45:06:24 00:45:08:16 use all the major digital platforms 00:45:09:11 00:45:10:14 every single day. 00:45:11:10 00:45:13:11 And we communicate through them and 00:45:13:17 00:45:15:05 we we sort of give our data, we 00:45:15:15 00:45:16:21 expect some kind of value 00:45:17:16 00:45:18:23 in return, maybe expect some 00:45:18:24 00:45:19:23 privacy. 00:45:19:24 00:45:21:10 But people always say they're very 00:45:21:11 00:45:23:01 deeply concerned about these things. 00:45:23:02 00:45:24:21 And and indeed, they get 00:45:25:02 00:45:26:14 talked about quite a lot. 00:45:27:12 00:45:29:09 So are we, are 00:45:29:10 00:45:30:11 we worried too much or 00:45:32:05 00:45:33:24 is sort of what we're talking about 00:45:34:00 00:45:35:04 here actually, a pretty healthy 00:45:35:05 00:45:37:07 reflection of balancing 00:45:37:08 00:45:38:24 the pros and the cons or the 00:45:39:09 00:45:40:16 the value in the risks? 00:45:40:21 00:45:42:12 You know, people use these things, 00:45:42:13 00:45:43:15 but they're worried about these 00:45:43:16 00:45:45:10 things. So how do we reconcile that, 00:45:45:24 00:45:46:24 Cammy? 00:45:47:12 00:45:48:20 Yeah, I think that's a really great 00:45:48:21 00:45:50:18 question. I think, I 00:45:50:19 00:45:52:05 like the complexity with which 00:45:52:19 00:45:54:02 this kind of discussion has 00:45:55:06 00:45:57:09 uncovered, because that's very 00:45:57:10 00:45:59:05 reflective of what the 00:45:59:06 00:46:00:23 research shows and 00:46:01:14 00:46:02:17 really the picture that we're 00:46:02:18 00:46:03:17 looking at. So 00:46:04:14 00:46:05:24 I think if we were to look at this 00:46:05:25 00:46:07:25 as focussing 00:46:07:26 00:46:10:06 solely on Sinner and maybe implying 00:46:10:07 00:46:11:25 that the negatives are just worries 00:46:12:17 00:46:14:15 is a little unbalanced, I think 00:46:14:22 00:46:16:15 there's a lot of research that shows 00:46:16:16 00:46:18:08 that there are real, true negatives 00:46:18:13 00:46:20:07 associated with 00:46:20:17 00:46:21:18 digital platforms. 00:46:21:19 00:46:23:12 And obviously my particular interest 00:46:23:22 00:46:25:04 is in social media. 00:46:25:17 00:46:27:14 And we know that kind of 00:46:27:15 00:46:28:18 how we use them or even 00:46:29:15 00:46:31:10 how the platforms 00:46:31:11 00:46:33:11 organise themselves or or design 00:46:33:12 00:46:35:25 themselves, tend to 00:46:36:02 00:46:37:02 create more 00:46:38:02 00:46:39:23 positive or more negative outcomes, 00:46:39:24 00:46:41:17 and I think that it's very important 00:46:41:19 00:46:43:21 that we're cognisant of both sides 00:46:44:11 00:46:45:18 because it truly is 00:46:46:10 00:46:47:16 some of both, really. 00:46:48:17 00:46:49:15 And Pinar, how about from the 00:46:49:16 00:46:51:05 entrepreneurship standpoint, how 00:46:51:14 00:46:52:22 do we reconcile these different 00:46:52:23 00:46:53:22 views? 00:46:54:02 00:46:55:05 I think from from the 00:46:55:06 00:46:56:18 entrepreneurship perspective, it's 00:46:56:19 00:46:58:01 actually really changed the 00:46:58:07 00:46:59:24 competitive landscape in many 00:47:00:00 00:47:01:17 industries. And that is in a sense, 00:47:01:20 00:47:03:15 it's not just the worry, it is - we 00:47:03:16 00:47:04:16 see this happening. 00:47:04:25 00:47:06:22 And so, for example, we have 00:47:07:05 00:47:08:16 a wonderful incubator over at 00:47:09:04 00:47:10:14 the Business School called Creative 00:47:10:15 00:47:11:14 Destruction Lab. 00:47:11:21 00:47:13:18 And what we see there 00:47:13:19 00:47:15:01 is when these entrepreneurs 00:47:15:21 00:47:17:12 come in with their data driven 00:47:17:13 00:47:19:12 ventures and they have so much 00:47:19:13 00:47:21:11 they can offer, one of the first 00:47:21:12 00:47:23:05 things that they hear from investors 00:47:23:24 00:47:24:24 is why do you think 00:47:25:20 00:47:27:19 Google or Amazon couldn't 00:47:27:20 00:47:29:09 do this? Couldn't they kill your 00:47:29:10 00:47:30:12 business within a week? 00:47:30:19 00:47:32:02 And that is a real worry, I 00:47:32:16 00:47:33:20 think, when it comes to 00:47:33:21 00:47:36:10 entrepreneurship, what companies, 00:47:36:11 00:47:38:04 what entrepreneurs really need to 00:47:38:05 00:47:39:20 worry about right now is 00:47:40:03 00:47:42:07 can I actually stand against 00:47:42:08 00:47:43:10 Google or Amazon? 00:47:43:11 00:47:44:23 Or, imagine that you're selling 00:47:45:00 00:47:46:25 something, a particular product; 00:47:47:09 00:47:48:23 when you get to Amazon, and once 00:47:49:06 00:47:51:02 you start to become successful, 00:47:51:09 00:47:53:05 will Amazon just not manufacture 00:47:53:06 00:47:54:18 it themselves, just like they do 00:47:54:19 00:47:55:23 with many other products? 00:47:56:09 00:47:57:21 And so, you know, we are seeing 00:47:58:06 00:48:00:05 more and more Amazon brands on 00:48:00:06 00:48:01:06 the platform, etc. 00:48:01:10 00:48:02:13 And so I think that, from an 00:48:02:14 00:48:04:12 entrepreneurship perspective, there 00:48:04:13 00:48:05:22 is a real worry when it comes 00:48:06:14 00:48:08:04 to the monopoly of these big tech 00:48:08:10 00:48:09:10 platforms. 00:48:09:16 00:48:11:08 And so I guess unless you're from an 00:48:11:09 00:48:12:19 entrepreneurship perspective, unless 00:48:12:20 00:48:14:06 you're building your own platform 00:48:14:07 00:48:15:18 and finding the way to build that 00:48:15:19 00:48:17:08 market, it's a really fair point. 00:48:17:21 00:48:18:25 You are putting a lot of 00:48:20:02 00:48:21:20 trust, and therefore, the flip side 00:48:21:21 00:48:23:01 of trust is in terms of risk 00:48:23:18 00:48:25:07 in what those platforms might do 00:48:25:16 00:48:26:11 to your business. 00:48:26:12 00:48:28:02 I guess for me as a marketing 00:48:28:03 00:48:30:02 professor, it's a good reminder 00:48:30:03 00:48:31:23 that we need to remember that we're 00:48:32:06 00:48:33:12 we're not just thinking of these 00:48:33:13 00:48:35:02 platforms as as, in essence, 00:48:35:03 00:48:36:06 distribution channels. 00:48:36:12 00:48:38:06 But if you can build a strong brand, 00:48:38:08 00:48:39:08 for example, develop 00:48:40:07 00:48:42:07 a very loyal customer base, it 00:48:42:08 00:48:43:09 means that whatever it is you're 00:48:43:10 00:48:45:11 offering through an Amazon 00:48:45:12 00:48:47:08 or other kinds of platforms is 00:48:47:09 00:48:48:21 less of a commodity that can be 00:48:49:01 00:48:50:19 easily copied and 00:48:51:02 00:48:52:08 have you'll have your lunch taken 00:48:52:09 00:48:53:16 from you, so to speak. 00:48:53:17 00:48:54:25 So I think that's 00:48:55:22 00:48:57:02 going to bring us to 00:48:58:03 00:48:59:23 maybe segue into sort 00:48:59:24 00:49:01:04 of some closing thoughts. 00:49:01:11 00:49:02:20 Because what I wanted to say then 00:49:02:21 00:49:04:22 just go through the panel again 00:49:04:23 00:49:06:02 to to ask about is where 00:49:06:20 00:49:08:16 do we go next with all of this? 00:49:08:24 00:49:09:22 We've got these different 00:49:09:23 00:49:11:11 perspectives and we can think about 00:49:11:12 00:49:12:19 the the good and the bad or 00:49:13:10 00:49:14:10 the Saints and the Sinners. 00:49:14:15 00:49:16:17 But how do we build 00:49:17:07 00:49:18:22 a business community and build a 00:49:18:23 00:49:20:07 society? Quite frankly, that 00:49:20:24 00:49:22:19 really does optimise, 00:49:22:20 00:49:24:06 because I think platforms have 00:49:24:14 00:49:26:10 always been here as marketplaces, as 00:49:26:11 00:49:27:18 one example. So the concept 00:49:28:07 00:49:29:12 of platforms and certainly in the 00:49:29:13 00:49:31:12 digital world, digital platforms 00:49:31:13 00:49:32:11 is here to stay. 00:49:32:12 00:49:33:24 So how do we keep on making the 00:49:34:07 00:49:36:12 most of the many 00:49:36:19 00:49:38:06 opportunities while minimising 00:49:39:01 00:49:40:06 those those risks? 00:49:40:07 00:49:42:02 And so I'll just go through 00:49:42:03 00:49:43:10 all panel members to to get 00:49:43:23 00:49:45:17 your closing thoughts here 00:49:45:23 00:49:46:22 in the last few minutes. 00:49:46:23 00:49:48:08 Felipe, you're first. 00:49:49:13 00:49:51:05 Just to borrow, I guess from 00:49:51:10 00:49:52:10 Pinar's last point is 00:49:53:06 00:49:54:23 like, does this mean to me there's 00:49:55:03 00:49:56:15 been this fascinating transition? 00:49:56:16 00:49:58:07 And if you're thinking in terms of 00:49:58:13 00:50:00:07 like classic business management and 00:50:00:13 00:50:02:04 power in the supply chain kind of 00:50:02:05 00:50:02:24 arguments. Right. 00:50:03:00 00:50:04:18 Like you've just had this relocation 00:50:05:06 00:50:06:20 of power towards a concentration 00:50:07:07 00:50:08:18 on the platforms, which wasn't 00:50:09:05 00:50:10:15 necessarily the case before, when it 00:50:10:16 00:50:11:20 was just manufacturing, I 00:50:12:11 00:50:14:04 don't expect that's going to change. 00:50:14:05 00:50:15:12 I don't expect that we're suddenly 00:50:15:13 00:50:17:11 going to have a shift, 00:50:18:01 00:50:19:09 that we're going to wake up tomorrow 00:50:19:10 00:50:20:16 and they're not going to be a 00:50:20:17 00:50:22:19 powerful player in the sense, 00:50:22:21 00:50:24:03 specifically because of all the 00:50:24:04 00:50:25:16 things that we said, that what they 00:50:25:17 00:50:27:19 allow in terms of 00:50:27:20 00:50:29:14 ability for new businesses 00:50:29:15 00:50:31:10 or existing businesses to leverage, 00:50:31:18 00:50:33:14 and they can do it at scale today. 00:50:34:01 00:50:35:09 So, like, they're very powerful in 00:50:35:10 00:50:37:07 that sense. And the counters, as you 00:50:37:08 00:50:38:17 mentioned, and groups like the 00:50:38:18 00:50:40:06 marketing of traditional creating 00:50:40:07 00:50:42:14 assets that they can't recreate. 00:50:42:24 00:50:44:23 I think that my takeaway is that 00:50:45:16 00:50:47:13 like a lot of great tech, 00:50:47:24 00:50:49:12 this is tech, 00:50:50:06 00:50:52:07 it is a tool, it is a system. 00:50:52:21 00:50:54:11 And the humans are the ones that 00:50:54:20 00:50:56:23 change what impact 00:50:56:24 00:50:58:02 it has on other humans. 00:50:58:09 00:51:00:03 So you can take this and create the 00:51:00:04 00:51:01:13 most enabling 00:51:01:24 00:51:02:24 platform to 00:51:03:20 00:51:05:08 allow people to flourish, create new 00:51:05:09 00:51:06:25 businesses, create economic and 00:51:06:26 00:51:09:04 social prosperity, or you can 00:51:09:12 00:51:11:06 use it to put people in a box and 00:51:11:07 00:51:12:08 ship them across country. 00:51:12:09 00:51:13:16 Like, it's it really can be 00:51:14:07 00:51:15:19 as great as we want to make it 00:51:16:06 00:51:17:22 or as terrible as we want to make 00:51:17:23 00:51:19:10 it. And I think with that, I 00:51:19:11 00:51:20:20 understand the fears and I think 00:51:20:21 00:51:22:10 it's healthy to just question, 00:51:23:01 00:51:25:05 but not too healthy to be afraid 00:51:25:06 00:51:26:12 to use platforms. 00:51:26:20 00:51:28:01 So it's that balance, 00:51:28:20 00:51:30:06 trust, but verify, use it and 00:51:30:15 00:51:32:08 exploit it to the best ability that 00:51:32:13 00:51:33:13 you can. 00:51:33:14 00:51:34:12 Thank you, Felipe. 00:51:34:13 00:51:35:13 Final thoughts Cammy? 00:51:36:05 00:51:37:02 Yeah. 00:51:37:03 00:51:39:04 So from my perspective, I think this 00:51:39:05 00:51:40:21 discussion is centred around 00:51:41:07 00:51:43:03 Sinner and Saint platforms, 00:51:43:04 00:51:44:19 but I'd like to say it's 00:51:45:03 00:51:46:07 Sinner and Saint usage. 00:51:46:15 00:51:48:20 And for our well-being 00:51:49:02 00:51:50:20 as people and as individuals, 00:51:50:25 00:51:52:25 I think the best way we can think 00:51:52:26 00:51:54:12 about using these platforms is 00:51:54:21 00:51:55:21 to foster deeper 00:51:56:22 00:51:58:11 social connections with people that 00:51:58:12 00:52:00:24 we care about and avoid 00:52:01:01 00:52:03:21 a lot of the sort of unnecessary 00:52:03:22 00:52:05:06 interactions with people we don't 00:52:05:07 00:52:07:09 know, which can spur things like 00:52:08:19 00:52:09:22 envy and loneliness and 00:52:10:23 00:52:12:17 social comparison that makes us feel 00:52:12:18 00:52:13:13 bad about ourselves. 00:52:13:14 00:52:15:23 So really, um trying to understand 00:52:15:24 00:52:17:18 and take control of 00:52:17:25 00:52:19:20 the way that we use 00:52:19:21 00:52:21:11 the platform and also push the 00:52:21:12 00:52:23:05 platforms to better design their 00:52:23:06 00:52:24:17 product to encourage the kind of 00:52:24:19 00:52:26:18 usage that benefits us as people. 00:52:27:03 00:52:28:01 Thank you. 00:52:28:02 00:52:29:10 Cammy. And Pinar, your thoughts. 00:52:30:06 00:52:32:08 So I completely agree with with 00:52:32:10 00:52:34:02 everyone that we are going to use 00:52:34:03 00:52:35:19 platforms, we're going to use them 00:52:35:22 00:52:36:23 even more in the future. 00:52:36:24 00:52:38:12 So it's not a matter of, OK, how 00:52:38:23 00:52:40:11 do we move away from them at all? 00:52:40:12 00:52:42:05 But to me, it really comes down to 00:52:42:06 00:52:43:12 two things right now. 00:52:43:20 00:52:45:07 The first one is regulation and 00:52:45:19 00:52:47:05 regulation, not just in terms 00:52:47:15 00:52:48:24 of kind of how much data are 00:52:49:12 00:52:51:03 you allowed to keep as a platform? 00:52:51:04 00:52:52:16 Can you use it across sectors to 00:52:52:17 00:52:54:00 gain monopolistic power, but 00:52:54:14 00:52:56:07 also within that platform, 00:52:56:14 00:52:57:20 are you allowed to create 00:52:57:21 00:52:59:12 behavioural nudges such as 00:52:59:22 00:53:01:15 putting digital digital confetti on 00:53:01:21 00:53:03:05 someone when they've just made an 00:53:03:06 00:53:04:16 investment and people ended up 00:53:04:17 00:53:06:05 killing themselves because of these 00:53:06:20 00:53:08:05 kind of behavioural nudges that 00:53:08:06 00:53:09:16 happened on digital platforms? 00:53:09:17 00:53:11:07 So regulating 00:53:11:14 00:53:13:18 how the platform is designed 00:53:14:02 00:53:15:22 and what kind of behaviour it 00:53:16:02 00:53:18:14 encourages, but also regulating 00:53:18:15 00:53:20:10 data storage and data sharing. 00:53:20:17 00:53:21:22 And then the second part is 00:53:21:23 00:53:23:12 education and that education, 00:53:23:13 00:53:24:22 obviously, platforms are not 00:53:25:13 00:53:27:09 incentivised to educate us in 00:53:27:10 00:53:28:19 terms of what they can do to 00:53:29:06 00:53:31:03 us. Right. And so it really falls 00:53:31:21 00:53:34:05 on the shoulders of even education, 00:53:34:11 00:53:35:18 education institutes. 00:53:35:23 00:53:37:17 We need to start educating our 00:53:37:18 00:53:39:14 students as young as primary school 00:53:39:15 00:53:41:15 students to understand 00:53:41:21 00:53:43:16 how to use these platforms 00:53:43:17 00:53:45:06 and how to even give them choice 00:53:45:18 00:53:47:09 and make them understand better what 00:53:47:10 00:53:48:14 the platform can do to them. 00:53:48:15 00:53:50:11 Otherwise, I think we are going 00:53:50:12 00:53:52:06 to be falling victim to the things 00:53:52:07 00:53:53:15 that the platforms want us to do. 00:54:00:08 00:54:01:14 My thanks to Pinar Ozcan, 00:54:02:09 00:54:03:24 Cammy Crolic, Felipe, Thomaz 00:54:04:19 00:54:05:19 and Andrew Stephen. 00:54:06:17 00:54:08:01 My name is Peter Tufano, and 00:54:08:16 00:54:09:19 you've been listening to Leadership 00:54:09:20 00:54:11:10 in Extraordinary Times, a podcast 00:54:12:03 00:54:13:15 from Oxford University's Said 00:54:13:16 00:54:14:16 Business School. 00:54:15:06 00:54:16:12 Subscribe wherever you get your 00:54:16:13 00:54:18:11 podcasts and take a moment to rate 00:54:18:12 00:54:19:11 and review us. 00:54:20:20 00:54:22:16 In the next episode, the last in the 00:54:22:17 00:54:24:08 current series, we'll be exploring 00:54:24:10 00:54:26:01 innovation in health care and why 00:54:26:08 00:54:27:25 it needs to be community centred, 00:54:28:04 00:54:29:04 not patient centred. 00:54:30:07 00:54:31:20 You'll find more information about 00:54:31:21 00:54:33:17 this and all the Leadership in 00:54:33:18 00:54:35:11 Extraordinary Times series at 00:54:36:14 00:54:37:14 oxfordanswers.org. 00:54:38:05 00:54:39:13 Until next time, thanks for 00:54:40:01 00:54:40:10 listening.