1 00:00:00,280 --> 00:00:04,960 Speaker 1: Yesterday the Digital Platform Services Inquiry Report was published and 2 00:00:04,960 --> 00:00:08,799 Speaker 1: it highlights that consumers well we just don't know how 3 00:00:08,880 --> 00:00:12,080 Speaker 1: much of our data is collected, used and shared with 4 00:00:12,200 --> 00:00:16,360 Speaker 1: data firms and other businesses. When you QR code, for instance, 5 00:00:16,480 --> 00:00:19,680 Speaker 1: to book into a restaurant or use a QR code 6 00:00:19,680 --> 00:00:24,240 Speaker 1: in some other way, how much of you is downloaded 7 00:00:24,280 --> 00:00:27,200 Speaker 1: in that and they get your details and then what 8 00:00:27,240 --> 00:00:29,319 Speaker 1: do they do with it? Not like the old days 9 00:00:29,320 --> 00:00:31,360 Speaker 1: where you could ring up as mister Jones, book a 10 00:00:31,400 --> 00:00:33,960 Speaker 1: table for two or four or whatever and rock up 11 00:00:33,960 --> 00:00:36,000 Speaker 1: and pay in cash. Job done, no one knew who 12 00:00:36,000 --> 00:00:40,239 Speaker 1: you were, mister Jones. But these days a little bit 13 00:00:40,240 --> 00:00:43,199 Speaker 1: different and that has the A Triple C concerned that 14 00:00:43,320 --> 00:00:47,400 Speaker 1: to consumers lack visibility in choice over data collection practices. 15 00:00:47,600 --> 00:00:50,040 Speaker 1: Deputy Chair Katrina Lowe is on the line. 16 00:00:50,159 --> 00:00:52,559 Speaker 2: Katrina, good morning, Good morning Matthew. 17 00:00:52,680 --> 00:00:53,599 Speaker 1: What do we do about this? 18 00:00:55,360 --> 00:00:59,960 Speaker 2: Well, we have written our eight Digital Platform Inquiry Report 19 00:01:00,120 --> 00:01:04,920 Speaker 2: to begin by shining a light on industry practices which 20 00:01:05,200 --> 00:01:09,319 Speaker 2: haven't been well studied, frankly in Australia. And it's important 21 00:01:09,640 --> 00:01:13,080 Speaker 2: given both the significant role that these data firms play 22 00:01:13,200 --> 00:01:17,560 Speaker 2: in the economy and as you say, the limited consumer 23 00:01:17,680 --> 00:01:25,080 Speaker 2: awareness regarding data collection practices. So consumers typically lack visibility, awareness, 24 00:01:25,120 --> 00:01:29,119 Speaker 2: and control over how their data is collected and used, 25 00:01:29,200 --> 00:01:34,399 Speaker 2: and we certainly want to firstly raise awareness of this fact. 26 00:01:34,720 --> 00:01:39,240 Speaker 2: But we also think that this report really underlines the 27 00:01:39,280 --> 00:01:43,160 Speaker 2: importance of a couple of reforms that are already under discussion, 28 00:01:43,680 --> 00:01:48,440 Speaker 2: namely reforms to our privacy law, but also the introduction 29 00:01:48,760 --> 00:01:52,160 Speaker 2: of what's called an unfair trade practices prohibition. 30 00:01:52,640 --> 00:01:56,320 Speaker 1: Okay, I suppose as you go through your day you're 31 00:01:56,360 --> 00:01:59,640 Speaker 1: being identified through the use of the phone in a multiple, 32 00:02:00,880 --> 00:02:03,280 Speaker 1: probably hundreds of different times a day. 33 00:02:04,440 --> 00:02:07,280 Speaker 2: That's right. In essence, data can be collected in a 34 00:02:07,320 --> 00:02:11,880 Speaker 2: whole range of ways, as you say, perhaps in activities 35 00:02:11,880 --> 00:02:15,040 Speaker 2: that we're used to being able to conduct anonymously. So 36 00:02:15,160 --> 00:02:19,799 Speaker 2: whether that's conducting a search online, whether it's ordering something 37 00:02:19,800 --> 00:02:22,720 Speaker 2: in a restaurant via a QR code, there are a 38 00:02:22,800 --> 00:02:26,160 Speaker 2: whole range of ways in which our data can be collected, 39 00:02:26,600 --> 00:02:31,080 Speaker 2: and data firms often will also be then aggregating that 40 00:02:31,280 --> 00:02:36,560 Speaker 2: data and providing services to businesses whereby they can enrich 41 00:02:36,840 --> 00:02:39,920 Speaker 2: the data that they already hold about it. And we 42 00:02:40,000 --> 00:02:45,520 Speaker 2: do see data firms are offering often marketing the vast 43 00:02:45,600 --> 00:02:50,520 Speaker 2: number of data points that they have to businesses as 44 00:02:50,600 --> 00:02:51,840 Speaker 2: a selling point. 45 00:02:52,200 --> 00:02:54,880 Speaker 1: When you use a QR code, what are you actually 46 00:02:55,040 --> 00:02:59,079 Speaker 1: allowing the phone to do? Does it impart your personal 47 00:03:00,120 --> 00:03:03,680 Speaker 1: your name, your address, your phone number? Does it give 48 00:03:03,720 --> 00:03:06,200 Speaker 1: all that through the q archive Look? 49 00:03:06,240 --> 00:03:09,720 Speaker 2: It does vary, but certainly it can provide that sort 50 00:03:09,760 --> 00:03:13,880 Speaker 2: of information. It can also provide information about the orders 51 00:03:13,919 --> 00:03:21,000 Speaker 2: that are made, and often firms will also derive profiling 52 00:03:21,040 --> 00:03:25,120 Speaker 2: information based on the choices that we make, so building 53 00:03:25,200 --> 00:03:29,120 Speaker 2: up the profile against about us. And part of the 54 00:03:29,200 --> 00:03:32,320 Speaker 2: challenge is that even if we do go to read 55 00:03:32,639 --> 00:03:37,800 Speaker 2: the privacy policy, often it will use broad and general language, 56 00:03:37,800 --> 00:03:41,400 Speaker 2: which might make it difficult for us to understand what 57 00:03:41,600 --> 00:03:45,520 Speaker 2: actually is being done with our data. One of the 58 00:03:45,600 --> 00:03:48,880 Speaker 2: data points in our report is that if consumers really 59 00:03:48,920 --> 00:03:52,880 Speaker 2: did sit down and read all of the privacy policies 60 00:03:52,880 --> 00:03:56,520 Speaker 2: that they might encounter across their range of activities in 61 00:03:56,560 --> 00:04:00,440 Speaker 2: a day, it would take an average reader for six 62 00:04:00,480 --> 00:04:06,760 Speaker 2: hours per month. You no, no, I am not regrettably 63 00:04:06,920 --> 00:04:10,360 Speaker 2: so it's forty six hours per month for an average reader. 64 00:04:10,480 --> 00:04:13,440 Speaker 2: So that gives some sense of the size of the 65 00:04:13,560 --> 00:04:16,200 Speaker 2: task were people to actually sit down and do this, 66 00:04:16,680 --> 00:04:21,400 Speaker 2: and even if we did, we often have a take 67 00:04:21,440 --> 00:04:23,720 Speaker 2: it or leave it choice, that is, either give the 68 00:04:23,839 --> 00:04:26,239 Speaker 2: data that's right, or don't access. 69 00:04:25,800 --> 00:04:31,039 Speaker 1: The service, so neat yes, so okay. Is legislation the 70 00:04:31,080 --> 00:04:34,600 Speaker 1: best way to control this, to tip the balance back 71 00:04:34,640 --> 00:04:35,679 Speaker 1: in the consumer's favor. 72 00:04:36,920 --> 00:04:39,480 Speaker 2: So, as I say, one of the reforms that we 73 00:04:39,520 --> 00:04:43,839 Speaker 2: think is very important in this context are the privacy 74 00:04:43,920 --> 00:04:47,840 Speaker 2: law reform. So one of the reforms that's under discussion 75 00:04:48,720 --> 00:04:52,520 Speaker 2: is a requirement that the collection, use, and disclosure of 76 00:04:52,600 --> 00:04:56,920 Speaker 2: personal information needs to be fair and reasonable in all 77 00:04:56,960 --> 00:05:00,839 Speaker 2: of the circumstances. So that's really saying that there shouldn't 78 00:05:00,880 --> 00:05:04,880 Speaker 2: be more data collected about us than is necessary, and 79 00:05:04,920 --> 00:05:08,680 Speaker 2: what's necessary should be decided with reference to what a 80 00:05:08,760 --> 00:05:13,120 Speaker 2: reasonable person should expect. There's quite a bit of research 81 00:05:13,200 --> 00:05:17,720 Speaker 2: that's been done around consumer attitudes to data collection, and 82 00:05:17,760 --> 00:05:21,200 Speaker 2: it's fair to say that the evidence suggests that people 83 00:05:21,320 --> 00:05:25,920 Speaker 2: are uncomfortable about the amount of their personal information that's 84 00:05:25,960 --> 00:05:30,320 Speaker 2: been collected. So, for example, a body called the Consumer 85 00:05:30,360 --> 00:05:34,760 Speaker 2: Policy Research Center found in a survey that seventy four 86 00:05:34,839 --> 00:05:39,200 Speaker 2: percent of people's surveyed were uncomfortable with the idea of 87 00:05:39,240 --> 00:05:42,000 Speaker 2: their personal information being sheed or sold. 88 00:05:42,360 --> 00:05:45,440 Speaker 1: I'm sure that's absolutely right. It's you know, you just 89 00:05:45,560 --> 00:05:48,240 Speaker 1: wake up in the morning and there's three messages from 90 00:05:48,720 --> 00:05:51,800 Speaker 1: allegedly a toll road, you know, maybe Telstra and some 91 00:05:51,920 --> 00:05:54,080 Speaker 1: other organization wanting you to click on a link, and 92 00:05:54,080 --> 00:05:56,840 Speaker 1: you think, how did they get my number? And perhaps 93 00:05:56,839 --> 00:06:00,640 Speaker 1: this is part of the answer, but anyway, it's something 94 00:06:00,640 --> 00:06:02,720 Speaker 1: that needs looking at, no doubt about it. Katrina, thank 95 00:06:02,760 --> 00:06:03,320 Speaker 1: you for your time. 96 00:06:03,880 --> 00:06:04,640 Speaker 2: You're very welcome. 97 00:06:04,680 --> 00:06:07,599 Speaker 1: Indeed, Matty Katrina low There, Deputy chair a Triple C