1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,240 --> 00:00:15,200 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,360 --> 00:00:18,400 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,520 --> 00:00:22,000 Speaker 1: and I love all things tech. And you know, one 5 00:00:22,040 --> 00:00:25,159 Speaker 1: thing I haven't really covered in this show is just 6 00:00:25,560 --> 00:00:30,040 Speaker 1: a rundown on common acronyms and initialisms in the world 7 00:00:30,040 --> 00:00:35,000 Speaker 1: of tech and what those things actually mean. Sometimes you 8 00:00:35,080 --> 00:00:37,720 Speaker 1: run into these things and they can throw you for 9 00:00:37,760 --> 00:00:41,640 Speaker 1: a loop. So today and in the next few episodes, 10 00:00:41,720 --> 00:00:45,360 Speaker 1: we're doing a sort of glossary of tech related terms 11 00:00:45,440 --> 00:00:49,520 Speaker 1: you might encounter and what those actually mean. Not really terms, 12 00:00:49,560 --> 00:00:53,040 Speaker 1: but again those initialisms and acronyms. I will spend a 13 00:00:53,080 --> 00:00:55,280 Speaker 1: little bit of time on each of these because there's 14 00:00:55,320 --> 00:00:57,680 Speaker 1: a lot to get through, so expect to hear a 15 00:00:57,680 --> 00:01:01,000 Speaker 1: bit of context, not just a deaf ffinition. Do you 16 00:01:01,040 --> 00:01:03,800 Speaker 1: want a definition? You could pull up lists of acronyms 17 00:01:03,800 --> 00:01:07,000 Speaker 1: and initialisms and just see that. Now, before I jump 18 00:01:07,040 --> 00:01:09,760 Speaker 1: into this, I do want to mention that this is 19 00:01:09,800 --> 00:01:13,320 Speaker 1: by no means an exhaustive list of acronyms in tech. 20 00:01:14,280 --> 00:01:18,479 Speaker 1: If I did that, it would pretty much completely take 21 00:01:18,520 --> 00:01:22,080 Speaker 1: over this podcast for the next like four weeks. Instead, 22 00:01:22,200 --> 00:01:25,920 Speaker 1: I've selected a bunch of acronyms and initialisms that I 23 00:01:25,959 --> 00:01:29,480 Speaker 1: think are important to know, but I'm leaving out a 24 00:01:29,520 --> 00:01:32,199 Speaker 1: ton of them, and I wouldn't blame anyone for saying 25 00:01:32,520 --> 00:01:36,200 Speaker 1: I was being a bit arbitrary with my approach to selection. 26 00:01:36,800 --> 00:01:42,319 Speaker 1: It turns out the tech world absolutely loves acronyms and initialisms, 27 00:01:42,319 --> 00:01:44,640 Speaker 1: and some of them take longer to say than the 28 00:01:44,680 --> 00:01:48,360 Speaker 1: full names, because I mean, why not. Also, just for 29 00:01:48,400 --> 00:01:52,040 Speaker 1: the purposes of organization, I'm going to go alphabetically through 30 00:01:52,080 --> 00:01:55,360 Speaker 1: the list here because I needed to organize this in 31 00:01:55,440 --> 00:01:58,680 Speaker 1: some way and that seemed to make the most sense 32 00:01:58,720 --> 00:02:01,200 Speaker 1: to me. In addition, with some of these, I have 33 00:02:01,240 --> 00:02:05,640 Speaker 1: actually grouped related terms together. For example, if I just 34 00:02:05,720 --> 00:02:11,440 Speaker 1: go strictly alphabetically, I would hit d RAM before RAM, 35 00:02:11,800 --> 00:02:14,919 Speaker 1: which means things we get a little weird. So instead, 36 00:02:15,320 --> 00:02:17,480 Speaker 1: d RAM is going to be part of a larger 37 00:02:17,520 --> 00:02:20,720 Speaker 1: treatment on RAM in general. So if you feel like 38 00:02:20,760 --> 00:02:24,239 Speaker 1: I've skipped over something, and I definitely have skipped over 39 00:02:24,320 --> 00:02:28,360 Speaker 1: some things, just wait for all of these episodes to 40 00:02:28,400 --> 00:02:31,640 Speaker 1: come out, just in case that thing shows up in 41 00:02:31,680 --> 00:02:35,200 Speaker 1: a group entry, and maybe by that time you will 42 00:02:35,240 --> 00:02:37,240 Speaker 1: have forgotten all about it, and I won't have to 43 00:02:37,280 --> 00:02:41,600 Speaker 1: get angry messages. So let's get to it. And our 44 00:02:41,639 --> 00:02:44,240 Speaker 1: first one isn't actually starting with a letter at all. 45 00:02:44,280 --> 00:02:48,480 Speaker 1: It starts with the numeral two, so it's two f A. 46 00:02:49,440 --> 00:02:54,000 Speaker 1: This one means to factor authentication, which I'm guessing most 47 00:02:54,000 --> 00:02:57,480 Speaker 1: of you out there have encountered at some point. This 48 00:02:57,600 --> 00:03:01,600 Speaker 1: is a means of authenticating a user, you know, saying yes, 49 00:03:01,680 --> 00:03:04,799 Speaker 1: this user is who they claim to be, and they 50 00:03:04,840 --> 00:03:09,919 Speaker 1: do it through well two factors. Those factors should belong 51 00:03:10,000 --> 00:03:14,800 Speaker 1: to two different categories of things. There are three categories total. 52 00:03:15,280 --> 00:03:18,240 Speaker 1: There's knowledge, so that would be stuff like a password. 53 00:03:18,280 --> 00:03:23,200 Speaker 1: It's something that the user knows. There's stuff what is 54 00:03:23,360 --> 00:03:27,280 Speaker 1: you by that I mean like biometrics like a retinal 55 00:03:27,360 --> 00:03:31,280 Speaker 1: scan or a fingerprint scan or you know, a vocal 56 00:03:31,520 --> 00:03:34,880 Speaker 1: scan that kind of thing. And then there's stuff what 57 00:03:34,960 --> 00:03:39,280 Speaker 1: you own, like your cell phone or a physical token 58 00:03:39,560 --> 00:03:44,120 Speaker 1: or something like that. So with two factor authentication, you 59 00:03:44,160 --> 00:03:47,640 Speaker 1: have to provide two out of the three categories in 60 00:03:47,720 --> 00:03:50,760 Speaker 1: order to get access to whatever system it is you're 61 00:03:50,800 --> 00:03:53,680 Speaker 1: trying to access. This could be a building, it could 62 00:03:53,760 --> 00:03:57,440 Speaker 1: be a computer, it could be a specific piece of software. 63 00:03:57,880 --> 00:04:01,040 Speaker 1: A pretty common version of this is a password and 64 00:04:01,160 --> 00:04:05,440 Speaker 1: a one time use access code that the system then 65 00:04:05,520 --> 00:04:09,920 Speaker 1: sends to your registered smartphone. So the idea here, of course, 66 00:04:10,080 --> 00:04:13,040 Speaker 1: is that if someone were to get hold of your password, 67 00:04:13,640 --> 00:04:16,320 Speaker 1: like let's say you wrote it on a post it 68 00:04:16,360 --> 00:04:20,800 Speaker 1: note or something I've seen it happen, well, that person 69 00:04:20,880 --> 00:04:23,440 Speaker 1: would still need to have your phone in order to 70 00:04:23,520 --> 00:04:27,680 Speaker 1: access that account once they were prompted by the system. So, 71 00:04:28,080 --> 00:04:33,159 Speaker 1: when implemented properly, two factor authentication is a big boost insecurity. 72 00:04:33,560 --> 00:04:36,760 Speaker 1: On a related note, you also have m f A, 73 00:04:36,960 --> 00:04:41,880 Speaker 1: which stands for multi factor authentication that includes to F 74 00:04:42,000 --> 00:04:46,120 Speaker 1: A but could extend beyond to F A for certain systems. 75 00:04:46,520 --> 00:04:49,279 Speaker 1: And also, I know it can feel like a huge 76 00:04:49,279 --> 00:04:53,200 Speaker 1: hassle to log into systems that require multi factor authentication, 77 00:04:53,680 --> 00:04:56,599 Speaker 1: but it really is a more secure method than using 78 00:04:56,800 --> 00:05:00,640 Speaker 1: passwords alone, for example, particularly in a world where data 79 00:05:00,680 --> 00:05:04,240 Speaker 1: breaches and poor security habits can lead to someone gaining 80 00:05:04,279 --> 00:05:09,120 Speaker 1: unauthorized access to a computer or network. Next, we have 81 00:05:09,520 --> 00:05:14,360 Speaker 1: and C A n s I stands for American National 82 00:05:14,600 --> 00:05:18,840 Speaker 1: Standards Institute. So if you're like me, you probably haven't 83 00:05:18,960 --> 00:05:21,960 Speaker 1: traditionally spent a whole lot of time thinking about standards 84 00:05:22,320 --> 00:05:26,440 Speaker 1: and where those standards come from. Now, standards are incredibly useful. 85 00:05:26,600 --> 00:05:29,360 Speaker 1: They're what makes it possible for you to use stuff 86 00:05:29,360 --> 00:05:34,720 Speaker 1: from totally different sources like manufacturers, for example, or or 87 00:05:34,760 --> 00:05:38,800 Speaker 1: just companies in general, and have those things still work together. 88 00:05:39,560 --> 00:05:43,320 Speaker 1: Imagine a world without standards. It kind of like me 89 00:05:43,360 --> 00:05:46,479 Speaker 1: in college. It would just be a total mess. You 90 00:05:46,520 --> 00:05:51,000 Speaker 1: would be locked into ecosystems even more than you already are. 91 00:05:51,279 --> 00:05:56,560 Speaker 1: Like if every PC manufacturer went totally proprietary with their hardware, 92 00:05:56,880 --> 00:06:00,360 Speaker 1: their firmware, and their software, you would be lawed into 93 00:06:00,440 --> 00:06:03,240 Speaker 1: that system. You would never be able to use anything 94 00:06:03,320 --> 00:06:07,960 Speaker 1: from anywhere else on a hardware related level, like not 95 00:06:08,040 --> 00:06:13,360 Speaker 1: even computer hardware, just literal like hardware store hardware. Imagine 96 00:06:13,600 --> 00:06:18,680 Speaker 1: that you had proprietary screws from one company and proprietary 97 00:06:18,800 --> 00:06:21,000 Speaker 1: screw drivers from a different company, and they're not at 98 00:06:21,000 --> 00:06:25,200 Speaker 1: all compatible, and that you would have to have everything 99 00:06:25,360 --> 00:06:28,839 Speaker 1: from the same manufacturer for it to work together. It 100 00:06:28,880 --> 00:06:31,520 Speaker 1: would be a nightmare and it would make things really 101 00:06:31,560 --> 00:06:34,200 Speaker 1: difficult whenever you need to make repairs or add on 102 00:06:34,279 --> 00:06:40,480 Speaker 1: to anything. So standards can apply to stuff like equipment 103 00:06:40,720 --> 00:06:45,200 Speaker 1: or even processes and personnel. So standards can be far 104 00:06:45,200 --> 00:06:48,640 Speaker 1: beyond just the physical stuff that we encounter or the 105 00:06:48,680 --> 00:06:53,360 Speaker 1: types of of software that we use, and and see 106 00:06:53,440 --> 00:06:57,680 Speaker 1: the organization that verify standards that dates back to nineteen 107 00:06:58,360 --> 00:07:01,279 Speaker 1: It doesn't actually establish standards itself. It's not like this 108 00:07:01,400 --> 00:07:05,480 Speaker 1: organization gets together and says we have the stone tablet 109 00:07:05,560 --> 00:07:09,400 Speaker 1: that says all uh Phillips head screwdrivers have to be 110 00:07:09,480 --> 00:07:12,600 Speaker 1: this particular shape. It's not like that. Instead, it's an 111 00:07:12,600 --> 00:07:17,240 Speaker 1: accreditation organization that evaluates and approves standards that have been 112 00:07:17,280 --> 00:07:21,960 Speaker 1: developed by other entities. That process involves a lot of collaboration. 113 00:07:22,160 --> 00:07:25,560 Speaker 1: So and C brings together all the various parties that 114 00:07:25,600 --> 00:07:28,760 Speaker 1: are affected by the adoption of those standards so that 115 00:07:28,800 --> 00:07:31,960 Speaker 1: they can hash it out. Ultimately, this leads to a 116 00:07:32,000 --> 00:07:36,600 Speaker 1: more orderly marketplace. All that being said, there's still lots 117 00:07:36,600 --> 00:07:40,360 Speaker 1: of companies that develop proprietary technologies and processes, but the 118 00:07:40,440 --> 00:07:43,200 Speaker 1: vast majority of the stuff we depend upon has an 119 00:07:43,320 --> 00:07:49,440 Speaker 1: underlying uniformity thanks to standards. Next, a p I and 120 00:07:49,880 --> 00:07:52,160 Speaker 1: s d K. This is one of those where I've 121 00:07:52,160 --> 00:07:56,440 Speaker 1: combined two different things. So a p I stands for 122 00:07:56,520 --> 00:08:02,640 Speaker 1: Application programming interface, s d A stands for software development kit. 123 00:08:03,200 --> 00:08:06,320 Speaker 1: These two terms are frequently used together, but they are 124 00:08:06,440 --> 00:08:10,000 Speaker 1: not interchangeable, and a p I is kind of like 125 00:08:10,800 --> 00:08:14,280 Speaker 1: a software liaison, so it's a set of rules and 126 00:08:14,560 --> 00:08:17,960 Speaker 1: tools that allow different pieces of software to communicate with 127 00:08:18,200 --> 00:08:22,520 Speaker 1: each other. It helps developers create application software that can 128 00:08:22,520 --> 00:08:26,760 Speaker 1: interoperate with some other piece of software or platform. So, 129 00:08:26,880 --> 00:08:30,880 Speaker 1: for example, Facebook has an ap I that lets developers 130 00:08:30,920 --> 00:08:36,199 Speaker 1: create apps that can tap into basic Facebook functionality. N 131 00:08:36,360 --> 00:08:39,560 Speaker 1: s d K, on the other hand, is a more 132 00:08:39,720 --> 00:08:43,480 Speaker 1: robust set of tools for the purposes of developing software, 133 00:08:43,760 --> 00:08:47,400 Speaker 1: and s d K often has an API as part 134 00:08:47,520 --> 00:08:50,320 Speaker 1: of the kit. So you can think of an API 135 00:08:50,600 --> 00:08:53,200 Speaker 1: as a subset of the kind of tools that you 136 00:08:53,320 --> 00:08:57,160 Speaker 1: find in an s d K. So an operating system 137 00:08:57,320 --> 00:09:00,000 Speaker 1: might have an s d K, and that would allow 138 00:09:00,160 --> 00:09:03,200 Speaker 1: developers to create software that could then run on top 139 00:09:03,400 --> 00:09:07,240 Speaker 1: of that operating system, which, by the way, we often 140 00:09:07,360 --> 00:09:11,360 Speaker 1: abbreviate to O S, so that's another bonus. OS stands 141 00:09:11,480 --> 00:09:17,280 Speaker 1: for operating system. Next a R. A R stands for 142 00:09:17,679 --> 00:09:22,360 Speaker 1: augmented reality. This applies to technologies that use some form 143 00:09:22,480 --> 00:09:27,040 Speaker 1: of computer generated information to enhance our real world experience 144 00:09:27,040 --> 00:09:30,000 Speaker 1: in some way. Frequently, we think of this as a 145 00:09:30,080 --> 00:09:34,240 Speaker 1: visual overlay of the world around us. So, for example, 146 00:09:34,600 --> 00:09:40,360 Speaker 1: Google Glass which had a transparent prism that was sort 147 00:09:40,400 --> 00:09:43,560 Speaker 1: of look in a way that was not directly in 148 00:09:43,640 --> 00:09:45,680 Speaker 1: your view if you were looking straight ahead, but rather 149 00:09:46,480 --> 00:09:48,200 Speaker 1: up just a little bit, so you'd glance up a 150 00:09:48,200 --> 00:09:50,240 Speaker 1: bit and you could look at this prism that was 151 00:09:50,280 --> 00:09:53,199 Speaker 1: actually a screen that could display digital information. So I 152 00:09:53,200 --> 00:09:56,199 Speaker 1: could give you, for example, step by step directions as 153 00:09:56,200 --> 00:10:00,160 Speaker 1: you navigated around an environment. You would just glance up 154 00:10:00,200 --> 00:10:02,320 Speaker 1: and see that, you know, in one feet you would 155 00:10:02,320 --> 00:10:04,840 Speaker 1: need to make a left turn. Other versions of a 156 00:10:05,000 --> 00:10:07,320 Speaker 1: R use a camera to pick up on an image 157 00:10:07,559 --> 00:10:11,000 Speaker 1: and then display where you would be able to see 158 00:10:11,040 --> 00:10:14,680 Speaker 1: some sort of effect related to that image. So one 159 00:10:14,679 --> 00:10:16,800 Speaker 1: example of this is like an app where you would 160 00:10:16,840 --> 00:10:18,960 Speaker 1: hold it up your phone, you know, hold up your phone, 161 00:10:19,040 --> 00:10:22,040 Speaker 1: so the camera is looking at say a movie poster, 162 00:10:22,679 --> 00:10:25,600 Speaker 1: and then on your screen that movie poster suddenly becomes 163 00:10:25,640 --> 00:10:30,200 Speaker 1: animated and it's it's, you know, a pretty interesting effect. 164 00:10:30,480 --> 00:10:33,000 Speaker 1: But you could have a R implementations that don't use 165 00:10:33,160 --> 00:10:36,040 Speaker 1: visual elements at all. Just has to be a computer 166 00:10:36,200 --> 00:10:42,040 Speaker 1: generated sensory experience that enhances or augments what's going on 167 00:10:42,080 --> 00:10:45,280 Speaker 1: in the world around you. A R is a type 168 00:10:45,320 --> 00:10:49,080 Speaker 1: of mixed reality which you can think of as existing 169 00:10:49,080 --> 00:10:51,439 Speaker 1: on a spectrum. So on one end of the spectrum 170 00:10:51,760 --> 00:10:56,160 Speaker 1: you have an experience that is heavily dependent upon real reality, 171 00:10:56,679 --> 00:11:01,640 Speaker 1: and the computational elements are extremely light touch. And then 172 00:11:01,720 --> 00:11:04,679 Speaker 1: on the other side of the spectrum you have experiences 173 00:11:05,320 --> 00:11:08,800 Speaker 1: that are heavily dependent on a computer generated reality. In fact, 174 00:11:08,840 --> 00:11:12,839 Speaker 1: you could have somewhere the computer generated reality is replacing 175 00:11:13,040 --> 00:11:19,760 Speaker 1: almost everything of your real experience. Arguably, augmented reality started 176 00:11:19,760 --> 00:11:22,880 Speaker 1: off as an entertainment experience pioneered by a guy named 177 00:11:22,880 --> 00:11:26,360 Speaker 1: Morton Heilig back in the late nineteen fifties. He invented 178 00:11:26,360 --> 00:11:29,200 Speaker 1: a machine called the sensor Rama. The idea being that 179 00:11:29,600 --> 00:11:31,800 Speaker 1: you would sit down in one of these machines. It's 180 00:11:31,880 --> 00:11:34,560 Speaker 1: kind of like a almost like a console type thing. 181 00:11:34,559 --> 00:11:38,600 Speaker 1: You would sit at and you would watch some form 182 00:11:38,640 --> 00:11:42,040 Speaker 1: of movie or short film, and that would be augmented 183 00:11:42,080 --> 00:11:47,320 Speaker 1: with other sensations, like the device would emit certain smells, 184 00:11:47,360 --> 00:11:51,040 Speaker 1: like let's say that you're looking at video or a 185 00:11:51,040 --> 00:11:56,280 Speaker 1: film rather of orange groves, and suddenly you can smell oranges, 186 00:11:56,800 --> 00:12:01,720 Speaker 1: and you could even get haptic sensation, so so tactile feedback. 187 00:12:01,760 --> 00:12:05,079 Speaker 1: There would be little vibrating motors and stuff, and that 188 00:12:05,240 --> 00:12:08,280 Speaker 1: was a kind of the birth of augmented reality. As 189 00:12:08,280 --> 00:12:11,679 Speaker 1: for the term itself, Thomas P. Coddle gets the credit 190 00:12:11,760 --> 00:12:14,560 Speaker 1: for coining that term in nine back when he was 191 00:12:14,600 --> 00:12:20,880 Speaker 1: working at Boeing. Next up, we've got our PA slash DARPA. 192 00:12:21,080 --> 00:12:24,599 Speaker 1: So our PA A r p A stands for Advanced 193 00:12:24,679 --> 00:12:30,480 Speaker 1: Research Projects Agency and DARPA, which is the same organization. 194 00:12:30,480 --> 00:12:34,599 Speaker 1: It's just a new name that stands for Defense Advanced 195 00:12:34,720 --> 00:12:38,839 Speaker 1: Research Projects Agency. That's the current name for the organization. 196 00:12:39,280 --> 00:12:41,800 Speaker 1: This is the R and D ARM of the United 197 00:12:41,840 --> 00:12:46,040 Speaker 1: States Department of Defense. Now this agency isn't actually a 198 00:12:46,080 --> 00:12:50,000 Speaker 1: think tank filled with like labs and scientists playing with 199 00:12:50,640 --> 00:12:55,200 Speaker 1: beakers and robots and aliens and stuff. Instead, it's an 200 00:12:55,280 --> 00:13:00,439 Speaker 1: organization that's focused primarily on providing funding to other research 201 00:13:00,559 --> 00:13:05,000 Speaker 1: organizations that are actually developing technologies that could potentially be 202 00:13:05,080 --> 00:13:08,760 Speaker 1: useful in the cause of national defense. DARPA has played 203 00:13:08,840 --> 00:13:12,760 Speaker 1: a huge role in the evolution of technologies like computer networks, 204 00:13:13,160 --> 00:13:19,199 Speaker 1: autonomous cars, drone technology, and much more. ARPA net a 205 00:13:19,280 --> 00:13:23,079 Speaker 1: predecessor of the Internet, came out of an ARPA initiative. 206 00:13:23,120 --> 00:13:27,640 Speaker 1: As the name implies, many of the autonomous car projects 207 00:13:27,720 --> 00:13:30,760 Speaker 1: in various companies can actually trace their history back to 208 00:13:31,480 --> 00:13:34,600 Speaker 1: participants who are competing in one of the DARPA Grand 209 00:13:34,720 --> 00:13:39,199 Speaker 1: Challenges of Driver Ellis vehicles. These challenges lay out really 210 00:13:39,240 --> 00:13:42,920 Speaker 1: ambitious goals and then various teams strive to achieve those 211 00:13:43,160 --> 00:13:46,680 Speaker 1: while competing against other teams, and they're all going for 212 00:13:46,720 --> 00:13:50,320 Speaker 1: a cash prize and really bragging rights for winning the 213 00:13:50,360 --> 00:13:53,319 Speaker 1: whole thing. Now, it's important to remember that the chief 214 00:13:53,440 --> 00:13:56,840 Speaker 1: role of DARPA is ultimately to fund projects that could 215 00:13:56,880 --> 00:14:00,920 Speaker 1: potentially be used in a defense or millet very context, 216 00:14:01,320 --> 00:14:06,679 Speaker 1: and the organization has been connected to some rather unsavory 217 00:14:06,760 --> 00:14:09,480 Speaker 1: projects in the past, such as the use of herbicides 218 00:14:09,559 --> 00:14:13,040 Speaker 1: in warfare, most notably agent orange, which is a highly 219 00:14:13,120 --> 00:14:17,079 Speaker 1: toxic and carcinogenic compound that was used by the United 220 00:14:17,120 --> 00:14:21,040 Speaker 1: States during the Vietnam War. Then we have as key 221 00:14:21,160 --> 00:14:24,880 Speaker 1: or a s c I. I stands for American Standard 222 00:14:25,000 --> 00:14:29,320 Speaker 1: Code for Information into Exchange. It's a standard for how 223 00:14:29,440 --> 00:14:34,800 Speaker 1: an eight bit system represents numbers, letters, and certain symbols. 224 00:14:35,320 --> 00:14:39,880 Speaker 1: So a bit is a binary digit and it can 225 00:14:39,920 --> 00:14:42,680 Speaker 1: have a value of zero or one. So each bit 226 00:14:43,080 --> 00:14:46,360 Speaker 1: has two potential values. Right, a bit can be either 227 00:14:46,400 --> 00:14:48,960 Speaker 1: a zero or a one. If you have two bits. 228 00:14:49,640 --> 00:14:54,720 Speaker 1: Then you've got four potential values, which would be zero, zero, zero, one, 229 00:14:55,120 --> 00:14:58,480 Speaker 1: one zero, or one one. When you get up to 230 00:14:58,600 --> 00:15:03,400 Speaker 1: eight bits, you have two hundred fifty six potential values. 231 00:15:03,840 --> 00:15:08,400 Speaker 1: The American Standards Association's X three division created as key 232 00:15:08,440 --> 00:15:14,160 Speaker 1: as a standard for representing various characters using binary with 233 00:15:14,520 --> 00:15:17,360 Speaker 1: these eight bits or a bite if you prefer, a 234 00:15:17,440 --> 00:15:20,640 Speaker 1: bite is eight bits. The standard could then be used 235 00:15:20,680 --> 00:15:24,760 Speaker 1: in bit based computer systems, which allowed for electronic communication 236 00:15:24,920 --> 00:15:28,480 Speaker 1: of these characters. So we have to remember that computers 237 00:15:28,800 --> 00:15:32,479 Speaker 1: don't process language the same way we do. They typically 238 00:15:32,840 --> 00:15:37,440 Speaker 1: process language and form of machine code. Machine code, more 239 00:15:37,480 --> 00:15:41,360 Speaker 1: often than not, means binary. That's something that we humans 240 00:15:41,680 --> 00:15:45,880 Speaker 1: can't really handle very well. Machines can handle it very 241 00:15:45,960 --> 00:15:50,280 Speaker 1: very quickly. So the as key was a way of 242 00:15:50,840 --> 00:15:56,120 Speaker 1: translating binary into characters and vice versa. Very important when 243 00:15:56,160 --> 00:16:03,800 Speaker 1: you're using computers to communicate between two different people. Basic, alright, Basic, 244 00:16:04,480 --> 00:16:07,240 Speaker 1: believe it or not, is an acronym. It's not just 245 00:16:07,360 --> 00:16:13,480 Speaker 1: a word. Basic stands for beginners all purpose symbolic instruction code. 246 00:16:14,040 --> 00:16:17,080 Speaker 1: It's a type of high level programming language, and the 247 00:16:17,120 --> 00:16:20,400 Speaker 1: developers of Basic intended it to be a relatively easy 248 00:16:20,440 --> 00:16:24,560 Speaker 1: to use programming language that computer science students could pick 249 00:16:24,640 --> 00:16:27,880 Speaker 1: up pretty quickly. So what does high level mean in 250 00:16:27,880 --> 00:16:31,480 Speaker 1: this case, Well, machines, like I said, process information in 251 00:16:31,560 --> 00:16:34,680 Speaker 1: machine code, and the most famous of this is binary. 252 00:16:35,000 --> 00:16:39,040 Speaker 1: Machines can process binary code very very quickly, but it's 253 00:16:39,120 --> 00:16:42,880 Speaker 1: incredibly hard for humans to do the same. It's easy 254 00:16:42,920 --> 00:16:46,120 Speaker 1: for me to say the letter H, for example, but 255 00:16:46,200 --> 00:16:48,480 Speaker 1: if I have to look up the as key code 256 00:16:48,640 --> 00:16:52,400 Speaker 1: for the letter H, that would be zero one one 257 00:16:52,520 --> 00:16:56,480 Speaker 1: zero one zero zero zero. That's if I wanted to 258 00:16:56,480 --> 00:16:58,360 Speaker 1: do a lower case H, and upper case H is 259 00:16:58,360 --> 00:17:03,400 Speaker 1: a totally different code, so that would very quickly become 260 00:17:03,560 --> 00:17:07,439 Speaker 1: impossible for me to use this this binary language to 261 00:17:07,640 --> 00:17:10,159 Speaker 1: make any kind of meaningful set of instructions, like a 262 00:17:10,200 --> 00:17:13,639 Speaker 1: program to run on a computer. So to make it 263 00:17:13,680 --> 00:17:17,200 Speaker 1: easier for humans to program and work with computers, various 264 00:17:17,359 --> 00:17:21,760 Speaker 1: very smart people have created programming languages. Now, some programming 265 00:17:21,840 --> 00:17:25,439 Speaker 1: languages we refer to as being low level languages. That 266 00:17:25,480 --> 00:17:28,960 Speaker 1: means they are fairly close to machine code and thus 267 00:17:29,000 --> 00:17:32,080 Speaker 1: they're pretty hard for humans to work with. But other 268 00:17:32,200 --> 00:17:36,240 Speaker 1: languages are more of an abstraction and they are high level, 269 00:17:36,720 --> 00:17:40,600 Speaker 1: so they are easy or at least easier for humans 270 00:17:40,600 --> 00:17:43,679 Speaker 1: to work with as they type out instructions to create 271 00:17:43,800 --> 00:17:50,159 Speaker 1: a program, so a compiler then takes that language and 272 00:17:50,280 --> 00:17:53,720 Speaker 1: translates it into machine code for the computer to process. 273 00:17:54,440 --> 00:17:57,760 Speaker 1: Basic is one of the older modern computer languages and 274 00:17:57,840 --> 00:18:00,640 Speaker 1: one that goofs is like yours. Truly it around with 275 00:18:00,960 --> 00:18:03,920 Speaker 1: when personal computers first became a thing. You can still 276 00:18:03,920 --> 00:18:08,080 Speaker 1: program in Basic, though there are far more sophisticated programming 277 00:18:08,119 --> 00:18:12,719 Speaker 1: languages out there. Of course. Next is BIOS, which stands 278 00:18:12,760 --> 00:18:16,520 Speaker 1: for Basic Input output System. So if you look at 279 00:18:16,520 --> 00:18:19,760 Speaker 1: a computer and you get a bit abstract, you realize 280 00:18:19,800 --> 00:18:24,120 Speaker 1: there's some proverbial layers going on with your basic computer 281 00:18:24,320 --> 00:18:28,600 Speaker 1: or computational device. You've got your actual circuitry, right, You've 282 00:18:28,600 --> 00:18:33,800 Speaker 1: got the actual hardware through which information ultimately must be processed. 283 00:18:34,680 --> 00:18:37,880 Speaker 1: These are the physical pathways that electricity can flow through. 284 00:18:38,000 --> 00:18:42,119 Speaker 1: These are the transistors and the wires and processors and 285 00:18:42,160 --> 00:18:45,000 Speaker 1: all that kind of stuff. But you've also got software. 286 00:18:45,280 --> 00:18:48,680 Speaker 1: These are the programs that you run to create various outputs. 287 00:18:48,880 --> 00:18:51,960 Speaker 1: Maybe it's a video game, maybe it's a word processor, 288 00:18:52,040 --> 00:18:54,880 Speaker 1: maybe it's a web browser. These are chunks of code 289 00:18:55,160 --> 00:18:58,600 Speaker 1: that respond to your input and create an output based 290 00:18:58,720 --> 00:19:01,280 Speaker 1: on that. But there's got to be a layer that 291 00:19:01,320 --> 00:19:05,280 Speaker 1: allows for software to interact with hardware, and that's kind 292 00:19:05,280 --> 00:19:08,600 Speaker 1: of what BIOS is doing. It's a type of firmware, 293 00:19:08,920 --> 00:19:11,520 Speaker 1: which is a low level software that interacts with a 294 00:19:11,600 --> 00:19:16,439 Speaker 1: hardware level. BIOS initiates the boot up process, among other things, 295 00:19:16,880 --> 00:19:20,159 Speaker 1: and the BIOS sets the boot priority, which is essentially 296 00:19:20,200 --> 00:19:23,440 Speaker 1: a list that dictates the order in which processes may 297 00:19:23,520 --> 00:19:29,000 Speaker 1: initiate upon a machine booting up. All right, so far 298 00:19:29,119 --> 00:19:32,080 Speaker 1: we have covered only the a's and the bees plus 299 00:19:32,080 --> 00:19:34,280 Speaker 1: a number with two F A. When we come back, 300 00:19:34,520 --> 00:19:39,639 Speaker 1: we'll see what's next. It's um, it's c because that 301 00:19:39,720 --> 00:19:42,680 Speaker 1: was a pun. I'm so sorry. We'll be right back. 302 00:19:50,119 --> 00:19:53,119 Speaker 1: We're gonna start off the seas with CAT C A D. 303 00:19:53,760 --> 00:19:56,679 Speaker 1: It's not just a scoundrel. It actually stands for a 304 00:19:56,720 --> 00:20:00,720 Speaker 1: computer aided design and as the name IMPLOY lies, this 305 00:20:00,840 --> 00:20:05,160 Speaker 1: refers to the practice of using computers or computerized workstations 306 00:20:05,400 --> 00:20:09,640 Speaker 1: to assist in the design of something. It could be 307 00:20:09,720 --> 00:20:14,440 Speaker 1: in the design of electronics or architecture, or mechanical systems 308 00:20:14,520 --> 00:20:18,639 Speaker 1: or animation. I first learned about CAD approaches from my 309 00:20:18,720 --> 00:20:22,040 Speaker 1: friend Michael in high school. He took a course in drafting, 310 00:20:22,119 --> 00:20:26,920 Speaker 1: and that's where he first worked with CAD applications. Me. No, 311 00:20:27,000 --> 00:20:30,560 Speaker 1: I never got into that because ain't no computer that's 312 00:20:30,600 --> 00:20:34,040 Speaker 1: been made that can aid me in the design enough 313 00:20:34,119 --> 00:20:39,440 Speaker 1: to make something I make look good or be functional. Today, 314 00:20:39,640 --> 00:20:44,000 Speaker 1: CAD is used in tons of industries, from aerospace to 315 00:20:44,160 --> 00:20:48,399 Speaker 1: prosthetics to computer animation. Designers may work in a two 316 00:20:48,480 --> 00:20:52,440 Speaker 1: D format that's two dimensional, or they might use three 317 00:20:52,520 --> 00:20:56,159 Speaker 1: D models. It all depends on the specific implementation. You 318 00:20:56,200 --> 00:20:59,480 Speaker 1: know what they're using it for, and the program next 319 00:20:59,560 --> 00:21:02,880 Speaker 1: is CAT or C A T. In this case, I'm 320 00:21:02,920 --> 00:21:06,439 Speaker 1: talking about CAT as in category, which we use to 321 00:21:06,480 --> 00:21:11,240 Speaker 1: describe certain types of network cables, like ethernet cables. These 322 00:21:11,280 --> 00:21:14,840 Speaker 1: are a subset of twisted pair cables. So let's just 323 00:21:15,040 --> 00:21:19,760 Speaker 1: walk through that really quickly. If you're familiar with electromagnetism, 324 00:21:19,840 --> 00:21:23,800 Speaker 1: you know that a current running through a conductor generates 325 00:21:23,840 --> 00:21:27,400 Speaker 1: a magnetic field, and you know that a fluctuating magnetic 326 00:21:27,440 --> 00:21:32,080 Speaker 1: field will induce a current to flow through a nearby conductor. 327 00:21:33,000 --> 00:21:34,959 Speaker 1: We can do a lot of cool stuff with that 328 00:21:35,119 --> 00:21:38,000 Speaker 1: because of that basic law of physics, but it also 329 00:21:38,080 --> 00:21:41,600 Speaker 1: means we have to take interference into account when we 330 00:21:41,640 --> 00:21:45,800 Speaker 1: build out electronics. If you had two unshielded conductors that 331 00:21:45,840 --> 00:21:49,320 Speaker 1: were near each other, the current flowing through conductor number 332 00:21:49,320 --> 00:21:53,560 Speaker 1: one would interfere with conductor number two. So one thing 333 00:21:53,600 --> 00:21:57,040 Speaker 1: you can do to limit this is you can insulate 334 00:21:57,119 --> 00:22:00,320 Speaker 1: the conductors. You can use a non conductive material to 335 00:22:00,440 --> 00:22:03,480 Speaker 1: coat those. But another thing you can do is you 336 00:22:03,480 --> 00:22:07,080 Speaker 1: can twist a pair of conductors together that actually reduces 337 00:22:07,119 --> 00:22:11,119 Speaker 1: the interference between the two. Alexander Graham Bell discovered this 338 00:22:11,400 --> 00:22:14,639 Speaker 1: and used it when building out devices like the early telephone, 339 00:22:15,040 --> 00:22:20,440 Speaker 1: and in fact telephone wires use this particular approach. Let's 340 00:22:20,440 --> 00:22:24,280 Speaker 1: skip ahead to the nine nineties. After the development of 341 00:22:24,600 --> 00:22:28,920 Speaker 1: Level one cables, which are used for telephone wires, and 342 00:22:29,080 --> 00:22:32,800 Speaker 1: Level two, which was used in early computer terminal systems, 343 00:22:32,840 --> 00:22:38,280 Speaker 1: particularly at places like IBM, twisted pair cables came in 344 00:22:38,320 --> 00:22:41,920 Speaker 1: a type called Category three or just Cat three cables, 345 00:22:42,119 --> 00:22:44,840 Speaker 1: and then went from there. Cat three cables allowed for 346 00:22:44,880 --> 00:22:48,399 Speaker 1: a bandwidth of sixteen mega hurts of frequencies for the 347 00:22:48,440 --> 00:22:52,359 Speaker 1: purposes of data transmission. These days, most Ethernet cables are 348 00:22:52,400 --> 00:22:56,680 Speaker 1: actually Cat five E cables, which can transmit data at 349 00:22:56,760 --> 00:23:00,920 Speaker 1: up to gigabit speeds. There are other Cat gory cables 350 00:23:00,960 --> 00:23:03,200 Speaker 1: out there, some of which have yet to be ratified 351 00:23:03,240 --> 00:23:06,680 Speaker 1: by standards organizations. That also means that because they haven't 352 00:23:06,680 --> 00:23:12,200 Speaker 1: been ratified, there aren't that many equipment manufacturers that have created, 353 00:23:12,680 --> 00:23:17,280 Speaker 1: you know, actual devices that accept those kinds of cables, 354 00:23:17,320 --> 00:23:19,879 Speaker 1: because it could be a very expensive mistake to build 355 00:23:19,920 --> 00:23:23,680 Speaker 1: out stuff that is accepting a non standardized input. If 356 00:23:24,160 --> 00:23:30,280 Speaker 1: standards organizations never ratify specific implementations and declare them as standard, 357 00:23:31,080 --> 00:23:34,160 Speaker 1: you could end up having devices that have useless ports, 358 00:23:34,280 --> 00:23:39,800 Speaker 1: and that's just an expense that you didn't need to have. Next, 359 00:23:39,840 --> 00:23:43,240 Speaker 1: we have C MOSS or c m O S. This 360 00:23:43,320 --> 00:23:47,760 Speaker 1: stands for complementary metal oxide semiconductor, which is a type 361 00:23:47,760 --> 00:23:51,840 Speaker 1: of semiconductor that says nice things about your outfit. Wait no, 362 00:23:51,920 --> 00:23:56,120 Speaker 1: I'm sorry, wait being told that's the wrong kind of complementary. 363 00:23:56,280 --> 00:23:59,320 Speaker 1: With regard to computer chips, C MOSS refers to a 364 00:23:59,400 --> 00:24:03,040 Speaker 1: chip that's ors information about the hardware settings of the device. 365 00:24:03,680 --> 00:24:08,720 Speaker 1: So BIOS references s MOSS when going through the booting process. 366 00:24:08,720 --> 00:24:12,159 Speaker 1: So SA MOSS and BIOS work together to bring a 367 00:24:12,200 --> 00:24:16,959 Speaker 1: computational device online and in proper working order upon booting up. 368 00:24:17,480 --> 00:24:21,280 Speaker 1: The memory on S MOSS is dynamic and technically it's 369 00:24:21,280 --> 00:24:24,840 Speaker 1: temporary or volatile in other words, And if S Moss 370 00:24:24,840 --> 00:24:27,600 Speaker 1: were to ever go unmpowered, like if the chip were 371 00:24:27,640 --> 00:24:30,720 Speaker 1: to be able to cut off from power, the memory 372 00:24:30,760 --> 00:24:33,879 Speaker 1: on that chip would just wipe out. It would be blank. 373 00:24:33,920 --> 00:24:36,879 Speaker 1: It would be a race essentially. But Bios needs the 374 00:24:36,960 --> 00:24:41,359 Speaker 1: instructions from s Moss to boot properly, right, Bios depends 375 00:24:41,440 --> 00:24:45,280 Speaker 1: on s Moss to essentially instruct the Bios what order 376 00:24:45,359 --> 00:24:48,480 Speaker 1: to do stuff in so, the S Moss chip relies 377 00:24:48,600 --> 00:24:51,880 Speaker 1: on a small battery to stay powered up even when 378 00:24:51,920 --> 00:24:54,240 Speaker 1: the computer itself has turned off or if you lost 379 00:24:54,240 --> 00:24:58,000 Speaker 1: power or whatever. These batteries can last a really long time. 380 00:24:58,359 --> 00:25:02,320 Speaker 1: Ten years, isn't on you usual, Uh, it's typically at 381 00:25:02,400 --> 00:25:05,320 Speaker 1: least as long as the life cycle for the motherboard 382 00:25:05,480 --> 00:25:08,280 Speaker 1: of your computer. Most of the time you would actually 383 00:25:08,320 --> 00:25:11,160 Speaker 1: be ready to replace the whole device before you would 384 00:25:11,160 --> 00:25:13,840 Speaker 1: ever need to replace the C Moss battery, although there 385 00:25:13,840 --> 00:25:16,480 Speaker 1: are cases where people have had to do that. Now, 386 00:25:16,480 --> 00:25:18,840 Speaker 1: when you boot up a computer, you actually do have 387 00:25:18,880 --> 00:25:21,520 Speaker 1: a tool that allows you the option to either boot 388 00:25:21,600 --> 00:25:25,960 Speaker 1: into BIOS or S Moss. Booting into sea Moss gives 389 00:25:25,960 --> 00:25:29,199 Speaker 1: you the chance to change sea Moss settings which in 390 00:25:29,240 --> 00:25:33,320 Speaker 1: turn will affect how BIOS handles the booting process in 391 00:25:33,359 --> 00:25:36,920 Speaker 1: the future. Sea Moss, by the way, is a PC term, 392 00:25:37,000 --> 00:25:41,919 Speaker 1: as in personal computer, not politically correct. In Apple Mac computers, 393 00:25:41,960 --> 00:25:45,600 Speaker 1: the equivalent is PRAM or p RAM that stands for 394 00:25:45,720 --> 00:25:49,159 Speaker 1: parameter ram. But we're gonna talk about RAM in a 395 00:25:49,280 --> 00:25:52,440 Speaker 1: later episode, all right. I should also add that there 396 00:25:52,600 --> 00:25:55,639 Speaker 1: is another s MOSS in tech, and that's the type 397 00:25:55,680 --> 00:25:59,960 Speaker 1: of active pixels sensor found in some digital cameras. See 398 00:26:00,119 --> 00:26:04,120 Speaker 1: MOSS is just one type of these kinds of sensors, 399 00:26:04,160 --> 00:26:06,520 Speaker 1: and to go into how those sensors work would require 400 00:26:06,560 --> 00:26:09,800 Speaker 1: a pretty thorough explanation, and it's full episode on its own, 401 00:26:09,800 --> 00:26:11,560 Speaker 1: so I'm going to leave that for the time being. 402 00:26:11,840 --> 00:26:14,920 Speaker 1: Just understand that there's that version to It still stands 403 00:26:14,920 --> 00:26:16,840 Speaker 1: for the same thing, by the way, It's still complementary 404 00:26:16,880 --> 00:26:21,280 Speaker 1: metal oxide semiconductor, but it has a different purpose. Moving 405 00:26:21,320 --> 00:26:26,200 Speaker 1: on CMS now with regard to tech, CMS means content 406 00:26:26,480 --> 00:26:30,960 Speaker 1: management system. Typically, this is a framework within which users 407 00:26:31,000 --> 00:26:35,480 Speaker 1: can post, edit, and delete content, such as web content. 408 00:26:36,320 --> 00:26:39,000 Speaker 1: A CMS typically has a structure that allows for a 409 00:26:39,080 --> 00:26:42,760 Speaker 1: uniform approach to adding content to a pre existing system, 410 00:26:42,800 --> 00:26:45,679 Speaker 1: like say a website that way, even someone who is 411 00:26:45,720 --> 00:26:48,639 Speaker 1: new to that environment can still post stuff that's in 412 00:26:48,680 --> 00:26:51,800 Speaker 1: line with the standards and protocols of the site. So 413 00:26:51,920 --> 00:26:55,119 Speaker 1: let's take an actual example and use my old employer, 414 00:26:55,280 --> 00:26:59,320 Speaker 1: how stuff works dot com. That site has a CMS 415 00:26:59,359 --> 00:27:03,040 Speaker 1: that allows people to create content in article format, including 416 00:27:03,080 --> 00:27:06,440 Speaker 1: the basics and how images show up on screen, where 417 00:27:06,480 --> 00:27:09,840 Speaker 1: captions should appear, and in what font and all that 418 00:27:09,920 --> 00:27:12,080 Speaker 1: kind of stuff. It's sort of like a way to 419 00:27:12,160 --> 00:27:16,640 Speaker 1: create and manage templates and then post content within that template. 420 00:27:17,200 --> 00:27:19,439 Speaker 1: And back when I worked at how stuff Works, I 421 00:27:19,480 --> 00:27:22,320 Speaker 1: didn't have to use the CMS very much myself. I 422 00:27:22,320 --> 00:27:25,120 Speaker 1: would write my articles and then a publisher would take 423 00:27:25,119 --> 00:27:28,760 Speaker 1: the finished and edited product and then put it into 424 00:27:28,920 --> 00:27:32,880 Speaker 1: CMS for publication. Our CMS also allowed publishers to set 425 00:27:32,880 --> 00:27:36,240 Speaker 1: a time for that publication, so that a finished piece 426 00:27:36,240 --> 00:27:39,000 Speaker 1: of copy could go live on the site at a 427 00:27:39,080 --> 00:27:44,040 Speaker 1: specific designated time. So there was a content delivery application 428 00:27:44,320 --> 00:27:47,439 Speaker 1: or c d A that would take the formatted content 429 00:27:47,560 --> 00:27:51,280 Speaker 1: and push it to go live. A good CMS will 430 00:27:51,320 --> 00:27:54,040 Speaker 1: have lots of features that make life easier for publishers, 431 00:27:54,119 --> 00:27:56,920 Speaker 1: like audit logs to keep track of changes that are 432 00:27:56,920 --> 00:28:00,479 Speaker 1: made to content, or a small server footprint so that 433 00:28:00,520 --> 00:28:03,400 Speaker 1: the CMS isn't taking too much space on a network. 434 00:28:03,760 --> 00:28:07,080 Speaker 1: Really good ones will have a very intuitive UI that 435 00:28:07,119 --> 00:28:10,480 Speaker 1: stands for user interface. So a user interface is exactly 436 00:28:10,480 --> 00:28:12,040 Speaker 1: what it sounds like. It's the way in which a 437 00:28:12,160 --> 00:28:15,760 Speaker 1: user interacts with a technology. So that's another little bonus 438 00:28:15,840 --> 00:28:20,600 Speaker 1: initialism for you. Right there. Next is KAPPA c O 439 00:28:20,800 --> 00:28:23,680 Speaker 1: p p A. I covered this in a recent episode 440 00:28:23,720 --> 00:28:26,240 Speaker 1: of Tech Stuff, so I'm not gonna spend too much 441 00:28:26,280 --> 00:28:29,360 Speaker 1: time on it, but it stands for Children's Online Privacy 442 00:28:29,400 --> 00:28:32,960 Speaker 1: Protection Act. The US Congress passed this Act into law 443 00:28:33,000 --> 00:28:38,240 Speaker 1: in n KAPPA requires online sites and services that target 444 00:28:38,400 --> 00:28:41,280 Speaker 1: users who are under the age of thirteen to comply 445 00:28:41,440 --> 00:28:45,520 Speaker 1: with certain rules or else face civil lawsuits from entities 446 00:28:45,560 --> 00:28:49,360 Speaker 1: like the Federal Trade Commission or FTC. Those rules state 447 00:28:49,440 --> 00:28:52,040 Speaker 1: that a site or service has to get the express 448 00:28:52,120 --> 00:28:55,840 Speaker 1: permission from a parent or guardian of a child before 449 00:28:55,920 --> 00:29:00,640 Speaker 1: they can collect that child's information. Further, the eighter service 450 00:29:00,680 --> 00:29:03,600 Speaker 1: cannot collect any and all information about the kid. It 451 00:29:03,680 --> 00:29:07,200 Speaker 1: can't just like build out a comprehensive database of all 452 00:29:07,320 --> 00:29:10,640 Speaker 1: data points about that child. They are only supposed to 453 00:29:10,680 --> 00:29:14,160 Speaker 1: collect the information needed to provide whatever service it is 454 00:29:14,600 --> 00:29:18,240 Speaker 1: that the entity is providing So, for example, if it's 455 00:29:18,280 --> 00:29:22,200 Speaker 1: a web based game, it can't be asking for all 456 00:29:22,240 --> 00:29:25,920 Speaker 1: the information about the kids address and parents names and 457 00:29:25,920 --> 00:29:28,080 Speaker 1: all that kind of stuff because it's not necessary in 458 00:29:28,160 --> 00:29:30,800 Speaker 1: order to just play the game. Also, these entities are 459 00:29:30,800 --> 00:29:33,800 Speaker 1: supposed to delete that information once the info is no 460 00:29:33,840 --> 00:29:38,160 Speaker 1: longer needed to provide that service. This is tied pretty 461 00:29:38,200 --> 00:29:41,320 Speaker 1: closely to the rules that the advertising industry set for 462 00:29:41,360 --> 00:29:46,440 Speaker 1: itself when it comes to marketing towards children. In Kappa 463 00:29:46,520 --> 00:29:50,600 Speaker 1: made the news uh within YouTube circles because the platform 464 00:29:50,640 --> 00:29:54,600 Speaker 1: initiated some pretty big changes that had widespread effects on 465 00:29:54,920 --> 00:29:59,320 Speaker 1: content creators. The short version is that YouTube requires creators 466 00:29:59,320 --> 00:30:03,200 Speaker 1: to design whether their channels or on a more granular level, 467 00:30:03,720 --> 00:30:08,720 Speaker 1: their individual videos are targeting kids specifically, and if so, 468 00:30:09,040 --> 00:30:11,720 Speaker 1: then many of the typical features that we've come to 469 00:30:11,800 --> 00:30:16,200 Speaker 1: expect on YouTube, you know, stuff like comments, uh, notifications, 470 00:30:16,280 --> 00:30:19,320 Speaker 1: merchandise links, that kind of stuff, all of that gets 471 00:30:19,400 --> 00:30:24,480 Speaker 1: turned off because of those strict rules about how sites 472 00:30:24,520 --> 00:30:28,760 Speaker 1: and services can collect information about kids or how they 473 00:30:28,760 --> 00:30:33,080 Speaker 1: can advertise to kids. So, in addition, that means personalized 474 00:30:33,120 --> 00:30:37,320 Speaker 1: ads are turned off automatically for any of those videos or, 475 00:30:37,440 --> 00:30:40,040 Speaker 1: in the case of channels, that are directed towards kids 476 00:30:40,080 --> 00:30:43,720 Speaker 1: for the entire channel that affects monetization. It means that 477 00:30:43,760 --> 00:30:47,440 Speaker 1: you get a lower level for revenue than you would 478 00:30:47,440 --> 00:30:51,120 Speaker 1: with personalized ads, and that means that creators will make 479 00:30:51,240 --> 00:30:54,240 Speaker 1: less money through those means. And for a lot of creators, 480 00:30:54,600 --> 00:30:58,000 Speaker 1: these changes raised questions about whether or not their channel, 481 00:30:58,640 --> 00:31:01,920 Speaker 1: which might be family friend e, might be tagged as 482 00:31:01,960 --> 00:31:05,000 Speaker 1: being explicitly targeting kids. They could say, well, no, I 483 00:31:05,040 --> 00:31:09,160 Speaker 1: don't target kids. I mean, I don't make content that's 484 00:31:09,200 --> 00:31:13,280 Speaker 1: inappropriate for children, but I'm not specifically targeting children as 485 00:31:13,320 --> 00:31:16,000 Speaker 1: my audience. And so there are a lot of questions 486 00:31:16,000 --> 00:31:20,440 Speaker 1: about how do these different creators, you know, how do 487 00:31:20,520 --> 00:31:24,320 Speaker 1: they fit within this rule set. So far, it doesn't 488 00:31:24,320 --> 00:31:27,080 Speaker 1: seem as though these changes have turned YouTube upside down 489 00:31:27,160 --> 00:31:31,080 Speaker 1: or anything, but it is an ongoing dialogue between creators 490 00:31:31,160 --> 00:31:37,960 Speaker 1: and the platform itself. Next, we have c P, A CPC, CPL, 491 00:31:38,280 --> 00:31:42,160 Speaker 1: and CPM. So speaking of monetization, that's what all these 492 00:31:42,200 --> 00:31:46,080 Speaker 1: initialisms kind of relate to. The CP in each of 493 00:31:46,120 --> 00:31:50,960 Speaker 1: these stands for cost per So you've got cost per 494 00:31:51,080 --> 00:31:54,680 Speaker 1: action for c P, a cost per click for CPC, 495 00:31:55,240 --> 00:31:59,160 Speaker 1: cost per lead for CPL, and cost per mill a 496 00:31:59,440 --> 00:32:03,360 Speaker 1: for cp UM. So all of this ties to advertising 497 00:32:03,400 --> 00:32:06,360 Speaker 1: and how ad deals are struck between content providers or 498 00:32:06,440 --> 00:32:11,719 Speaker 1: content platforms and the advertisers. So cost per action covers 499 00:32:11,760 --> 00:32:15,960 Speaker 1: an amount paid per specific action that's taken by users. 500 00:32:16,480 --> 00:32:19,480 Speaker 1: That action could be clicking on an AD, or it 501 00:32:19,520 --> 00:32:22,080 Speaker 1: could be submitting an online form, or it might go 502 00:32:22,200 --> 00:32:26,080 Speaker 1: so far as actually making a purchase. So the agreement 503 00:32:26,120 --> 00:32:30,040 Speaker 1: here states that the advertiser will pay the content platform 504 00:32:30,280 --> 00:32:34,280 Speaker 1: a specific fee every time some user takes this very 505 00:32:34,320 --> 00:32:39,040 Speaker 1: particular action related to whatever the ad is. Cost per 506 00:32:39,120 --> 00:32:42,600 Speaker 1: click is really a subset of cost per action. It 507 00:32:42,680 --> 00:32:47,440 Speaker 1: specifically refers to the moment when someone clicks on an AD, 508 00:32:47,600 --> 00:32:50,760 Speaker 1: So this isn't just whether or not someone saw an ad. 509 00:32:51,000 --> 00:32:53,680 Speaker 1: It's not enough for it to just be an impression. 510 00:32:53,760 --> 00:32:56,440 Speaker 1: In other words, this is if a person saw the 511 00:32:56,520 --> 00:33:00,280 Speaker 1: ad and then acted by clicking through to see what 512 00:33:00,480 --> 00:33:04,400 Speaker 1: the ad links to. So Google ads in search results 513 00:33:04,480 --> 00:33:06,760 Speaker 1: typically fall into this category. If you ever do a 514 00:33:06,800 --> 00:33:09,280 Speaker 1: Google search and then you click on one of the 515 00:33:09,320 --> 00:33:12,000 Speaker 1: ad results, which are typically at the very top of 516 00:33:12,040 --> 00:33:15,680 Speaker 1: the list, that's likely counting towards a cost per click 517 00:33:15,920 --> 00:33:20,360 Speaker 1: revenue model. It doesn't affect you directly, it's just how 518 00:33:20,880 --> 00:33:25,360 Speaker 1: the money is changing hands at that advertiser platform level. 519 00:33:26,120 --> 00:33:29,040 Speaker 1: Then you have cost per lead or CPL. That's when 520 00:33:29,040 --> 00:33:32,440 Speaker 1: you're usually talking about scenario in which someone is explicitly 521 00:33:32,600 --> 00:33:36,480 Speaker 1: signing up for an offer. So if the ad leads 522 00:33:36,520 --> 00:33:39,600 Speaker 1: to someone signing up to get a newsletter or something 523 00:33:39,640 --> 00:33:42,880 Speaker 1: like that, that might be a cost per lead. And 524 00:33:42,960 --> 00:33:46,000 Speaker 1: like the previous examples, the advertiser will pay out a 525 00:33:46,000 --> 00:33:49,280 Speaker 1: certain amount of money for every user that actually follows 526 00:33:49,360 --> 00:33:52,760 Speaker 1: through and generates a lead. And yeah, leads in this 527 00:33:52,800 --> 00:33:57,160 Speaker 1: case essentially mean potential sales, as they typically represent someone 528 00:33:57,200 --> 00:34:00,720 Speaker 1: who is interested in a specific product or so of us. 529 00:34:00,760 --> 00:34:03,640 Speaker 1: And then finally you've got cost per mil a mill 530 00:34:03,680 --> 00:34:07,640 Speaker 1: a is the old Roman word for thousand, and this 531 00:34:07,720 --> 00:34:11,879 Speaker 1: is the impression model. So essentially CPM establishes a certain 532 00:34:11,920 --> 00:34:15,439 Speaker 1: amount of money that an advertiser will pay per one 533 00:34:15,560 --> 00:34:19,960 Speaker 1: thousand impressions or views. So if you've got a website 534 00:34:20,520 --> 00:34:24,919 Speaker 1: and your website displays ads that are all based on impressions, 535 00:34:25,280 --> 00:34:28,200 Speaker 1: the more people who come to visit your site, the 536 00:34:28,239 --> 00:34:31,880 Speaker 1: more money you'll make. Once you have those one thousand 537 00:34:32,120 --> 00:34:36,240 Speaker 1: impression blocks start to fill up. In addition, more popular 538 00:34:36,320 --> 00:34:40,439 Speaker 1: sites can actually require a higher CPM, so that means 539 00:34:40,440 --> 00:34:45,320 Speaker 1: that advertisers will actually pay more per one thousand impressions. 540 00:34:45,880 --> 00:34:49,239 Speaker 1: These agreements typically have a set time limit on them, 541 00:34:49,239 --> 00:34:52,440 Speaker 1: so for example, you might have an ad deal that 542 00:34:52,520 --> 00:34:54,840 Speaker 1: lasts for three months, and at the end of the 543 00:34:54,880 --> 00:34:58,480 Speaker 1: three months you get paid according to whatever your CPM 544 00:34:58,560 --> 00:35:02,600 Speaker 1: rate is and how any people actually viewed that ad 545 00:35:02,719 --> 00:35:07,040 Speaker 1: within the three month period. CPM approaches have led to 546 00:35:07,120 --> 00:35:10,120 Speaker 1: some of the types of web pages that I personally dislike, 547 00:35:10,640 --> 00:35:14,600 Speaker 1: such as the slide show approach for listicles, where every 548 00:35:14,680 --> 00:35:17,760 Speaker 1: item on a list is its own web page. That's 549 00:35:17,800 --> 00:35:21,480 Speaker 1: done because moving from one slide to another counts as 550 00:35:21,480 --> 00:35:25,640 Speaker 1: a page refresh, which means you get another impression. So 551 00:35:25,920 --> 00:35:29,920 Speaker 1: one way web pages boost impression counts is by using 552 00:35:30,000 --> 00:35:33,719 Speaker 1: stuff like slide shows, galleries, and quizzes in order to 553 00:35:33,760 --> 00:35:36,920 Speaker 1: get those page views to go up. Also, a website 554 00:35:36,960 --> 00:35:39,359 Speaker 1: that can show that it has a higher page view 555 00:35:39,480 --> 00:35:42,640 Speaker 1: rate can demand a higher CPM rate, so it all 556 00:35:42,719 --> 00:35:45,879 Speaker 1: kind of feeds back on itself. We've got a few 557 00:35:45,880 --> 00:35:48,000 Speaker 1: more seeds to go through, but I need to take 558 00:35:48,400 --> 00:35:59,399 Speaker 1: a really quick break. Okay, we're up to a big one. 559 00:35:59,560 --> 00:36:03,399 Speaker 1: See pu this one is a basic term that maybe 560 00:36:03,480 --> 00:36:06,280 Speaker 1: all of you know, but just in case, it stands 561 00:36:06,280 --> 00:36:10,719 Speaker 1: for central processing unit. This is the logic center for 562 00:36:10,800 --> 00:36:15,200 Speaker 1: a computational device. It's the chip that performs basic operations 563 00:36:15,200 --> 00:36:19,000 Speaker 1: on data to generate results. So you could have a 564 00:36:19,040 --> 00:36:23,600 Speaker 1: program that's sending instructions and data to the CPU. Those 565 00:36:23,600 --> 00:36:27,280 Speaker 1: instructions might be as simple as add these two numbers together, 566 00:36:27,680 --> 00:36:31,040 Speaker 1: and then the CPU executes those instructions on the data 567 00:36:31,360 --> 00:36:33,960 Speaker 1: and then sends the output to wherever it's supposed to 568 00:36:33,960 --> 00:36:38,920 Speaker 1: go based on those instructions. CPUs handled general instructions, and 569 00:36:38,960 --> 00:36:42,280 Speaker 1: so they have to be pretty good at pretty much everything, 570 00:36:42,640 --> 00:36:45,080 Speaker 1: or at least they have to be passable at everything. 571 00:36:45,160 --> 00:36:49,279 Speaker 1: For a general purpose computer, they typically have an arithmetic 572 00:36:49,480 --> 00:36:53,160 Speaker 1: logic unit or a LU in them, which, as the 573 00:36:53,239 --> 00:36:57,920 Speaker 1: name suggests, is in charge of executing arithmetic operations on data. 574 00:36:58,640 --> 00:37:01,719 Speaker 1: A LU chip can exist on their own. They don't 575 00:37:01,760 --> 00:37:04,120 Speaker 1: have to be full CPUs, and in fact, in some 576 00:37:04,680 --> 00:37:08,279 Speaker 1: more um basic electronics you might just have an a 577 00:37:08,440 --> 00:37:11,479 Speaker 1: l U. The CPU typically also has a control unit 578 00:37:11,560 --> 00:37:14,399 Speaker 1: that's in charge of coordinating things within the CPU, such 579 00:37:14,440 --> 00:37:18,280 Speaker 1: as fetching data from memory and then dictating the order 580 00:37:18,280 --> 00:37:21,600 Speaker 1: of operations that the CPU is supposed to follow. CPUs 581 00:37:21,640 --> 00:37:25,280 Speaker 1: operate at a specific rate of operations called the clock 582 00:37:25,360 --> 00:37:28,160 Speaker 1: rate or clock speed. You can think of this as 583 00:37:28,440 --> 00:37:32,920 Speaker 1: how many basic instructions the CPU is able to execute 584 00:37:33,000 --> 00:37:36,719 Speaker 1: in a second. We measure this in hurts or cycles 585 00:37:36,840 --> 00:37:40,200 Speaker 1: per second. So a CPU that operates on the mega 586 00:37:40,280 --> 00:37:44,120 Speaker 1: Hurts scale is executing basic instructions at a rate of 587 00:37:44,280 --> 00:37:47,360 Speaker 1: millions per second, though these days that would be slow. 588 00:37:47,560 --> 00:37:51,120 Speaker 1: Your basic CPUs today operate on the giga Hurts scale, 589 00:37:51,320 --> 00:37:56,040 Speaker 1: so we're talking billions of basic instructions every second. Some 590 00:37:56,200 --> 00:38:01,680 Speaker 1: operations require more than one step in instructions, and the 591 00:38:01,840 --> 00:38:06,239 Speaker 1: faster the clock rate, the faster the CPUs can execute instructions. 592 00:38:06,560 --> 00:38:11,000 Speaker 1: Generally speaking, the practice of overclocking refers to boosting a 593 00:38:11,080 --> 00:38:15,200 Speaker 1: CPUs clock rate beyond whatever the factory set limit for 594 00:38:15,239 --> 00:38:18,280 Speaker 1: that CPU happens to be. It's kind of like removing 595 00:38:18,280 --> 00:38:21,080 Speaker 1: any sort of limitation device from a car so that 596 00:38:21,080 --> 00:38:24,440 Speaker 1: it can actually go faster than it's rated top speed. 597 00:38:25,600 --> 00:38:30,560 Speaker 1: On another note, while early CPUs used a single core architecture, 598 00:38:31,000 --> 00:38:33,840 Speaker 1: it's pretty common these days for computers to have multi 599 00:38:33,920 --> 00:38:36,480 Speaker 1: core processors. You can sort of think of these as 600 00:38:37,480 --> 00:38:41,840 Speaker 1: slightly smaller CPUs that all work together. For certain types 601 00:38:41,840 --> 00:38:46,719 Speaker 1: of computational problems, the multi core approach greatly speeds up 602 00:38:46,719 --> 00:38:51,080 Speaker 1: processing by breaking those problems up into different components. This 603 00:38:51,200 --> 00:38:55,680 Speaker 1: doesn't work for every computational problem, however, and so a 604 00:38:55,760 --> 00:39:00,120 Speaker 1: multi core processor may sometimes not match a single or 605 00:39:00,239 --> 00:39:04,400 Speaker 1: processor of a similar clock rate for a specific subset 606 00:39:04,400 --> 00:39:07,520 Speaker 1: of computational problems. Getting into all of that would require 607 00:39:07,760 --> 00:39:10,359 Speaker 1: a full episode of itself, so we'll leave it for now. 608 00:39:10,680 --> 00:39:13,120 Speaker 1: Just know that most CPUs out there these days are 609 00:39:13,239 --> 00:39:17,440 Speaker 1: multi core processors, and for the vast majority of types 610 00:39:17,760 --> 00:39:22,560 Speaker 1: of software that we typical users run, that's fine. It's 611 00:39:22,600 --> 00:39:26,879 Speaker 1: perfectly cromulent, as the Simpsons would say. Next, we have 612 00:39:27,120 --> 00:39:31,680 Speaker 1: c R T. So in the context of technology, I'm 613 00:39:31,719 --> 00:39:35,680 Speaker 1: talking about cathode ray tubes. This refers to the old 614 00:39:35,800 --> 00:39:39,360 Speaker 1: style of computer monitors and displays and even television sets. 615 00:39:39,800 --> 00:39:43,680 Speaker 1: These devices are big, bulky displays. They aren't just wide 616 00:39:43,680 --> 00:39:47,440 Speaker 1: and tall like flat panel displays. They have depth, so 617 00:39:47,480 --> 00:39:53,160 Speaker 1: in our flat screen world they look really clumsy and bulky. 618 00:39:53,239 --> 00:39:56,719 Speaker 1: They're also incredibly heavy. Oh and they also have very 619 00:39:56,760 --> 00:40:00,360 Speaker 1: powerful capacitors inside them that can hold on to a 620 00:40:00,440 --> 00:40:04,719 Speaker 1: latent electric charge that makes them potentially very dangerous if 621 00:40:04,719 --> 00:40:07,960 Speaker 1: you were to ever break one, So don't do that 622 00:40:08,239 --> 00:40:12,160 Speaker 1: because you could get electrocuted or at least suffer a 623 00:40:12,320 --> 00:40:17,400 Speaker 1: really serious shock. Anyway, the cathode ray tube refers to 624 00:40:17,440 --> 00:40:21,320 Speaker 1: a component inside these displays that is in many ways 625 00:40:21,520 --> 00:40:24,960 Speaker 1: similar to a light bulb. So you've got a tube 626 00:40:25,040 --> 00:40:27,960 Speaker 1: inside of which is a filament that is suspended in 627 00:40:28,120 --> 00:40:32,399 Speaker 1: a vacuum. So inside the tube is a vacuum, electricity 628 00:40:32,440 --> 00:40:35,040 Speaker 1: can flow through the filament, which then causes the filament 629 00:40:35,080 --> 00:40:38,279 Speaker 1: to start to give off electrons. Frequently, we call this 630 00:40:38,360 --> 00:40:41,319 Speaker 1: an electron gun because of how it gives off and 631 00:40:41,360 --> 00:40:46,120 Speaker 1: then directs electrons to hit the backside of a fluorescent screen. 632 00:40:46,560 --> 00:40:50,680 Speaker 1: The impact of the electrons on those fluorescent components causes 633 00:40:50,719 --> 00:40:55,200 Speaker 1: those components to you know, fluoresce or glow, and on 634 00:40:55,239 --> 00:40:58,439 Speaker 1: the flip side, we see those as pixels of light 635 00:40:58,560 --> 00:41:02,040 Speaker 1: on these types of display. So these electron guns are 636 00:41:02,080 --> 00:41:07,720 Speaker 1: consistently scanning across the backs of these screens and generating 637 00:41:07,760 --> 00:41:10,680 Speaker 1: the images that we see, whether it's television or computer 638 00:41:10,719 --> 00:41:14,080 Speaker 1: monitor or display or whatever. These days, c r T 639 00:41:14,360 --> 00:41:16,839 Speaker 1: s are a rarity. You still find them with some 640 00:41:17,000 --> 00:41:20,520 Speaker 1: legacy systems, and folks who have old working televisions may 641 00:41:20,600 --> 00:41:23,800 Speaker 1: still be using them, though with some pretty big limitations, 642 00:41:23,840 --> 00:41:26,200 Speaker 1: but for the most part they have been replaced by 643 00:41:26,200 --> 00:41:29,880 Speaker 1: other types of tech. Next, we have c s S 644 00:41:30,160 --> 00:41:35,160 Speaker 1: that's cascading style sheets. This is a style sheet language, 645 00:41:35,320 --> 00:41:38,960 Speaker 1: and that doesn't really help very much for most of us, 646 00:41:39,000 --> 00:41:41,840 Speaker 1: I think, But for the web it means that you 647 00:41:41,880 --> 00:41:46,800 Speaker 1: can use CSS to create the formatting style for information 648 00:41:46,840 --> 00:41:49,719 Speaker 1: that will be displayed on the web, and you can 649 00:41:49,760 --> 00:41:54,080 Speaker 1: separate the format that is the way that things are 650 00:41:54,520 --> 00:41:57,919 Speaker 1: displayed within a browser, and you can separate that from 651 00:41:58,000 --> 00:42:01,799 Speaker 1: the content, as in the actual stuff that's being displayed. 652 00:42:01,960 --> 00:42:03,640 Speaker 1: So in the old days, if you wanted to create 653 00:42:03,680 --> 00:42:06,759 Speaker 1: a web page, you had to code everything in. You 654 00:42:06,800 --> 00:42:09,800 Speaker 1: had to set whatever the background color of the page 655 00:42:09,840 --> 00:42:12,560 Speaker 1: was going to be, the text color of the font 656 00:42:13,000 --> 00:42:18,600 Speaker 1: of the table, formats, font styles and size, layout styles 657 00:42:18,600 --> 00:42:21,319 Speaker 1: and more, and it was a lot of work. CSS 658 00:42:21,440 --> 00:42:23,920 Speaker 1: allows developers to create what is sort of like a 659 00:42:24,080 --> 00:42:29,960 Speaker 1: format template. Any content that uses that CSS format will 660 00:42:30,120 --> 00:42:32,319 Speaker 1: end up fitting that template once you publish it to 661 00:42:32,320 --> 00:42:35,640 Speaker 1: the web, and you could port that content to a 662 00:42:35,719 --> 00:42:39,520 Speaker 1: different CSS sheet and it would end up looking totally different. 663 00:42:40,280 --> 00:42:43,239 Speaker 1: The word cascading here is used to describe a sort 664 00:42:43,280 --> 00:42:46,120 Speaker 1: of order of operations. UH. You can think of it 665 00:42:46,160 --> 00:42:50,040 Speaker 1: as an if then kind of approach, such as, if 666 00:42:50,360 --> 00:42:53,680 Speaker 1: this web page is being viewed on a mobile device, 667 00:42:54,200 --> 00:42:59,520 Speaker 1: then use this specific layout scheme. However, if the content 668 00:42:59,680 --> 00:43:02,360 Speaker 1: is being viewed through a web browser on say a 669 00:43:02,400 --> 00:43:07,439 Speaker 1: desktop computer, then use this other layout scheme that's optimized 670 00:43:07,520 --> 00:43:11,440 Speaker 1: for that. Since different rules might apply depending upon the 671 00:43:11,520 --> 00:43:18,280 Speaker 1: specific circumstances, the operations cascade and priority based on those circumstances. 672 00:43:18,280 --> 00:43:21,120 Speaker 1: So it's really about removing a lot of the work 673 00:43:21,239 --> 00:43:22,920 Speaker 1: that you would have to do if you were to 674 00:43:22,960 --> 00:43:26,239 Speaker 1: do all of this by hand. CSS, by the way, 675 00:43:26,320 --> 00:43:29,520 Speaker 1: is one of the foundational elements of the World Wide Web, 676 00:43:29,719 --> 00:43:32,239 Speaker 1: and we'll talk about another one in a future episode. 677 00:43:32,400 --> 00:43:35,680 Speaker 1: And spoiler alert that one is HTML. But we've got 678 00:43:35,680 --> 00:43:37,160 Speaker 1: a long way to go. We've got a lot more 679 00:43:37,239 --> 00:43:39,920 Speaker 1: letters in the alphabet before we get to H. Next, 680 00:43:39,960 --> 00:43:43,160 Speaker 1: we have DOLL and this isn't just what I say 681 00:43:43,160 --> 00:43:45,640 Speaker 1: when I see a cute puppy dog, however, it's also 682 00:43:46,120 --> 00:43:49,880 Speaker 1: that I also do say that no. DAW stands for 683 00:43:50,200 --> 00:43:54,680 Speaker 1: digital audio workstation. So this is what audio editors and 684 00:43:54,760 --> 00:43:58,319 Speaker 1: engineers used to work on digital audio files. A doll 685 00:43:58,480 --> 00:44:01,839 Speaker 1: can be a selection of physical equipment, so it can 686 00:44:01,880 --> 00:44:04,520 Speaker 1: be like a big bank of controls, complete with lots 687 00:44:04,520 --> 00:44:08,200 Speaker 1: of you know, knobs and buttons and sliders. Or it 688 00:44:08,239 --> 00:44:11,080 Speaker 1: can consist of software in which all of those physical 689 00:44:11,120 --> 00:44:14,600 Speaker 1: controls are essentially virtualized. Or it could be a combination 690 00:44:14,640 --> 00:44:19,439 Speaker 1: of the two. Podcasters use DAWs to record and edit 691 00:44:19,480 --> 00:44:22,960 Speaker 1: their content. Most DAWs have tons of options to let 692 00:44:22,960 --> 00:44:26,719 Speaker 1: you manipulate audio files in various ways. It might mean 693 00:44:27,120 --> 00:44:31,200 Speaker 1: adding reverb to a selection, or it might mean changing 694 00:44:31,320 --> 00:44:36,120 Speaker 1: the pit of the audio or the speed of the 695 00:44:36,160 --> 00:44:39,400 Speaker 1: audio playback. And those are just tiny examples and some 696 00:44:39,480 --> 00:44:42,480 Speaker 1: of the more overt features you'll find with DAWs. There's 697 00:44:42,520 --> 00:44:45,759 Speaker 1: some that are incredibly subtle, and you might not even 698 00:44:45,800 --> 00:44:47,680 Speaker 1: pick up on them, but you know, producers who have 699 00:44:47,719 --> 00:44:51,000 Speaker 1: been working in the field for years will immediately recognize them. 700 00:44:51,040 --> 00:44:54,719 Speaker 1: The one DAW that most of our producers tend to use, 701 00:44:54,800 --> 00:44:57,160 Speaker 1: not all of them, but most of them, is called 702 00:44:57,280 --> 00:45:01,520 Speaker 1: Audition from Adobe. Some produce there's wouldn't even call Audition 703 00:45:01,600 --> 00:45:05,920 Speaker 1: a dawn simply because it lacks support that, say, musicians 704 00:45:05,920 --> 00:45:09,920 Speaker 1: would rely upon, such as native support for MIDI integrations 705 00:45:10,480 --> 00:45:14,080 Speaker 1: MIDI or m I d I is another initialism that 706 00:45:14,120 --> 00:45:17,680 Speaker 1: we will cover in a future episode. Next, we have 707 00:45:18,200 --> 00:45:22,719 Speaker 1: d l C. This stands for downloadable content, and typically 708 00:45:22,760 --> 00:45:26,960 Speaker 1: this refers to additional content that embellishes an existing piece 709 00:45:26,960 --> 00:45:31,720 Speaker 1: of software, most notably in video games, but not exclusively. 710 00:45:32,320 --> 00:45:35,240 Speaker 1: DLC is a way for publishers to create and sell 711 00:45:35,400 --> 00:45:39,480 Speaker 1: expansions to existing pieces of software. But it doesn't require 712 00:45:39,520 --> 00:45:42,680 Speaker 1: developers to go in and create an all new version 713 00:45:42,920 --> 00:45:46,160 Speaker 1: of the stuff, right. They can rely heavily on existing 714 00:45:46,239 --> 00:45:49,960 Speaker 1: assets to build out these additional features. So in the 715 00:45:50,080 --> 00:45:54,200 Speaker 1: video game world, you'll frequently see DLC used to flesh 716 00:45:54,280 --> 00:45:59,000 Speaker 1: out fictional world or create new storylines or levels or 717 00:45:59,080 --> 00:46:02,280 Speaker 1: missions for the play here to follow. But sometimes DLC 718 00:46:02,400 --> 00:46:06,600 Speaker 1: might include purely cosmetic changes, or they'll include content that's 719 00:46:06,880 --> 00:46:10,520 Speaker 1: you know, tangential to the game play. It doesn't actually 720 00:46:10,560 --> 00:46:14,080 Speaker 1: represent more gameplay, but just kind of augments what's already 721 00:46:14,080 --> 00:46:17,560 Speaker 1: been there. Just like games in general, DLC can be 722 00:46:17,600 --> 00:46:20,400 Speaker 1: done really well or it can be done poorly. For 723 00:46:20,560 --> 00:46:23,799 Speaker 1: most gamers, I would say good, DLC is typically seen 724 00:46:23,840 --> 00:46:27,520 Speaker 1: as something that's priced appropriately. Typically you're talking about something 725 00:46:27,520 --> 00:46:31,160 Speaker 1: that's priced below the price for a full game and 726 00:46:31,520 --> 00:46:34,399 Speaker 1: provides a satisfying experience on top of whatever the main 727 00:46:34,480 --> 00:46:38,040 Speaker 1: game is. Bad, DLC might be viewed as being too 728 00:46:38,080 --> 00:46:42,600 Speaker 1: expensive or just containing superfluous content that doesn't really add anything. 729 00:46:43,360 --> 00:46:47,080 Speaker 1: DLC can extend the life cycle of a game title. 730 00:46:47,320 --> 00:46:51,360 Speaker 1: Some games can remain relevant years after their initial release 731 00:46:51,360 --> 00:46:55,440 Speaker 1: because of DLC, and some games, like the hit Man games, 732 00:46:55,480 --> 00:46:59,160 Speaker 1: the most recent ones, for example, can create entire revenue 733 00:46:59,200 --> 00:47:04,040 Speaker 1: models based around DLC that just consistently adds new content 734 00:47:04,320 --> 00:47:07,640 Speaker 1: to an older game. Well, I think that's a good 735 00:47:07,680 --> 00:47:10,080 Speaker 1: place for us to leave off. We've got a lot 736 00:47:10,160 --> 00:47:12,919 Speaker 1: more to cover. Obviously, we're just now INDs and we've 737 00:47:12,920 --> 00:47:16,680 Speaker 1: still got a few DS to go. Beyond that, we 738 00:47:16,719 --> 00:47:19,319 Speaker 1: have the rest of the alphabet. But I think that 739 00:47:19,400 --> 00:47:22,680 Speaker 1: this is a really useful approach to kind of understanding 740 00:47:23,239 --> 00:47:27,000 Speaker 1: some terms that you're gonna encounter as you navigate the 741 00:47:27,000 --> 00:47:29,600 Speaker 1: world of tech, and not all of them are intuitive, 742 00:47:30,000 --> 00:47:32,560 Speaker 1: and a lot of them can actually lead to people 743 00:47:32,680 --> 00:47:37,000 Speaker 1: thinking that the initials stand for totally different stuff, and 744 00:47:37,040 --> 00:47:38,920 Speaker 1: it can be very confusing. So I find that this 745 00:47:39,000 --> 00:47:42,319 Speaker 1: sort of approach is good to build an understanding and 746 00:47:42,320 --> 00:47:47,160 Speaker 1: a contextualization around tech, and I always find that to 747 00:47:47,239 --> 00:47:52,440 Speaker 1: be particularly useful. So we will continue this in Wednesday's 748 00:47:52,480 --> 00:47:56,960 Speaker 1: episode and probably beyond that, because unless I just get 749 00:47:57,040 --> 00:48:02,080 Speaker 1: extremely efficient with descriptions, will have a lot more to 750 00:48:02,120 --> 00:48:05,120 Speaker 1: go beyond that. But I think that this is a 751 00:48:05,160 --> 00:48:09,040 Speaker 1: really useful path to go down. If you have suggestions 752 00:48:09,080 --> 00:48:11,960 Speaker 1: for topics I should cover in future episodes of tech Stuff, 753 00:48:11,960 --> 00:48:15,080 Speaker 1: whether it's elaboration on any of these terms or something 754 00:48:15,600 --> 00:48:18,120 Speaker 1: just interesting in the tech world, reach out to me. 755 00:48:18,360 --> 00:48:20,440 Speaker 1: The best place to do that is on Twitter. The 756 00:48:20,480 --> 00:48:23,920 Speaker 1: handle we use is tech stuff h s W and 757 00:48:23,920 --> 00:48:33,080 Speaker 1: I'll talk to you again really soon. Tech Stuff is 758 00:48:33,080 --> 00:48:36,200 Speaker 1: an I Heart Radio production. For more podcasts from my 759 00:48:36,360 --> 00:48:39,960 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 760 00:48:40,080 --> 00:48:42,040 Speaker 1: or wherever you listen to your favorite shows.