字幕表 動画を再生する 英語字幕をプリント [Introduction] DAVID MALAN: This is CS50, Harvard University's introduction to the intellectual enterprises of computer science and the art of programming. My name is David Malan, and if you are among those in the room who are thinking, why am I in a class of computer science, realize that I too felt that exact same way. In fact, my freshman year, I didn't quite get up the nerve to take this class or computer science more generally, and that was largely because I was intimidated by it. I was a little nervous. It felt well out of my comfort zone. And I really didn't know at the end of the day what it actually was. But realize if you, too, are feeling a little bit of that, or even if you're among those more comfortable who have dabbled in computer science or programming, realize that there's so many blanks that we can fill in along the way so that ultimately, at the end of the semester, everyone will feel themselves on the same page. And until then, rest assured that 68% of the people sitting to your left and to your right and behind and in front have never taken a CS course before, which may very well be the demographic into which you fit. But realize, too, that with such an amazing support structure with so many office hours and sections and materials and beyond, realize that what's ultimately important in this course is not so much where you end up relative to your classmates in week 10, our final week, but where you end up relative to yourself in week zero. And indeed, that is where we now are. And as it turns out, computer scientists start counting at zero. And so over the next 11 weeks, we will take you from being among those less comfortable or perhaps somewhere in between less comfortable and more to feeling much more comfortable and confident and capable than that. But to get there, we need to understand what computer science really is. And this was something I didn't understand until I set foot in a room like this. And I dare say we can distill computer science into just this picture. Computer science is about problem solving. And I know that high school courses typically do kind of paint a misleading picture that it's only about and it's entirely about programming and people with their heads down in the computer lab working fairly anti-socially on code, but the reality is it's all about solving problems, and very often, solving problems collaboratively either in person or by leveraging code, programs that others have written in the past. And what does it mean to solve a problem? Well, you need inputs. So there's a problem you're trying to solve. That is the input. And you want output. You want the solution to that problem. And the sort of secret sauce of computer science is going to be everything in this proverbial black box in the middle over the next several weeks, where you begin to understand exactly what you can do with that. But in order to start solving problems, we kind of just need to decide as a group how we're going to represent these problems and what might a problem be. Well, in this room, there's a whole bunch of people. If we wanted to take attendance or count the number of people in this room, I might need to start keeping track of how many people I see. But how do I represent the number of people I see? Well, I can do it sort of old school and I can just take out a piece of chalk or whatnot and say, all right. I see 1, 2, 3, 4, 5. I can do little stylistic conventions like that to save space and remind myself. 6, 7, 8, 9, 10, and so forth. Or I can, of course, just do that on my own hand. So 1, 2, 3, 4, 5, and so forth. But obviously, how high can I count on just one hand? So 5 you would think, but that's just because we haven't really thought hard enough about this problem. It turns out that with just these five fingers, let alone these five more, I can actually count rather higher because after all, the system I'm using of hashmarks on the board or just now with my fingers is just kind of keeping my fingers down and putting them up to represent ones, really. But what if I actually took into account the order of my fingers and sort of permuted them, so to speak, so that it's really patterns of fingers that represent the number of people in the room, and not just the mere presence of a finger going up or down. In other words, this can remain zero. This could still be one. But what if two is not just this, the obvious? But what if it's just this? So raising just one, my second finger. What if, then, three is this? So we have 0, 1, 2, 3. That's going to lead us to four somewhat offensively. But if we begin to jump ahead to five, I might now permute this finger and this finger up. And if I want to now represent six, I could do this. And now seven. In other words, I've expressed so many more patterns on my hand already and if we keep doing this, I think I can actually represent painfully perhaps like 32 different patterns, and therefore 32 different people, on my hands alone. Or 31 people if I start counting at zero. So what is that-- what's the relationship and how did we even get here? Well, it turns out that computers are kind of simplistic, much like our hands here. At the end of the day, your computer is plugged into the wall or it's got a battery, so it either has or it does not have electricity. At the end of the day, that is the physical resource that drives these things and our phones and all of technology today. So if there is either electricity or not, that kind of maps nicely to no finger or yes finger. And indeed, computers, as you probably know, only speak what language? What alphabet, so to speak? Yeah. Binary. Bi meaning two. And indeed, that refers to the fact that in binary in computers, you only have two digits-- zero and one. We humans, of course, have 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, and then we can combine those to count even higher. But computers only have 0, 1, and then that's it. Because at the end of the day, there's actually a direct mapping between power being off and it being a zero or power being on and it being one, or some electrons or whatever flowing from your battery or from the wall. So this is why computers tend to speak only binary, because at the end of the day, it just maps really cleanly to what it is that's powering them in the first place. But how is this actually useful? If computers only have zeros and ones, how can they do anything useful? Well, think about our human world, where you might have this pattern of symbols. This is decimal, dec meaning 10 because you have 0 through 9. And this is, of course, 123. But why? If you haven't thought about this in quite some time, this is really just a pattern of three symbols, one and two and three shapes, or glyphs, on the screen. But we humans, ever since grade school, have started ascribing meaning to each of these numbers, right? If you think back, this is the ones column, this is the tens column, this is the hundreds column, and so forth, and we could keep going. And so why does this pattern-- one, two, three-- mean 123? Well, it's because all of us sort of intuitively nowadays are just quickly in our head doing 100 times 1 plus 10 times 2 plus 1 times 3, which of course gives us 100 plus 20 plus three, and then the number we know mathematically as 123. But we're all doing this so quickly, you don't really think about this anymore. Well, computers work fundamentally the same way. They don't have as many digits-- 0 through 9-- as we do. They only have zeros and ones. And so if they were to store values, you're only going to see zeros and ones on the screen, but those zeros and ones just mean different things. Instead of having a ones place, tens, a hundreds, they're going to have a ones place, a twos place, a fours place, and then eights and 16 and beyond. Now, why? Well, one and 10 and 100, turns out those are powers of 10. 10 to the 0 is technically 1. 10 to the 1 is just 10. 10 to the 2 is 100. And that's why you have ones, tens hundreds, thousands, and so forth. Computers are apparently using powers of 2. Not surprising. Binary-- two. So if you only have ones, twos, and fours as your placeholders, if a computer were storing these digits-- 0, 0, 0-- that computer is presumably storing what number so far as we humans understand it? Well, that's how a computer would store zero. If a computer is storing literally 0, 0, 0, just like in our human world, that also is 0, but that's technically because it's 4 times 0 plus 2 times 0 plus 1 times zero, which is obviously zero. Meanwhile, if a computer is actually storing not just, say, 0, 0, 0, but instead is storing this value in binary, what does that map to in decimal? So that's one. And now, why, if we change this 0 and 1 to this value here, is this two? Well, mathematically, for the exact same reasons. And so earlier, I had five fingers, but if you consider just my first three, when I did this holding up one finger, I was representing two. And if I want to represent three, recall that I put up the second finger. And so the reason that could nicely represent three is because all I was doing with my human hand was counting in binary. And I could keep counting more and more and more. And so if I have five fingers or five bits, bit meaning binary digits, I could count up, it turns out, if we do the math, as high as 31 by starting to zero. It's going to be hard to physically do that, but we could. So why is this useful? At the end of the day, a computer, therefore, can represent any number of values from 0 to 1 to 2 to 3 to some number much, much, much higher than that. All it needs is enough bits, enough zeros and ones. Well, what are those bits? Well, all of us have these days in our phones sources of light, for instance. So I could actually say that this physical device right now-- might be a little hard to tell-- it does have a flashlight and it's technically off at the moment. But if I turn this flashlight on, thereby using some of the electricity, now, I'm storing a one. And so the phone is on. Now, it's off. Now, it's on. And if I see-- can I borrow someone's phone real quick? May I? OK. And flashlight. How do I turn on the flashlight? Oh. Shake it. That's OK. OK. Thank you. Oh. Thank you. OK. So this is great. Now, I can count higher. Now, this represents the number what if I have two light bulbs or two switches on at the moment? Yeah. Three. Because I have a one, I have a one, and I have two, which of course is going to end up equaling three. And if I pick up a third phone somehow, I could count even higher. Technically, if I had three light bulbs on-- one, one, one-- what would that value be? Seven. Because it's a four plus a two plus a one, and so forth. Thank you so much for the spontaneity. So why does this not lead to limitations for us? I can count in decimal as high as I want. I can now count in binary as high as I want, so long as I have enough bits. But how do I actually represent other information? Well, if I want to represent something like a letter, how do I get there? If computers only have electricity in them and they use binary to count, and yet somehow they're much more useful than just doing math-- they can have text messages and e-mails and websites and videos and more-- how do we get from zeros and ones to letters? Well, we-- yeah. Sorry. A little louder. Yeah. We just need to kind of relate the numbers to letters. In other words, all the people in this room just need to decide at