字幕表 動画を再生する 英語字幕をプリント [eerie ambience] 1999. A once-distant year that still sounded like science fiction was rapidly coming to a close as the year 2000 approached, with massive “millennium celebrations” planned to greet the 2000s with excessive partying. But looking beyond the street festivals and boundless optimism, there was an undercurrent of pessimism and anxiety surrounding a computer glitch known as The Millennium Bug, The Year 2000 Problem, or simply: Y2K. Set to occur at the stroke of midnight, capable of instigating all manner of disruptions, ranging from the mundane to the apocalyptic. And yet, once the new year arrived with few newsworthy problems occurring, many assumed the Y2K threat had been exaggerated, or was even an outright hoax. What happened? With the millennium only hours away, many people are working down to the last minute to fend off the Y2K Bug. It's been called the world's most important extermination job eliminating the Year 2000 computer bug. The federal government now has ten response centers across the country staffed 'round the clock. The chance of major dislocation in our economy, the major dislocation in our standard lives is very low. I would certainly agree that it's below 10 percent. When agencies are saying they're making good progress, they're 99% compliant, they're gonna be there, they have every assurance that they'll be ready in time. Lemme translate that to one phrase: they're not ready yet. These global issues are the direct result of an equally real human oversight many people now refer to as the Y2K, or Year 2000, problem. This is a problem. Things are gonna be broken, the electricity may be broken. We will have to be patient while it's being fixed. And y'know what? While it's being fixed, we might actually enjoy some family time. This is LGR Tech Tales, where we take a look at noteworthy stories of technological inspiration, failure, and everything in-between. This episode tells the tale of the infamous Y2K Bug and the crescendo of panic that ensued. So what was the Year 2000 Problem to begin with? Well Y2K, or Century Date Change as it was known for a while, can be most easily described as a calendar problem. For decades, computer programmers abbreviated the four digits of each calendar year down to the last two digits. So the year 1965 would be truncated to ‘65’ for example. While humans had the common sense and context to understand a two digit year, the actual computers might take this at face value. Meaning that the year 2000 could instead be misinterpreted by a computer as the year 1900, deeming 01/01/00 to be an earlier date than 12/31/99. In hindsight, this seems like an easily avoidable problem, but when this shortcut was first implemented it didn’t pose any immediate threat. In fact, it made complete sense! Some of the first widely-used computers in the United States, like the IBM 1401 introduced in 1959, were programmed using paper punch cards. These punch cards were a holdover from the electro-mechanical tabulating and accounting machines computers replaced, and operators of those had often punched in 6-digit date codes in the interest of saving time and space. So when it came time to start using computer programs instead of accounting machines, the same 6-digit date code carried over as well. Then once the venerable IBM System/360 came along in 1964, its operating system continued using 6-digit dates to maintain backwards compatibility when emulating IBM 1400 programs. These abbreviated dates also had the welcome effect of saving a couple bits of data per punch card, slightly decreasing the number of cards per program and lowering memory requirements. Consider that a single two kilobyte board of core memory could cost a couple thousand dollars, around 1 US dollar per bit. And a 5 megabyte hard disk cost tens of thousands more, so it made sense to truncate programs any way possible. Sure, 6-digit years were a tad vague but big deal! No way anyone would be running 1960s computer programs forty years later! But that’s exactly what happened, especially in the realms of government, finance, and infrastructure, where bureaucracy was high and change was slow. And it’s not that computer programmers didn’t think ahead, this “Century Date Change” or “CDC” problem as it was called, was noticed early on. All the way back in 1958, computing pioneer Bob Bemer was one of the first to address the issue publicly while he was an employee at IBM. He was working on a genealogy program in conjunction with the Mormon Church, and their researchers needed to distinguish between the years 1900, 1800, 1700, and so on. Two-digit years made this impossible, so Mr. Bemer came up with a way of programming full 4-digit year references in COBOL. Logically, this could become a problem in the future once the year 2000 arrived, but despite Mr. Bemer repeatedly voicing his concerns for years, the practice of two-digit year entry continued. It was considered such a non-issue that on November 1, 1968, the Federal Information Processing Standards Publication 4, “The Standard For Calendar Date,” was introduced. This outlined how all information exchange between US government agencies would use the now-standard 6-digit date, ensuring its usage far into the future. And again, this didn’t go unnoticed. Another IBM employee, Peter de Jager, came to his own realization about the Year 2000 Problem in 1977 while testing a new banking system. He recognized the looming glitch, persistently voiced concern to his bosses, and like those before him was very much ignored. It simply wasn’t a big enough deal yet, and fixing so much code would be costly and time-consuming and nobody wanted to do that. A bit more credence was finally given to the idea in 1984, with the release of “Computers in Crisis” by Jerome and Marilyn Murray. In this book, the Murrays laid out the history of the problem, what might happen if it’s left unresolved, and how to address the flaws in 6-digit date programming, complete with hundreds of pages showing examples of corrected computer code. Still, even as recognition of the problem rose among programmers and computer users heading into the 1990s, it wasn’t until the rise of fledgling internet services that the Century Date Change problem truly gained traction. Arguably, a contributing factor is the term “Y2K” itself, something far more catchy than “Century Date Change.” The coining of the term is usually attributed to a programmer by the name of David Eddy, who first used “Y2K” to refer to the problem in an email sent on June 12, 1995, spreading virally from there. But beyond the catchy name, what finally made governments, companies, and the public sit up and take notice of Y2K in the mid-90s? The answer is simple: panic. Never underestimate the power of public panic! "Power outages, water outages." My main worry is the energy grid. And if we can't get power, we can't get water. So it's something that is totally unpredictable. I think there are individual banks that will probably go bankrupt. There are individual credit unions that will disappear over this issue. Some people will die. "The sooner you start your preparations, the better your opportunity to get the supplies" "you need at reasonable prices." News reports, websites, TV specials, radio shows, newspaper opinion pieces. The Y2K coverage grew exponentially starting in 1996 as the new millennium drew ever closer, and IT professionals warned with increasing volume of the impending problems that would occur. By 1997, many experts were declaring a point of no return “time bomb” was approaching, and who knew what chaos might ensue if things weren’t soon addressed. Especially when it came to computers that were now a couple of decades old, like those used in electrical grids, air traffic control, social security, emergency response systems, banking and financial institutions, hospitals and hospice centers, oil refineries and gas processing, and of course, all those scary-looking nuclear power plants and missile silos. The more the news coverage increased, the more people started wondering: what if everything turned off, all at once, on January 1st? Or worse, what if the glitch made systems go haywire, dropping planes from the sky and launching nukes at random? Soon, Century Date Change bills were enacted at state and local levels, bringing in tech firms to assist in rewriting old code. Former COBOL programmers were brought out of retirement to help fix their own programs from the 60s and 70s. Individual contractors were hired by countless institutions and businesses starting at $1500 a day in 1997. Governments, corporations, and small businesses around the globe were finally taking Y2K dead seriously in 1998, with an estimated 300 billion to half a trillion dollars spent globally once it was all said and done. Of course, that was just the response on an official level: all those large institutions still using decades-old hardware and software. You’ll note that personal computers haven’t really been brought up as a major concern, and with good reason: a sizable margin of home computers were never going to be that affected by Y2K, relative to minicomputers and mainframes. Even going back to the early Macintosh and IBM PC systems from the 1980s, those accepted the full 4-digit year 2000, no problem. Granted, there were countless PC clones, many of which used two-digit years in the BIOS, as well as the bigger problem where your computer and OS was compatible but your older software wasn’t. A good number of applications still only recognized the last two digits, but by the late 90s there were very few of those programs still in use, and what remained likely wasn’t controlling anything *vital* to society. Windows, Mac OS, UNIX, even MS-DOS: none of these systems were ever at much risk, especially by the mid-to-late ‘90s when the public finally started caring. And larger companies like Microsoft made sure to let users know there were updates for any of their outdated software, even offering a free Year 2000 Resource CD-ROM to help users better understand the situation regarding PCs. Still, overarching Y2K anxieties led tons of folks to get the wrong idea anyway. The prevalence of “Y2K Compliant” labels all over the place probably didn’t help, leading to a misguided idea that anything remotely computer-related could stop working. Computers themselves were of course labeled this way, but also software that was never at risk in the first place. Bicycle Rummy? Oh sure that’s Y2K compatible, why not? If an item used a microchip, or heck, used electricity in some way, slap a Y2K sticker on it! Cash registers, KVM switches, otoscopes, digital scales. Of course, remember to turn your computers off before midnight too, because reasons. Thanks, Best Buy. Despite the responsibility falling largely on the shoulders of governments and corporations, rather than individuals, this didn’t stop the onslaught of folks capitalizing on the hysteria by peddling personal preparedness. There were countless Y2K books, many of them centered around post-apocalyptic survivalism and off-the-grid living. Seminars were held that pitched the need for prepping at work and at home. Y2K News Magazine got printed for a couple years, detailing Y2K-related facts and rumors on a bi-weekly basis. Outdoor and military supply stores bundled food, stoves, and lanterns together and sold them as “Y2K Survival Kits.” Cleverly named online stores popped up, like Y2Kmart, billing themselves as “one-stop disaster shops.” Folks extra-worried about personal safety went into stores asking retailers for quote, “Y2K guns,” leading to a spike in US weapons sales. Amish Mennonite business owners saw a notable rise in sales due to their offerings of analog technology. People began stocking up underground bunkers and fallout shelters in numbers not seen since the height of the Cold War. Cookbooks were released focusing on food preparation in the absence of electricity and clean water. Collectibles were sold cute-ifying the Millennium Bug as a soft, plushie insect. Y2K Survival Shows were hosted on county fairgrounds, providing a kind of end-times family event. Computer hardware makers sold add-on cards promising to make your BIOS Y2K compatible. Software developers released applications for analyzing and updating PCs for potential glitches. Certain religious leaders and churches were hailing the new millennium as the beginning of the end times as written in the book of Revelation. Mock movie posters were sold that combined mid-century horror movie tropes and Y2K apocalypse fears. Actual Y2K movies got made, like Y2K: The Movie, a 1999 made-for-TV disaster flick. Somehow Leonard Nimoy got roped into hosting an hour-long Y2K scare-a-thon released to VHS. Even musicians couldn’t resist making Y2K-themed songs, like “The Millennium Bug” by Terry Breen. And naturally, comedians, comic strips, and TV shows all joined in poking fun at the whole shebang. Happy new -- whaa? Oh no, it’s happening! [electrical zapping] [tires squealing, cars crashing] So with all the years of Millennium Bug silliness and scare tactics, it’s no surprise that many folks were still expecting the worst. But once the day finally arrived and 1999 turned to 2000, Y2K was more or less a no-show. Power kept flowing. Planes kept flying. Cars kept driving. Banks kept banking. Even computers that weren’t turned off before midnight did just fine! There were a number of issues though, even if largely limited to brief, localized interruptions. A building in South Korea lost its heating for a few hours. Several people in the UK were given incorrect medical test results due to age miscalculation. Three dialysis machines in Egypt needed to be reset at midnight. And some credit and debit card terminals delayed transactions by a few days. Not to mention a few more lighthearted blunders, like a guy who returned a rental videotape to the store only to be met with a $91,250 late fee for being 100 years late. But the vast majority of Y2K bugs were incredibly minor, mostly just digital clocks and calendars showing 1900 instead of 2000 before being promptly fixed. According to the US Senate Y2K Aftermath report, despite all those incidents, there were quote “no major problems were experienced in the U.S. or worldwide during the millennium date change.” So, hooray! Job well done, crisis averted! Everyone thanked all those skilled programmers, right? Eh, not quite. The skeptics and the defenders alike began sounding off immediately, as exemplified by the January 1st comments section from the BBC’s Talking Point page: This discourse only grew over the coming years, with retrospectives analyzing all the warnings and preparation, wondering if it was all overblown. Make no mistake though, Y2K was a real problem that needed to be fixed, and people put in countless hours to fix it. But the problem with thoroughly fixing something is that, to an outsider, it looks like nothing was fixed at all. So on the one hand, the Millennium Bug was an unquestionably big problem, with a massive quantity of unseen work happening that resulted in a disaster dodged. On the other hand, it’s clear that an unwarranted amount of fear, uncertainty, and doubt was being sown and reaped by those looking to make a quick buck. To quote tech journalist Robert Cringely: “I believe a terrific amount of Y2K fraud took place." "There was a lot of money that was spent and it wasn’t visible." "The question is whether the right work was done" "and my guess is probably about half that money was just wasted.” Indeed, no one knows precisely how much of the prepwork was truly necessary, and how much was at best an honest overreaction, or at worst a cynical cash-grab. And unfortunately, those bad actors often received the spotlight instead of the workers grinding away fixing the actual problems. And thus Y2K ended up being more of a punchline, a cliché “end of the world” trope that the public was all too happy to move on from. But it wasn’t an all-out hoax either. The Calendar Date Change problem was real and a lot of the fixes were necessary. Additionally, computer hardware and national infrastructure updates made to prevent Y2K chaos ended up having benefits beyond The Year 2000 Problem. As just one example, New York City’s infrastructure overhaul for Y2K has been credited as helping deal with the 9/11 terrorist attacks. The enhancements and failsafes installed to prepare for Y2K ended up being invaluable in helping first responders in 2001 when power, communication, and rail lines were destroyed. On the flip side, some of the Y2K fixes caused issues that only arose decades later, in what became known as The Y2020 Bug. Turns out up to 80% of Millennium Bug fixes relied on a method called “the pivot year” or “windowing,” making computers see years between ‘00’ to ‘20’ as the 2000s. Once the year 2020 came about, anything relying on this temporary fix immediately broke, including parking meters, point of sale terminals, and even the pro wrestling video game, WWE 2K20. So sure, Y2K is easy to look back on and joke about, with all the laughable products and absurd apocalyptic predictions. But it shouldn’t be dismissed entirely as a scam either, even with all the scamming and poor decision-making that occurred. “The public perception perpetrated by the media that this was a hoax" "has done a great disservice to the industry,” said Peter de Jager in 2010. “Organizations did not spend $300 billion worldwide because someone said" "there was a problem. Nobody is that gullible." "They spent $300 billion because they tested their systems with ‘00’ dates" "and the systems stopped working.” Let’s hope the media, the industry, and the public alike have learned their lessons then, because calendar-based computer problems continue to crop up, and there’s no telling what kind of technological silliness lies over the horizon. [LGR-made synth beats commence] If you liked this episode of LGR Tech Tales, or if you’ve got some particularly pertinent Y2K memories to share, feel free to lemme know in the comments! Seeing that kinda thing really helps encourage more stuff like this in the future. And as always, thank you very much for watching!
B2 中上級 LGR Tech Tales - Y2K.2000年問題 (LGR Tech Tales - Y2K: The Year 2000 Problem) 4 0 林宜悉 に公開 2021 年 01 月 14 日 シェア シェア 保存 報告 動画の中の単語