You are here: The Oldskool PC/Who is this guy?/Trixter: The Compleat Works/Essays I have written/Why Computer Movies Suck |
Computer movies suck.
Not like I needed to tell you that or anything, but it's true. In an era where PCs are a household appliance, video game consoles sell like hotcakes, and everyone's grandma knows what the Internet is, why does Hollywood insist on making incredibly stupid computer-based movies that insult your intelligence?
I don't know. Nobody knows. Only screenwriters or Hollywood suits can tell you. Some of the errors are negligible, like referring to websites that don't exist ("Deep Impact"); other errors are intentional fantasy that are appropriate to the story's vision, and are thus forgivable ("Tron"). But most of them are terribly, terribly wrong--a blatant slap in the face for those who work to improve the public's perception of computers. It's frustrating, it's degrading, and worst of all, it's just plain ignorant. Technophiles have a hard time with ignorance.
Let's look at a partial list of computer-related stuff that is always misinterpreted (or just plain ludicrous) in mainstream entertainment:
Let's stop for a bit of trivia: All of the above are in the same X Files episode written by William Gibson, father of Cyberpunk. There is an acerbic observation in there somewhere, but I'll leave you to discover it.
[MJ observation, 2004: In a frighteningly typical example of life
following idiocy art, most of the above are actually observable
on a regular basis now.
Frightening, isn't it?]
Of course, we can expand well beyond that X Files episode. I'm sure you've also seen:
exxus> Find all matches for males 30 to 40 years of age who have ever used the word 'Fargus' in a phone conversation. exxus> [21 matches]
To quote danielr@bigfoot.com: "Me and my friends still get a laugh out of a scene in 'The Fly' where Jeff Goldblum types on his computer 'computer: analyze brundlefly' - after many beeps and boops and flashing lights it explains in english something or other. I love the fact that he types in 'computer:' as if the message might mistakenly be misdirected to the dishwasher."
[MJ observation, 2004: I have to admit, a few weeks back I was nearly fooled into believing that someone had actually invented such an amazing video technology. I saw a small (120x300px) picture of someone on a website that I thought I recognized so using Internet Explorer's annoying image bar, I zoomed in on the picture. Much to my surprise, the picture was actually clearer after zooming in. I zoomed in again, and surprise, surprise, It was even clearer. I kept zooming until I could see the pores on this guy's face! I was getting all set to be amazingly impressed with this new compression technology when I realized that this 120x300 pixel picture was actually an extremely high quality, 1200x3000, 4.9M jpeg resized through an HTML <img> tag! I should have been suspicious when this bare-bones page took nearly a minute to load]
It's insulting to anyone who has even a passing knowledge of computing.
Want an example of good computer fiction? War Games. Yes, the entire second half of the film was silly, but the first half of the film where Matthew Broderick's character computes was very realistic. You can see a real CPM machine, real 8" floppy disks (remember those? Yes, they did exist), a 300 baud acoustic coupling modem (the kind you had to shove your phone handset into), and a character-based serial terminal. And the most realistic thing in that movie was how Matthew's character stole passwords from the school secretary's desk drawer. That is how it happens in real life, not through some super-hacker-password-cracker program. It's possible that a lot of the realism in War Games had to do with the fact that nobody knew enough about computers yet to make it inaccurate. Now every screenwriter thinks he's a geek. Feh.
Tron also holds a special place in my heart, simply because it knows it is pure fiction--it doesn't try to hold itself to the real world, but instead uses computing as the basis for an entirely different plane of existence. The screenwriter's husband was a mainframe programmer and was a technical consultant to the film (his influence is definitely in there--all command and program names are 4 letters long or less ;-), so what little ties they attempt to make to reality are done tastefully. And -- here's some ironic foreshadowing -- an "old-timer" has a conversation with a "suit" that is ironically similar to the current state of business computing verses hobbyist computing (or Microsoft's gestapo-like hold on the industry--take your pick). Listen to this if you don't believe me; use an MP3 player if you have to.
Roger Ebert once wrote that it was a grave mistake for trade specialists to see movies about their trade. It's impossible to suspend disbelief when you see your trade misrepresented. As a result, doctors don't like medical dramas, lawyers don't like courtroom dramas, firefighters don't like fire-fighting movies ("Backdraft", "Firestorm", etc.), and so on. It's a good point, but we live in a day and age where computers are household items now. Appliances, like a television. You don't see televisions misrepresented in movies, do you? Or telephones, or cars? It is entirely possible to make decent, engaging, thrilling, and even non-fictional computer movies; read The Cukoo's Egg if you need a proof of concept. All you need to do is stop treating your audience like they're lobotomized. How about some computer terms used correctly? Or talking computers that actually talk using a speech synthesizer (or better yet, have a legitimate need to talk using a speech synthesizer)? Or some source code on display screens that's relevant to the story? These are not unreasonable demands!
Oh, well. Even when it's right, it's wrong. In The Terminator, the assembly code you see scrolling by every so often in Arnold's red heads-up display was the RWTS (read/write a track/sector) routines from DOS 3.3 on the Apple ][ computer. Of course, I find it hard to believe that a cybernetic killing machine with lightning-quick reflexes and a complete database of human anatomy used a 1Mhz 6502 chip in his skull, but maybe he was running an Apple emulator in his spare cycles for fun.
Epilogue: "You've Got Mail", a modern take on "The Shop Around The Corner", came out shortly after this essay was written in 1998. In it, they show people communicating through (big surprise here) AOL email. For the first time ever, email text was shown in its original form on the screen, with the camera zoomed way in so that the audience could read it. It was a step in the right direction, but as of 2003 things haven't gotten much better. In fact, in Charlie's Angels, Drew Barrymore barges in on two kids playing a Final Fantasy game together. The Final Fantasy game the kids were playing is a single-player-only game, so there was no reason for the both of them to be holding controllers. Sigh...
Well, it's still happening. I may be adding little updates to this essay for the rest of my life.
November, 2003: Even TV shows are getting dumbed down. CSI: Miami, a crime drama that takes place in modern times, recently profiled a crime that involved a webcam'd peep show. While I took a ton of computer reality violations with a ton of salt (blatantly invalid IP addresses, impossibly long WiFi distances, etc.), the one that just drove me nuts was when they tried to determine what company (name, phone number, address) was behind a particular IP address. "Hold on," computer tech says, and after churning through random IP addresses for nearly 10 seconds we get the output of a "whois" command. What was with all the IP addresses? Why not just type the damn whois command and get the output? I can understand entertainment making complex things simple for the benefit of the audience, but when they make a simple thing complex I just want to tear my own skin off. At least that experience was mitigated by one very clever use of computers: To determine the time of death. You see, when the guy was murdered, his head hit the keyboard, and he was in a word processor at the time. You figure it out :-)
December, 2003: Okay, I've had it with CSI. Someone used "face manipulation software" to artificially age a young girl's face from an old photo. This in and of itself is not a problem; such software/technology is already available and in use by law enforcement. What pissed me off royally is that whenever an adjustment was made to the face, there were floppy disk drive noises and dot matrix printer noises sounding from somewhere. DOT FUCKING MATRIX NOISES. Does your computer sound like a rabid 9-pin dot matrix printer whenever you move the mouse in photoshop? No, I didn't think so. This is so incredibly stupid that I've prepared a RealVideo clip of the segment so that you can see for yourself just how asinine this looked and sounded. See if you can also spot how all the photos used in the reconstruction had a smiling mouth of teeth, but how the end result had no smile/teeth whatsoever. Or the fact that it took about all of two minutes to perform the work, when in real life it takes hours or days.
October, 2005: Someone pointed me to this additional list of Computer Movie Annoyances. Cute.
January, 2010: It turns out that such wacky computer interfaces are intentional. Doesn't make it right!