A few days ago on StackOverflow, someone posted the question, How can I keep Task Manager from killing my program? The first comment asks a very good question: “What legitimate reason do you have for doing this?”
It reminded me of the guy who wanted to know how to make a file that can’t be edited or deleted by any means. Both of these guys seemed to have honorable intentions, but just hadn’t thought the ramifications through all that well.
As programmers, we often have near-absolute power on a computer. We have access to everything that’s not specifically denied to us by system security, and with that much power comes great responsibility. It’s important to remember what I consider the first principle of programming ethics: When you write a program for someone else to use, the computer they are running it on is their property, not yours, and your program needs to behave itself accordingly.
There’s a negative corollary to the Golden Rule that applies here. Do not unto others as you would not have them do unto you. Much like a surgeon whose power over a patient in his care is similarly near-absolute, we have the implicit responsibility to “first, do no harm.” Your program needs to behave itself as an invited guest in someone else’s home. You do not walk in and act like you own the place, and this has implications far beyond simply avoiding virus-like behavior.
For example, unless you’re writing for a very old computer, or for a few particularly backwards modern devices, such as the iPhone or most game consoles, your program is going to have to share the system with a bunch of other programs, and they need to execute too. This means that you need to be careful to accomplish your task while using as few system resources as possible so you don’t end up hogging resources that another program may need. If possible, keep your CPU and memory usage low. This also means, for example, that it’s a very bad idea to use a garbage collector that’s designed to build up as much garbage as possible before collecting. That’s like never putting anything in the dishwasher until you have no clean dishes left in the house. (And I’m sure some of you out there have kids that do exactly that. Doesn’t it drive you up the wall? Do you really want a program doing that on your computer? If not, don’t do it to other people’s computers.)
Then there’s “protection.” Most programmers tend to be pretty good about this sort of issue, until the idea of someone doing unauthorized things with their program comes up, and then all rational thought, not to mention consideration for ethics, goes right out the window, seemingly replaced by testosterone-driven outrage from coders who are usually rather mild and easygoing. “What?!? They’ll never get away with stealing my program!” And then they proceed to do all sorts of blatantly evil things to other people’s computers.
Like most emotional knee-jerk reactions, this approach ignores the actual facts of the matter. For example, empirical evidence suggests that, in the absence of any enforcement, almost 90% of people tend to be basically honest. That’s pretty darn good, even before you figure in the additional cost of actually providing the enforcement.
Also, copy protection simply does not work, due to a combination of two factors. First, in order for the program to actually run, or for protected data to be read by a program, there has to be a “door” in the copy protection someplace. Not a secret “backdoor,” just the ordinary variety that lets the authorized user through. Thing is, if that exists in a computer-readable format, some user can find it and figure out how it works. There are some coders out there who can read assembly as easily as you and I can read the language of our choice, and if one of them tries to find a hole in your protection, it won’t last long, and this is a very important point.
DRM proponents often say that a copy-protection scheme doesn’t have to be perfect; just good enough to discourage casual hackers and “keep honest people honest.” The part about keeping honest people honest is nonsense, of course. Either someone is honest or they aren’t. But the thing is, so is the part about it not having to be perfect. That might have been true twenty years ago, but today a copy-protection scheme has to be absolutely perfect, because if it’s cracked once, by anyone anywhere, it’s all over. The crack will be posted online and any of those 13% or so of dishonest people out there who wants to use your program for free will have immediate access to it at the cost of just a little bit of searching.
And then there are the ethical issues involved. In any other context, an external programmer taking control of the functionality of a computer away from the computer’s owner and using it against the owner’s interests is known as computer hacking, (or cracking if you prefer to use the term “hacker” in its original, positive context,) and is quite illegal. Why all the special pleading in the case of DRM? Because there’s lost revenue involved? When did ensuring revenue become more a important consideration than not committing a crime? Follow that line of reasoning far enough and you end up with Enron and Bernie Madoff, or the Sony rootkit.
We’re beginning a new year, and it’s traditional to set resolutions. I hope everyone involved in programming will resolve to try to hold ourselves to the high standard of not writing code to do anything on another person’s computer that we wouldn’t want done on ours, and also to the even higher standard of actually thinking through the implications of what we try to do to figure out whether or not it would end up doing something bad. Then maybe we’ll end up with a few less questions like that on StackOverflow, and I won’t have to keep posting the same answer to them.
Happy New Year, everyone!