Last Monday, during my first cup of coffee, I wrote a brief piece titled "Toward a Technology Bill of Rights." It contained a few basic ideas that had been bouncing around in my head for a while, but it was by no means exhaustive. It got picked up on Slashdot and other places, and I was deluged with e-mails, comments, criticism, and praise. Had I known the distance that piece would travel, I probably would have had another cup of coffee before writing it.
But the specifics of this idea certainly isn't set in stone, and it's supposed to create public discussion -- so that part worked. I've had more time to think about it and to read the comments and e-mails, so I think some clarification and expansion is in order.
[ Read up and weigh in on Paul Venezia's proposed Technology Bill of Rights | Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: First Look newsletter and InfoWorld Daily podcast. ]
I really liked the idea of patterning something such as this on the Five Laws of Library Science, and I think that in a perfect world, those might be sufficient. I also fully agree with the comments that the last thing we need are more laws, and what I've outlined could either be covered by existing laws or handled by mere common sense. However, we live in an ever more litigous world, and common sense is more expensive now than ever -- and somehow even rarer. That's why I think that these rights need to be explicitly spelled out. They also need significant clarification. This whole writing of laws thing is pretty tough, believe it or not.
I should first address the anonymity article. Several commenters were quick to point out that InfoWorld requires registration to post a comment, and that would seem to be at odds with this article. I don't see it that way -- the guarantee of anonymity doesn't require that steps such as registration be banished; it simply means that it cannot be illegal to post under an assumed name. I created my first pseudonym back in the '80s when I frequented (and ran) 2,400bps dial-up BBSes. Eight characters were the limit, and few people used their real name. That's the key to Article 1. If you want your real identity to be hidden from public view (though not from administrative view in most forms), you can absolutely do so. I wrote that article because I'm convinced that there will be circumstances that lead a lawmaker to propose a bill requiring all persons to use a verified name, address, and other personal information in order to post content on the Internet. The right to select relative anonymity should always be a choice, not an infraction. Let's not forget that unless you're behind seven proxies, the administrator of just about any Internet site will be able to trace you back to your IP address at least.
As far as the Net neutrality article, I didn't see many complaints other than obvious trolls. I think that the wording is a bit strong now (I've had that second cup, naturally), but I do think that we will need to regulate unrestricted Internet access as a right, especially given the nature of monopolistic carriers in many regions of the United States.
Another comment I saw was related to the Julie Amero case and Article 3. I saw a few people saying that just because there was this one case, we shouldn't bother with this article. I couldn't disagree more. We're at a point in the digital divide where the knowledge disparity displayed by those in positions of power (judges, district attorneys, lawmakers, and so on) is generally very large. Many of these people unknowingly subscribe to Arthur C. Clarke's comment that "Any sufficiently advanced technology is indistinguishable from magic," and simply don't have the foundation to understand how it could be possible that an incident such as occurred on Amero's classroom computer could be possible without the knowledge and consent of the teacher. That's what Article 3 was written to handle -- as with many of these rights, it functions as protection and a bridge to the next generation where the fundaments of networked computing are as embedded as the morning newspaper is to the current. This isn't a knock on anyone in specific, and my point should be completely obvious to anyone who's had to remove Bonzi Buddy from their parent's computer.
Article 4 was sticky to begin with, and plenty of people pointed that out. While I definitely think that something like this needs to be present, it's a very narrow path indeed. I do think that with the ever-increasing capabilities and functions of internetworked computing come ever-increasing vulnerabilities. Before it was possible to do your banking online, you didn't have to worry that your browser might become a gateway to your checking account, for instance. I would like to think (and many proprietary software companies would like you to believe) that we're out of the era of "it's amazing that this software even works," and firmly in the "now it needs to be secure and stable" phase. We've poured the foundation, built the frame, added a roof and windows, so let's get some locks on the doors and some insurance.
I should also address the comments such as this from nillo:
The writer of the malware is the responsible moral actor. This is not analogous to a product recall in which a manufacturer made a dangerous product. It is more like saying the manufacturer of tire irons is responsible for not designing a tire iron that cannot be used to bludgeon people to death. When the person who wielded the tire iron is the one who caused harm.
The way I see it, the writer of malware is a criminal, altering the function of or removing your personal property without notification or permission. I think an accurate analogy might be something like the manufacturer of your front-door lock should be held liable if someone discovers that the product can be subverted and anyone can get into your house using only a paperclip, yet either don't disclose this fact, issue a fix, or both.
My main reasoning for that article was to require proprietary software companies to disclose vulnerabilities and provide timely fixes for them. If there's no visibility into the code, there's no other way of knowing whether you're at risk and generally no way to protect yourself. With great power comes great responsibility.
I didn't really see much opposition to Article 5 and had plenty of positive remarks. This one is a no-brainer, pure and simple. Regardless of the greater concept of the Technology Bill of Rights, this should be law already.
Article 6 was attacked by some as being too open-ended. Perhaps, but there needs to be a guarantee that when an individual purchased a song or a movie, that they can listen to that song or watch that movie wherever and whenever they like. It does not allow for redistribution of that work, but merely requires that the content be unencumbered for personal use. The technological aspects of making this happen are either very simple or very difficult, as the current state of DRM shows. I think of it this way: If companies are willing to invest time and money into surreptitiously installing rootkits on your personal computer when you insert a music CD and suing everyone in creation, we need to fight back for our rights. Piracy is a problem, but the antidote needn't be worse than the affliction.
But there's obviously more here than just these six. I saw a few people going well beyond my original line of thinking, and offering up additional ideas. Keep them coming.