Privacy, Censorship and the Law
presented by Valerie Steeves
at University of Saskatchewan Libraries Access '98
on October 3, 1998
A year ago, a friend of mine asked me to give a special lecture to his class of 3rd year criminology students. We wanted to talk about the links between pornography and marketing to children. The best examples I know of are the smoking and alcohol sites. Games for 8-12 year old kids are embedded in these sites - instead of jumping up to catch gold coins like in Super Mario Brothers, the kids jump up and catch a bottle of rye. What makes the sites most interesting from an academic point of view is the fact that just underneath the links to these kids games are links to pages like "A Picture of Debbie Naked" and "Naked Women Smoking Cigars".
I was told that I probably shouldn't bother because my friend doubted the university filter would let us get access to the sites. In effect, the administration had decided that its third year university students didn't have the maturity to see this kind of material in an academic setting.
The majority of people connected to the Internet around the world still access it through a university account. We have a new medium that provides students with access to information like no other medium before it - the Net is an incredible resource for learning, dialogue and debate. And, I guess, pictures of naked women smoking. However, more and more schools are using software filters and acceptable use policies to control their students' Internet use.
Why have universities, supposed bastions of academic freedom, taken such a conservative approach to online resources? I think the first part of the answer is because they can. As you know, the Internet is the most public of mediums. The total transparency of open architecture has the potential of enabling people to form self-actualizing virtual communities. But this will only happen if that transparency is reciprocal. In a newsgroup, we can form self-regulating principles governing our use of free speech and respect for privacy because everyone knows what everyone else is saying. It's like a small town. Not much privacy and lots of gossip, but I know as much about my neighbours as they know about me.
But networking technology can also be used to give others - invisible others - a window into our community life in a way we haven't experienced before. And the government is the least of our worries. Reg Whittaker at York argues that Big Brother has been outsourced, because the private sector now collects, uses and sells vast amounts of electronic information about each of us - our interests, our political positions, the names and ages of our children. The government has taken the position of private sector cheerleader and busboy. John Manley, Minister of Industry, introduced data protection legislation into the House of Commons on October 1. He tells us he is proud of the fact this new legislation will secure our position in the new global economy, and manufacture the consumer confidence which is necessary to enable companies to continue to collect and use personal information - in other words, to continue to exploit Canadian citizens as market and marketable commodities.
Our government is not concerned about protecting the social value of privacy. If it were, Mr. Manley would have heeded the overwhelming calls for a privacy charter which were heard across the country during the public consultation on privacy rights and new technologies conducted by the Parliamentary Committee on Human Rights in March 1997.
But our government isn't alone. Universities don't appear to be concerned about the privacy implications of their actions either. The fiscal climate has brought with it a shift in values and attitudes about the nature of the academic community. For example, I'm the director of the Technology and Human Rights Project at the University of Ottawa's Human Rights Research and Education Centre. We've been told, after deep cuts into our budget, that we will only survive if we become a profit centre for the university by attracting private sector partners willing to finance our research.
A few months ago, I was sitting in a meeting with a potential private sector sponsor. After reviewing some educational software we designed that his organization was willing to pay for, he vetoed distributing it to the Canadians who attended the Parliamentary Committee's public consultation on privacy rights. He said "Val, as you know, there are two approaches to privacy education - the e-com approach and the charter approach. We intend to crush the charter approach." So much for private sector support of my field of research.
The shift away from traditional academic values - academic freedom, independence, theoretical research, critical commentary - to corporate values - cost-effectiveness, efficiency, applied technologies, marketable skills, corporate funding - has real implications for the ways in which we teach, learn and provide access to information, particularly over the Internet. Let me give you an example. I met a professor in New Brunswick who was part of an early experiment in networked long distance learning. The experiment was funded by a private company. During a class discussion, she made comments which were critical of government policies. A few days later, she got a telephone call from her funder. Apparently, they had been monitoring the course by listening in over the network, and they wanted her to stop making those kind of comments in the future. After all, they were paying for the course.
Two things are violated when there is electronic surveillance in a university environment. The first is the right to privacy. The second is the right to free speech. Academic learning depends, in large part, on the relationships between the professor and the students and among the students themselves. But surveillance destroys the trust and confidentiality which are necessary to sustain those relationships. Surveillance also has a chilling effect on free speech - people are less willing to take part in dialogue when their comments are recorded surreptitiously and then monitored by an invisible party.
Privacy and freedom of speech are human rights with a long and interconnected history. Indeed, both are often hailed as hallmarks of a free and democratic society. However, neither of these rights is an absolute one, and we have often been called upon to balance their benefits against society's need to limit harmful behaviour. That balance has not been an easy one to find. Unlike freedom of speech, privacy is not expressly protected, per se, in the Canadian Charter of Rights and Freedoms. But implicit in the various legal mechanisms we have created to protect free expression, there is the notion that people should only be limited in the exercise of their freedom of speech when their statements are, in fact, made in the public sphere. Conversely, our lawmakers have been loathe to restrict our actions when those actions are taken in private.
For example, the Criminal Code prohibits the spread of hate propaganda and obscene materials, both of which limit our freedom of expression. The courts have maintained that those restrictions on our freedom are acceptable because our right to free speech must be balanced against society's need to protect itself from harm. But the balance between the benefits of that expression and the resulting harm to society rests, in law, upon the line dividing what we do in private and what we do in public.
Incitement of hatred, for example, is committed by someone who, "by communicating statements in any public place, incites hatred against any identifiable group". Similarly, it is an offence whenever someone "exposes to public view any obscene ... thing whatsoever". Both these laws assume that in all but the most extreme circumstances, we are free to express ourselves as we please in our own private spaces; but when we express those thoughts publicly and some harm occurs, our free speech can be curtailed.
New communications technologies have challenged this balance precisely because, in a networked world, the line between our private lives and our public actions has become blurred. The realities of technological convergence mean that we use the same technologies, and the virtual spaces they create, to chat, debate, shop, gossip, play and work. It is more difficult to effect a balance between free expression, the right to privacy and, say, the protection of children, in an environment where a child researching a history assignment on World War II in a virtual library is just one click away from a neo-Nazi recruitment site.
The difficulty is magnified by the legalistic interpretation of human rights so prevalent in our courts. On the one hand, the Supreme Court has interpreted the right to be free from unreasonable search and seizure under s. 8 of the Charter to include a right to privacy. On the other hand, that right to privacy is only present where the individual has a reasonable expectation of being secure against such a search.
The trouble is that new technologies are making it easier and easier for others to listen in to our communications, and we all know it. Accordingly, it may be difficult to argue that we have a reasonable expectation of privacy when we send e-mail, search through university catalogues, or, pay our tuition over open networks because the technology itself does not provide us with any expectation of privacy.
In addition, the courts have tended to look at the issues in isolation. In R. v. Plant, the Supreme Court had to decide whether or not the police had violated s. 8 of the Charter by accessing the city utility's mainframe computer to retrieve the accused's electrical bills. The police had received a "Crime Stopper's" tip that the accused was growing marijuana at his home. The police figured indoor cultivation would increase someone's electrical consumption. So they logged onto their own computer, telneted over to the utility's mainframe and pulled up the accused's electrical bills. Mr. Justice Sopinka argued that Section 8 of the Charter should seek to protect a biographical core of personal information - information which tends to reveal intimate details of an individuals' lifestyle and personal choices. However, he concluded that the computer records in this case could not reasonably be said to reveal intimate details of the accused's life.
In isolation, that may be true. But once again, the Court failed to recognize the changing nature of data management and its impact on our privacy and our sense of freedom. Digitized transactions make it easy to maintain huge databases of information, each containing small bits of our personal lives, like our electrical bills. Even alone, these digital records may put our private choices under unwanted public scrutiny, as they did when a news reporter published a record of Judge Bork's video rentals during hearings confirming his appointment to the United States Supreme Court. However, these small pieces of us do not exist in isolation. Network those databases together and data-matching software can construct a detailed, sophisticated profile of our daily lives. In other words, new information practices are dramatically changing our experience both of privacy and of public interaction. To focus on each little bit of information is to miss the point.
It is the information practices that we must question if we are to successfully incorporate new technologies into the academic environment. Convergence means our students can go wherever they want to on university computer labs. And they're likely to come across offensive content in some of the places they go. But I think we need to clearly distinguish between illegal content and offensive content.
The rush to create acceptable use policies can be explained, in part, by the fear of liability. What if a student libels someone in an online discussion? Or advocates genocide on a student web page? Can the university be held liable for the online actions of its students? Canadian courts have yet to definitively answer these questions. Early American caselaw suggests that service providers who make no effort to control the online actions of their customers can't be sued if one of them makes libellous statements in an online discussion. The service provider in that case is acting more like a telephone company than a publisher. We don't charge the telephone company with a criminal offence just because they supplied the phones and wires used by a hate mongorer to utter a death threat in a local call. However, American courts have said that service providers who do try to control the actions of their customers are more like publishers, and publishers are legally responsible for defamatory statements they publish.
The lawyer in me says the last thing a university should do to limit liability is to come up with an acceptable use policy. Prodigy was held liable when the only "control" it exercised over its discussion forums was a software filter designed to delete messages with swear words in them. To control is to publish - and in this case, to publish is to perish.
But I think the more fundamental questions we have to ask about acceptable use policies have to do with the consequences of eliminating offensive, as opposed to illegal, content. Many, if not most, acceptable use policies, attempt to control offensive dialogue and access to offensive information. But in a democracy, there is a value in being offended. Ken McVay was an unemployed mechanic in Victoria at the end of the 1980's. He came across some truly offensive anti-semitic comments in a newsgroup. He was so offended, that he went to his local library and read everything he could about the holocaust. He figured if he was so interested, there were other people who would be interested too, so he posted everything he could find about the holocaust and holocaust-denial on the Internet. The result was the Nizkor Project, now the largest online collection of material in the world on the holocaust and the actions of cyber-racists.
Democracy is not an easy process. Democratic dialogue is often difficult and uncomfortable. One of the strengths of online communications is that it is easier to see the challenges we face because hatred, sexism, homophobia, racism aren't hidden in the margins where they can grow and become more dangerous. We need to be offended by these things. That's the only way we will confront them and work to create a healthier society.
But we must be very careful about the tools we use to do this. For some reason, we tend to treat online communications differently than traditional methods of expression. Although our postal services are used by people to distribute pornography and hate literature, no one would seriously suggest that we open all our students' mail to make sure nothing illicit is going on. And yet that is precisely what many are suggesting we do in cyberspace.
The American Communications Decency Act is an excellent example of this dynamic. In July 1995, Time published an article claiming that a vast amount of pornographic material could be accessed anonymously over the Internet. Although the claims contained in the article were later discredited, the United States Congress responded by criminalizing the use of a telecommunications device to "knowingly make, create, or solicit ... any comment, request, proposal, image, or other communication which is obscene, lewd, lascivious, filthy, or indecent". In no other medium is communication so restricted, and a 3-judge panel in a Philadelphia court struck down the Act in June of 1996 for contravening constitutional guarantees of free speech.
The trouble is these constitutional protections may limit what governments do to censor expression in cyberspace, but they do not extend to the actions of private parties. Students are increasingly using the Internet to share information and debate political issues. If they can be denied access to the medium because the university administration finds their views offensive or unpopular, then that, in effect, will disenfranchise those people from the political process itself. And filtering out offensive web sites may disenfranchise others. When Deutsche Telecom cut off access to the online server Web Communications to prevent German citizens from viewing Ernst Zundel's Web pages, Deutsche Bank Securities, who also had an account with Web Communications, found themselves "thrown out with the hate speech".
It is also important to question the reasoning offered in support of such broad censorship. Proponents of restricted expression often point to the vast quantity of obscene and hateful online material which can be accessed anonymously. One Canadian chief of police summed it up this way: "Everything on the Internet is becoming bigger, better, badder, uglier, stinkier, nastier, more violent, more vile, more disgusting."
In fact, the actual percentage of illicit traffic over the Net is quite small, and claims that it is impossible to track anonymous users are also somewhat exaggerated. Even anonymous remailers are not the haven they appear to be. Most pornographic files, especially containing graphics or video, are quite large - and most remailers limit the size of files they will handle to avoid being overloaded. In any event, a remailer retains the name and IP address of the original sender on its hard drive. Because of this, Finnish courts were able to grant a search warrant in February of 1995 to retrieve the name of someone who had posted an allegedly stolen computer file in a newsgroup through a local remailer.
We have laws which protect us from illegal speech in ways which are consistent with our most deeply held values. These laws can be applied to cyber-speech in the same way they are applied in the real world without seeking to assert control over the Internet itself.
The value of free speech in a democracy is unquestioned. As such, it is imperative that access to forums of public debate be as open as possible. Rather than allowing university administrators to filter communications, that same filtering software in the hands of the user can provide each of us with a vehicle to tailor our electronic communications in a way which protects us from offensive, invasive material but which also maximizes our ability to participate in free discussion. In that way, electronic communications and access to online information can continue to bring us closer to a participatory model of government - one where citizens take part in an informed debate about social, economic and political values.
As more of our informational archives are digitized, the policies that you develop to balance free speech and privacy will have a lasting impact on the kind of society we are building for the future. On Friday, I was on a panel with Alan Borovoy, who referred to librarians as the Clark Kents of the battle against censorship. May you always find cyberspace full of phone booths, not kryptonite.