Justice and accountability

What does justice mean in cyberspace? Here are some examples of situations in which justice may have become blurred in cyberspace, with suggestions as to how a Christian response may help to clarify issues.

Shrinkwrap licences

We have already remarked that software is almost entirely opaque to the user: if it goes wrong it is seldom apparent what has gone wrong, or how to fix it. Despite this, people often have a touching faith in computers. The people who build computers and their software know how misplaced this faith often is! The fact is that large software systems are nearly as opaque to their builders as to their users. They are so big, and so complicated, that usually nobody has a clear idea how they work. Software is incredibly difficult to test, because it is impossible to try out all possible combinations of circumstances.

As a result, software manufacturers often supply their software with licences that say things like 'For 90 days we warrant that the software will substantially conform to the user documentation', and 'Because software is inherently complex and may not be completely free from errors, it is your responsibility to . . .' These terms are usually accompanied by 'Opening this package signifies your agreement to these terms'. In one sense this is fair: if we use the software we accept the terms, and we don't have to use the software. But the balance of power between supplier and consumer is decisively different from that in other product areas, where manufacturers are held much more directly liable for defects in their products. Is this right?

The Bible says a great deal about the importance of protecting those who do not have the means to protect themselves. The users of computer programs are, arguably, in just that position; not only do they lack power even to identify the true problem, but they are much more likely to blame themselves instead of the product than when they are faced with a defective toaster or car. Yet those same users collectively constitute a market that rewards new features but does not reward new reliability - so it is hard to blame suppliers for emphasizing the former rather than the latter.

Laws (such as the Unfair Contract Terms Act 1977 in the UK) limit the ability of suppliers to impose unfair terms on customers, and there could well be commercial advantages in companies offering better warranties, as used car dealers have discovered. (Unfortunately, in software, unlike cars, there is no obvious way to fix problems. Even replacing software may not fix any problems, whereas - of course - replacing a car fixes at least the original car's problems.) Many businesses have discovered that it is better to anticipate a public desire for more ethical business practices than to take full 'advantage' of limited legal protection. From a Christian perspective there is a strong ethical need to treat the consumer as you would wish to be treated: 'love your user as yourself'. Christian software companies would be hard pressed to reach this ideal, but they might try to think of ways to contribute.


Not shooting the messenger

Rather like the conventional postal system, which transports people's letters around, delivering information in cyberspace requires various intermediary operators. Each operator takes a part in the delivery of messages and the exchange of information. Who is responsible for the content of the information while it is being 'carried'? There has been a suggestion that Internet Service Providers should be made responsible.

In the conventional postal system, the carriers have a special legal status. Beyond the obvious reasonable care they must take, they are not held responsible for the content. It may be illegal to post certain materials (for example, radioactive substances) but it is the people who post them who are breaking the regulations, or the law, not the carriers. There is clearly a spectrum of opinion about what is the right approach, and the legal issues are by no means straightforward. Moreover, laws in different countries vary. For example, in the United States the Freedom of Information provisions ensure that information can be published that is secret in the UK. Or bookshops in Britain can readily stock material that would be illegal in politically more repressive regimes. Or we can write 'Free Tibet' but in Tibet itself people are put in prison for saying so. So even if we decided who was responsible, we have no international agreement about the things which ought not to be done.

All these uncertainties still exist despite postal systems, books, written material and so on having been around for thousands of years! Our society does not seem to know what to do; on the one hand, it wants to make some material judged nasty or secret inaccessible, yet on the other, it doesn't want just to shoot the messenger when the real problems are with either the original suppliers or the final consumers. In the physical world, there is nobody who supports an entirely liberal line that 'anything goes': it is hard to imagine anyone wishing to defend the transport of postcards, magazines, nuclear weapons, venomous snakes or hydrofluoric acid by a public service which treats them all equally! But, in effect, cyberspace currently does just this!

There are obviously good motives for restricting the access of certain groups to some kinds of information. We want to rid the world of torture. There are some people (for example, children and criminals) who are better off not knowing much about it; but there is equally a view that where there is evil in the world, some people - not everybody - should be able to see it and take appropriate action. So it seems impossible that these questions will be answerable by a simple, global solution.

Work towards agreement on standards of professional discretion on the part of Internet Service Providers may offer some assistance in approaching the questions discussed here. These might be - indeed, given the international legislative difficulties involved, might have to be - informally agreed and adopted. They will probably be exploited, but the point would be to make clear where boundaries of responsibility lie.