Centre for Technology Studies
The University of Lethbridge
Lethbridge, AB T1K 3M4 Canada
Abstract As soon as a brilliant mind produces technological progress of some sort, an equally brilliant but devious mind devises ways to sabotage it. The hacker, the cyberpunk, the computer virus - who as end-user has not had to worry about them? Our understanding about people's morals when it comes to computer technology is limited and while laws may develop to reduce physical harm, social and economic reforms may be necessary, in order to deal with the tide of legal violations with the help of new technology. This paper discusses some of these issues and presents some research propositions in order to develop a moral code for cyberusers. Preliminary data indicate that cross-national differences exist and younger respondents are more libertarian than their older counterparts. A better understanding of end-users' motivations will help in preventing breakdowns and system failures. The paper will conclude with some managerial/policy implications and outline future research avenues.
1. LITERATURE REVIEW Information systems and computer technology are expensive. System safety, data integrity and security are of particular concern. Unfortunately, whenever a safety feature is implemented a cyberpunk may try cracking it for the sake of adventure or using his/her knowledge for personal profit. Accordingly, it has become more difficult to determine moral issues in a society with changes occurring at an ever faster pace. For instance, we all know that copying software is a crime, however, some individuals have continued to pirate software for private use. In some situations, we may feel that this is neither legal nor harmless and we tolerate the offense by ignoring the incident. Also, it may be harmless to write a computer virus and play with it on one's own computer. Nevertheless, purposely or unintentionally putting such programs into the hands of people with devious or malicious interests can eventually harm other people. Consequently, a computer virus may be distributed around the world with the help of many innocent bystanders. This may occur through diskettes or other methods of data transfer permitting the virus to spread exponentially. In this paper we are focusing on two issues:
In midst of all this is the great majority of cyberusers, like you and me, who have no obvious malicious intent nor interest in harming anybody else. Within each one of us cyberusers, there is the potential to exhibit behaviours (e.g., wanting to copy a friend's exciting new software) which result in us skirting the fringes of the law if we give in. Hence, we all have some hacker, cracker and cyberpunk potential within us and if unleashed, it may cause harm to others. It seems only appropriate if we, therefore attempt to learn and better understand how hackers, crackers, cyberpunks and cyberusers may reason and go through a moral appraisal resulting in the decision to do what is right or wrong (e.g., Davidson, Turiel & Black, 1983).
We know that strong social bonds (e.g., what would my friends think if I do this) influence the reasoning process for deciding if exhibiting some action is appropriate or not (e.g., Matsueda & Heimer, 1987). However, we know little if anything about how anyone of these user groups decides why skirting the fringes of the law is just or wrong? New technology, security devices, passwords and data back-ups all improve the security of computer systems and information networks. Our increasing dependence upon these technologies, unfortunately, will exponentially raise our vulnerability to their malicious use by a few. Accordingly, unless we better understand the moral reasoning process invoked by diverse end-users, our risk to be harmed will continue to rise sharply (cf. Davidson, Turiel & Black, 1983).
Trying to explain delinquency in white and black youths in the USA, Matsueda and Heimer (1987) reported that differential association theory "is supported over social control theory." Consequently, a person's moral framework of what is right and wrong when working with computers and any technology may be explained by association theory. In turn, what may be correct as far as copying software or working with a virus for fun or hacking a computer system may depend largely upon one's friends own values, behaviours and norms or what they think is right or wrong. Hence, whom you keep company with may help in explaining why diverse end-user groups pursue different actions which may or may not harm others.
2. Social Bonds and Moral Development:
Cyberuser Moral Code (CMC) The literature about delinquency and moral
codes outlined in the previous section suggests that we need to better
understand how computer users feel about various behaviours such as, writing
or playing with computer viruses, and also, by using an encryption device
or software for protecting their data against misuse by others (i.e. similar
to the use scramblers to protect phone conversations against wiretapping).
Research Proposition 2: Groups (i.e based on social background and computer experience) differentiate dissimilarly between the harmless- less harmless (at the fringes of the law) scenarios about computer-mediated actions across countries (e.g., using a private encryption device when they are actually banned versus placing a virus on a BB or EDL).
Among affluent classes, two countries may have differing moralities. While in Brazil some rights and justice are outweighed by personal ties and the ethics of community, the ethics of autonomy are important in the United States (Haidt, Koller & Dias, 1993). Similarly, software piracy as identified by Western software companies may be perceived differently by others. Verzola (1994) reasons that pirating software from USA companies somehow offsets Malaysia's human capital losses. The latter occurs when after completion of their education, Malaysian professionals try to immigrate to the USA.
Other international problems arise with different laws across countries. For instance, the USA government is striving to make the dissemination of virus programs illegal. In 1994 a New Zealand youth built a plastic bomb from information put on a BB hurting himself and others when the bomb went off. In turn, a New Zealand MP introduced a Technology and Crimes Reform Bill in 1994 that, if made law, would halt some of the flow of obscene material by making it an offence to possess it, thereby enabling the government to disconnect BBs carrying such information. Similarly, in 1994 the Australian government established a task force which is considering alternatives for developing a regulatory system for BBs, import and export controls of information via EDLs and most interestingly, assess whether current law enforcement powers are adequate. This suggests that certain behaviours being part of one's personal domain (e.g., the ubiquitous practice of playing with a virus) may become part of the moral domain or possibly illegal in some countries. Unfortunately, without coordination across countries, it will be difficult to succeed with these efforts to regulate the flow of information fairly and justly across and within borders. This suggests the last proposition which should be tested, namely:.
3. PRELIMINARY RESEARCH FINDINGS AND IMPLICATIONS To test some of the propositions outlined above we chose to use vignettes describing particular situations a person might observe and we asked participants to respond to sets of questions pertaining to these scenarios. At the start we had 10 stories of about 25 - 35 words each, of which the first five dealt with a person entering a computer with and without a password, the remaining five asked about people's moral reasoning when observing a person playing or programming a virus. The stories started off with relatively harmless acts (e.g., entering a computer without a password simply for fun even though the person has the password for the system). What the preliminary testing revealed was that people were interpreting the entering of a computer without using one's password as being in the conventional domain (suggests not to do this even for fun), while leaving such a "cracker" program with a friend was felt to be part of the moral domain (intrinsically harmful to others, therefore inappropriate behaviour).
The results were not as obvious for stories dealing with the computer virus. For instance, younger individuals were more likely to attribute certain actions to the personal domain of morality, while older respondents felt certain actions were part of the conventional or moral domain.
Playing versus reality. The above findings resulted in us having three stories to work with. In particular, younger individuals felt that playing with a virus was okay and the person's personal decision as long as she or he was careful in not releasing the virus into the "wild." Older respondents were less certain and often felt that it went beyond the personal domain or should simply not be done (can harm others, immoral behaviour).
BBs and EDLs. During the preliminary testing we also discovered through colleagues and associates worldwide assessing the instrument that one scenario should address the BB/EDL issue. It was pointed out that how a person may decide, after reasoning, if a behaviour is part of the personal, moral or conventional domain is of special interest when receiving information from abroad where it might not be banned (e.g., how to make a bomb). The preliminary data indicate that North Americans are more likely to use a libertarian approach by letting the individual decide on his or her own. In contrast, European Community members felt it inappropriate to skirt the fringes of the law. These respondents tended to reason that such information should not be passed on to a friend abroad where such information is legal. Also, North Americans were less likely to take a moralizing stand by arguing that what one country did was appropriate, while what another did was not. This was surprising to us, considering that the USA federal government often attempts to impose its own goals and beliefs upon other country's by twisting arms and using heavy handed political maneuvering (e.g., international trade talks).
Clipper chip and encryption. The recent discussions about the USA's and other countries' efforts to establish some encryption standards which may ultimately regulate what can be used and, most importantly, ban the utilization of private devices warranted the addressing of this issue. Again, the preliminary data show that USA respondents are adamant about protecting privacy and the person's freedom to choose any encryption device including making their own. This is not surprising considering that in a 1994 TIME/CNN public opinion poll of 1000 Americans, two thirds of the respondents felt that it was more important to protect the privacy of phone calls than to uphold law enforcement's ability to conduct wiretaps.
Differential association theory. The preliminary data also support this theory indicating that a person's moral framework of what is right and wrong when assessing situations dealing with computer technology is affected by social bonds (e.g., friends). For instance, the way the person interpreted how friends might feel about certain things affected one's own assessment, while full-time employees of lower perceived socioeconomic status were less tolerant about certain actions and behaviour involving cyperspace.
Preliminary data gathered through this research program suggest that people are willing to break the law in the cyperspace domain (e.g., keeping illegal copies of software for a while to make use of them), thereby making the enforcement of laws protecting property rights very difficult. If the public feels that certain things are part of the conventional domain only (one should not do it) but not immoral, then governments and firms will have a very hard time to enforce these laws. Moreover, the social control of individuals may not succeed if a large group feels that certain behaviours are appropriate such as playing with viruses. Making these actions illegal may neither diminish the incidence of these types of actions nor reduce our current stack of pending cases in the court system.
Encryption of passwords and software that can handle encrypted messages and authenticate both recipient and sender will reduce the risk of fraudulent use and intrusion into computer systems. As networks become more global, local laws become mere local ordinances, since other countries may not qualify certain acts as crime nor ban them. Without further investigating how CMC is developed, maintained and changes among the various groups such as hackers, crackers, cyberpunks and cyberusers, techno-crime and terror against private users and firms by a few will continue to raise havoc in many information systems. Gaining a better understanding about this issues, in turn, will enable us to better understand why people behave the way they do and subsequently permit us to protect users. Issues which we consider part of the personal domain of moral theory may have to be shifted to the conventional domain if not the moral one in order to prevent people from doing things which could harm others if the software, technique or password lands in the lap of a malicious user. We are currently investigating these issues further and hope to shed some more light on the matters discussed here shortly.
REFERENCES Davidson, P., Turiel, E., & Black, A. (1983). The effect of stimulus familiarity on the use of criteria and justifications in children's social reasoning. British Journal of Developmental Psychology, 1, 49-65.
Gattiker, U. E., & Nelligan, T. (1988). Computerized offices in Canada and the United States: Investigating dispositional similarities and differences. Journal of Organizational Behaviour, 9, 77-96.
Gattiker, U.E., Gutek, B.A., & Berger, D.E. (1988). Office technology and employees attitudes. Social Science Computer Review, 6, 327-340.
Haidt, J., Koller, S. H., & Dias, M. G. (1993). Affect culture, and morality, or is it wrong to eat your dog. Journal of Personality and Social Psychology, 65, 613-628.
Heckathorn, D. D. (1990). Collective sanctions and compliance norms: A formal theory of group-mediated social control. American Sociological Review, 55, 366-384.
Helms, J. E. (1992). Why is there no study of cultural equivalence in standardized cognitive ability testing? American Psychologist, 47, 1083-1101.
Hirschi, T. (1969). Causes of delinquency. Berkeley: Free Press.
Hofstede, G. (1980). Culture's consequences. Newbury Park, CA: Sage Publications
Logan, R., Snarey, J., & Schrader, D. (1990). Autonomous versus heterogenous moral judgment types. A longitudinal cross-cultural study. Journal of Cross-Cultural Psychology, 21, 71-89.
Nucci, l. (1981). Conceptions of personal issues: A domain distinct from moral or societal concepts. Child Development, 52, 114-121.
Nucci, L. Turiel, E., & Encarnacion-Gawrych, G. (1983). Children's social interactions and social concepts. Analyses of morality an convention in the Virgin Islands. Journal of Cross-Cultural Psychology, 14, 469-487.
Matsueda, R. L., & Heimer, K. (1987). Race, family structure, and delinquency: A test of differential association and social control theories. American Sociological Review, 52, 826-840.
Sherman, L. W., Smith, D. A., Schmidt, J. D., & Rogan, D. P. (1992). Crime, pnishment, and stake in conformity: Legal and informal control of domestic violence. American Sociological Review, 57, 680-690.
Sutherland, E. H. (1947). Principles of criminology (4th ed.) Philadelphia: Lippincott.
Turiel, E. (1983). The development of social knowledge: Morality and contention. Cambridge, UK: Cambridge University Press.
Verzola, R. (1994). Software piracy: Another view. 2600 The Hacker Quarterly,
11(1), 16-17.
Retirado do site: http://www.icsa.net/library/research/tecnoc.shtml em jul/99