® InfoJur.ccj.ufsc.br

Techno-crime and terror against tomorrow's organization:
What about cyberpunks?

Urs E. Gattiker and Helen Kelley

Centre for Technology Studies
The University of Lethbridge
Lethbridge, AB T1K 3M4 Canada


Abstract As soon as a brilliant mind produces technological progress of some sort, an equally brilliant but devious mind devises ways to sabotage it. The hacker, the cyberpunk, the computer virus - who as end-user has not had to worry about them? Our understanding about people's morals when it comes to computer technology is limited and while laws may develop to reduce physical harm, social and economic reforms may be necessary, in order to deal with the tide of legal violations with the help of new technology. This paper discusses some of these issues and presents some research propositions in order to develop a moral code for cyberusers. Preliminary data indicate that cross-national differences exist and younger respondents are more libertarian than their older counterparts. A better understanding of end-users' motivations will help in preventing breakdowns and system failures. The paper will conclude with some managerial/policy implications and outline future research avenues.


1. LITERATURE REVIEW Information systems and computer technology are expensive. System safety, data integrity and security are of particular concern. Unfortunately, whenever a safety feature is implemented a cyberpunk may try cracking it for the sake of adventure or using his/her knowledge for personal profit. Accordingly, it has become more difficult to determine moral issues in a society with changes occurring at an ever faster pace. For instance, we all know that copying software is a crime, however, some individuals have continued to pirate software for private use. In some situations, we may feel that this is neither legal nor harmless and we tolerate the offense by ignoring the incident. Also, it may be harmless to write a computer virus and play with it on one's own computer. Nevertheless, purposely or unintentionally putting such programs into the hands of people with devious or malicious interests can eventually harm other people. Consequently, a computer virus may be distributed around the world with the help of many innocent bystanders. This may occur through diskettes or other methods of data transfer permitting the virus to spread exponentially. In this paper we are focusing on two issues:

1.1. Crime and Terror: The Age of Cyperspace

The law tries to prevent harm to the material or psychological interest of others. In today's world, however, a new breed of computer users has emerged. The "hacker" is somebody who derives joy from discovering ways to exceed current limitations. Thus, hackers in the original sense were referred to as explorers who solved problems and exceeded conventional limits through trial and error in situations in which there were no formal guidelines or previous models from which to draw. Forms of predatory and malicious behaviour, including malicious computer intrusion are embodied not by hackers but by "crackers." Accordingly, crackers try to take a peek at data and use other hardware and software without the legal rights to do so. The newest breed is the "cyberpunk" who embodies behaviours similar to what we define as crackers but, in addition, uses technology to damage, destroy or capitalize on the data found; know-how is utilized to gain more information which, in turn, increases the cyberpunk's influence, power and potential threat upon others' information world.

In midst of all this is the great majority of cyberusers, like you and me, who have no obvious malicious intent nor interest in harming anybody else. Within each one of us cyberusers, there is the potential to exhibit behaviours (e.g., wanting to copy a friend's exciting new software) which result in us skirting the fringes of the law if we give in. Hence, we all have some hacker, cracker and cyberpunk potential within us and if unleashed, it may cause harm to others. It seems only appropriate if we, therefore attempt to learn and better understand how hackers, crackers, cyberpunks and cyberusers may reason and go through a moral appraisal resulting in the decision to do what is right or wrong (e.g., Davidson, Turiel & Black, 1983).

We know that strong social bonds (e.g., what would my friends think if I do this) influence the reasoning process for deciding if exhibiting some action is appropriate or not (e.g., Matsueda & Heimer, 1987). However, we know little if anything about how anyone of these user groups decides why skirting the fringes of the law is just or wrong? New technology, security devices, passwords and data back-ups all improve the security of computer systems and information networks. Our increasing dependence upon these technologies, unfortunately, will exponentially raise our vulnerability to their malicious use by a few. Accordingly, unless we better understand the moral reasoning process invoked by diverse end-users, our risk to be harmed will continue to rise sharply (cf. Davidson, Turiel & Black, 1983).

1.2. Differential Association and Social Control Theory

One of the challenges we face is that without the context it is difficult for the actor or the observer to classify something as just or unjust (e.g., hacker who enters computer illegally but does not destroy any data). Sutherland's (1947) theory of differential association tries to address this issue and states that delinquency is rooted in normative conflict. Consequently, modern society contains conflicting structures of norms and behaviours as well as definitions of appropriate behaviour that give rise to crime. At the individual level, Sutherland maintained that normative conflict is translated into individual acts of delinquency through differential association learned through communication and primarily in intimate groups. In short, the actions my friends think justify the means will influence my reasoning. Hirschi's (1969) social control theory denies the existence of normative conflict; rather it posts a single conventional moral order in society and assumes that the motivation for delinquency is invariant across people. Instead of asking "why do some people violate the law and social norms?," control theory asks "why do most people refrain from law and moral code violations?" Hirschi's answer is that they are dissuaded by strong bonds to conventional society such as attachment to friends, involvement in the community and belief in what is right and wrong.

Trying to explain delinquency in white and black youths in the USA, Matsueda and Heimer (1987) reported that differential association theory "is supported over social control theory." Consequently, a person's moral framework of what is right and wrong when working with computers and any technology may be explained by association theory. In turn, what may be correct as far as copying software or working with a virus for fun or hacking a computer system may depend largely upon one's friends own values, behaviours and norms or what they think is right or wrong. Hence, whom you keep company with may help in explaining why diverse end-user groups pursue different actions which may or may not harm others.

1.3. Cognitive-Developmental Approach

The above helps in assessing what external governors (e.g., social control) people may be submitted to that would bring them to act legally. Additionally, we are also interested to know how such external governors may influence the person's reasoning process, thereby helping us in answering what brings people to act in moral ways or immoral ways (e.g., copying commercial software from a friend). The principal interest here is whether people assessing certain behaviours related to computer technology take a moralizing or a permissive stance toward these acts. People have notions of freedom from external constraints, ideas about the human construction of rules and laws, and issue of who is to be included in the moral domain. Turiel (1983), Nucci (1981) as well as Nucci, Turiel and Encarnation-Gawrych (1983) have developed a "domain" theory of moral development, in which development proceeds as children and adolescents sort social events into three domains of knowledge, namely personal, moral and conventional domains. In the personal domain it is "outside the realm of societal regulation and moral concern" (Nucci, 1981, p. 114). Accordingly, if a person chooses to write a computer virus and then proceeds to test it with the scanner program on his/her own machine this is not harmful to others. Acts which have interpersonal consequences but are not harmful, fall within the domain of conventional knowledge. Accordingly, it is not harmful to wear jeans, but in the context of a school where the wearing of a uniform is mandatory, the pupil commits a violation of a local social convention. Similar, downloading a computer virus program from a bulletin board (BB) or an electronic discussion list (EDL) is not necessarily harmful to others, since the virus will, if activated affect one's own machine. However, convention may suggest not to do this because there is a potential that the virus could accidentally spread, hence the person may be playing with fire. If the act is "intrinsically harmful" to others (e.g., violence, theft and putting a virus onto somebody else's computer hard-drive without their explicit permission) it pertains to the moral domain. Intrinsic harm is said to be directly perceived, or inferred from direct perceptions (Turiel, 1983; pp. 41-43); as such uploading a computer virus onto a BB or EDL for others to use entails intrinsically harmful material or psychological consequences to others if the virus falls into the hands of a malicious user. Literature suggests that because the harm is intrinsic to the act/behaviour, children and adults will reason that the act is universally wrong, even in another country (e.g., Haidt, Koller & Dias, 1993; Logan, Snarey & Schrader, 1990).

2. Social Bonds and Moral Development:
Cyberuser Moral Code (CMC) The literature about delinquency and moral codes outlined in the previous section suggests that we need to better understand how computer users feel about various behaviours such as, writing or playing with computer viruses, and also, by using an encryption device or software for protecting their data against misuse by others (i.e. similar to the use scramblers to protect phone conversations against wiretapping).

2.1 The Additive Hypothesis

The literature suggests that both informal and legal controls deter potential offenders and the greater either type of these controls, the greater the deterrence (Sherman, Smith, Schmidt & Rogan, 1992). This suggests that informal control in tandem with deterrents should reduce illegal behaviour by computer users. However, first we need to determine how much social bonds do affect the person's moral development. Research on social bonds and moral development would suggest that if the computer user is sufficiently tied to society (e.g., employed and active in community), his or her moral code is such that he or she will try to avoid suffering from stigmatization of violating moral codes about, for instance, pirating software. The individual's CMC develops out of the interaction between social bonds (Matsueda & Heimer, 1987), normative control such as making a faux pas and looks of disapproval by friends, co-workers and others (Heckathorn, 1990) and one's framework of morals in the personal, moral and conventional domain (Turiel, 1983; Nucci 1981; Nucci, Turiel & Encarnacion-Gwrych, 1983). This suggest that the following proposition be tested:

2.2. Background and Personal Experience

Background variables such as socioeconomic status, age, employment status and use of computers may all effect moral development. For instance, Matsueda and Heimer (1987) reported that age and socioeconomic status affect the possibility of becoming a delinquent in their sample of USA youths. Haidt, Koller and Dias (1993) reported that age had a significant effect upon people's moral development. For instance, children were more likely to universalize their judgment (i.e. regardless where this is done it is wrong to do it) than adults. Moreover, research on computers has shown that age and experience with computers all affect individuals' perception of information technology (Gattiker, Gutek, & Berger, 1988) as does gender (e.g., Gattiker & Nelligan, 1988). If we deal with moral issues which may be part of one's personal, moral and/or conventional domain, past experience may also effect perceptions. For instance, if a computer virus has recently destroyed your hard-drive you might feel more leery about any action/behaviour involving viruses. Accordingly, even though a person playing with a virus on his or her machine does not harm anybody (personal domain of morals), you might feel that one should not play with a virus. Hence, your past experience lead you to place this act in the moral domain, i.e. others can be harmed if the virus is released into "the world of cyperspace". This suggests the following proposition:

Research Proposition 2: Groups (i.e based on social background and computer experience) differentiate dissimilarly between the harmless- less harmless (at the fringes of the law) scenarios about computer-mediated actions across countries (e.g., using a private encryption device when they are actually banned versus placing a virus on a BB or EDL).

2.3. The Cultural Hypothesis

The increasing internationalization of business and cyperspace limiting the geographical distances requires that one address cultural differences. Many authors have claimed that North Americans are more individualistic than people from other nations (e.g., Hofstede, 1980). Also there appears to be a high correlation between affluence and individualism (Hofstede, 1980). Affluence as perceived by respondents could have "at least as great and perhaps greater influence on the types of cultural characteristics one develops as extraneous out-of-culture definitions of one's status" (Helms, 1992, p. 1088) than socioeconomic status as defined by income.

Among affluent classes, two countries may have differing moralities. While in Brazil some rights and justice are outweighed by personal ties and the ethics of community, the ethics of autonomy are important in the United States (Haidt, Koller & Dias, 1993). Similarly, software piracy as identified by Western software companies may be perceived differently by others. Verzola (1994) reasons that pirating software from USA companies somehow offsets Malaysia's human capital losses. The latter occurs when after completion of their education, Malaysian professionals try to immigrate to the USA.

Other international problems arise with different laws across countries. For instance, the USA government is striving to make the dissemination of virus programs illegal. In 1994 a New Zealand youth built a plastic bomb from information put on a BB hurting himself and others when the bomb went off. In turn, a New Zealand MP introduced a Technology and Crimes Reform Bill in 1994 that, if made law, would halt some of the flow of obscene material by making it an offence to possess it, thereby enabling the government to disconnect BBs carrying such information. Similarly, in 1994 the Australian government established a task force which is considering alternatives for developing a regulatory system for BBs, import and export controls of information via EDLs and most interestingly, assess whether current law enforcement powers are adequate. This suggests that certain behaviours being part of one's personal domain (e.g., the ubiquitous practice of playing with a virus) may become part of the moral domain or possibly illegal in some countries. Unfortunately, without coordination across countries, it will be difficult to succeed with these efforts to regulate the flow of information fairly and justly across and within borders. This suggests the last proposition which should be tested, namely:.


3. PRELIMINARY RESEARCH FINDINGS AND IMPLICATIONS To test some of the propositions outlined above we chose to use vignettes describing particular situations a person might observe and we asked participants to respond to sets of questions pertaining to these scenarios. At the start we had 10 stories of about 25 - 35 words each, of which the first five dealt with a person entering a computer with and without a password, the remaining five asked about people's moral reasoning when observing a person playing or programming a virus. The stories started off with relatively harmless acts (e.g., entering a computer without a password simply for fun even though the person has the password for the system). What the preliminary testing revealed was that people were interpreting the entering of a computer without using one's password as being in the conventional domain (suggests not to do this even for fun), while leaving such a "cracker" program with a friend was felt to be part of the moral domain (intrinsically harmful to others, therefore inappropriate behaviour).

The results were not as obvious for stories dealing with the computer virus. For instance, younger individuals were more likely to attribute certain actions to the personal domain of morality, while older respondents felt certain actions were part of the conventional or moral domain.

Playing versus reality. The above findings resulted in us having three stories to work with. In particular, younger individuals felt that playing with a virus was okay and the person's personal decision as long as she or he was careful in not releasing the virus into the "wild." Older respondents were less certain and often felt that it went beyond the personal domain or should simply not be done (can harm others, immoral behaviour).

BBs and EDLs. During the preliminary testing we also discovered through colleagues and associates worldwide assessing the instrument that one scenario should address the BB/EDL issue. It was pointed out that how a person may decide, after reasoning, if a behaviour is part of the personal, moral or conventional domain is of special interest when receiving information from abroad where it might not be banned (e.g., how to make a bomb). The preliminary data indicate that North Americans are more likely to use a libertarian approach by letting the individual decide on his or her own. In contrast, European Community members felt it inappropriate to skirt the fringes of the law. These respondents tended to reason that such information should not be passed on to a friend abroad where such information is legal. Also, North Americans were less likely to take a moralizing stand by arguing that what one country did was appropriate, while what another did was not. This was surprising to us, considering that the USA federal government often attempts to impose its own goals and beliefs upon other country's by twisting arms and using heavy handed political maneuvering (e.g., international trade talks).

Clipper chip and encryption. The recent discussions about the USA's and other countries' efforts to establish some encryption standards which may ultimately regulate what can be used and, most importantly, ban the utilization of private devices warranted the addressing of this issue. Again, the preliminary data show that USA respondents are adamant about protecting privacy and the person's freedom to choose any encryption device including making their own. This is not surprising considering that in a 1994 TIME/CNN public opinion poll of 1000 Americans, two thirds of the respondents felt that it was more important to protect the privacy of phone calls than to uphold law enforcement's ability to conduct wiretaps.

Differential association theory. The preliminary data also support this theory indicating that a person's moral framework of what is right and wrong when assessing situations dealing with computer technology is affected by social bonds (e.g., friends). For instance, the way the person interpreted how friends might feel about certain things affected one's own assessment, while full-time employees of lower perceived socioeconomic status were less tolerant about certain actions and behaviour involving cyperspace.

3.1. Implications for Management and Public Policy

The security of networks is at the core of today's strategic efforts required to secure the firm's survival and future success, and cryptography is certainly one way to safeguard the organization against misuse of new technology. Unfortunately, employees not technology must be at the core of the solution. Their interpretation of personal, moral and conventional domains of knowledge about computer-mediated work and the do's and don'ts play a pivotal role. Technical structures must be coordinated with organizational but, most importantly, human resource efforts to increase security against potentially strong attacks which could harm the firm and many other organizations and individuals (leak of national security data). As Dutch "crackers" penetrating a United States military computer system in the summer of 1991 demonstrated, there is no practical way to totally protect against such intruders; however, the risk of their success can be reduced largely by employees being alert and having an understanding of personal, moral and conventional domains of knowledge about computer-mediated work which safeguards the firm against such attacks.

Preliminary data gathered through this research program suggest that people are willing to break the law in the cyperspace domain (e.g., keeping illegal copies of software for a while to make use of them), thereby making the enforcement of laws protecting property rights very difficult. If the public feels that certain things are part of the conventional domain only (one should not do it) but not immoral, then governments and firms will have a very hard time to enforce these laws. Moreover, the social control of individuals may not succeed if a large group feels that certain behaviours are appropriate such as playing with viruses. Making these actions illegal may neither diminish the incidence of these types of actions nor reduce our current stack of pending cases in the court system.

3.2. Implications for Research

Morality is of indisputable importance when discussing computer and network security and serves as a guide for organizing one's "life" in cyperspace. Morality is rooted amongst other aspects of social life such as social bonding and association of the individual, which might affect his or her acts as far as computer technology is concerned. Also political systems, such as capitalism and socialism may result in people assessing actions differently when looking at the domain of moral development. This also suggests that cultural differences in the organization of computer-related actions by individuals and their possible transgression from moral standards must be addressed. Comparing findings from USA data and other countries suggests that this may be the most difficult challenge for research in investigating cyperspace issues.

Encryption of passwords and software that can handle encrypted messages and authenticate both recipient and sender will reduce the risk of fraudulent use and intrusion into computer systems. As networks become more global, local laws become mere local ordinances, since other countries may not qualify certain acts as crime nor ban them. Without further investigating how CMC is developed, maintained and changes among the various groups such as hackers, crackers, cyberpunks and cyberusers, techno-crime and terror against private users and firms by a few will continue to raise havoc in many information systems. Gaining a better understanding about this issues, in turn, will enable us to better understand why people behave the way they do and subsequently permit us to protect users. Issues which we consider part of the personal domain of moral theory may have to be shifted to the conventional domain if not the moral one in order to prevent people from doing things which could harm others if the software, technique or password lands in the lap of a malicious user. We are currently investigating these issues further and hope to shed some more light on the matters discussed here shortly.


REFERENCES Davidson, P., Turiel, E., & Black, A. (1983). The effect of stimulus familiarity on the use of criteria and justifications in children's social reasoning. British Journal of Developmental Psychology, 1, 49-65.

Gattiker, U. E., & Nelligan, T. (1988). Computerized offices in Canada and the United States: Investigating dispositional similarities and differences. Journal of Organizational Behaviour, 9, 77-96.

Gattiker, U.E., Gutek, B.A., & Berger, D.E. (1988). Office technology and employees attitudes. Social Science Computer Review, 6, 327-340.

Haidt, J., Koller, S. H., & Dias, M. G. (1993). Affect culture, and morality, or is it wrong to eat your dog. Journal of Personality and Social Psychology, 65, 613-628.

Heckathorn, D. D. (1990). Collective sanctions and compliance norms: A formal theory of group-mediated social control. American Sociological Review, 55, 366-384.

Helms, J. E. (1992). Why is there no study of cultural equivalence in standardized cognitive ability testing? American Psychologist, 47, 1083-1101.

Hirschi, T. (1969). Causes of delinquency. Berkeley: Free Press.

Hofstede, G. (1980). Culture's consequences. Newbury Park, CA: Sage Publications

Logan, R., Snarey, J., & Schrader, D. (1990). Autonomous versus heterogenous moral judgment types. A longitudinal cross-cultural study. Journal of Cross-Cultural Psychology, 21, 71-89.

Nucci, l. (1981). Conceptions of personal issues: A domain distinct from moral or societal concepts. Child Development, 52, 114-121.

Nucci, L. Turiel, E., & Encarnacion-Gawrych, G. (1983). Children's social interactions and social concepts. Analyses of morality an convention in the Virgin Islands. Journal of Cross-Cultural Psychology, 14, 469-487.

Matsueda, R. L., & Heimer, K. (1987). Race, family structure, and delinquency: A test of differential association and social control theories. American Sociological Review, 52, 826-840.

Sherman, L. W., Smith, D. A., Schmidt, J. D., & Rogan, D. P. (1992). Crime, pnishment, and stake in conformity: Legal and informal control of domestic violence. American Sociological Review, 57, 680-690.

Sutherland, E. H. (1947). Principles of criminology (4th ed.) Philadelphia: Lippincott.

Turiel, E. (1983). The development of social knowledge: Morality and contention. Cambridge, UK: Cambridge University Press.

Verzola, R. (1994). Software piracy: Another view. 2600 The Hacker Quarterly, 11(1), 16-17.


Retirado do site: http://www.icsa.net/library/research/tecnoc.shtml em jul/99