® InfoJur.ccj.ufsc.br

Title : Legal Expert Systems: A Humanistic Critique of

: Mechanical Legal Inference

Author : Andrew Greinke <Andrew.Greinke@anu.edu.au>

Organisation : The Australian National University

Keywords : Computers and law; expert systems; artificial

: intelligence; computerised decision-support systems

Abstract :

The author surveys a wide range of computerised expert systems and shows that hey invariably rely on pattern-matching and rule application strategies which have been embodied in their inference mechanisms and knowledge representations. This computational approach is argued to be unsuitable for use with law which presents a domain of intractable complexity arising out of the need to refer to social context and human purpose in resolving legal issues. The author concludes that a better use for computation in legal applications is in the form of decision- support systems that leave legal inference to human agents.

Contact Name : The Editors, E Law

Contact Address: Murdoch University Law School, PO Box 1014,

: Canning Vale, Western Australia, 6155

Contact Phone : +61 09 360 2976

Contact Email : elaw-editors@csuvax1.murdoch.edu.au

Last Verified :

Last Updated :

Creation Date : 29 November 1994

Filename : greinke.txt

File Size : 126KB

File Type : Document

File Format : ASCII

ISSN : 1321-8247

Publication Status: Final

COPYRIGHT POLICY:

Material appearing in E Law is accepted on the basis that the

material is the original, uncopied work of the author or authors.

Authors agree to indemnify E Law for all damages, fines and costs

associated with a finding of copyright infringement by the author or

by E Law in disseminating the author's material. In almost all cases

material appearing in E Law will attract copyright protection under

the Australian _Copyright Act 1968_ and the laws of countries which

are member states of the _Berne Convention_, _Universal Copyright

Convention_ or have bi-lateral copyright agreements with Australia.

Ownership of such copyright will vest by operation of law in the

authors and/or E Law. E Law and its authors grant a license to those

accessing E Law to call up copyright materials onto their screens and

to print out a single copy for their own personal non-commercial use

subject to proper attribution of E Law and/or the authors.

EMAIL RETRIEVAL: message "get elaw-j greinke.txt"

URL

gopher://infolib.murdoch.edu.au:70/00/.ftp/pub/subj/law/jnl/elaw/refereed/

greinke.txt

ftp://infolib.murdoch.edu.au/pub/subj/law/jnl/elaw/refereed/greinke.txt

--------------------------------------------------------------

LEGAL EXPERT SYSTEMS: A HUMANISTIC CRITIQUE OF

MECHANICAL LEGAL INFERENCE

by

ANDREW GREINKE <Andrew.Greinke@anu.edu.au>

BComm (Hons) LLB (Hons) CPA

Lecturer, Department of Commerce

The Australian National University

"Suppose I am in a closed room and that people are passing in to me a

series of cards written in Chinese, a language of which I have no

knowledge; but I do possess rules for correlating one set of

squiggles with another set of squiggles so that when I pass the

appropriate card back out of the room it will look to a Chinese

observer as if I am a genuine user of the Chinese language. But I am

not; I simply do not understand Chinese; those squiggles remain just

squiggles to me." [*]

1: INTRODUCTION

Computerisation of the legal office is an ongoing process. The range

of non-legal applications now in common use include word processing,

accounting, time costing, communication and administration systems.

[1] More recently it has been demonstrated that computers can be

used as research tools, particularly in the retrieval of primary

legal materials. Prominent examples include the LEXIS, SCALE and

INFO1 databases, now familiar to many practitioners. [2] Some moves

have also been made towards "conceptual" text retrieval systems. [3]

Flushed with successes in projects such as DENDRAL, [4] PROSPECTOR,

[5], and MYCIN, [6] computer scientists have now turned to law in

order that they might "widen their range of conquests". [7] The

interaction between computers and the law has now spawned a large and

disparate discipline, boasting eight centres for Law and Informatics

in Europe, as well as growing numbers of similar centres in North

America, Japan and Australia. The "resource" of expert legal

knowledge, "often transitory, even volatile in nature" is seen worthy

of nurture and preservation. The use of legal expert systems is seen

capable of preserving indefinitely and placing at the disposal of

others the wealth of legal knowledge and expertise. [8] The idea is

not new, being anticipated by writers such as Loevinger [9] and Mehl

[10] as early as 1949.

Yet lawyers have generally greeted "legal expert systems" - seen by

some as the natural progression in the use of computers - with

apathy, ignorance or resistance. [11] This article argues that such

opposition is justified when proper regard is had to the implications

arising from the computational foundation for such systems.

It is necessary for the following analysis to clearly distinguish two

fundamentally distinct classes of computer applications to law:

decision support systems, and expert systems. [12] "Decision support

systems" are powerful research tools or "intelligent assistants"

designed to support decisions taken and advice given by human

experts. "Legal expert systems" are designed to make decisions and

provide advice as would a human expert. Richard Susskind, a British

researcher whose work [13] constitutes the major theoretical

grounding of legal expert systems, states:

"Expert systems are computer programs that have been constructed (with

the assistance of human experts) in such a way that they are capable

of functioning at the standard of (and sometimes even at a higher

standard than) human experts in given fields . . . that embody a

depth and richness of knowledge that permit them to perform at the

level of an expert."[14]

Legal expert systems are a type of knowledge based technology. With

the explosion of applications, "expert system" is quickly becoming an

imprecise term. [15] The definition used by Feigenbaum will be

acceptable for the type of systems examined in this article:

"An intelligent computer program that uses knowledge and inference

procedures to solve problems that are difficult enough to require

significant human expertise for their solution. Knowledge necessary

to perform at such a level, plus the inference procedures used, can

be thought of as a model of the expertise of the best practitioners

of the field."[16]

In terms of programming technology, the knowledge based approach has

been described as an "evolutionary change with revolutionary

consequences", [17] replacing the tradition of

data + algorithm = program

with a new architecture centred around a "knowledge base" and an

"inference engine" so that:

knowledge + inference = expert system.

In fact, there are four essential components to a fully functional

expert system:

1. the knowledge acquisition module;

2. the knowledge base;

3. the inference engine; and

4. the user interface.

Knowledge acquisition is the process of extracting knowledge from

experts. Given the difficulty involved in having experts articulate

their "intuition" in terms of a systematic process of reasoning, this

aspect is regarded as the main "bottleneck" [18] in expert systems

development. The knowledge base stores information about the subject

domain. However, this goes further than a passive collection of

records in a database. Rather it contains symbolic representations

of experts' knowledge, including definitions of domain terms,

interconnections of component entities, and cause-effect

relationships between these components. In legal expert systems this

usually consists of formalised legal rules obtained from primary and

secondary sources of law. Another layer of rules may also be

obtained from less formal knowledge not found in published

literature, [19] such as "practitioner's hand books and internal

memoranda within legal practices". [20] These heuristics [21] add

"experiential" to "academic" knowledge. [22]

An inference engine consists of search and reasoning procedures to

enable the system to find solutions, and, if necessary, provide

justifications for its answers. The nature of this inference process

is described in detail in Section 2. The user interface is critical

to the commercial success of expert systems, particularly in the

legal field, to enable lawyers with little or no expertise in

programming, to gain access to the encoded knowledge. Typically this

is in the form of prompting for information, and asking questions

with "yes", "no" and "unknown" responses.

Artificial intelligence, the foundation for legal expert systems, has

run up against both practical and theoretical difficulties. While

computers can beat the average human at "clever" tasks such as

playing chess, they are "impossibly stupid" over tasks taken for

granted such as speaking a language or walking across a room. [23]

This casts doubt on whether many human activities do, as some

artificial intelligence researchers suggest, consist of suppressed

computational algorithms. There has been severe criticism by those

who claim that knowledge, by its very nature, is not amenable to

representation on a computer, [24] or that they achieve no more than

simple competency. [25] Early misconceptions about the ease with

which powerful and knowledgeable systems could be built for use by

relative novices have given way to concern about real problems of

knowledge elicitation and knowledge modelling. [26] Some opponents

are convinced that the claims of artificial intelligence are

exaggerated and their objectives unreachable. [27]

Section 2 examines the nature of the inference engine, and suggests

that its deductive procedures rest in pattern matching routines. It

also explores issues of "fuzzy" and "deontic" logic. Section 3

explores the implications for knowledge representation, and questions

whether devices such as "semantic networks", "frames" and "case based

reasoning" are anything more than elaborate pattern matching

constructs. Section 4 demonstrates that the need to be amenable to a

deductive inference engine involves unacceptable distortion of law

both at a practical and theoretical level. Section 5 argues that

legal reasoning necessarily involves resort to social context and

purpose, which is not tractable within current technology. Section 6

suggests that researchers ought to abandon legal expert systems, and

instead concentrate on computerising more mechanical tasks such as

legal retrieval and litigation support. A summary and conclusion is

contained in Section 7.

2: PATTERN MATCHING AS THE CORE OF AN EXPERT SYSTEM

The core of any expert legal system is its inference engine. This

Section investigates the nature of computer inference at a basic

level, and argues that it is little more than a pattern matching

exercise. It is also argued that more sophisticated approaches, such

as fuzzy logic, and deontic logic, are no more than extensions of the

same principles.

2.1 _The Nature of Computer Inference_

Computer inference is undertaken by a simple strategy known as modus

ponens. This means that the following syllogism is assumed to be

correct:

A is true (fact)

If A is true then B is true (rule)

\ B is true (conclusion)

Computer deduction is obtained by conditioning the consecutive

execution of instructions on matching, or failing to match, values in

storage registers. The identical syllogism is obtained by a computer

using a routine in the following terms:

1. check the value of register X1

2. compare the value of X1 to a value in register A

3. if X1 = A then:

4. change the value of register X2 to the value in register B

The computer is conditioned by the value placed in register X1,

either by the user or by satisfaction of some prior rule. A is taken

to be true if the value of X1 is equal to a particular value in

register A, representing some real world condition. The rule "if A

then B" is contained implicitly within the structure of the routine

by conditioning step 47 on the satisfaction of the condition X1 = A

(i.e. A is true). The modus ponens is completed by step 47 which

alters another register to equal a value B, thereby asserting that B

is true.

A programmer in BASIC or PASCAL, for instance, has some relationship

in mind between the data supplied to the program and the output to be

produced by computation. The input data are stored in the machine's

memory, and the programmer's task is to devise a sequence of

instructions to manipulate the data in accordance with the

relationships she envisages. In such a case the inference engine

constitutes the implicit algorithm contained within the sequence of

instructions. Examples of such algorithmic knowledge based systems

include Hellawell's CORPTAX, [28] CHOOSE, [29] and SEARCH, [30] all

implemented in BASIC.

2.2 _Logic Programming_

Normal programming can at best maintain logical relationships

implicitly within the program's structure. A number of researchers

are now involved in logic programming, [31] which has been seen as

the real technical breakthrough in this field. Some have extended

logic programming to the point of writing expert systems by means of

another logic based application, such as DARWIN. [32] Logic

programming allows the programmer to specify logical relationships,

not in terms of sequential instructions, but in terms of some

symbolic language. [33] It is then up to the machine to compile the

set of sequential instructions to maintain the desired relationship.

A logic programming system can be regarded as a kind of rule-based

system where the inference engine becomes a "mechanical theorem

prover", [34] a machine for answering questions of the form:

Do axioms (A0 . . . An) logically imply B ?

The claimed advantages of rule-based logic systems over conventional

programs are perspicuity and modularity. [35] Perspicuity is

obtained by separating the rules (the knowledge base) from the

logical operators (the inference engine). This has important

implications for system maintenance and debugging. Modularity exists

since the knowledge is split into small and independent rules.

Legal rules, written in symbolic language, are manipulated through a

process of "forward" and "backward chaining". A set of IF-THEN

rules, constituting a "search space", [36] are compared against a set

of facts to reach a logical conclusion. [37] In an expert system

forward chaining simply involves matching the IF conditions to the

facts, according to a predetermined order, which under the rules,

dictate a conclusion. [38] Susskind describes this as a "control

strategy" which "triggers" and "fires" the rules. [39]

Backward chaining starts with the legal conclusion and searches for

justifying antecedents in the knowledge base. In terms of

programming this technique is more difficult since the search of the

knowledge base is not along a single "path" but involves

identification of all possible rules leading to the required

conclusion. [40] In essence, it matches THEN variables with their IF

antecedents and compiles a list of the paths thus generated.

A "goal driven" expert system predetermines a conclusion and

identifies the legal arguments and reasoning that can be used in

support of that conclusion. [41] In logic programming terms this is

no more than backward chaining across the search space. These

processes of forward and backward chaining form the core of expert

systems inference procedure.

One notable feature of logic programming is the Horn clause, seen as

a suitable extension to the "simple" predicate logic already

outlined. It is of the form:

A if B0 and . . . Bn where n >= 0.

which consists of a single conclusion, A, and any number of

conditions (B0, B1, . . .Bn). For example, the Horn clause:

X is the father of Y if X is a parent of Y and X is male

is a Horn clause with one conclusion and two conditions. Factual

premises, such as "X is male" can be expressed as a Horn clause with

one conclusion and no conditions. The significance of such a clause

is that symbolic logic is not limited to IF-THEN statements, but may

be extended to IF-AND-NOT-THEN statements. In terms of actual

programming, however, the Horn clause is implemented as a bundle of

IF-THEN assertions; each condition being checked separately for a

pattern match, and the routine halting on failure to match.

The pattern matching approach of logic programming is

not relaxed but in fact tightened by the use of "integrity

constraints", such as IF-THEN-ELSE structures, to close off the

potential for negation by failure [42] and counterfactual conditional

[43] difficulties.

Constructed on the basis of Horn clauses, PROLOG and variations have

been the platform for most logic programming projects, both in logic

and procedure. APES, [44] implemented in PROLOG, is one widely used

augmentation. [45] Using PROLOG as a symbolic logic structure

involves rendering the domain knowledge in terms of Horn clauses,

rewriting them in PROLOG syntax, and then executing the result as a

program. PROLOG may itself provide a procedural basis for expert

system platforms. Horn clauses may be backward chained as a

procedure, working from conclusions to conditions and, as a sub-task,

pattern matching each against its knowledge base, or user input. The

program statements can thereby mix conditions which express legal

rules with procedures to prompt the user for additional information.

An example is Schlobohm's system to determine "constructive ownership

of stock" under United States revenue laws. [46] The LEX [47]

project is a more sophisticated application of the same principles.

2.3 _Fuzzy Logic_

Fuzzy logic is an attempt to escape the perceived inadequacy of

binary logic. [48] Zadeh introduced the concept of the fuzzy set

[49] to provide a formal way of speaking about imprecise concepts,

such as "large" and "small". Rather than requiring precise values to

be attached to particular characteristics, a spectrum of values,

broken into categories, is used to match concepts, analogous to

concepts in cognitive psychology. [50] The object of fuzzy logic is

to convert continuous measurements into approximate discrete values.

For example, a rule of the form:

A PERSON IS A MINOR IF UNDER 18 YEARS OF AGE

can be rendered by the following simple routine:

1. check register AGE

2. if AGE < 18 then:

3. set register PERSON to value 123

where 123 represents "minor". The spectrum of values "less than 18"

is the fuzzy category. On a more complex level, matching can take

place not only between ranges of values, but fuzzy sets. In binary

logic, two concepts will be identical if and only if their membership

functions, that is, their defining characteristics, exactly coincide.

For instance, if F is a class of subsets of X, a set of

characteristics defining legal concepts, then for Y and Z:

Y is identical to Z iff fY(x) = fZ(x) for all x

Rather than matching "identical" sets, fuzzy logic matches "closely

identical" or "sufficiently close" sets. [51] To ascertain

"closeness", a probabilistic metric is constructed. For example

D(Y,Z) = Integral [{fy(x) - fz(x)}^2.p(x).dx]

where p(x) is some probability distribution on X. D(Y,Z) is

therefore a metric that depends on the choice of p(x). Using these

definitions, one can test "closely identical" by inferring that:

Y is identical to Z iff (1 - d(Y,Z)) >= d

where d is some arbitrary threshold which can itself be used to

trigger the operation of a rule.

"Fuzzy logic" is therefore one way of rendering continuous or

approximate concepts into terms amenable to computer deduction. It

is, however, no more than an extension to logic programming

techniques. Critics suggest that fuzzy logic is no more than

oversophistication of arbitrary approximation; that its appearance of

precision is spurious, and that its philosophical basis is uncertain

when applied to legal concepts. [52]

2.4 _Deontic Logic_

In legal expert systems the nature of law as a normative system, [53]

has given rise to a perceived necessity for incorporation of deontic

logic. [54] Whereas traditional and classical logics provide formal

canons for reasoning with empirical statements that have truth value,

deontic logics provide standards for reasoning with statements which

lack truth value [55] in the sense that they describe norms or

imperatives. They cannot be characterised as true or false or

logically related to each other or to statements of fact. [56]

McCarty has consistently argued for "intuitionistic" rather than

classical logic as the basis for representing legal concepts. He

sets out some theoretical suggestions, as yet unimplemented, for the

semantics of normative concepts in legal expert systems. [57]

General foundations were laid by von Wright, [58] who developed a

system of logic based on possibility and necessity. According to

McCarty, "permission" exists in the union of all states and substates

in which an action is necessarily true given the conditions of these

states. This forms the "Grand Permitted Set" or a boundary condition

for legality. [59] "Obligation" exists in the intersection of these

sets. [60] In programming terms, "permission" entails backward

chaining from the proposed action to all the states of the world.

"Obligation" then is moving forward from all states of the world to

find a common action.

McCarty designed a language called LLD which allegedly possessed

distinct advantages for legal applications in its use of action terms

and deontic language. [61] However, his unimplemented proposal is

problematic. LLD attempted to represent law in count terms, mass

terms, states, events, actions, permissions and obligations.

However, even McCarty admits that LLD failed to represent purpose,

intention, knowledge and belief. [62] Jones demonstrates that a

striking feature of McCarty's theorem is that an obligation to act

did not logically imply its permission. In particular he

demonstrates that under McCarty's analysis, one could logically

derive "permission to poison the King from an obligation to serve

him". [63] The only difference from logic programming in the

suggested implementation of LLD lies in the use of fuzzy categories.

That it does not stray too far from traditional logic programming is

obvious since LLD is constructed almost entirely from Horn clauses.

[64] McCarty therefore fails to tackle more difficult and

fundamentally philosophical problems in deontic logic. [65]

Stamper's LEGOL [66] project proposed a number of extensions to

enable the system to handle concepts such as purpose, right, duty,

judgment, privilege, and liability; [67] yet these were never

implemented. [68] His latest project, NORMA, [69] has the object of

relating all formalised symbols directly to the notions of agent,

intention, and behaviour. However, it is doubtful whether this goal

can be achieved, given that his languages are based in typical

control structures such as sequencing of rules, if-then branches, and

iteration, [70] hence easily rewritten as a logic program. [71]

Sergot suggests that both Stamper's work and McCarty's LLD have

simply taken standard semantics of logical formalism and presented

their own variant. [72]

Deontic logic may in any case be non-computational. Since the limit

to current technology ultimately lies in the mechanistic linking of

discrete relationships, the modelling of any "normative" aspect of

law will not by its very nature be amenable to computer processing:

"For there is not much sense in asking how . . . by having 255 in

register 1234567 licences coming to have the number 128 in register

450254925." [73]

Perhaps in light of this limit to technology, both the Oxford Project

[74] and the Imperial College Group [75] have avoided deontic logic.

Susskind reduced deontic logic to predicate logic by treating the

normative aspect of law as merely linguistic. [76] Deontic labels

were attached to different varieties of mechanical cause [77] and

effect. [78] Normative statements were simply rewritten into

declarative symbolic language. [79]

The important implication from this work on deontic logic is

recognition of the error in equating "logic" as understood by a

computer with "logic" as understood in wider contexts. [80] In

particular, reference can be had to MacCormick's distinction between

"formal" and "everyday" logic, with the latter being based in common

sense. [81] Researchers such as Stamper appear to be aware of such

difficulties, but find themselves constrained by the existing

technology. Whether legal reasoning can be computational is

addressed in Section 5.

3: KNOWLEDGE REPRESENTATION AND THE PROBLEM OF CLASSIFICATION

The previous Section demonstrated that the process of computer

inference was limited to an elaborate process of pattern matching.

This Section investigates the implications for knowledge

representation; in particular, that it is necessary for knowledge to

be represented in terms of IF-THEN rules. It is also argued that

more "sophisticated" representation techniques, such as "semantic

networks" are no more than elaborations of this basic structure.

3.1 _Pattern Matching and Open Texture_

To be implementable, the knowledge base must be structured so as to

be amenable to deductive inference procedures. In theory, it is

possible to use any form of symbolic logic as the representational

formalism as long as it is appropriate to a deductive inference

engine. This condition requires that the knowledge base must be in

the form of pattern matching rules. In logic programming,

computation is deduction, and the task of the programmer is therefore

to pose a problem suitable for a deductive process. [82]

The major difficulty encountered is what broadly may be termed

"semantic indeterminacy". [83] Not all legal rules are appropriate

for application in all situations. [84] Legal expert systems have

been acknowledged to be only capable of solving problems referred to

as "clear cases of the expert domain". [85] Yet what is a clear

case? The Oxford Project defined a clear or "easy" case as one

easily solved by an expert, yet hopelessly difficult for non-experts.

[86] Gardner [87] drew the distinction between hard and easy cases

by describing the latter as situations whose verdict would not be

disputed by knowledgable and rational lawyers, whereas they may

rationally disagree as to the former.

The real answer for legal expert systems lies in the nature of the

computation process. When presented with the facts of a case, the

expert system must decide whether or not a rule applies. Since "fact

situations do not await us neatly labelled, creased, and folded" [88]

the difficulty lies in subsuming particular instances under a general

rule. [89] A "hard case" is therefore one where the system fails to

match the appropriate pattern, thereby preventing a rule from firing.

As computer logic relies on pattern matching, knowledge

representation necessarily must encounter problems of classification.

[90] What is "ultimately beyond the grasp of a computer," states

Detmold, "is not complexity, but particulars". [91] This difficulty

is often termed "open texture". The notion of open texture is

obtained from Hart's analysis. In the now infamous regulation:

NO VEHICLES ARE PERMITTED IN THE PARK

the open-textured term here is vehicle. The difficulty in terms of

legal expert systems is how the program can classify an object as

being a "vehicle" falling under the rule. Hart suggests that general

words like "vehicle" must have a set of standard instances in which

no doubts are felt about its application. There must be a "core of

settled meaning", but there will be, as well a "penumbra of debatable

cases". [92]

In terms of computation, a case is within the core of settled

meaning, and is classified as "easy" where there is a matching

pattern in the knowledge base. Cases in the penumbra of doubt are

hard, since they cannot be classified by the system. The difficulty

for legal expert systems, then, is to build a system for

classification, so that the pattern matching process can take place.

Skalak [93] suggests there are three theoretical models of

classification:

- the classical model

- the probabilistic model

- the exemplar model

All three models have been extensively used in expert systems

technology. The following analysis demonstrates that the first two

models have little to distinguish them in practical effect, and

together constitute an inadequate response to the problem of open

texture. The exemplar model has been used as justification for case

based reasoning approaches, but the term has been misused, and such

cases are argued instead to fall into the "probabilistic" model. The

exemplar model is returned to in Section 6, where it is used as the

basis for suggested development of computer applications to law.

3.2 _The Classical Model_

In the classical model, a concept is defined by necessary and

sufficient conditions. Hafner [94] suggests that these conditions

can be formally represented by knowledge structures involving

decision rule hierarchies, taxonomic hierarchies, or role structures.

Decision rule hierarchies specify the conditions under which a

concept is true or false. "Vehicle" may be defined by a set of

characteristics such as "four wheels", "engine" and so on. In

programming terms, this means that the IF antecedents are themselves

THEN consequents based on sets of prior conditions which constitute

the "definition" of a term. This quickly builds into a "decision

tree" structure. [95] Statute law, particularly statutory

definitions, are seen particularly suitable for rendering into what

amounts to typical Horn clauses. [96] A prominent example of this

technique is the modelling of the British Nationality Act 1981. [97]

Others include the United Kingdom supplementary benefits legislation,

[98] and STATUTE, now used by some Australian government departments.

[99]

Taxonomic hierarchies define sub-types of concepts, placing different

objects into groups and sub-groups. For instance, the class

"vehicle" may have among its sub-classes "car", which in turn may be

further sub-classed into "Toyota", "sedan" and specific instances

based on model types, years, and so on. [100] Any taxonomic

hierarchy can, however, be represented in terms of a chain of

decision rule hierarchies, or IF-THEN rules. Role structures are

modelled in "frames", and "semantic networks". A semantic network

[101] is a collection of objects called "nodes". These are connected

by "links". [102] Typical links include "is a" links to represent

class-instance relationships and "has a" links to represent

part-subpart relationships. Interconnected, these may quickly build

into a complex web of relationships. A frame [103] is a subset of a

semantic network, being a representation of a single object with a

set of "slots" for the value of each property associated with the

object. All links in both semantic nets and frames are, however,

functionally equivalent to taxonomic hierarchies. Hayes demonstrates

that both semantic networks and frames are no more than elaborate

logic programs, and concludes that they hold no new insights. [104]

The semantic network may be forward and backward chained as a set of

logical rules, just as would a rendering of a set of Horn clauses in

PROLOG. [105]

For instance, in McCarty's TAXMAN project [106] the domain was

modelled in terms of objects such as corporations, individuals,

stocks, shares, transactions &c. Each object is described by a

"template", being a collection of the object's properties, such as

name, address, size, and value. These properties are then linked and

indexed. Each "bundle of assertions" constitutes the object's

"frame". [107] These structures are aimed at answering questions

such as:

"Does the taxpayer and her family have a controlling interest in the

stock of a company which is a partner in a partnership which owns an

interest in XYZ Ltd?" [108]

In TAXMAN 2 McCarty proposed a more elaborate semantic net based on a

"prototype-plus-deformation" model. Essentially this sets one frame

as being the default for each class of object, with incremental

modifications to slots based upon fuzzy categories. Unfortunately

the concept was never implemented. As a result, McCarty offer no

solution to algorithmic issues such as how to choose, index, and

search the space of prototypes, and their relationships to actual

cases. [109]

3.2 _The Probabilistic Model_

Some argue that legal concepts cannot be adequately represented by

definitions that state necessary and sufficient conditions. Instead

legal concepts are incurably open-textured. [110] Typically an expert

system associates some kind of "certainty factor" with every rule in

its knowledge base, obtained from probabilities, or fuzzy logic, or

some combination of the two. [111] Firstly, probabilities are used

alongside facts and rules, as a "slot" in the knowledge base. To

each fact and rule is attached a certainty factor between zero and

one. Concepts are mechanically linked, but the final output includes

a composite probability. For example:

A is true (0.8 chance)

If A is true then B is true (0.75 chance)

\ B is true (0.6 chance = 0.8 x 0.75)

Secondly, fuzzy logic is called into play when classification of

facts involves weighting particular features. In the former the rule

"fires", but a certainty level is attached to each fact

and rule. In the latter, the rule will only fire at defined

threshold certainties. If the weighted average of a set of

characteristics add to a threshold amount, the facts are classified

accordingly.

3.3 _Case Based Reasoning_

Using precedents by induction and analogy is seen as advantageous in

overcoming apparently intractable problems of classification. [112]

However, both analogy and induction are inherently non-computational.

[113] Case based reasoning is one attempt to imitate these

techniques, allegedly based on an exemplar model. In the exemplar

model, the user is presented with prototypical instances, or "mental

images", on which to base her classification. This approach differs

significantly from the two previous models in that it is primarily

designed to leave the task of classification to the user. Most case

based legal expert systems instead use a database of examples linked

to the decision given in particular cases. When presented with a new

case for decision the system will attempt to match the case under

consideration with the examples, either stereotypical [114] or

actual, in its database to extract those which appear to be most

similar. On that basis it will attempt to predict the outcome of the

new case. For instance, Popple's SHYSTER [115] applies rules until

the meaning of some open texture concept is required. At this point

a case based reasoning mechanism attempts to resolve this

uncertainty. [116]

A matching algorithm is used to measure the similarity of cases in

terms of "case features". Each case is modelled as a frame with

significant features contained in "slots". [117] These features are

then weighted by some statistical method. [118] The object of

constructing these "similarity metrics" [119] is to retrieve the most

"on-point" cases. [120] Using weighted characteristics is described

as a "dimensional" [121] approach, such as Betzer's "3-D" system

which uses relative weights in a "procedure sweep" to fill in "gaps"

in the knowledge base. [122]

In addition to the facts, the cases themselves are often weighted in

a manner "meaningful to lawyers", usually to reflect some sense of

stare decisis. For instance, the weights might be determined by the

level of the tribunal. [123]

Although "case based" reasoning allegedly possesses an advantage over

rule based systems by the elimination of complex semantic networks,

[124] it suffers from intractable theoretical obstacles. Apart from

the question of choice of a matching algorithm, without some further

theory it cannot be predicted what features of a case will turn out

to be relevant. Too often, "legally significant parameters" [125]

are facts deemed important by the programmers, [126] with no

grounding in any articulated theory, even though the utility of such

systems depends critically on the set of attributes selected. [127]

Both selection of attributes and the choice of associated weights are

therefore highly arbitrary. [128]

On this analysis, case based reasoning constitutes an extension of

the probabilistic model rather than a true exemplar model, in which

the task of classification is left to the user. The potential of

this latter model for building applications is examined in Section 5.

4: PHILOSOPHICAL IMPLICATIONS OF THE DEDUCTIVE INFERENCE ENGINE

The previous Sections have demonstrated that the task of knowledge

representation was to provide a symbolic representation of knowledge

in a form amenable to the deductive inference engine. The primary

reference point was logic programming, that is, formalisation of the

law into a set of Horn clauses. It was also argued that techniques

such as "fuzzy logic" "semantic networks" and "case based reasoning"

are no more than elaborations of logic programming, [129] and not, as

some would argue, "second generation" systems going beyond deductive

inference. [130]

This Section carries this analysis beyond the practical and into the

philosophical. It has been thought inescapable that a legal expert

system which attempts to emulate the reasoning processes of a lawyer

must embody theories of law that must in turn rest on more basic

philosophical assumptions. [131] Building a legal expert system is

thus described as not being just an exercise in computer programming,

but requires "solid and articulated" jurisprudential foundations.

[132] Researchers in this field appear, however, to have discounted

or ignored the value of close analysis of the field's theoretical

assumptions. This Section demonstrates how, to avoid theoretical

obstacles, the nature of law, its epistemological basis, and the task

of jurisprudence have all been subjected to unacceptable distortion.

4.1 _Isomorphic Representation or Distortion?_

The activity of legal knowledge representation is said to involve the

operation of interpretative processes whereby the formal sources of

part of a legal system are scrutinised and analysed, so as to be both

faithful in meaning to the original source materials, and in a form

which is computer encodeable. This principle is termed

"isomorphism".

Doubts have been raised as to whether it is possible to meet both

objectives. It is said that the knowledge engineer must desist from

imposing her own interpretations, lest she be universally condemned

for misrepresenting the law. [133] Yet it is difficult to reconcile

Susskind's claim of isomorphism with his admission that the process

is in fact one of complete "reformulation" or "rational

reconstruction". [134]

Although isomorphism requires the formalisation of rules to be

sufficiently expressive to capture their original meaning, [135]

Levesque and Brachman have demonstrated that there is a significant

trade off between the expressiveness of a system of knowledge

representation and its computational tractability. [136] Susskind

admits that it is not possible, without "extensive modification and

inconvenience" to accommodate legal knowledge within the restrictive

frameworks offered by currently available computer programming

environments. [137]

Moles provides a typical example of the modelling of British coal

insurance claims. [138] This involved taking statutes and cases and

then "translating" them into six different structures using three

separate applications into the target representation language. After

being "translated, cut up into bits, precised, further analysed into

[frames], which are then stored in another structure", he suggests it

would be a "miracle" if they were "isomorphic" to the original texts.

[139] It would appear that terms such as "isomorphism" may be no

more than "syntactic sugar" [140] used to "sweeten" the acceptability

of what must be a distorting process.

4.2 _Law as a System of Easily Interpreted Rules_

Statutory interpretation has been predominantly characterised as

involving a literal interpretation, [141] particularly in tax law,

[142] allegedly idiosyncratic in being construed both literally and

strictly. [143] In case law, legal sources cannot be as easily

"formalised" or "normalised", [144] but must to some degree be

interpreted. However, Susskind takes a dangerous step in suggesting

that the task of the knowledge engineer is to "sift the authoritative

ratio decidendi from the text eliminating obiter dicta and other

"extraneous" [145] material. Moreover, he argues that this can be

easily extracted, not by a thorough examination of the case but by

reading the headnote alone. [146] Representation of cases in

knowledge bases typically are compressed into a "headnote" style,

including citation, court, date, facts, and holdings. [147] Cost may

also be a factor behind this characterisation of law. Susskind

suggests that knowledge engineers need only avail themselves of the

services of the legal expert to "tune" the knowledge base. [148]

This view, however, that the law is a formal rule-governed process,

ignores a great deal of learning stretching back over a century -

and more recently in the form of critical legal studies - arguing

that the law is far from determinate. [149] The law is at least an

"elastic" phenomenon in which students have traditionally been taught

and encouraged to "flip" legal argument. [150] The conception of

legal decision-making as a formal rule-governed process has been

eroded by a judicial move towards "realist" scepticism of rigid rule

structures. For example, members of the Australian High Court have

indicated a rejection of formalism and adoption of a more active

assessment of legal principles with respect to justice, fairness, and

practical efficacy. [151]

Advocates of case based reasoning attempt to accommodate realist

criticism by suggesting that fact patterns can explain legal

decisions independently of any "surface discourse" of law. [152] The

critical assumption is that judges decide even hard cases in a rule

based manner. Levi [153] supports this view in arguing that legal

reasoning, while not being purely a system of applying the law

mechanically to facts, does embody rules obtained by analysing the

similarities and differences in decided cases. Such researchers

argue analogously to theorists, such as Goodhart, who suggest

examination ought to be focussed on the facts treated as material and

immaterial by the court. [154]

Stone, however, argues that there is a critical distinction between

the ratio which explains the decision and the one which binds future

courts. More often than not, the critical facts are those treated as

material by the later court, and even if they are identifiable, they

can be stated at multiple levels of generality. [155]

Other legal systems, particularly in some parts of Europe, may be

more suited to this characterisation of law. For example, in the

Scandinavian legal system, one overall guiding principle is the

prohibition of decisions which are non liquet [156], which is

considered a serious fault. [157]

4.3 _Change_

A severe impediment to the routine use of knowledge based technology

for practical legal applications lies in the unresolved problems

associated with the "maintenance" of such systems, that is, how to

continually update the system with primary sources. [158] One group

describe how after exhaustively studying over 1,000 cases under the

Canadian Unemployment Insurance Act, it was amended in 1990 rendering

their work irrelevant. [159] Most approaches are inadequate, either

for expressly assuming a constant state of the law, or avoiding

primary sources and instead modelling directly the heuristics of the

expert. [160]

The logic programming approach of the Imperial College group, whereby

the expert system is formalised to correspond to individual sections

of a statute, is argued to be easily modifiable. However, a fully

functioning expert system requires a layer of pertinent heuristic

knowledge to avoid a "layman's reading" of an Act. [161] Once the

formalisation is structured, explained and augmented in this way,

modifying the system is no longer straightforward. Schlobohm

suggests that, as a result, human experts would have to modify the

heuristic rules whenever the law changes, and the entire system

containing the new rules would then have to be debugged. [162]

Similarly, use of modular approaches such as the Chomexpert system

have proved inadequate. [163] It is difficult to encode statutory

rules at even the most basic level without making inappropriate

commitments as to how they will be interpreted in future. [164]

4.4 _Epistemology of Law_

Susskind suggests that law is not an abstract system of concepts and

entities distinct from the "marks on paper" that are the material

symbols of it. [165] The difference between legal expert systems and

scientific systems such as PROSPECTOR and MYCIN lies, according to

Susskind, in that scientific laws are to be "discovered" in the

empirical world in general, while legal rules can be extracted, as an

acontextual linguistic exercise, from scrutiny of formal legal

sources. Under this analysis, knowledge engineers need go no further

than the written text, hence Susskind argues that the "bottleneck" of

knowledge acquisition is effectively dissolved. This is a dangerously

narrow epistemology to adopt, [166] since researchers in this area do

not sufficiently distinguish between the writing, and the meaning of

the writing. [167] In Section 5 it is argued that meaning can only

be found in a social context.

4.5 _The Nature of Jurisprudence_

Although the foregoing suggests that many theories in jurisprudence

conflict significantly with important assumptions of expert systems

technology, many of these fundamental theoretical difficulties have

been downplayed or eliminated. When faced with theories which imply,

for instance, that there is no future for expert systems, some

researchers have expressly rejected the usefulness of jurisprudence.

[168] Critical legal theory is therefore characterised as

"unacceptable". [169] Even if jurisprudence is wholly ignored by

knowledge engineers, they suggest that the only risk is that the

systems they design might be of some "inferior quality". [170]

Others "rationally reconstruct" jurisprudence into an acceptable

form. For instance, Susskind asserts that the activity of any "legal

science" is to impose order over unstructured and complex law by

recasting it into a body of structured, interconnected, coherent, and

simple rules. [171] Smith and Deedman go further and argue that the

task is to transform apparent indeterminacy into a completely

rule-governed structure. [172]

The same can be said for the portrayal of the epistemology of

jurisprudence. Just as "complex" law is recast into "simple" rules,

the task of Susskind was to take "confused and perplexed"

jurisprudence, and obtain "consensus" over relevant issues. What

Susskind does to find "consensus" in legal theory, is to allegedly

statistically sample the literature. [173] However, the "sample" was

limited firstly to works of analytical jurisprudence, and secondly to

British writings from the mid-1950's. [174] Perhaps unsurprisingly,

the influence of H.L.A. Hart's concept of law as a system of rules

was overwhelming. As an adjunct, Susskind further asserted that to

be "jurisprudentially impartial", that is, to embody no "contentious"

theory of law, an expert system must reason only with rules. Any

facility for reasoning with non-rule standards [175] was rejected out

of hand. A significant internal inconsistency emerges when it is

appreciated that Susskind believed that it was sufficient

justification for use of rules that this "consensus" identifies legal

rules as necessary but insufficient for legal reasoning. [176]

Perhaps the clue to why these works were chosen lies in the fact that

they constituted "the source materials with greatest potential given

the overall purpose of the project". Susskind notes that his work

was intended to "eliminate much of the future need for extensive

scrutiny of non-computationally oriented contemporary legal theory".

[177] Here the inference engine is most clearly "driving"

jurisprudence.

4.6 _Jurisprudence Turned on its Head_

Niblett claimed that "a successful expert system is likely to

contribute more to jurisprudence than the other way around". [178]

If the suggestions of researchers such as Susskind are taken

seriously, they turn jurisprudence on its head. Theory is not used

as a basis for practice, but instead implementability in technology

is used as the touchstone for accepting the truth or falsity of the

theory. A particular feature of artificial intelligence literature

is that its rigour lies not in experimental corroboration, or any

theoretical soundness, but implementability. [179] Hofstader [180]

suggests that so long as the artificial researcher takes care to

construct theories which can be written down as a sequence of

algorithmic or computational steps, these theories can be

implemented, thereby "confirming" the theories underlying the

process. Implementability per se leads to a self-perpetuating

methodology: since an artificial intelligence researcher will use

concepts of computational theory to construct theories, it is

necessarily implementable.

Legal expert systems researchers fall into this model by rejecting

"unacceptable" legal theories, and reformulating the remainder in

computational terms, to eliminate potential obstacles to the

prosperity of their research programme. This abandonment of serious

inquiry into jurisprudence by researchers into legal expert systems

may give credence to Kowalski's fears that the field may have cut

itself off as a specialist discipline and established its parameters

prematurely. [181] Brown notes that at a 1991 conference, few if any

papers questioned the basic assumptions of the field. [182]

Niblett claimed that "a successful expert system is likely to

contribute more to jurisprudence than the other way around". [183]

The foregoing demonstrate that these words ring true. Law and

jurisprudence, to form an acceptable basis for expert systems

research, has been reformulated in computational terms, to eliminate

philosophical "technicalities". [184]

4.7 _Failure to Recognise Limitations_

Leith has argued for a rejection of legal expert systems on the basis

that they simplify the law to such an unacceptable extent that they

have little or no value in legal analysis. [185] Yet while some

engineers of legal expert systems may be fully aware of the

limitations already discussed, it is not necessarily the case that

other researchers, and more importantly, the users of these programs

will also be so mindful. This article agrees with Leith's implied

suggestion that many accounts of work in this area refuse to

acknowledge that there are significant limitations. For example,

McCarty felt able to say:

"[Law] seems to be an ideal candidate for an artificial intelligence

approach: the "facts" would be represented in a lower level semantic

network, perhaps; the "law" would be represented in a higher level

semantic description; and the process of legal analysis would be

represented in a pattern-matching routine." [186]

Susskind has, however, admitted that expert systems might not be

amenable to corporate, commercial and tax law, but would be apposite,

for example, but to limited instances such as the Scottish law

relating to liability for damage caused by animals. [187] Such

limitations, often given little attention, should be made clear, and

"plausibility tricks" avoided. [188] There is a very real danger

that users will significantly overestimate the value of the analysis

they obtain from such a program, particularly in light of the wealth

of optimistic literature and when it is described as "expert".

5: LEGAL REASONING AS AN INTRACTABLY COMPLEX SYSTEM

The previous Section demonstrated how law and jurisprudence have been

unacceptably distorted to be amenable to expert systems technology.

Moles suggested that researchers have deliberately ignored

fundamental problems since they were committed to the use of a

"particular computing tool", and not to the understanding of law.

[189] This article identifies this tool as the inference engine

itself. The following section addresses the non-computational nature

of legal inference.

5.1 _Search for a Deep Model_

There has been a growing trend in legal expert systems to speak of

"deep knowledge" or "conceptual knowledge" as something distinct and

preferable to "shallow" knowledge. [190] McCarty calls for the

development in law of "deep" systems akin to CASNET [191] in which

the disease is represented as a dynamic process. [192] The depth of

a system has been described as the extent to which programmes contain

not only rules for mapping conclusions onto input scenarios, but also

a representation of the underlying causes. [193]

The Imperial College Group suggest that deep structure in legislation

is the isomorphism to that legislation, on the basis that each Horn

clause represents some clause in the legislation. [194] In addition,

case based reasoning has been described as employing a "deep

structure". [195] The advantage stemming from both of these

descriptions is that they cast deep structure into computational

terms. [196] This is another example of technology driving the

underlying theory.

On the other hand, McCarty argues that resolution of the difficulties

of open texture are related to a sense of "conceptual coherence".

[197] In addition, while theoretical approaches are emerging

to cope with problems of legal change, [198] a unifying theme is a

striving for an undefined "normative enrichment". [199] This Section

argues that deep structure is to be found in social context and

purpose, which are non-computational.

5.2 _Interpretation in a Social Context of Shared Understanding_

Law is not, as legal expert systems would portray it, self-contained

and autonomous, [200] but in fact is embedded in social and political

context. That legal concepts draw upon ordinary human experience is

precisely what makes them so difficult for an artificial intelligence

system. [201] Whenever human behaviour is analysed in terms of

rules, it must always contain a ceteris paribus condition; in

essence referring to the background of shared social practices,

interests and feelings. Even if we accept Susskind's

characterisation of law's ontology as going no further than the

"marks on paper", semantic problems will still arise since these

marks are not created in a vacuum, but are the result of purposive

social interaction, and must be so interpreted. [202]

Using one recognised example, the injunction:

DOGS MUST BE CARRIED ON THE ESCALATOR

can only be interpreted based on the understandings, for instance,

that a dog's small feet may become trapped in slots and moving parts;

that humans generally feel some concern for dogs, and therefore do

not wish to see them "mangled". Thus an adequate interpretation of

any rule requires that we locate it in a complex body of assumptions.

[203]

Minsky noted that intelligent behaviour presupposes a background of

cultural practices and institutions which must be modelled if

computer representations are to have any meaning. [204]

Wittgenstein's arguments that the meaning of language must be based

in social use and a community of users are worth rereading in the

light of Searle's Chinese room analogy. [205] How can the computer

have this sort of direct access to language? [206] Kowalski and

Sergot admit that a computer must operate by "blind" and "mechanical"

application of its internal rules. [207]

5.3 _Open Texture as an Intractable Problem_

If legal reasoning was really some "pointing" [208] or "cataloguing"

[209] procedure, Hart's suggestion that the task of legal

institutions is to approach greater refinement in definition by

adjudicating [210] on particular cases, may be attractive. [211]

Open texture may then be marginalised by a progressive refinement of

categories; in computational terms, weaving a more elaborate semantic

net. To model social context in a knowledge base, however, may be an

impossible task.

Popper demonstrates that context entirely depends on point of view.

[212] Harris suggests that any view of the law must be a

phenomenological one which takes account of shifting foci of

interest. [213] The difficulty is that a great deal of social

context will not be "conscious" and expressible, but will constitute

a hidden set of assumptions on which human decisions will be based.

It is impossible to focus attention onto elements of that context

without creating a new subconscious context. Polyani describes this

as the difference between focal and subsidiary awareness. [214]

As Berry [215] demonstrates, if people learn to perform tasks so that

important aspects of their knowledge are implicit in nature, then

knowledge engineers will be unable to extract this knowledge and

represent it in a meaningful way in an expert system. [216] Husserl,

for instance, discovered that construction of even simple "frames"

involved coping with an ever expanding "outer horizon" of knowledge.

He sadly concluded at the age of 75 that he was a "perpetual

beginner" engaged in an "infinite task". [217] This is a

fundamental difficulty with artificial intelligence in all its

applications. [218]

5.4 _Purposive Interpretation and Intention_

Hempel [219] argued that ad hoc modifications to a theory were

limited by the increased complexity of the theory and that, after a

certain threshold level of complexity was exceeded, scientists would

naturally and logically pursue simpler alternative theories. [220]

Here we may learn from science. Certain physical and chemical

systems have been discovered that display uncanny qualities of

co-operation, or organise themselves spontaneously and unpredictably

into complex forms. These systems are still subject to physical

laws, but laws that permit a more flexible and innovative type of

behaviour than the old mechanistic view of nature ever suggested.

The lesson from chaos theory is that seemingly complex systems can be

defined in terms of simple but not mathematically tractable models.

[221]

Legal reasoning is not mechanical. [222] Social context and shared

understandings can be dealt with in terms of the simple, elegant, but

non-computational model of purposive interpretation. Searle's

Chinese room analogy identifies intentionality as the benchmark of

the mental, and refutes claims that intentional mental predicates,

such as meaning, understanding, planning, and inferring, can be

attributed to a mathematical computational system. [223]

Susskind prefers to avoid purposive theories, [224] since such

theories imply that law is not simply a question of linguistic

pattern matching but instead involves examination of social practices

and human intentionality. [225] Similarly, case based reasoning is

seen as a way around having to tackle "full blown" statutory

interpretation involving legislative intent. [226]

Law is a practical enterprise, concerned to guide, influence or

control the actions of citizens. Since any action is purposive, any

philosophy of action must be a philosophy of purposes. [227] When a

court applies, say, the statutory term of our example, "vehicle", to

a particular contraption, the meaning of "vehicle" is found in an

analysis not only of the purpose of the law, but of the purpose for

which the vehicle was to be used. [228] For example, Fuller

responded to Hart in these terms:

"What would Professor Hart say if some local patriots wanted to mount

on a pedestal in the park a truck used in World War 2, while other

citizens, regarding the proposed memorial as an eye-sore, support

their stand by the "no vehicle" rule? Does this truck, in perfect

working order, fall within the core or the penumbra?" [229]

One could make similar arguments when a "NO DOGS ALLOWED" sign

confronts a seeing eye dog, or one that is stuffed or anaesthetised.

[230] It is difficult to reconcile Hart's acontextual approach to

legal interpretation with his own view of actors within the legal

system holding an internal normative view of the rules. [231]

Following a rule equals "obeying the law" only where a purposive

personal commitment has been made to a rule structure. [232]

Applying modern literary and linguistic theory to the law, [233] some

suggest that no text has meaning without the active participation of

the reader, [234] and an "interpretive community" of which the reader

is a part. [235] The use of figurative language, imagery and

metaphor is integral to legal discourse. [236] Ideological symbolism

is inescapable. [237] What counts as the relevant facts depends

entirely on context, [238] and cannot be determined by programmers ex

ante. [239] Language is the very condition of intention, and

intention is the vehicle of meaning. [240]

5.5 _Humanistic Implications_

The implication of the foregoing suggests that the law cannot be

amenable to a legal expert system, as this involves denying social

context, purpose, and essentially humanity. A humanistic critique

would argue if expert systems have any degree of success in modelling

"the law", the result would be "profoundly humiliating". [241]

Weizenbaum stated that if artificial intelligence fulfils its

promises then this implies that man is merely a machine. [242] In

similar vein, the success of legal expert systems might imply that

the law itself is a machine, and that lawyers, perhaps even judges,

can be replaced by computers.

6: THE WAY FORWARD

The preceding sections have demonstrated that the use of a deductive

inference mechanism, and the consequent need for knowledge

represented to be amenable to such an engine, will lead to

unacceptable distortion of both the law, its philosophical

underpinnings, and its humanity. How are lawyers then effectively to

utilise the information technology resource? This article adopts the

basic message in Tapper's insightful 1963 piece. [243] The range of

activities to which computers ought be used must be limited to

activities which can be reproduced by the machines. Tapper

tentatively describes the distinction as one between "mechanical" and

"creative" tasks. [244] If the argument of this article is

accepted, the way forward involves relocation of the inference engine

from the computer to the human user. This section explores

possibilities for "decision support systems", which presents material

to the user on which she alone performs the specifically legal

reasoning. [245]

6.1 _Decision Support Systems_

Recall that in the exemplar model, the user is presented with

prototypical instances, or "mental images". This approach differs

from the other models of classification in that it is primarily

designed to leave the task of classification to the user of the

system. The reason why the user, rather than the machine, ought

perform the legal inference is that legal reasoning is

non-computational, as Section 5 has demonstrated.

Despite growing recognition that research perhaps ought to be

oriented towards "decision support systems", such systems have been

designed to first reason with the legal data and then present such

reasoning to the user to support her conclusion. [246] This approach

is hazardous since it may predetermine the human conclusion to a

large degree. [247] To dispute the computer inference the user would

require knowledge of the area of law to a degree where the computer

would not need to be have been consulted in the first instance. [248]

Decision support systems, then, differ significantly from expert

systems in that the heart of the problem - the inference engine - is

relocated in the user of the system. Computers should then be

utilised for mechanical and time-consuming tasks for which they are

best suited. In particular, this Section suggests three significant

uses:

- structured legal information retrieval;

- calculation based on strategies; and

- litigation support and "legal econometric" systems.

6.2 _Legal Information Retrieval_

Firstly, searching for primary and secondary legal sources is a

costly and to a significant degree a mechanical exercise. Efficient

retrieval of legal information is vitally important. Tapper

suggested in 1963 that lack of resources to those operating outside

provincial centres, and concentration of materials within large

organisations was productive of injustice in favour of powerful

sections of the community. [249] Modern statute and case databases

have gone some way to addressing this problem.

Generally, systems such as LEXIS, SCALE, and INFO1 use Boolean

keyword search routines, [250] but these have obvious disadvantages.

[251] Some limited advances have been made with, for instance,

"Hypertext" cross-referencing, [252] and "probabilistic" elaboration

of keyword searches. [253] It has long been assumed that retrieval

based on the meaning and content of documents, and indexed in terms

of legal concepts, [254] would be far more appropriate. [255] A

variety of techniques have emerged for indexing, including use of

discrimination trees, [256] and explanation based generalisation.

[257] Research is progressing towards a "hybrid" approach of linking

case databases with statutory material and legal texts. [258] Hafner

[259] has constructed a database on United States negotiable

instruments law designed to retrieve cases based on typical problems

which arise in legal disputes.

McCarty has suggested that it would be more fruitful to look at legal

argument than to develop a theory of correct legal decisions. [260]

Similarly Bench-Capon and Sergot suggest that open texture should be

handled by giving the user for and against arguments in borderline

cases. If so, a computer system will be concerned, not with the

production of a conclusion, but rather with presenting the arguments

on which the user may base her own conclusions. [261]

On this basis, Ashley and Rissland have designed a system called

HYPO, [262] which emerged from Rissland's earlier work on reasoning

by examples. [263] It does not use an inference engine for legal

analysis but instead aims for conceptual retrieval based on structure

of legal argument. [264] The system's inference engine is used for

some statistical processes used to decide which primary materials to

retrieve. The cases relevant to the issues identified by the user

are retrieved and arranged in terms of argument for and against a

decision in a new case. [265] The system is further supplemented by

a set of "hypothetical" cases. [266] The actual legal inference on

the basis of the material retrieved is left to the user of the

system, which distinguishes HYPO and similar text retrieval systems

from many of the case based reasoning systems earlier.

6.3 _Calculations and Planning_

A second application would be to utilise the mathematical functions

of computer systems. The computer performs inference, but

essentially calculates outcomes based on strategies already

formulated by an expert who has himself interpreted the legal

materials. Michaelson's TAXADVISOR [267] is one example. It

calculates tax planning strategies for large estates, based on

strategies obtained from lawyers experienced in tax advice. There is

little more legal inference in this than calculating a share

portfolio to maximise return based on a broker's personal model.

Similarly, systems have been suggested which will assist in financial

planning, for instance by forecasting retirement pensions. [268]

6.4 _Litigation Support and Jurimetrics_

Finally, a decision support system may more clearly focus on

litigation strategies. These may be developed with the assistance of

expertise, or by techniques such as hypothesis and experiment. [269]

Such systems do in fact make inferences, but these are not inferences

of law, but inferences based on strategies already defined by

expertise or essentially what amounts to empirical research. In that

sense, the inference procedures are extensions to the calculation and

planning examples.

One example is the LDS [270] system, implemented in ROSIE [271] by

Waterman and Peterson. It advises on whether to settle product

liability cases, and an advisable amount, based on factors such as

abilities of the lawyers, characteristics of the parties, timing of

claim, type of loss suffered, and the probability of establishing

liability. The primary goal of LDS was not to model the law per se

but rather the actual decision making processes of lawyers and claims

adjusters in product liability litigation. Another example is SAL,

[272] intended to advise on an appropriate sum to settle asbestos

injury claims. In such systems, the computer is modelling non-legal

factors which may influence the outcome of a case, in order to assist

the lawyer in deciding her ultimate strategy. In Australia the

Government Insurance Office has developed COLOSSUS, a sophisticated

system to detect possible fraudulent personal injury claims, and tag

them for investigation by its officers. [273]

Similar systems have also been suggested as aids not only in

litigation, but dispute resolution strategies. [274] The information

contained in such systems may as an adjunct constitute an important

resource for sociological study, such as Bain's modelling of

subjective decisions of judges of particular varieties of crime in

the United States. [275] In this case, the expert system constitutes

jurimetrics, a legal version of econometrics.

7: SUMMARY AND CONCLUSIONS

Computerisation of the legal office will continue, but the message

from this article is that researchers must be acutely aware of the

philosophical underpinnings of their work. In particular, the

usefulness of legal expert systems is severely questioned. Use of

such systems has involved an unacceptable level of distortion both of

the nature of law and of jurisprudence. This is not a case of

"carbon", [276] "biological", [277] or even "neural" [278]

chauvinism, but a demonstration that expert systems technology have

made a poor choice of domain in law. Blame has been laid for such

distortion on the core of the expert system: the pattern matching

inference engine. Legal inference, on the other hand, relies on

purpose and social context, implying that computational models of

sufficient richness are not tractable.

This article suggests that, given current limitations of computer

technology, the quest for an artificially intelligent legal adviser

is misguided. In the future, however, these limitations may be

overcome. For example, work being undertaken in parallel distributed

processing is producing significant results with respect to low level

"intelligent" processes, including perception, language, and motor

controls. This is based on the assumption that intelligence emerges

from interactions of large numbers of simple processing units, and

represents a significant break away from increasingly complex

rule-based structures. [279] While this article cannot address such

possibilities within future technology, it is suggested that the

basic pattern matching /rule-governed principles will limit computers

for some time. It is therefore suggested that researchers instead

investigate decision support systems as a more useful alternative.

Relocation of the inference engine will mean that knowledge

representation will no longer need be amenable to computational

inference, but human inference. The computer's inference engine

should instead be used instead for searching procedures, and

computation. Some possibilities have been noted.

Ardent advocates such as Tyree suggest that despite their

difficulties, legal expert systems are a cost-effective second-best

solution. The choice is portrayed not between human advice and

machine advice, but in an era of high costs of justice, between

machine advice and no advice at all. [280]

While economic factors are important, [281] humanistic factors must

not be forgotten. Law plays an important role in modern

civilisation. It must maintain a close relationship with the social

and political forces shaping society, and not merely regress into a

"technology", a tool to be used by competing social forces. [282]

---------------------------------------------------

ENDNOTES

* J Searle, "Minds, Brains and Programs" (1980) 3 Behavioural and

Brain Sciences 417.

1 For examples see NJ Bellord, Computers for Lawyers (Sinclair

Browne: London, 1983); and T Ruoff, The Solicitor and the Automated

Office (Sweet & Maxwell: London, 1984).

2 q.v. J Bing (ed.), Handbook of Legal Information Retrieval

(North-Holland: Amsterdam, 1984).

3 See Section VI, infra.

4 Inferring molecular structure from mass spectroscopy data; q.v.

RK Lindsay, BG Buchanan, & J Lederberg, Applications of Artificial

Intelligence for Chemical Inference: The DENDRAL Project

(McGraw-Hill: New York, 1980).

5 Advising on the location of ore deposits given geological data;

q.v. RO Duda & R Reboh, "AI and Decision Making: The PROSPECTOR

Experience" in W Reitman, Artificial Intelligence Applications for

Business (Ablex Publishing: Norwood, 1984).

6 Providing consultative advice on diagnosis and antibiotic therapy

for infectious diseases; q.v. BG Buchanan & EH Shortcliffe,

Rule-Based Expert Systems: The MYCIN Experiments of the Stanford

Heuristic Programming Project (Addison-Wesley: Reading, 1984).

7 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.11.

8 Ibidem p.15.

9 L Loevinger, "Jurimetrics: The Next Step Forward" (1949) Minnesota

Law Review 33.

10 L Mehl, "Automation in the Legal World: From the Machine

Processing of Legal Information to the 'Law Machine'" in

Mechanisation of Thought Processes (HMSO: London, 1958) p.755.

11 RW Morrison, "Market Realities of Rule-Based Software for

Lawyers: Where the Rubber Meets the Road" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 33 at

p.35.

12 c.f. RA Clarke, Knowledge-Based Expert Systems (Working paper:

Department of Commerce, Australian National University, 1988) p.6.

13 Primarily RE Susskind, Expert Systems in Law: A Jurisprudential

Inquiry (Clarendon Press: Oxford, 1987).

14 Ibidem p.44; emphasis added.

15 MJ Sergot, "The Representation of Law in Computer Programs",

Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal

Applications (Academic Press: London, 1991) at p.4.

16 P Harmon & D King, Expert Systems: Artificial Intelligence in

Business (John Wiley & Sons: New York, 1985) at p.5.

17 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M

Yazdani (ed.), Artificial Intelligence: Principles and Applications

(Chapman & Hall: London, 1986) pp.186-187.

18 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M

Yazdani, Artificial Intelligence: Principles and Applications

(Chapman and Hall: London, 1986) p.194.

19 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.46.

20 Ibidem p.47.

21 F Hayes-Roth, DA Waterman & DB Lenat Building Expert Systems

(Addison-Wesley: London, 1983) p.4.

22 e.g. The Latent Damage Adviser; q.v. PN Capper & RE Susskind,

Latent Damage Law - The Expert System (Butterworths: London, 1988).

23 J Vaux, "AI and Philosophy: Recreating Naive Epistemology"

Chapter Seven in KS Gill (ed.), Artificial Intelligence for Society

(John Wiley & Sons: London, 1986) p.76.

24 T Winograd & F Flores, Understanding Computers and Cognition: A

New Foundation for Design (Ablex: Norwood, 1986).

25 HL Dreyfus & SE Dreyfus, Mind over Machine (Basil Blackwell:

Oxford, 1986).

26 A Hart & DC Berry, "Expert Systems in Perspective" in DC Berry &

A Hart (eds) Expert Systems: Human Issues (MIT: Cambridge, 1990)

p.11.

27 e.g. J Weizenbaum, Computer Power and Human Reason: From Judgment

to Calculation (WH Freeman & Co: San Francisco, 1976).

28 R Hellawell, "A Computer Program for Legal Planning and Analysis:

Taxation of Stock Redemptions" (1980) 80 Columbia Law Review 1363.

See also NJ Bellord, "Tax Planning by Computer" in B Niblett (ed.),

Computer Science and Law (Cambridge University Press: New York, 1980)

p.173.

29 R Hellawell, "CHOOSE: A Computer Program for Legal Planning and

Analysis" (1981) 19 Columbia Journal of Transnational Law 339.

30 R Hellawell, "SEARCH: A Computer Program for Legal Problem

Solving" (1982) 15 Akron Law Review 635.

31 P Jackson, H Reichgelt & Fv Harmelen, Logic-Based Knowledge

Representation (MIT: Cambridge, 1989).

32 Implemented in QUINTUS PROLOG; q.v. NH Minsky & D Rozenshtein,

"System = Program + Users + Law" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 170.

33 Symbolic logic has had a profound influence in the artificial

intelligence field; for a description see I Copi, Symbolic Logic

(Macmillan: New York, 1973).

34 MJ Sergot, "A Brief Introduction to Logic Programming and Its

Applications in Law" Chapter Five in C Walter (ed.) , Computer Power

and Legal Language (Quorum: London, 1988) at pp.25-27.

35 C Mellish, "Logic Programming and Expert Systems" Chapter

Nineteen in KS Gill (ed.), Artificial Intelligence for Society (John

Wiley & Sons: London, 1986) at p.211.

36 F Hayes-Roth, DA Waterman & DB Lenat, Building Expert Systems

(Addison-Wesley: London, 1983) at p.66.

37 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.208.

38 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M

Yazdani, Artificial Intelligence: Principles and Applications

(Chapman and Hall: 1986) p.191.

39 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) pp.209-210.

40 RI Levine, DE Drang & B Edelson, Artificial Intelligence and

Expert Systems (McGraw-Hill: 1990) Chapter Six, particularly at

pp.62-65.

41 e.g. AW Koers & D Kracht, "A Goal Driven Knowledge Based System

for a Domain of Private International Law" (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 81.

42 q.v. RA Kowalski, "The Treatment of Negation in Logic Programs

for Representing Legislation" (1989) Proceedings Second International

Conference on Artificial Intelligence and Law 11; P Asirelli, M De

Santis & M Martelli, "Integrity Constraints in Logic Databases"

(1985) 2 Journal of Logic Programming 221; K Eshghi & RA Kowalski,

"Abduction Compared with Negation by Failure" (1989) Proceedings of

the Sixth International Logic Programming Conference; and JW Lloyd,

EA Sonenberg and RW Topot, "Integrity Constraint Checking in

Stratified Databases" (1986) 4 Journal of Logic Programming 331.

43 TJM Bench-Capon, "Representating Counterfactual Conditionals"

(1989) Proceedings Artificial Intelligence and the Simulation of

Behvaiour 51.

44 "Augmented Prolog Expert System"; q.v. MJ Sergot, "A Brief

Introduction to Logic Programming and Its Applications in Law"

Chapter Five in C Walter (ed.), Computer Power and Legal Language

(Quorum: London, 1988) at pp.34-35.

45 P Hammond & MJ Sergot, "A PROLOG Shell for Logic Based Expert

Systems" (1983) 3 Proceedings British Computer Society Expert Systems

Conference.

46 DA Schlobohm, "A PROLOG Program Which Analyses Income Tax Issues

under Section 318(a) of the Internal Revenue Code" in C Walter (ed.),

Computing Power and Legal Reasoning (West Publishing: St Paul, 1985)

p.765.

47 q.v. F Haft, RP Jones & T Wetter, "A Natural Language Based Legal

Expert System for Consultation and Tutoring - The LEX Project" (1987)

Proceedings First International Conference on Artificial Intelligence

and the Law 75.

48 C Walter, "Elements of Legal Language" Chapter Three in C Walter

(ed.), Computer Power and Legal Language (Quorum: London, 1988).

49 LA Zadeh, "Fuzzy Sets" (1965) 8 Information and Control 338.

50 E Rosch & C Mervis, "Family Resemblances: Studies in the Internal

Structure of Categories" (1975) 7 Cognitive Psychology 573.

51 M Novakowska, "Fuzzy Concepts: Their Strcuture and Problems of

Measurement" in MM Gupta, RK Ragade & RR Yager (eds), Advances in

Fuzzy Set Theory and Applications (North-Holland: Amsterdam, 1979) at

p.361.

52 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation

of Open Texture in Law" Chapter Six in C Walter (ed.), Computer Power

and Legal Language (Quorum: London, 1988) at p.49.

53 D Berman & C Walter (ed.), "Toward a Model of Legal

Argumentation" Chapter Four in C Walter (ed.), Computer Power and

Legal Language (Quorum: London, 1988) at p.22.

54 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.225.

55 CE Alchourrsn & AA Martino, "A Sketch of Logic Without Truth"

(1989) Proceedings Second International Conference on Artificial

Intelligence and Law 165 at p.166.

56 HLA Hart, "Problems of the Philosophy of the Law" in HLA Hart,

Essays in Jurisprudence and Philosophy (Clarendon Press: Oxford,

1983) p.100; and H Kelsen, "Law and Logic" in H Kelsen, Essays in

Legal and Moral Philosophy (Reidel: Dordrecht, 1973) at p.229.

57 LT McCarty, "Permissions and Obligations - an Informal

Introduction" (1983) Proceedings International Joint Conference on

Artificial Intelligence-83; LT McCarty, "Permissions and Obligations

- An Informal Introduction" in AA Martino & NF Socci (eds) Automated

Analysis of Legal Texts (North-Holland: Amsterdam, 1986). Fora more

developed system on the same principles, see H-N Castaqeda, "The

Basic Logic for the Interpretation of Legal Texts" in C Walter (ed.),

Computer Power and Legal Language (Quorum: London, 1988) at p.167.

58 GHv Wright, "Deontic Logic" (1951) 60 Mind 1.

59 McCarty suggests that it is helpful to think of the set as an

"oracle" to be consulted when contemplating a course of action; see

LT McCarty, "Permissions and Obligations - an Informal Introduction"

in AA Martino & NF Socci (eds) Automated Analysis of Legal Texts

(North-Holland: Amsterdam, 1986).

60 Note LT McCarty, "Permissions and Obligations - A Informal

Introduction" in AA Martino & NF Socci (eds) Automated Analysis of

Legal Texts (North-Holland: Amsterdam, 1986)Definitions 5-7.

61 LT McCarty, "Clausal Intuitionistic Logic I: Fixed-Point

Semantics" (1988) 5 Journal of Logic Programming 1; LT McCarty,

"Clausal Intuitionistic Logic II: Tableau Proof Procedures" (1988) 5

Journal of Logic Programming 93.

62 LT McCarty, "On the Role of Prototypes in Appellate Legal

Argument" (1991) Proceedings Third International Conference on

Artificial Intelligence and Law 185 at p.187.

63 AJI Jones, "On the Relationship Between Permission and

Obligation" (1987) Proceedings First International Conference on

Artificial Intelligence and Law 164 at pp.166-168.

64 LT McCarty & Rvd Meyden, "Indefinite Reasoning with Definite

Rules" (1991) Proceedings of the Twelfth International Joint

Conference on Artificial Intelligence.

65 Particularly those of consequential closure; AJI Jones & I Pren,

"Ideality, Sub-Ideality and Deontic Logic" (1985) 2 Synthise 65.

66 R Stamper, "The LEGOL-1 Prototype System and Language" (1977) 20

The Computer Journal 102.

67 R Stamper, C Tagg, P Mason, S Cook & J Marks, "Developing the

LEGOL Semantic Grammar" in C Ciampi (ed.) Artificial Intelligence and

Legal Information Systems (North-Holland: Amsterdam, 1982) p.357.

68 R Stamper, "LEGOL: Modelling Legal Rules by Computer" in B

Niblett (ed.), Computer Science and Law (Cambridge University Press:

New York, 1980) p.45.

69 R Stamper, "A Non-Classical Logic for Law Based on the Structures

of Behaviour" in AA Martino & F Socci (eds), Automated Analysis of

Legal Texts (North-Holland: Amsterdam, 1986) p.57.

70 S Jones, "Control Structures in Legislation" in B Niblett (ed.),

Computer Science and Law (Cambridge University Press: New York, 1980)

p.157.

71 MJ Sergot, Programming Law: LEGOL as a Logic Programming Language

(Imperial College: London, 1980).

72 MJ Sergot, "The Representation of Law in Computer Programs",

Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal

Applications (Academic Press: London, 1991) at p.35.

73 IE Pratt, Epistemology and Artificial Intelligence (PhD

dissertation: Princeton, 1987) p.18; emphasis in original.

74 Susskind and Gold.

75 Including Bench-Capon, Cordingley, Forder, Frohlich, Gilbert,

Luff, Protman, Sergot, Storrs and Taylor; q.v. RN Moles, "Logic

Programming - An Assessment of Its Potential for Artificial

Intelligence Applications in Law" (1991) 2 Journal of Law and

Information Science 137 at pp.146-147.

76 RE Susskind, "The Latent Damage System" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 23 at

p.29.

77 On causality, note CG de'Bessonet & CR Cross, "Representation of

Some Aspects of Causality" in C Walter (ed.) Computing Power and

Legal Reasoning (West: St Paul, 1985) pp.205-214.

78 e.g. SR Goldman, MG Dyer & M Flowers, "Precedent-based Legal

Reasoning and Knowledge Acquisition in Contract Law: a Process Model"

(1987) Proceedings First International Conference on Artificial

Intelligence and Law 210 at pp.214-215 using Hohfeldian analysis of

rights; q.v. WN Hohfeld, "Some Fundamental Legal Conceptions As

Applied in Judicial Reasoning" (1917) 23 Yale Law Journal 16.

79 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.227.

80 P Leith, "Logic, Formal Models and Legal Reasoning" (1984)

Jurimetrics Journal 334 at pp.335-336.

81 N MacCormick, Legal Reasoning and Legal Theory (Oxford University

Press: Oxford, 1978) at p.37.

82 MJ Sergot, "A Brief Introduction to Logic Programming and Its

Applications in Law" Chapter Five in C Walter (ed.), Computer Power

and Legal Language (Quorum: London, 1988) at p.26.

83 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) pp.181-198, particularly at p.188.

84 LE Allen & CS Saxon, "Some Problems in Designing Expert Systems

to Aid Legal Reasoning" (1987) Proceedings First International

Conference on Artificial Intelligence and Law 94 at p.94.

85 RE Susskind, "The Latent Damage System" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 23 at

p.28.

86 Ibidem p.30.

87 AvdL Gardner, "Overview of an AI Approach to Legal Reasoning" in

C Walter (ed.),Computing Power and Legal Reasoning (West: St Paul,

1985) p.247.

88 HLA Hart, "Positivism and the Separation of Law and Morals"

(1958) 79 Harvard Law Review 593 at p.599.

89 G Gottlieb, The Logic of Choice: An Investigation of the Concepts

of Rule and Rationality (Allen & Unwin: London, 1968) p.17.

90 OC Jensen, The Nature of Legal Argument (Basil Blackwell: Oxford,

1957) p.16; A Wilson, "The Nature of Legal Reasoning: A Commentary

with Special Reference to Professor MacCormick's Theory" (1982) 2

Legal Studies 269 at pp.278-280.

91 MJ Detmold, The Unity of Law and Morality: A Refutation of Legal

Positivism (Routledge & Kegan Paul: London, 1984) p.15; c.f. RE

Susskind, "Detmold's Refutation of Positivism and the Computer Judge"

(1986) 49 Modern Law Review 125.

92 HLA Hart, "Positivism and the Separation of Law and Morals"

(1958) 79 Harvard Law Review 593 at p.607.

93 DB Skalak, "Taking Advantage of Models for Legal Classification"

(1989) Proceedings Second International Conference on Artificial

Intelligence and Law 234.

94 CD Hafner, "Conceptual Organisation of Case Law Knowledge Bases"

(1987) Proceedings First International Conference on Artificial

Intelligence and Law 35 at pp.36-37.

95 Note the diagram attached to RE Susskind, "The Latent Damage

System" (1989) Proceedings Second International Conference on

Artificial Intelligence and Law 23.

96 MJ Sergot, "Representing Legislation as Logic Programs" (1985) 11

Machine Intelligence 209.

97 MJ Sergot, F Sadri, RA Kowalski, F Kriwaczek, P Hammond, & HT

Cory, "The British Nationality Act as a Logic Program" (1986) 29

Communications of the ACM 370; and MJ Sergot, HT Cory, P Hammond, RA

Kowalski, F Kriwaczek, & F Sadri, "Formalisation of the British

Nationality Act" in C Arnold (ed.), Yearbook of Law, Computers and

Technology (Butterworths: London, 1986).

98 TJM Bench-Capon, GO Robinson, TW Routen & MJ Sergot, "Logic

Programming for Large Scale Applications in Law: A Formalisation of

Supplementary Benefit Legislation" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 190.

99 q.v. P Johnson & D Mead, "Legislative Knowledge Base Systems for

Public Administration: Some Practical Issues" (1991) Proceedings

Third International Conference on Artificial Intelligence and Law

74.

100 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.100.

101 The concept can be attributed to Quillian; q.v. MR Quillian,

"Word Concepts: A Theory and Simulation of Some Basic Semantic

Capabilities" (1967) 12 Behvioural Science 410.

102 WA Woods, "What's in a Link: Foundations for Semantic Networks"

in DG Bobrow & AM Collins (eds), Representation and Understanding:

Studies in Cognitive Science (Academic Press: New York, 1975) p.32.

103 M Minsky, "A Framework for Representing Knowledge" in J

Haugeland (ed.), Mind Design (MIT Press: Cambridge, 1981) p.95.

104 PJ Hayes, "The Logic of Frames" in D Metzing (ed.), Frame

Conceptions and Text Understanding (Walter de Gruyter: Berlin, 1979)

p.46.

105 MJ Sergot, "The Representation of Law in Computer Programs",

Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal

Applications (Academic Press: London, 1991) at p.48.

106 LT McCarty, "Reflections on TAXMAN: An Experiment in Artificial

Intelligence and Legal Reasoning" (1977) 90 Harvard Law Review 837;

and LT McCarty, "The TAXMAN Project: Towards a Cognitive Theory of

Legal Argument" in B Niblett (ed.), Computer Science and Law

(Cambridge University Press: New York, 1980).

107 PJ Hayes, "The Logic of Frames" in D Metzing (ed.), Frame

Conceptions and Text Understanding (de Gruyter: New York, 1979).

108 Example adapted from MJ Sergot, "The Representation of Law in

Computer Programs", Chapter One in TJM Bench-Capon, Knowledge-Based

Systems and Legal Applications (Academic Press: London, 1991) at

p.46.

109 KE Sanders, "Representing and Reasoning About Open-Textured

Predicates" (1991) Proceedings Third International Conference on

Artificial Intelligence and Law 137 at p.138.

110 LT McCarty & NS Sridharan, "The Representation of an Evolving

System of Legal Concepts II: Prototypes and Deformations" (1987)

Proceedings of the Seventh International Joint Conference on

Artificial Intelligence 246.

111 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation

of Open Texture in Law" Chapter Six in C Walter (ed.), Computer Power

and Legal Language (Quorum: London, 1988) at p.47.

112 SS Weiner, "Reasoning About "Hard" Cases in Talmudic Law" (1987)

Proceedings First International Conference on Artificial Intelligence

and Law 222 at p.223.

113 P Leith, "Logic, Formal Models and Legal Reasoning" (1984)

Jurimetrics Journal p.334 at p.356.

114 KA Lambert & MH Grunewald, "LESTER: Using Paradigm Cases in a

Quasi-Prcedential Legal Domain" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 87.

115 J Popple, "Legal Expert Systems: The Inadequacy of a Rule-based

Approach" (1991) 23 Australian Computer Journal 11 at p.15.

116 Also note the GREBE system; q.v. LK Branting, "Representing and

Reusing Explanations of Legal Precedents" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 103.

117 RA Kowalski, "Case-based Reasoning and the Deep Structure

Approach to Knowledge Representation" (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 21 at

p.23.

118 KD Ashley & EL Rissland, "Waiting on Weighting: a Symbolic Least

Commitment Approach" (1988) Proceedings American Association for

Artificial Intelligence.

119 MJ Sergot, "The Representation of Law in Computer Programs",

Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal

Applications (Academic Press: London, 1991) at p.65.

120 J Zeleznikow, Building Intelligent Legal Tools - The IKBALS

Project (1991) 2 Journal of Law and Information Science 165 at p.173.

121 EL Rissland & KD Ashley, "A Case-Based System for Trade Secrets

Law" (1987) Proceedings First International Conference on Artificial

Intelligence and Law 60.

122 M Betzer, "Legal Resoning in 3-D" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 155 at

p.155.

123 RA Kowalski, "Case-based Reasoning and the Deep Structure

Approach to Knowledge Representation" (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 21.

124 RA Kowalski, "Case-based Reasoning and the Deep Structure

Approach to Knowledge Representation" (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 21 at

p.26.

125 G Greenleaf, A Mowbray & AL Tyree, "Expert Systems in Law: The

DATALEX Project" (1987) Proceedings First International Conference on

Artificial Intelligence and Law 9 at p.12.

126 e.g. SR Goldman, MG Dyer & M Flowers, "Precedent-based Legal

Reasoning and Knowledge Acquisition in Contract Law: a Process Model"

(1987) Proceedings First International Conference on Artificial

Intelligence and Law 210; and MT MacCrimmon, "Expert Systems in

Case-Based Law: The Hearsay Rule Adviser" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 68.

127 G Vossoss, J Zeleznikow & T Dillon, "Combining Analogical and

Deductive Reasoning in Legal Knowledge Base Systems - IKBALS II" in

Cv Noortwijk, AHJ Schmidt & RGF Winkels (eds), Legal Knowledge Based

Systems: Aims for Research and Development (Koninklijke: Lelystad,

1991) p.97 at p.100.

128 The weighting scheme used by Kowalski was:

Highest level court = 70; appeal level court = 50; trial level court

= 30.

Add 10 points for trial or appeals local to the jurisdiction.

Deduct 15 points for foreign jurisdictions, except England, then 10

points.

Add 1 to 5 points if case is recent: 1986 = 1 to 1990 = 5.

See RA Kowalski, Case-based Reasoning and the Deep Structure Approach

to Knowledge Representation (1991) Proceedings Third International

Conference on Artificial Intelligence and Law 21.

129 G Vossos, T Dillon & J Zeleznikow, "The Use of Object Oriented

Principles to Develop Intelligent Legal Reasoning Systems" (1991) 23

Australian Computer Journal 2.

130 J Zeleznikow & D Hunter, "Rationales for the Continued

Development of Legal Expert Systems" (1992) 3 Journal of Law and

Information Science 94 at pp.102-103.

131 RE Susskind, "Expert Systems in Law: A Jurisprudential approach

to Artificial Intelligence and Legal Reasoning" (1986) 49 Modern Law

Review 168 at p.171; see also RE Susskind, Expert Systems in Law: A

Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) p.20.

132 RA Kowalski, "Case-Based Reasoning and the Deep Structure

Approach to Knowledge Representation" (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 21 at

p.21.

133 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) pp.81-82.

134 Ibidem pp.21-23.

135 TJM Bench-Capon & J Forder, "Knowledge Representation for Legal

Applications" Chapter Twelve in TJM Bench-Capon, Knowledge-Based

Systems and Legal Applications (Academic Press: London, 1991) at

p.249.

136 HJ Levesque & RJ Brachman, "A Fundamental Tradeoff in Knowledge

Representation and Reasoning" Chapter Four in RJ Brachman & HJ

Levesque (eds), Readings in Knowledge Representation (Morgan

Kaufmann: Los Altos, 1985) at pp.66-67.

137 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.49.

138 q.v. TJM Bench-Capon & F Coenen, "Exploiting Isomorphism:

Development of a KBS to Support British Coal Insurance Claims" (1991)

Proceedings Third International Conference of Artificial Intelligence

and Law 62.

139 RN Moles, "Logic Programming - An Assessment of Its Potential

for Artificial Intelligence Applications in Law" (1991) 2 Journal of

Law and Information Science 137 at p.144.

140 Bench-Capon's own words; q.v. TJM Bench-Capon & J Forder,

"Knowledge Representation for Legal Applications" Chapter Twelve in

TJM Bench-Capon, Knowledge-Based Systems and Legal Applications

(Academic Press: London, 1991) at p.259.

141 e.g. C Biagioli, P Mariani & D Tiscornia, "ESPLEX: a Rule and

Conceptual Based Model for Representing Statutes" (1987) Proceedings

First International Conference on Artificial Intelligence and Law

240, and the examples previously cited.

142 e.g. DM Sherman, "A Prolog Model of the Income Tax Act of

Canada" (1987) Proceedings First International Conference on

Artificial Intelligence and Law 127; also note TAXMAN and like

projects cited.

143 B Niblett, "Computer Science and Law: An Introductory

Discussion" in B Niblett (ed.), Computer Science and Law (Cambridge

University Press: Cambridge, 1980) at pp.16-17.

144 For instance into ANF (Atomically Normalised Form) used with the

CCLIPS system (Civil Code Legal Information Processing System); q.v.

G Cross, CGde Bossonet, T Bradshaw, G Durham, R Gupta & M Nasiruddin,

"The Implementation of CCLIPS" Chapter Nine in C Walter (ed.),

Computer Power and Legal Language (Quorum: London, 1988) p.90.

145 JP Dick, "Conceptual Retrieval and Case Law" (1987) Proceedings

First International Conference on Artificial Intelligence and Law

106 at p.109; although such material may classify as a source of

heuristics. Susskind does not address this, however.

146 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) pp.84-85; citing HLA Hart, The

Concept of Law (Clarendon Press: Oxford, 1961) p.131.

147 KE Sanders, "Representing and Reasoning About Open-Textured

Predicates" (1991) Proceedings Third International Conference on

Artificial Intelligence and Law 137 at p.142.

148 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.61; contra G Greenleaf, A Mowbray &

AL Tyree, "Expert Systems in Law: The DATALEX Project" (1987)

Proceedings First International Conference on Artificial Intelligence

and Law 9.

149 c.f. LB Solum, "On the Indeterminacy Crisis: Critiquing Critical

Dogma" (1987) 54 University of Chicago Law Review 462.

150 J Boyle, "Anatomy of a Torts Class" (1985) 34 American

University Law Review 131; see also M Kelman, "Trashing" (1984) 36

Stanford Law Review 293.

151 A Mason, "Future Directions in Australian Law" (1987) 13 Monash

Law Review 149, particularly at pp.154-155 and p.158; FG Brennan,

"Judicial Method and Public Law" (1979) 6 Monash Law Review 12; and M

McHugh, "The Law-making Function of the Judicial Process" (1988) 62

Australian Law Journal 15.

152 e.g. JC Smith and C Deedman, "The Application of Expert Systems

Technology to Case-Based Reasoning" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 84.

153 E Levi, An Introduction to Legal Reasoning (University of

Chicago Press: Chicago, 1949) pp.3-5.

154 AL Goodhart, "The Ratio Decidendi of a Case" (1930) 40 Yale Law

Journal 161.

155 J Stone, "The Ratio of the Ratio Decidendi" in Lord Lloyd & MDA

Freeman, Lloyd's Introduction to Jurisprudence (5th Ed.) (Stevens:

London, 1985) p.1164.

156 "It is unclear".

157 S Strvmholm, Rdtt. rottskollor och rottssystem (3rd Ed.)

(Norstedts: Stockholm, 1987) cited by P Wahlgren, "Legal Reasoning -

A Jurisprudence Description" (1989) Proceedings Second International

Conference on Artificial Intelligence and Law 147 at p.148.

158 TJM Bench-Capon & F Coenen, "Practical Application of KBS to

Law: The Crucial Role of Maintenance" in Cv Noortwijk, AHJ Schmidt &

RGF Winkels (eds), Legal Knowledge Based Systems: Aims for Research

and Development (Koninklijke: Lelystad, 1991) p.5.

159 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change"

(1991) Proceedings Third International Conference on Artificial

Intelligence and Law 69.

160 e.g. P Hammond, "Representation of DHSS Regulations as a Logic

Program" (1983) Proceedings of the 3rd British Computer Society

Expert Systems Conference 225; and the Estate Planning System; q.v.

DA Schlobohm & DA Waterman, "Explanation for an Expert System that

Performs Estate Planning" (1987) Proceedings First International

Conference on Artificial Intelligence and Law 18.

161 RA Kowalski & MJ Sergot, "The Use of Logical Models in Legal

Problem Solving" (1990) 3 Ratio Juris 201 at p.207.

162 DA Schlobohm & LT McCarty, "EPS II: Estate Planning With

Prototypes" (1989) Proceedings Second International Conference on

Artificial Intelligence and Law 1.

163 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change"

(1991) Proceedings Third International Conference on Artificial

Intelligence and Law 69.

164 AvdL Gardner, "Representing Developiong Legal Doctrine" (1989)

Proceedings Second International Conference on Artificial

Intelligence and Law 16 at p.21.

165 Ibidem p.19.

166 A narrow definition of "information" is a common criticism of

modern expert systems; q.v. HL Dreyfus & SE Dreyfus, Mind over

Machine (Basil Blackwell: Oxford, 1986); T Roszak, "The Cult of

Information" (Pantheon: London, 1986); DR Hofstadter, Metamagical

Themas (Penguin Press: London, 1985); and T Winograd & F Flores,

Understanding Computers and Cognition: A New Foundation for Design

(Ablex: Norwood, 1986).

167 RN Moles, "Logic Programming - An Assessment of Its Potential

for Artificial Intelligence Applications in Law" (1991) 2 Journal of

Law and Information Science 137 at p.144.

168 C Biagioli, P Mariani & D Tiscornia, "ESPLEX: a Rule and

Conceptual Based Model for Representing Statutes" (1987) Proceedings

First International Conference on Artificial Intelligence and Law

240 at p.241.

169 C Smith & C Deedman, "The Application of Expert Systems

Technology to Case-Based Reasoning" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 84 at

p.87.

170 RE Susskind, "Expert Systems in Law - Out of the Research

Laboratory and into the Marketplace" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 1 at p.2.

171 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.78.

172 JC Smith & C Deedman, "The Application of Expert Systems

Technology to Case-Based Reasoning" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 84 at

p.85.

173 Approximately 50 texts and 100 articles; q.v. RE Susskind,

"Expert Systems in Law - Out of the Research Laboratory and into the

Marketplace" (1987) Proceedings First International Conference on

Artificial Intelligence and Law 1 at p.2; note the similarity to the

issue of epistemology of law.

174 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.27.

175 For instance, Dworkin's "principles"; q.v. RM Dworkin, Taking

Rights Seriously (Duckworth: London, 1977), RM Dworkin, A Matter of

Principle (Harvard University Press: London, 1985), and RM Dworkin,

Law's Empire (Fontana: London, 1986).

176 RE Susskind, "Expert Systems in Law - Out of the Research

Laboratory and into the Marketplace" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 1 at p.3.

177 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.254.

178 B Niblett, "Expert Systems for Lawyers" (1981) 29 Computers and

Law 2 at p.3.

179 PJ Hayes, "On the Differences Between Psychology and Artificial

Intelligence" in M Yazdani & A Narayanan, Artificial Intelligence:

Human Effects (Ellis Horwood: London, 1984) p.158.

180 DR Hofstader, Gvdel, Escher, Bach: An Eternal Golden Braid

(Harvester Press: New York, 1979) p.578.

181 RA Kowalski, "Leading Law Students to Uncharted Waters and

Making them Think: Teaching Artificial Intelligence and Law" (1991) 2

Journal of Law and Information Science 185 at p.187 nt.5.

182 D Brown, "The Third International Conference on Artificial

Intelligence and Law: Report and Comments" (1991) 2 Journal of Law

and Information Science 233 at p.238.

183 B Niblett, "Expert Systems for Lawyers" (1981) 29 Computers and

Law 2 at p.3.

184 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.7.

185 P Leith, "Clear Rules and Legal Expert Systems" in AA Martino &

F Socci (eds), Automated Analysis of Legal Texts (North-Holland:

Amsterdam, 1986) p.661; and P Leith, "Fundamental Errors in Legal

Logic Programming" (1986) 3 The Computer Journal 29.

186 LT McCarty, "Some Requirements for a Computer-based Legal

Consultant" (Research Report: Rutgers University, 1980) at pp.2-3,

cited in RN Moles, Definition and Rule in Legal Theory: A

Reassessment of HLA Hart and the Positivist Tradition (Basil

Blackwell: Oxford, 1987) p.269; emphasis added.

187 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.53.

188 MA Boden, Artificial Intelligence and Natural Man (Basic Books:

New York, 1977), cited in D Partridge, "Social Implications of

Artificial Intelligence" Chapter Thirteen in M Yazdani (ed.),

Artificial Intelligence: Principles and Applications (Chapman &

Hall: London, 1986) at p.326. See also D Partridge, Artificial

Intelligence: Applications in the Future of Software Engineering

(Ellis Horwood, Chichester, 1986).

189 RN Moles, "Logic Programming: An Assessment of its Potential for

Artificial Intelligence Applications in Law" (1991) 2 Journal of Law

and Information Science 137 at p.161.

190 TJM Bench-Capon, "Deep Models, Normative Reasoning and Legal

Expert Systems" (1989) Proceedings Second International Conference on

Artificial Intelligence and Law 37 at p.37.

191 Glauoma diagnosis system.

192 LT McCarty, "Intelligent Legal Information Systems: Problems and

Prospects" in CM Campbell (ed.), Data Processing and the Law (Sweet &

Maxwell: London, 1984) p.126.

193 SS Weiner, "Reasoning About "Hard" Cases in Talmudic Law" (1987)

Proceedings First International Conference on Artificial Intelligence

and Law 222 at p.223.

194 MJ Sergot, HT Cory, P Hammond, RA Kowalski, F Kriwacek & F

Sadri, "Formalisation of the British Nationality Act" (1986) 2

Yearbook of Law Computers and Technology; and TJM Bench-Capon, GO

Robinson, TW Routen & MJ Sergot, "Logic Programming for Large Scale

Applications in Law" (1987) Proceedings First International

Conference on Artificial Intelligence and Law 190.

195 JC Smith & C Deedman, "The Application of Expert Systems

Technology to Case-Based Reasoning" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 84 at

p.85.

196 RA Kowalski, Case-based Reasoning and the Deep Structure

Approach to Knowledge Representation (1991) Proceedings Third

International Conference on Artificial Intelligence and Law 21 at

p.22.

197 LT McCarty & NS Sridharan, "The Representation of an Evolving

System of Legal Concepts II: Prototypes and Deformations" (1987)

Proceedings of the Seventh International Joint Conference on

Artificial Intelligence 246 at p.250.

198 D Makinson, "How to Give it Up: A Survey of Some Formal Aspects

of the Logic of Theory Change" (1985) 62 Synthise 347.

199 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change"

(1991) Proceedings Third International Conference on Artificial

Intelligence and Law 69 at p.74.

200 See RM Dworkin, Law's Empire (Fontana: London, 1986) pp.250-254

on the difficulty in "compartmentalization" of the law.

201 LT McCarty, "Some Requirements for a Computer-based Legal

Consultant" (Research Report: Rutgers University, 1980) cited in MJ

Sergot, "The Representation of Law in Computer Programs", Chapter One

in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications

(Academic Press: London, 1991) at pp.46-47.

202 P Leith, "Logic, Formal Models and Legal Reasoning" (1984)

Jurimetrics Journal p.334 at p.356.

203 NE Simmonds, "Between Positivism and Idealism" (1991) 50

Cambridge Law Journal 308 at pp.312-313.

204 M Minsky, "A Framework for Representing Knowledge" in J

Haugeland (ed.), Mind Design (MIT Press: Cambridge, 1981) p.95 at

p.100.

205 It is also ironic in light of Hart's alleged reliance on

Wittgenstein's linguistic philosophy; q.v. Cotterrell, The Politics

of Jurisprudence: A Critical Introduction to Legal Philosophy

(Butterworths: London, 1989) pp.89-90.

206 J Vaux, "AI and Philosophy: Recreating Naive Epistemology"

Chapter Seven in KS Gill (ed.), Artificial Intelligence for Society

(John Wiley & Sons: London, 1986) p.76.; q.v. L Wittgenstein,

Philosophical Investigations (Basil Blackwell: London, 1953).

207 RA Kowalski & MJ Sergot, "The Uses of Logical Models in Legal

Problem Solving" (1990) 3 Ratio Juris 201 at p.205.

208 See HLA Hart, "Definition and Theory in Jurisprudence" (1954) 70

Law Quarterly Review 37.

209 L Fuller, "Positivism and Fidelity to Law - A Reply to Professor

Hart" (1958) 71 Harvard Law Review 630 at p.666.

210 Raz's approach to adjudication shares similar characteristics;

q.v. J Raz, "The Problem about the Nature of Law" (1983) 31

University of Western Ontatio Law Review 202 at pp.213-216.

211 HLA Hart, The Concept of Law (Clarendon Press: Oxford, 1961)

p.126.

212 KR Popper, Conjectures and Refutations (4th Ed.) (Routledge and

Kegan Paul: London, 1972) p.46.

213 JW Harris, Law and Legal Science (Clarendon Press: Oxford, 1979)

p.166.

214 M Polyani, Personal Knowledge - Towards a Post-Critical

Philosophy (Routledge and Kegan Paul: London, 1958).

215 DC Berry, "The Problem of Implicit Knowledge" (1987) 4 Expert

Systems 144.

216 DC Berry & A Hart, "The Way Forward" in DC Berry & A Hart (eds)

Expert Systems: Human Issues (MIT: Cambridge, 1990) p.256.

217 E Husserl, Cartesian Meditations (Martinus Nijhoff: The Hague,

1960) pp.54-55.

218 B MacLennan, "Logic for the New AI" in JH Fetzer (ed.), Aspects

of Artificial Intelligence (Kluwer: Dordrecht, 1988) at p.163.

219 C Hempel, Philosophy of Natural Science (Prentice Hall: London,

1966).

220 A Narayanan, "Why AI Cannot be Wrong" Chapter Five in KS Gill

(ed.), Artificial Intelligence for Society (John Wiley & Sons:

London, 1986) at p.48.

221 P Davies, "Living in a Non-Maaterial World - the New Scientific

Consciousness" (1991) The Australian (9th October) pp.18-19 at p.19.

222 RS Pound, "Mechanical Jurisprudence" (1908) 8 Columbia Law

Review 605.

223 J Searle, "Minds, Brains and Programs" (1980) 3 Behavioural and

Brain Sciences 417.

224 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry

(Clarendon Press: Oxford, 1987) p.241.

225 RHS Tur, "Positivism, Principles, and Rules" in E Atwool (ed.),

Perspectives in Jurisprudence (University of Glascow Press: Glascow,

1997) at p.51.

226 EL Rissland & DB Skalak, "Interpreting Statutory Predicates"

(1989) Proceedings Second International Conference on Artificial

Intelligence and Law 46 at p.46.

227 MJ Detmold, "Law as Practical Reason" (1989) 48 Cambridge Law

Journal 436 at p.460.

228 Ibidem p.439.

229 L Fuller, "Positivism and Fidelity to Law - A Reply to Professor

Hart" (1958) 71 Harvard Law Review 630 at p.663; see also RS Summers,

"Professor Fuller on Morality and Law" in RS Summers (ed.), More

Essays on Legal Philosophy: General Assessment of Legal Philosophies

(Basil Blackwell: Oxford, 1971) at pp.117-119.

230 F Schauer, Playing by the Rules: A Philosophical Examination of

Rule-based Decision-Making in Law and in Life (Clarendon Press:

Oxford, 1991) at pp.59-60.

231 HLA Hart, The Concept of Law (Clarendon Press: Oxford, 1961)

p.56.

232 H Williamson, "Some Implications of Acceptance of Law as Rule

Structure" (1967) 3 Adelaide Law Review 18 at pp.42-43.

233 c.f. A Glass, "Interpretive Practices in Law and Literary

Criticism" (1991) 7 Australian Journal of Law & Society 16.

234 DN Herman, "Phenomonology, Structuralism, Hermeneutics, and

Legal Study: Applications of Contemporary Continental Though to Legal

Phenomena" (1982) 36 University of Miami Law Review 379.

235 P Linzer, "Precise meaning and Open Texture in Legal Writing and

Reading" Chapter Two in C Walter (ed.), Computer Power and Legal

Language (Quorum: London, 1988).

236 M Weait, "Swans Reflecting Elephants: Imagery and the Law"

(1992) 3 Law and Critique 59 at p.66.

237 P Gabel & P Harris, "Building Power and breaking Images:

Critical Legal Theory and the Practice of Law" (1982-83) 11 Review of

Law & Social Change 369 at p.370.

238 HL Dreyfus, "From Micro-Worlds to Knowledge Representation: AI

at an Impasse" in J Haugeland (ed.), Mind Design (MIT Press:

Cambridge, 1981) p.161 at p.170.

239 DG Bobrow & T Winograd, "An Overview of KRL, A Knowledge

Representation Language" (1977) 1 Cognitive Science 3 at p.32.

240 C Fried, "Sonnet LXV and the 'Black Ink' of the Framer's

Intention" (1987) 100 Harvard Law Review 751 at pp.757-758.

241 J Weizenbaum, Computer Power and Human Reason: From Judgment to

Calculation (WH Freeman & Co: San Francisco, 1976) cited in D

Partridge, "Social Implications of Artificial Intelligence" Chapter

Thirteen in M Yazdani (ed.), Artificial Intelligence: Principles and

Applications (Chapman & Hall: London, 1986) at pp.330-331.

242 Contra note MA Boden, "AI and Human Freedom" in M Yazdani & A

Narayanan (eds), Artificial Intelligence: Human Effects (Ellis

Horwood: Chichester, 1984).

243 C Tapper, "Lawyers and Machines" (1963) 26 Modern Law Review

121.

244 Ibidem p.128.

245 c.f. TJM Bench-Capon, "Deep Models, Normative Reasoning and

Legal Expert Systems" (1989) Proceedings Second International

Conference on Artificial Intelligence and Law 37 at p.42.

246 J Zeleznikow, "Building Intelligent Legal Tools - The IKBALS

Project" (1991) 2 Journal of Law and Information Science 165.

247 e.g. DE Wolstenholme, "Amalgamating Regulation and Case-based

Advice Systems through Suggested Answers" (1989) Proceedings Second

International Conference on Artificial Intelligence and Law 63.

248 c.f. R Wright, "The Cybernauts have Landed" (1991) Law Institute

Journal 490 at p.491.

249 C Tapper, "Lawyers and Machines" (1963) 26 Modern Law Review 121

at p.126.

250 q.v. PJ Ward, "Computerisation of Legal Material in Australia"

(1982) 1 Journal of Law and Information Science 162.

251 J Bing, "The Text Retrieval System as a Conversation Partner" in

C Arnold (ed.) Yearbook of Law, Computers and Technology

(Butterworths: London, 1986) p.25.

252 G Greenleaf, "Australian Approaches to Computerising Law -

Innovation and Integration" (1991) 65 Australian Law Journal 677.

253 SJ Latham, "Beyond Boolean Logic: Probabilistic Approaches to

Text Retrieval" (1991) 22 The Law Librarian 157.

254 J Bing, "Legal Text Retrieval Systems: The Unsatisfactory State

of the Art" (1986) 2 Journal of Law and Information Science 1 at

pp.16-17.

255 RM Tong, CA Reid, GJ Crowe & PR Douglas, "Conceptual Legal

Document Retrieval Using the RUBRIC System" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 28; and

J Bing, "Designing Text Retrieval Systems for 'Conceptual Searching'"

(1987) Proceedings First International Conference on Artificial

Intelligence and Law 43.

256 J Kolodner, "Maintaining Organisation in a Dynamic Long-Term

Memory" (1983) 7 Cognitive Science; CD Hafner, "Conceptual

Organisation of Case Law Knowledge Bases" (1987) Proceedings First

International Conference on Artificial Intelligence and Law 35.

257 T Mitchell, "Learning and Problem Solving" (1983) Proceedings of

International Joint Conference on Artificial Intelligence.

258 DE Rose & RK Belew, "Legal Information Retrieval: A Hybrid

Approach" (1989) Proceedings Second International Conference on

Artificial Intelligence and Law 138.

259 CD Hafner, "Conceptual Organisation of Case Law Knowledge Bases"

(1987) Proceedings First International Conference on Artificial

Intelligence and Law 35.

260 LT McCarty, "On the Role of Prototypes in Appellate Legal

Argument" (1991) Proceedings Third International Conference on

Artificial Intelligence and Law 185 at p.186.

261 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation

of Open Texture in Law" Chapter Six in C Walter (ed.), Computer Power

and Legal Language (Quorum: London, 1988) at p.58.

262 KD Ashley, "Toward a Computational Theory of Arguing with

Precedents: Accomodating Multiple Interpretations of Cases" (1989)

Proceedings Second International Conference on Artificial

Intelligence and Law 99; and KD Ashley & EL Rissland, "But See,

Accord: Generating 'Blue Book' Citations in HYPO" (1987) Proceedings

First International Conference on Artificial Intelligence and Law

67.

263 EL Rissland, "Examples in Legal Reasoning: Legal Hypotheticals"

(1983) Proceedings Eighth International Joint Conference on

Artificial Intelligence 90; EL Rissland & EM Soloway, "Overview of

an Example Generation System" (1980) Proceedings First Annual

National Conference on Artificial Intelligence; and EL Rissland, EM

Valcarce & KD Ashley, "Explaining and Arguing with Examples" (1984)

Proceedings National Conference on Artificial Intelligence.

264 CC Marshall, "Representing the Structure of a Legal Argument"

(1989) Proceedings Second International Conference on Artificial

Intelligence and Law 121. On the structure of legal argument see S

Toulmin, The Uses of Argument (Cambridge University Press: Cambridge,

1958); S Toulmin, RD Reike, & A Janik, An Introduction to Reasoning

(MacMillan Press: New York, 1979); and C Perelman, The Idea of

Justice and the Problem of Argument (Routledge & Kegan Paul: London,

1963).

265 KD Ashley & EL Rissland, "Toward Modelling Legal Argument" in AA

Martino & F Socci (eds), Automated Analysis of Legal Texts

(North-Holland: Amsterdam, 1986) at p.19; also KD Ashley, "Toward a

Computational Theory of Arguing with Precedents" (1989) Proceedings

Second International Conference on Artificial Intelligence and Law

93.

266 EL Rissland, "Learning How to Argue: Using Hypotheticals" (1984)

Proceedings First Annual Conference on Theoretical Issues in

Conceptual Information Processing; EL Rissland, "Argument Moves and

Hypotheticals" in C Walter (ed.), Computing Power and Legal Reasoning

(West Publishing: St Paul, 1985).

267 RH Michaelson, "An Expert System for Federal Tax Planning"

(1984) 1 Expert Systems 2.

268 e.g. the Retirement Pension Forecast and Advice System (relying

on the Aion Development System shell) q.v. S Springel-Sinclair & G

Trevena, "The DHSS Retirement Pension Forecast and Advice System" in

P Duffin (ed.) Knowledge Based Systems: Applications in

Administrative Government (Ellis Horwood: Chichester, 1988).

269 G De Jong, "Towards a Model of Conceptual Knowledge Acquisition

Through Directed Experimentation" (1983) Proceedings of International

Joint Conference on Artificial Intelligence.

270 Legal Decision-making System; q.v. DA Waterman & MA Peterson,

"Rule-based Models of Legal Expertise" (1980) Proceedings First

Annual National Conference on Artificial Inelligence 272; and DA

Waterman & MA Peterson, "Evaluating Civil Claims: An Expert Systems

Approach" (1984) Expert Systems 1.

271 q.v. DA Waterman, RH Anderson, F Hayes-Roth, P Klahr, G Martins

& SJ Rosenschein, Design of a Rule-Oriented System for Implementing

Expertise (Rand Corporation: Santa Monica, 1979).

272 System for Asbestos Litigation; q.v. DA Waterman, J Paul & MA

Peterson, "Expert Systems for Legal Decision Making" (1986) 4 Expert

Systems 212.

273 G Greenleaf, "Australian Approaches to Computerising Law -

Innovation and Integration" (1991) 65 Australian Law Journal 677 at

p.679.

274 SS Nagel & R Barczyk, "Can Computers Aid the Dispute Resolution

Process?" (1988) 71 Judicature 253.

275 q.v. WM Bain, Toward a Model of Subjective Interpretation

(Department of Commerce Research Report: Yale University, 1984) cited

in MJ Sergot, "The Representation of Law in Computer Programs"

Chapter One in TJM Bench-Capon (ed.), Knowledge-Based Systems and

Legal Applications (Academic Press: London, 1991) p.16.

276 S Torrance, "Breaking out of the Chinese Room" in M Yazdani

(ed.), Artificial Intelligence: Principles and Applications (Chapman

& Hall: London, 1986) at p.301.

277 TW Bynum, "Artificial Intelligence, Biology, and Intentional

States" (1985) 16 Metaphilosophy 355.

278 T Cuda, "Against Neural Chauvanism" (1985) 48 Philosophical

Studies 111.

279 DE Rumelhart, JL McClelland and the PDP Research Group, Parallel

Distributed Processing: Explorations in the Microstructure of

Cognition (MIT Press: Cambridge, 1986).

280 AL Tyree, "The Logic Programming Debate" (1992) 3 Journal of Law

and Information Science 1111 at p.115.

281 q.v. RA Clarke, Knowledge-Based Expert Systems: Risk Factors and

Potentially Profitable Application Areas(Working paper: Department of

Commerce, Australian National University, 1988).

282 M Aultman, "Technology and the End of Law" (1972) 17 American

Journal of Jurisprudence 46 at pp.49-52.