The explicit-control evaluator of section 5.4 is a register machine whose controller interprets Scheme JavaScript programs. In this section we will see how to run Scheme JavaScript programs on a register machine whose controller is not a Scheme JavaScript interpreter.
The explicit-control evaluator machine is universal—it can carry out any computational process that can be described in Scheme. JavaScript. The evaluator's controller orchestrates the use of its data paths to perform the desired computation. Thus, the evaluator's data paths are universal: They are sufficient to perform any computation we desire, given an appropriate controller.[1]
Commercial general-purpose computers are register machines organized around a collection of registers and operations that constitute an efficient and convenient universal set of data paths. The controller for a general-purpose machine is an interpreter for a register-machine language like the one we have been using. This language is called the native language of the machine, or simply machine language. Programs written in machine language are sequences of instructions that use the machine's data paths. For example, the explicit-control evaluator's instruction sequence can be thought of as a machine-language program for a general-purpose computer rather than as the controller for a specialized interpreter machine.
There are two common strategies for bridging the gap between higher-level languages and register-machine languages. The explicit-control evaluator illustrates the strategy of interpretation. An interpreter written in the native language of a machine configures the machine to execute programs written in a language (called the source language) that may differ from the native language of the machine performing the evaluation. The primitive procedures functions of the source language are implemented as a library of subroutines written in the native language of the given machine. A program to be interpreted (called the source program) is represented as a data structure. The interpreter traverses this data structure, analyzing the source program. As it does so, it simulates the intended behavior of the source program by calling appropriate primitive subroutines from the library.
In this section, we explore the alternative strategy of compilation. A compiler for a given source language and machine translates a source program into an equivalent program (called the object program) written in the machine's native language. The compiler that we implement in this section translates programs written in Scheme JavaScript into sequences of instructions to be executed using the explicit-control evaluator machine's data paths.[2]
Compared with interpretation, compilation can provide a great increase in the efficiency of program execution, as we will explain below in the overview of the compiler. On the other hand, an interpreter provides a more powerful environment for interactive program development and debugging, because the source program being executed is available at run time to be examined and modified. In addition, because the entire library of primitives is present, new programs can be constructed and added to the system during debugging.
In view of the complementary advantages of compilation and interpretation, modern program-development environments pursue a mixed strategy. Lisp interpreters These systems are generally organized so that interpreted procedures functions and compiled procedures functions can call each other. This enables a programmer to compile those parts of a program that are assumed to be debugged, thus gaining the efficiency advantage of compilation, while retaining the interpretive mode of execution for those parts of the program that are in the flux of interactive development and debugging.[3] In section 5.5.7, after we have implemented the compiler, we will show how to interface it with our interpreter to produce an integrated interpreter-compiler development system.
Our compiler is much like our interpreter, both in its structure and in the function it performs. Accordingly, the mechanisms used by the compiler for analyzing expressions components will be similar to those used by the interpreter. Moreover, to make it easy to interface compiled and interpreted code, we will design the compiler to generate code that obeys the same conventions of register usage as the interpreter: The environment will be kept in the env register, argument lists will be accumulated in argl, a procedure function to be applied will be in proc, fun, procedures functions will return their answers in val, and the location to which a procedure function should return will be kept in continue. In general, the compiler translates a source program into an object program that performs essentially the same register operations as would the interpreter in evaluating the same source program.
This description suggests a strategy for implementing a rudimentary compiler: We traverse the expression component in the same way the interpreter does. When we encounter a register instruction that the interpreter would perform in evaluating the expression, component, we do not execute the instruction but instead accumulate it into a sequence. The resulting sequence of instructions will be the object code. Observe the efficiency advantage of compilation over interpretation. Each time the interpreter evaluates an expression—for example, a component—for example, (f 84 96)—it f(96, 22)—it performs the work of classifying the expression component (discovering that this is a procedure function application) and testing for the end of the operand list (discovering that there are two operands). list of argument expressions (discovering that there are two argument expressions). With a compiler, the expression component is analyzed only once, when the instruction sequence is generated at compile time. The object code produced by the compiler contains only the instructions that evaluate the operator and the two operands, function expression and the two argument expressions, assemble the argument list, and apply the procedure (in proc) function (in fun) to the arguments (in argl).
This is the same kind of optimization we implemented in the analyzing evaluator of section 4.1.7. But there are further opportunities to gain efficiency in compiled code. As the interpreter runs, it follows a process that must be applicable to any expression component in the language. In contrast, a given segment of compiled code is meant to execute some particular expression. component. This can make a big difference, for example in the use of the stack to save registers. When the interpreter evaluates an expression, a component, it must be prepared for any contingency. Before evaluating a subexpression, subcomponent, the interpreter saves all registers that will be needed later, because the subexpression subcomponent might require an arbitrary evaluation. A compiler, on the other hand, can exploit the structure of the particular expression component it is processing to generate code that avoids unnecessary stack operations.
As a case in point, consider the combination (f 84 96). application f(96, 22). Before the interpreter evaluates the operator of the combination, function expression of the application, it prepares for this evaluation by saving the registers containing the operands argument expressions and the environment, whose values will be needed later. The interpreter then evaluates the operator function expression to obtain the result in val, restores the saved registers, and finally moves the result from val to proc. fun. However, in the particular expression we are dealing with, the operator function expression is the symbol name f, whose evaluation is accomplished by the machine operation lookup-variable-value, lookup_symbol_value, which does not alter any registers. The compiler that we implement in this section will take advantage of this fact and generate code that evaluates the operator function expression using the instruction
Original | JavaScript |
(assign proc (op lookup-variable-value) (const f) (reg env)) | assign("fun", list(op("lookup_symbol_value"), constant("f"), reg("env"))) |
A compiler can also optimize access to the environment. Having
analyzed the code, the compiler can
in many cases
know in which frame
a particular variable
the value of a particular name
will be located and access that frame directly,
rather than performing the
lookup-variable-value
lookup_symbol_value
search. We will discuss how to implement such
variable access
lexical addressing
in
section 5.5.6. Until then, however,
we will focus on the kind of register and stack optimizations described
above. There are many other optimizations that can be performed by a
compiler, such as coding primitive operations in line
instead
of using a general apply mechanism (see
exercise 5.42); but we will not emphasize these
here. Our main goal in this section is to illustrate the compilation process
in a simplified (but still interesting) context.