This article includes a list of general references, but it lacks sufficient corresponding inline citations. (October 2024) |
The algorithmic state machine (ASM) is a method for designing finite state machines (FSMs) originally developed by Thomas E. Osborne at the University of California, Berkeley (UCB) since 1960,[1] introduced to and implemented at Hewlett-Packard in 1968, formalized and expanded since 1967 and written about by Christopher R. Clare since 1970.[2][3][4] It is used to represent diagrams of digital integrated circuits. The ASM diagram is like a state diagram but more structured and, thus, easier to understand. An ASM chart is a method of describing the sequential operations of a digital system.
The ASM method is composed of the following steps:
An ASM chart consists of an interconnection of four types of basic elements: state name, state box, decision box, and conditional outputs box. An ASM state, represented as a rectangle, corresponds to one state of a regular state diagram or finite state machine. The Moore type outputs are listed inside the box.
State Name: The name of the state is indicated inside the circle and the circle is placed in the top left corner or the name is placed without the circle.
State Box: The output of the state is indicated inside the rectangle box
Decision Box: A diamond indicates that the stated condition/expression is to be tested and the exit path is to be chosen accordingly. The condition expression contains one or more inputs to the FSM (Finite State Machine). An ASM condition check, indicated by a diamond with one input and two outputs (for true and false), is used to conditionally transfer between two State Boxes, to another Decision Box, or to a Conditional Output Box. The decision box contains the stated condition expression to be tested, the expression contains one or more inputs of the FSM.
Conditional Output Box: An oval denotes the output signals that are of Mealy type. These outputs depend not only on the state but also the inputs to the FSM.
Once the desired operation of a circuit has been described using RTL operations, the datapath components may be derived. Every unique variable that is assigned a value in the RTL program can be implemented as a register. Depending on the functional operation performed when assigning a value to a variable, the register for that variable may be implemented as a straightforward register, a shift register, a counter, or a register preceded by a combinational logic block. The combinational logic block associated with a register may implement an adder, subtractor, multiplexer, or some other type of combinational logic function.
Once the datapath is designed, the ASM chart is converted to a detailed ASM chart. The RTL notation is replaced by signals defined in the datapath.
The second annual IEEE Workshop on Microprocessors (now called the Asilomar Microcomputer Workshop, or AMW) was held Wednesday–Friday, April 28–30, 1976, near Monterey, California […] My Wednesday evening talk described tools that enabled a very different design methodology—Algorithmic State Machine design (ASM)—using Lyapunov state-variable mathematics, and derivative techniques pioneered at HP by Chris Clare and Dave Cochran for the spectacularly successful handheld scientific calculators (e.g., HP 35) […] My point: circuit design was no longer an element-by-element issue, but a question of "state flow" at lots of nodes—the sequential "words" of registers rather than the voltages of device pins. In effect, it argued that electronic voltages, whether analogic or switched, would "lose out" to software instructions, and "data states." Systems would be designed and analyzed for proper state sequencing rather than analogic signal distortion or digital switching times. […] I'd already seen the power of pre-publication books. Clare's insightful ASM methodology text, Designing Logic Systems Using State Machines, swept through the HPdesign community […] Stanford's electrical engineering department was not so sanguine, however, canceling Clare's course in 1974, saying that "it is a little bit too unconventional" […] Stanford preferred Quine–McCluskey minimization techniques. Fittingly, Mead's Caltech colleague Ivan Sutherland prepared a Scientific American article (1977) […] about the challenge microelectronics posed to computing theory and practice, noting that since most of a chip's surface was occupied by "wires" (conducting pathways) rather than "components" (transistors), decades of minimization theory in logic design had become irrelevant […](4 pages)
[…] In your April issue you published a letter by R. L. Dineley describing a simple method for treating product-of-sums logical expressions. […] An even simpler method is taught by D. A. Huffman. This method is based on recognizing that the Boolean expression will be zero when any of the factors in the product-of-sums form is zero. Plotting zeroes of factors on a Veitch diagram or Karnaugh map is as easy as locating ones for a sum-of-products expression. […] To illustrate, using Dineley's example (A+BC)(A+C): […] The zeroes resulting from A+BC will be located wherever both A and BC are zero. Therefore we locate on the map the expression A*BC (which is equal to A*B + A*C). Similarly the zeroes of A+C are located and plotted at A*C. With all zeroes located, the rest of the map can be filled with ones. One can be a little more formal and work out algebraically the logical complement of the expression under consideration and then plot zeroes for that resulting expression. In a simple product-of-sums representation, however, the complementary terms can be written by inspection; or the zeroes can be plotted by inspection without writing the complete expression […] "Classical Reduction Involving Infrequently Used Variables" October 11, 1968. University of Santa Clara […] Mr. Osborne's work draws close similarity to that I presented in this article and thus, would certainly be of interest to those readers seeking further information. I understand he has done work to apply the technique of infrequent variables to the design of sequential networks constructed from Read Only Memory. Since he has not yet published anything on this area, if readers would like additional information, they can write Mr. Osborne at: […] Thomas E. Osborne […] Building 1U […] 1501 Page Mill Road […] Palo Alto, California […] Thank you for the opportunity to publish with you. […] G. W. Schultz […] Central Data Systems, Inc. […] Sunnyvale, Calif.(1 page) (NB. Osborne's method was later published by Clare.[B])
[…] An important contribution to the adaptation of theory to practice was made by Schultz [20]; he draws upon the designer's basic understanding of the problem and requires him to identify the "infrequent variables." Loosely defined, these variables do not relate to all internal states, i.e., they are not needed to define every state. In essence, the infrequent variables are relevant to only a few (perhaps one or two) states or state transitions. Schultz suggests that the designer first translate the verbal problem to a state transition graph that is reduced. The internal states are encoded and then information regarding infrequent variables is added to the appropriate state transitions. A "first approximation" to flip-flop input equations is made, based only upon the frequent variables. Schultz demonstrates how these equations can subsequently be modified to incorporate transitions controlled by the infrequent variables. In Schultz's examples the infrequent variables are all input signals, but this idea also applies to internal state variable signals that may be considered "infrequent." In this case, for example, an infrequent internal state variable flip-flop might be set by a particular circumstance and reset sometime later. The output of the flip-flop may now be treated as an infrequent input variable. […](ix+1+179+3 pages)