MODELS OF MEMORY- The Atkinson-Shiffrin Model
- The Levels-of Processing Approach
- Tulving's Model
- The Parallel Distributed Processing Approach
THE ATKINSON-SHIFFRIN MODEL
A structural model that suggests three storage systems (places); Sensory Store, Short-Term Memory (STM), Long-Term Memory (LTM).
Information moves through these systems under the control of various cognitive processes (attention, rehearsal, etc.).
The distinctions among the three structures is made on the basis of four characteristics which emerged from the literature; capacity, duration, type of code, and mechanism of loss.
The model originated as a way of explaining some of the early research findings which argued for a duplex theory.
The study by Kintsch and Buschke (1969), for example, shows that items at the beginning of a list are coded semantically as characteristic of LTM while items at the end are coded acoustically (STM).
Over time, however, the distinctions marking the boundaries of the structures have begun to blur in more recent research.
After having been abandoned for a while in favor of the Levels-of-Processing model, most people have returned to some form of duplex theory.
THE LEVELS-OF-PROCESSING APPROACH
This approach suggests that deeper levels of processing produce better retention than shallow levels (Craik & Lockhart, 1972).
Depth is interpreted in terms of meaning -- A dimension which starts with the physical characteristics, through verbal-acoustic, to semantic.
Leads to distinction between elaborative rehearsal and maintenance rehearsal.
Explained by the fact that as you go from shallow to deep processing, more links to other elements in memory are established.
Can be understood in term of Distinctiveness and elaboration.
Theory was supported in a series of classic studies e.g., Craik & Lockhart, 1972; Craik & Tulving, 1975.
Also possible to interpret literature on The Self-Reference Effect in terms of elaboration.
The theory was quite popular for a while as a replacement for the Duplex Theory of memory.
Lost status as a result of a meta-systemic problem, a problem of circularity--there is no independent way of defining depth of processing.
In the end, the Depth of Processing Model was reintegrated with the Duplex Theory.
TULVING'S MODEL
Tulving's (1972) model focuses on the nature of the material that is stored and distinguishes three kinds of memory ased on content.
EPISODIC MEMORY stores information about when events happened and the relationship between those events. Relates to personal experience.
SEMANTIC MEMORY is the organized knowledge about the world. Essentially all the "facts" we have accumulated.
PROCEDURAL MEMORY involves knowing how to "do" something. Relates to skill learning, the connections between stimuli and motor behavior.
Underwood et al. (1978) tested 200 college students on 28 measures of episodic memory and 5 measures of semantic memory. In terms of performance, measures between the two kinds of memory were uncorrelated.
Shoben et al. (1978) found that variables known to influence semantic memory affected performance on a semantic memory task but not an episodic memory task. However, there is about an equal amount of contrary evidence. See Ratcliff & McKoon (1978) which found that episodic memory emphasizes conceptual rather than temporal relationships as suggested by Tulving.
Ratcliff (1986) found that episodic memory could be recalled quite quickly.
Nueroscience research is likewise not conclusive.
In general, researchers are skeptical about Tulving's distinction between episodic and semantic memory and it has not been retained.
The distinction between these two and procedural memory, however, is being retained as a useful one supported by a considerable body of evidence.
PARALLEL DISTRIBUTED PROCESSING MODELS
Cognitive processes including memory can be conceived as networks in which the elements exhibit multiple links.
These can be abstract in nature where nodes in a hypothetical network are connected by associative links, or biological in nature where neurons (or neuron like units) are linked.
Based on cognitive science assumptions that intelligent systems can be built out of dumb elements. The intelligence is in the connections.
Units in the system, through their many links, may affect other units through excitation or inhibition.
Representations (knowledge, memory) exist in the resultant patterns of activation that occur in the network.
Local processes giving rise to these patterns occur in parallel at distributed sites.
Memory storage is content addressable.
Every new event changes the strength of the connections among the relevant units.
The last three characteristics account for our ability to reconstruct material on the basis of incomplete or faulty information.
A highly abstract and theoretical approach (analogous to theoretical physics) that may be so significantly ahead of present methodology as to make hypothesis testing difficult.
Used mostly to explain earlier literature.
An exciting new development which we have just begun to explore.