Speaker 1: Jeroen G. W. Raaijmakers
(Yayoi Miyaji, Kobe College 宮地弥生、神戸女学院大学)
Saturday January 11, 9:50 – 11:10
Modeling implicit and explicit memory
Jeroen G.W. Raaijmakers
(University of Amsterdam, The Netherlands)
Over the past 25 years several quite successful models have been developed for episodic memory. An important characteristic of a number of these approaches is that they are general theories of memory rather models for specific experimental paradigms such as recognition or recall of paired associates. However, only a few of these theories have attempted to explain implicit memory phenomena although such phenomena have been the subject of many recent experiments. In this lecture I will discuss how explicit and implicit memory phenomena might be treated within a unified framework for human memory. I will argue that many implicit memory phenomena can be explained within a framework that does not rely on the assumption that implicit memory is based on a separate memory system (e.g., the perceptual representation system assumed by Schacter and many others). The framework that we have been working on in recent years is based on the assumption that implicit memory phenomena can be explained by a semantic-lexical memory system that is dynamic in nature and sensitive to contextual and perceptual aspects of the stimuli.
Speaker 2: Chizuko Izawa
Saturday January 11, 11:10 - 12:30
In search of optimum learning:Psychophysiological similarities and differences between study (S) and test (T) trials and effects of study-test (S-T) presentation programs
Chizuko Izawa
(Tulane University, U.S.A.)
Robert G. Hayden
(Tulane University, U.S.A.)
Michael Franklin
(Tulane University, U.S.A.)
Edward Katkin
(State University of New York at Stony Brook, U.S.A.)
To optimize learning, we examined general learning theories, total time (TTH), S (study) = T (test), and S-T-R(rest) presentation program hypotheses, via hitherto unexamined psychophysiological reactions to five S-T presentation programs: SSSSSSST, SSST, ST, STTT, and STTTTTTT repetitive patterns. Fifty college-freshmen learned a 20-pair list, while GSR (Galvanic Skin Responses) and HR (heart rates) were recorded. Learning curve analyses affirmed large differences among presentation programs. Each response measured differed significantly as a function of cycles/total time. HR and GSR revealed overall habituation from early to late acquisition stages.
New discoveries include: (a) S and T trials differed significantly in HR and GSR, respectively. Over both successive S and T trials (blocks), (b) HR remained stationary from the first to the last trial of each block, (c) while GSR declined significantly within the S or T block earlier in acquisition, it became asymptotic subsequently. (d) The main S-T program effects were very large for GSR, but smaller for HR. (e) However, S-T program effects interacted with S and T trial differentials significantly both in HR and GSR. (f) Alertness (HR) on S trials was greatest for the highest S density program but decreased generally as the T density increased. (g) The same interactions were more dramatic in GSR: A complete reversal occurred in nervous perspirations from the highest S density to the highest T density program. (h) The greater the T trial density, the more efficient learning per S trial, the higher the attention activation and comfort (less perspiration)! Izawa’s S-T-R presentation program hypothesis is well supported, while its alternatives are rejected.
Criterion-run learning curve analyses supported all-or-none learning theory, but not incremental leaning theory. Most intriguingly, however, physiological data disclosed patterns, which shed new light on this classic controversy.
Speaker 3: Charles J. Brainerd
(Takafumi Terasawa, Tazuko Aoki, Okayama University
寺澤孝文、青木多寿子、岡山大学)
Saturday January 11, 14:00 – 15:20
Fuzzy-trace theory and memory
Charles J. Brainerd
(University of Arizona, U.S.A)
Fuzzy-trace theory (FTT) is a model of memory, higher reasoning processes, and the interface between the two. Early work on FTT was motivated by findings about how the validity of solutions to reasoning problems (e.g., decision making, deductive inference, quantitative judgment) is related to memory for background facts that determine which solutions are valid (e.g., the premises that authorize deductive inferences). An especially surprising datum was that reasoning accuracy proved to be largely independent of memory accuracy. Later work has focused on memory mechanisms, particularly memory falsification mechanisms, that would be needed to account for such findings. This later work will be sketched in the current presentation, whereas the reasoning side of FTT will be covered in the companion presentation by Reyna.
In false-memory research, FTT’s emphasis is on exploiting a few explanatory principles, which are empirically well-grounded, to predict and control specific memory illusions, such as semantic intrusions in free recall or false recognition of implied inferences from narratives or the Deese/Roediger/McDermott illusion. Most of this work has revolved around five principles: (1) parallel storage of verbatim traces (exact surface form of experience) and gist traces (meanings, relations, patterns); (2) dissociated retrieval of verbatim and gist traces; (3) opposing effects of verbatim and gist retrieval on false-memory responses; (4) developmental variability in storage, retrieval, and retention of verbatim and gist traces; and (5) the influence of verbatim and gist retrieval on remembering phenomenologies.
These principles have been used to predict a series of false-memory effects, with counterintuitive effects being foci of attention. Experimental evidence on five such predicted effects will be presented: (1) experimental and statistical dissociations between true and false memories; (2) long-term persistence of false memories; (3) creation of false memories via mere memory testing; (4) age increases in false-memory illusions during childhood; and (5) illusory vivid phenomenology (phantom recollection) provoked by false-memory illusions.
Speaker 4: Valerie F. Reyna
(Shigeru Ono, Tokyo Metropolitan University 小野 滋、東京都立大学)
Saturday January 11, 15:20 - 16:40
Fuzzy-trace theory, judgment, and decision-making
Valerie F. Reyna
(University of Arizona, U.S.A.)
Fuzzy-trace theory is a framework for understanding memory, reasoning, and their relationships. In contrast to either heuristics-and-biases or adaptive-ecological approaches, fuzzy-trace theory embraces inconsistencies in human reasoning by assuming opposing dual processes: Intuitive gist-based processing and analytical verbatim-based reasoning are options in a cognitive menu from which children and adults make selections, depending on the task. However, unlike traditional dual-process approaches to reasoning, intuition is assumed to be an advanced mode of thought.
Recent advances in memory research are used to construct an integrative theory of judgment and decision-making, with illustrations from real-world contexts such as medicine. A common core of theoretical principles is used to explain decision-making involving genetic counseling, informed consent, cardiovascular risk, and reducing sexual risk taking. Key principles include: (1) Reasoners encode multiple gist and verbatim representations, which confer cognitive flexibility. (2) However, reasoning operates at the least precise level of gist that the task allows, increasingly so with the development of expertise. (3) This simplified, qualitative processing is not a result of computational complexity, but is the default mode of reasoning. (4) Although simplified, qualitative processing protects against many errors, it also leads to predictable pitfalls in reasoning, and these change with development.
Results indicate that more advanced reasoners (adults and older children compared to younger children; medical students and trainees compared to specialists) process fewer dimensions of information, and process them qualitatively rather than quantitatively in order to make decisions. Rather than classifying reasoning as rational or irrational, degrees of rationality are proposed based on the processing underlying different kinds of errors across many tasks (e.g., framing tasks, syllogistic reasoning, conjunctive and disjunctive probability judgment, base-rate neglect, and others). Therefore, rationality is not an immutable trait, but changes from task to task and from one stage of development to another.
Speaker 5: Michael Humphreys
(Ryuta Iseki, University of Tsukuba 井関龍太,筑波大学)
Sunday January 12, 9:20 - 10:40
Recollection and familiarity
Michael Humphreys
(University of Queensland, Australia.)
We consider evidence from a variety of sources in order to test the assumption that recollection and familiarity are present at the item level. A review of previous arguments reveals that they do not address independence at this level. Previous research also suggests that an implausible tradeoff is required if the independence assumption is to be maintained. An examination of conditional probabilities in a two test procedure reveals that words which support recollection under one set of test instructions support familiarity under other sets. Contextual reinstatement effects are also examined. However, a failure to replicate previous findings in spite of having more than twice as many observations prevented us from a definitive test of the independence assumption.
Speaker 6: Richard M. Shiffrin
(Hideaki Shimada, University of Tsukuba 島田英昭,筑波大学)
Sunday January 12, 10:40 - 12:00
Keynote Speech
Bayesian modeling of memory and perception
Richard M. Shiffrin
(Indiana University, U.S.A.)
I present a framework for modeling memory, retrieval, perception, and their interactions. The models are inspired by Bayesian induction to determine optimal decisions, in the face of a memory system with inherently noisy storage and retrieval. The original origins of the modeling enterprise precede the Bayesian approach: They begin with the Atkinson and Shiffrin article in the 1960s, emphasizing the distinction between short- and long-term memory, and the control processes of short term memory, and include the SAM modeling of Raaijmakers and Shiffrin at the start of the 1980s that highlighted retrieval from long-term memory. The starting point for the Bayesian modeling was the Retrieving Effectively from Memory (REM) model for episodic recognition (Shiffrin & Steyvers, 1997), but it should be noted that this model was a natural outgrowth of the earlier modeling efforts and remains largely consistent with them.
The general REM framework describes: 1) the storage of episodic traces, the accumulation of these into knowledge (e.g. lexical/semantic traces in the case of words), and the changes in knowledge caused by learning; 2) the retrieval of information from episodic memory and from general knowledge; 3) decisions concerning storage, retrieval and responding. I give examples of applications to episodic recognition, and episodic cued and free recall, perceptual identification (naming, yes-no and forced choice), lexical decision, and long-term and short-term priming.
Speaker 7: Jun Kawaguchi
Sunday January12, 13:30 - 14:20
Interaction between memory and environment:
Automatic and intentional processes
Jun Kawaguchi
(Nagoya University, Japan)
People make use of a variety of strategies and tools in order to memorize ordinary things and events, so that they do everyday activities without any trouble. For example, people may try to keep the name of casual acquaintance by the method of voluntary imagery, or may write an important promise in their notebook. This implies that both internal and external memory may work well together in our cognitive activities. The purpose of this study is to elucidate the relationship between internal process (e.g., memory strategies) and external environment (e.g., memory tools). The first part of this talk will show how people use these kinds of strategies by questionnaire study. This survey shows people mainly depend on external memory strategies rather than internal memory strategies. Furthermore, the way of using these strategies may change as age. The second part of the talk shows the experiment on memory for schedule. This suggested the retrieval of schedule was influenced by the condition of encoding environment (calendar format). Because this was an incident memory experiment, the encoding of environment (calendar) might be automatic. I will close my talk to show some comments on the interaction between internal memory processes and external environment.
Speaker 8: Alice Healy
(Hama Watanabe, Nagoya University 渡辺はま、名古屋大学)
Sunday January 12, 14:20 - 15:40
Optimizing the speed, durability, and transferability of training
Alice F. Healy
(University of Colorado, U.S.A.)
Our research program aims to develop principles that optimize simultaneously all three characteristics of training--speed, durability, and transferability. Such simultaneous optimization would not necessarily optimize any one characteristic individually but would require instead a balanced consideration of all three characteristics. The balance of the three aspects of training is not fixed across tasks or even within a given task but rather may depend on a variety of external factors, such as stress, frustration, fatigue, rapid presentation of information, and information load, that can change from time to time. Variations in any one of these factors can affect the interaction of these aspects of training. Although many of our studies have overlapping goals, we have divided them into several major groups. The first is concerned with managing factual overload, rapidly presented information, stress, frustration, and fatigue, with an emphasis on tasks involving perceptual and motoric processing. The second is addressed to a consideration of optimizing the balance of the three major aspects of training. The studies I will summarize illustrate our current work in these two groups. The experiments I will report from the first group involve a data entry task. They focus on the specific issue of initiating and executing response components under fatigue produced by prolonged work. These experiments demonstrate that prolonged work affects the component cognitive and motoric processes of data entry differentially and at different points in time. The experiments I will report from the second group involve a duration estimation task which is in some cases coupled with a secondary articulatory suppression task. They focus on the specific issue of ways to promote transfer of training. These experiments demonstrate that learning how to estimate durations is highly specific to the conditions of training and critically depends on whether or not a secondary task is required.
Speaker 9: Nelson Cowan
(Satoru Saito, Kyoto University 齊藤智、京都大学)
Monday January 13, 9:10 – 10:30
Working-memory capacity limits in a theoretical context
Nelson Cowan
(University of Missouri, U.S.A.)
Almost every cognitive task relies upon some version of what is commonly called working memory, which can be described as the limited amount of information that can be retained temporarily in a state that is more quickly and reliably accessible than other information in the memory system. Sentence comprehension requires that information from the first part of the sentence remain available in memory for integration with the next part, mental addition requires that the partial sums be held in mind until the problem is completed, and so on. Therefore, limitations in the capacity of working memory are of special importance in carrying out analyses of task demands and in assessing individual differences in cognitive capabilities.
I will argue that there has been very little consensus on how working-memory capacity should be measured. George Miller (1956) noted that adult humans can recall about 7 items in the correct serial order but his reference to that as a "magical" number was tongue-in-cheek as, for example, an autobiographical essay that he published clearly indicates. It was known even in 1956 that other tasks yield different estimates. In running memory span, for example, in which the end-point of the list is unpredictable, people typically recall only about 4 items.
I will sketch out conditions in which one critically important component of working-memory capacity, the contents of the focus of attention, might be measured. The conditions are those in which (1) stimulus items are familiar, (2) task demands prevent the grouping of stimulus items into higher-level units, and (3) task demands prevent rehearsal or passive memory storage faculties from augmenting performance. Under a wide variety of such circumstances, it can be shown that adults recall about 4 items Special measures of memory capacity will be proposed and relationships between those measures and other, more conventionally-used measures of working memory will be described.
Speaker 10: Douglas Nelson
(Kazuo Mori, Shinshu University 守 一雄、信州大学)
Monday January 13, 10:30 – 11:50
Implicitly activated memories, the missing links of remembering
Douglas Nelson
(University of South Florida, U.S.A.)
Scientists in many disciplines are mapping information in their domains. Like these scientists, we have been mapping word knowledge that reveals the associative structure of specific words. We can build such a map because words remind the brain of associated words, as reading Planet reminds it of the associated word Earth. By using free association procedures, we can learn what these words are and how they are linked. Our work to date indicates that the associative structures of known words vary systematically in terms of three different features: Resonance, connectivity and set size. Resonance refers to the probability that a word’s associates produce it as an associate. Connectivity refers to links among a word’s associates, and set size refers to how many relatively strong associates there are in its set. Our interest lies not in constructing word maps, but in determining how the pre-existing associative structure of a word affects its recognition and cued recall. The broader issue lies in understanding how pre-existing knowledge influences recent episodic memory. With this issue in mind, we select words having different structures and include them in lists of similar words that participants study under varying conditions. Words with high levels of resonance and connectivity are more likely to be recognized and recalled when given related words as test cues. The size of the associative set affects cued recall, but not recognition. In cued recall, the relationship between a test cue and its related studied word is determined by pre-existing links that vary in strength, direction and directness. Recognition and cued recall processes are best understood as the result of an interaction between known and new information. Our model assumes that processing a familiar word activates related words in long-term working memory, and that disrupting attention causes forgetting by reducing access to what has been activated.
Speaker 11: Lynn Hasher
(Etsuko Harada, Hosei University 原田悦子,法政大学)
Monday January 13, 13:20 – 14:40
It’s about time: Circadian rhythms, memory and aging
Lynn Hasher
(University of Toronto, Canada)
Circadian rhythms have been studied by biologists and largely ignored by psychologists, except for those interested in topics such as adjustment to shift work and to travel across time zones. As it happens, however, circadian rhythms play a substantial role in human cognition and attention to rhythms can inform questions from the applied (when should the school day start? when should neuropsychological assessments be done?), to basic empirical findings (how much does cognitive performance really decline as people age?), to theory (what are the differences between explicit and implicit memory?). The work reported here will focus on basic empirical findings in attention and memory and will raise surprising questions about the role of explicit retrieval in implicit memory performance.
As an overview, I will report data showing substantial age and individual differences in circadian arousal patterns across childhood, young adulthood and old age. Data will be reported from a series of studies comparing younger and older adults on a variety of attention and memory tasks, all of which show several important findings: First, both younger and older adults show better performance at their optimal than at their nonoptimal times of day. This is seen in the degree to which distraction can be ignored, in the degree to which details can be remembered, in the degree to which there are schema-based errors in memory, and in the degree to which error tendencies can be controlled. A second important finding that can be seen throughout the data I report here is that the difference in performance between older and younger adults is likely exaggerated in the cognitive gerontology literature since like most others in cognition, it ignores the basic fact that older and younger adults are on different circadian arousal cycles. A third important finding is that there are major differences in the circadian effects on explicit and implicit learning and memory. Whereas explicit performance is better at optimal than at nonoptimal times of day, implicit performance (both learning and memory) is better at nonoptimal times of day! These findings suggest the possibility that automatic processes are functioning at high levels even when more controlled, deliberate ones are not.
|
|