Shared Flashcard Set

Details

Learning Theories
N/A
54
Education
Graduate
03/16/2015

Additional Education Flashcards

 


 

Cards

Term

ACT-R

Definition

ACT-R distinguishes among three types of memory structures: declarative, procedural and working memory. Declarative memory takes the form of a semantic net linking propositions, images, and sequences by associations. Procedural memory (also long-term) represents information in the form of productions; each production has a set of conditions and actions based in declarative memory. The nodes of long-term memory all have some degree of activation and working memory is that part of long-term memory that is most highly activated.

According to ACT-R, all knowledge begins as declarative information; procedural knowledge is learned by making inferences from already existing factual knowledge. ACT-R supports three fundamental types of learning: generalization, in which productions become broader in their range of application, discrimination, in which productions become narrow in their range of application, and strengthening, in which some productions are applied more often. New productions are formed by the conjunction or disjunction of existing productions.

Term

Adult Learning

Definition

Cross (1981) presents the Characteristics of Adults as Learners (CAL) model in the context of her analysis of lifelong learning programs. The model attempts to integrate other theoretical frameworks for adult learning such asandragogy (Knowles), experiential learning experiential learning (Rogers), and lifespan psychology.

The CAL model consists of two classes of variables: personal characteristics and situational characteristics. Personal characteristics include: aging, life phases, and developmental stages. These three dimensions have different characteristics as far as lifelong learning is concerned. Aging results in the deterioration of certain sensory-motor abilities (e.g., eyesight, hearing, reaction time) while intelligence abilities (e.g., decision-making skills, reasoning, vocabulary) tend to improve. Life phases and developmental stages (e.g., marriage, job changes, retirement) involve a series of plateaus and transitions which may or may not be directly related to age.

Situational characteristics consist of part-time versus full-time learning, and voluntary versus compulsory learning. The administration of learning (i.e., schedules, locations, procedures) is strongly affected by the first variable; the second pertains to the self-directed, problem-centered nature of most adult learning.

Term

Algo-Heuristic Theory

Definition

Landa's theory is concerned with identifying mental processes -- conscious and especially unconscious -- that underlie expert learning, thinking and performance in any area. His methods represent a system of techniques for getting inside the mind of expert learners and performers which enable one to uncover the processes involved. Once uncovered, they are broken down into their relative elementary components -- mental operations and knowledge units which can be viewed as a kind of psychological "atoms" and "molecules". Performing a task or solving a problem always requires a certain system of elementary knowledge units and operations.

There are classes of problems for which it is necessary to execute operations in a well structured, predefined sequence (algorithmic problems). For such problem classes, it is possible to formulate a set of precise unambiguous instructions (algorithms) as to what one should do mentally and/or physically in order to successfully solve any problem belonging to that class. There are also classes of problems (creative or heuristic problems) for which precise and unambiguous sets of instructions cannot be formulated. For such classes of problems, it is possible to formulate instructions that contain a certain degree of uncertainty (heuristics). Landa also describes semi-algorithmic and semi-heuristic problems, processes and instructions.

The theory suggests that all cognitive activities can be analyzed into operations of an algorithmic, semi-algorithmic, heuristic, or semi-heuristic nature. Once discovered, these operations and their systems can serve as the basis for instructional strategies and methods. The theory specifies that students ought to be taught not only knowledge but the algorithms and heuristics of experts as well. They also have to be taught how to discover algorithms and heuristics on their own. Special emphasis is placed on teaching students cognitive operations, algorithms and heuristics which make up general methods of thinking (i.e., intelligence).

With respect to sequencing of instruction, Landa proposes a number of strategies, the most important of which is the "snowball" method. This method applies to teaching a system of cognitive operations by teaching the first operation, then the second which is practiced with the first, and so on.

Term

Andragogy

Definition

Knowles' theory of andragogy is an attempt to develop a theory specifically for adult learning. Knowles emphasizes that adults are self-directed and expect to take responsibility for decisions. Adult learning programs must accommodate this fundamental aspect.

Andragogy makes the following assumptions about the design of learning: (1) Adults need to know why they need to learn something (2) Adults need to learn experientially, (3) Adults approach learning as problem-solving, and (4) Adults learn best when the topic is of immediate value.

In practical terms, andragogy means that instruction for adults needs to focus more on the process and less on the content being taught. Strategies such as case studies, role playing, simulations, and self-evaluation are most useful. Instructors adopt a role of facilitator or resource rather than lecturer or grader.

Term

Anchored Instruction

Definition

Anchored instruction is a major paradigm for technology-based learning that has been developed by the Cognition & Technology Group at Vanderbilt (CTGV) under the leadership of John Bransford. While many people have contributed to the theory and research of anchored instruction, Bransford is the principal spokesperson and hence the theory is attributed to him.

The initial focus of the work was on the development of interactive videodisc tools that encouraged students and teachers to pose and solve complex, realistic problems. The video materials serve as "anchors" (macro-contexts) for all subsequent learning and instruction. As explained by CTGV (1993, p52): "The design of these anchors was quite different from the design of videos that were typically used in education...our goal was to create interesting, realistic contexts that encouraged the active construction of knowledge by learners. Our anchors were stories rather than lectures and were designed to be explored by students and teachers. " The use of interactive videodisc technology makes it possible for students to easily explore the content.

 

Anchored instruction is close ly related to the situated learning framework (see CTGV, 1990, 1993) and also to the Cognitive Flexibility theory in its emphasis on the use of technology-based learning.

Term

Aptitude-Treatment Interaction

Definition

Aptitude-Treatment Interaction (ATI) -- the concept that some instructional strategies (treatments) are more or less effective for particular individuals depending upon their specific abilities. As a theoretical framework, ATI suggests that optimal learning results when the instruction is exactly matched to the aptitudes of the learner. It is consistent with theories of intelligence (e.g., multiple intelligences, intellect theory, triarchic theory) that suggest a multidimensional view of ability.

According to Snow (1989), the aim of ATI research is predict educational outcomes from combinations of aptitudes and treatments. He summarizes the main conclusions of Cronbach & Snow (1977) as: (1) aptitude treatment interactions are very common in education, (2) many ATI combinations are complex and difficult to demonstrate clearly, and no particular ATI effect is sufficiently understood to be the basis for instructional practice. Furthermore, Snow identifies the lack of attention to the social aspects of learning as a serious deficiency of ATI research. He states: "Learning style differences can be linked to relatively stable person or aptitude variables, but they also vary within individuals as a function of task and situation variables." (p51)

Term

Attribution Theory

Definition

Attribution theory is concerned with how individuals interpret events and how this relates to their thinking and behavior. Heider (1958) was the first to propose a psychological theory of attribution, but Weiner and colleagues (e.g., Jones et al, 1972; Weiner, 1974, 1986) developed a theoretical framework that has become a major research paradigm of social psychology. Attribution theory assumes that people try to determine why people do what they do, i.e., attribute causes to behavior. A person seeking to understand why another person did something may attribute one or more causes to that behavior. A three-stage process underlies an attribution: (1) the person must perceive or observe the behavior, (2) then the person must believe that the behavior was intentionally performed, and (3) then the person must determine if they believe the other person was forced to perform the behavior (in which case the cause is attributed to the situation) or not (in which case the cause is attributed to the other person).

Weiner focused his attribution theory on achievement (Weiner, 1974). He identified ability, effort, task difficulty, and luck as the most important factors affecting attributions for achievement. Attributions are classified along three causal dimensions: locus of control, stability, and controllability. The locus of control dimension has two poles: internal versus external locus of control. The stability dimension captures whether causes change over time or not. For instance, ability can be classified as a stable, internal cause, and effort classified as unstable and internal. Controllability contrasts causes one can control, such as skill/efficacy, from causes one cannot control, such as aptitude, mood, others' actions, and luck.

Attribution theory is closely associated with the concept of motivation. It also relates the work done on script theory and inferencing done by Schank.

Term

Cognitive Dissonance

Definition

According to cognitive dissonance theory, there is a tendency for individuals to seek consistency among their cognitions (i.e., beliefs, opinions). When there is an inconsistency between attitudes or behaviors (dissonance), something must change to eliminate the dissonance. In the case of a discrepancy between attitudes and behavior, it is most likely that the attitude will change to accommodate the behavior.

Two factors affect the strength of the dissonance: the number of dissonant beliefs, and the importance attached to each belief. There are three ways to eliminate dissonance: (1) reduce the importance of the dissonant beliefs, (2) add more consonant beliefs that outweigh the dissonant beliefs, or (3) change the dissonant beliefs so that they are no longer inconsistent.

Dissonance occurs most often in situations where an individual must choose between two incompatible beliefs or actions. The greatest dissonance is created when the two alternatives are equally attractive. Furthermore, attitude change is more likely in the direction of less incentive since this results in lower dissonance. In this respect, dissonance theory is contradictory to most behavioral theories which would predict greater attitude change with increased incentive (i.e., reinforcement).

Term

Cognitive Flexibility Theory

Definition

Cognitive flexibility theory focuses on the nature of learning in complex and ill-structured domains. Spiro & Jehng (1990, p. 165) state: "By cognitive flexibility, we mean the ability to spontaneously restructure one's knowledge, in many ways, in adaptive response to radically changing situational demands...This is a function of both the way knowledge is represented (e.g., along multiple rather single conceptual dimensions) and the processes that operate on those mental representations (e.g., processes of schema assembly rather than intact schema retrieval)."

The theory is largely concerned with transfer of knowledge and skills beyond their initial learning situation. For this reason, emphasis is placed upon the presentation of information from multiple perspectives and use of many case studies that present diverse examples. The theory also asserts that effective learning is context-dependent, so instruction needs to be very specific. In addition, the theory stresses the importance of constructed knowledge; learners must be given an opportunity to develop their own representations of information in order to properly learn.

Cognitive flexibility theory builds upon other constructivist theories (e.g., constructivist, subsumption, genetic epistemology) and is related to the work of symbol systems in terms of media and learning interaction.

Term

Cognitive Load Theory 

Definition

This theory suggests that learning happens best under conditions that are aligned with human cognitive architecture. The structure of human cognitive architecture, while not known precisely, is discernible through the results of experimental research. Recognizing George Miller's information processing research showing that short term memory is limited in the number of elements it can contain simultaneously, Sweller builds a theory that treats schemas, or combinations of elements, as the cognitive structures that make up an individual's knowledge base. (Sweller, 1988)

The contents of long term memory are "sophisticated structures that permit us to perceive, think, and solve problems," rather than a group of rote learned facts. These structures, known as schemas, are what permit us to treat multiple elements as a single element. They are the cognitive structures that make up the knowledge base (Sweller, 1988). Schemas are acquired over a lifetime of learning, and may have other schemas contained within themselves.

The difference between an expert and a novice is that a novice hasn't acquired the schemas of an expert. Learning requires a change in the schematic structures of long term memory and is demonstrated by performance that progresses from clumsy, error-prone, slow and difficult to smooth and effortless. The change in performance occurs because as the learner becomes increasingly familiar with the material, the cognitive characteristics associated with the material are altered so that it can be handled more efficiently by working memory.

From an instructional perspective, information contained in instructional material must first be processed by working memory. For schema acquisition to occur, instruction should be designed to reduce working memory load. Cognitive load theory is concerned with techniques for reducing working memory load in order to facilitate the changes in long term memory associated with schema acquisition.

Term

Component Display Theory

Definition

Component Display Theory (CDT) classifies learning along two dimensions: content (facts, concepts, procedures, and principles) and performance (remembering, using, generalities). The theory specifies four primary presentation forms: rules (expository presentation of a generality), examples (expository presentation of instances), recall (inquisitory generality) and practice (inquisitory instance). Secondary presentation forms include: prerequisites, objectives, helps, mnemonics, and feedback.

[image]

The theory specifies that instruction is more effective to the extent that it contains all necessary primary and secondary forms. Thus, a complete lesson would consist of objective followed by some combination of rules, examples, recall, practice, feedback, helps and mnemonics appropriate to the subject matter and learning task. Indeed, the theory suggests that for a given objective and learner, there is a unique combination of presentation forms that results in the most effective learning experience.

Merrill (1983) explains the assumptions about cognition that underlie CDT. While acknowledging a number of different types of memory, Merrill claims that associative and algorithmic memory structures are directly related to the performance components of Remember and Use/Find respectively. Associative memory is a hierarchial network structure; algorithmic memory consists of schema or rules. The distinction between Use and Find performances in algorithmic memory is the use of existing schema to process input versus creating a new schema through reorganization of existing rules.

A significant aspect of the CDT framework is learner control, i.e., the idea that learners can select their own instructional strategies in terms of content and presentation components. In this sense, instruction designed according to CDT provides a high degree of individualization since students can adapt learning to meet their own preferences and styles.

In recent years, Merrill has presented a new version of CDT called Component Design Theory (Merrill, 1994). This new version has a more macro focus than the original theory with the emphasis on course structures (instead of lessons) and instructional transactions rather than presentation forms. In addition, advisor strategies have taken the place of learner control strategies. Development of the new CDT theory has been closely related to work on expert systems and authoring tools for instructional design (e.g., Li & Merrill, 1991; Merrill, Li, & Jones, 1991)

Term

Conditions of Learning

Definition

This theory stipulates that there are several different types or levels of learning. The significance of these classifications is that each different type requires different types of instruction. Gagne identifies five major categories of learning: verbal information, intellectual skills, cognitive strategies, motor skills and attitudes. Different internal and external conditions are necessary for each type of learning. For example, for cognitive strategies to be learned, there must be a chance to practice developing new solutions to problems; to learn attitudes, the learner must be exposed to a credible role model or persuasive arguments.

Gagne suggests that learning tasks for intellectual skills can be organized in a hierarchy according to complexity: stimulus recognition, response generation, procedure following, use of terminology, discriminations, concept formation, rule application, and problem solving. The primary significance of the hierarchy is to identify prerequisites that should be completed to facilitate learning at each level. Prerequisites are identified by doing a task analysis of a learning/training task. Learning hierarchies provide a basis for the sequencing of instruction.

In addition, the theory outlines nine instructional events and corresponding cognitive processes:

  1. Gaining attention (reception)
  2. Informing learners of the objective (expectancy)
  3. Stimulating recall of prior learning (retrieval)
  4. Presenting the stimulus (selective perception)
  5. Providing learning guidance (semantic encoding)
  6. Eliciting performance (responding)
  7. Providing feedback (reinforcement)
  8. Assessing performance (retrieval)
  9. Enhancing retention and transfer (generalization).

These events should satisfy or provide the necessary conditions for learning and serve as the basis for designing instruction and selecting appropriate media (Gagne, Briggs & Wager, 1992).

Term

Connectionism

Definition

The learning theory of Thorndike represents the original S-R framework of behavioral psychology: Learning is the result of associations forming between stimuli and responses. Such associations or "habits" become strengthened or weakened by the nature and frequency of the S-R pairings. The paradigm for S-R theory was trial and error learning in which certain responses come to dominate others due to rewards. The hallmark of connectionism (like all behavioral theory) was that learning could be adequately explained without refering to any unobservable internal states.

Thorndike's theory consists of three primary laws: (1) law of effect - responses to a situation which are followed by a rewarding state of affairs will be strengthened and become habitual responses to that situation, (2) law of readiness - a series of responses can be chained together to satisfy some goal which will result in annoyance if blocked, and (3) law of exercise - connections become strengthened with practice and weakened when practice is discontinued. A corollary of the law of effect was that responses that reduce the likelihood of achieving a rewarding state (i.e., punishments, failures) will decrease in strength.

The theory suggests that transfer of learning depends upon the presence of identical elements in the original and new learning situations; i.e., transfer is always specific, never general. In later versions of the theory, the concept of "belongingness" was introduced; connections are more readily established if the person perceives that stimuli or responses go together (c.f. Gestalt principles). Another concept introduced was "polarity" which specifies that connections occur more easily in the direction in which they were originally formed than the opposite. Thorndike also introduced the "spread of effect" idea, i.e., rewards affect not only the connection that produced them but temporally adjacent connections as well.

Term

Constructivist Theory

Definition

A major theme in the theoretical framework of Bruner is that learning is an active process in which learners construct new ideas or concepts based upon their current/past knowledge. The learner selects and transforms information, constructs hypotheses, and makes decisions, relying on a cognitive structure to do so. Cognitive structure (i.e., schema, mental models) provides meaning and organization to experiences and allows the individual to "go beyond the information given".

As far as instruction is concerned, the instructor should try and encourage students to discover principles by themselves. The instructor and student should engage in an active dialog (i.e., socratic learning). The task of the instructor is to translate information to be learned into a format appropriate to the learner's current state of understanding. Curriculum should be organized in a spiral manner so that the student continually builds upon what they have already learned.

Bruner (1966) states that a theory of instruction should address four major aspects: (1) predisposition towards learning, (2) the ways in which a body of knowledge can be structured so that it can be most readily grasped by the learner, (3) the most effective sequences in which to present material, and (4) the nature and pacing of rewards and punishments. Good methods for structuring knowledge should result in simplifying, generating new propositions, and increasing the manipulation of information.

In his more recent work, Bruner (1986, 1990, 1996) has expanded his theoretical framework to encompass the social and cultural aspects of learning as well as the practice of law.

Term

Contiguity Theory

Definition

Guthrie's contiguity theory specifies that "a combination of stimuli which has accompanied a movement will on its recurrence tend to be followed by that movement". According to Guthrie, all learning was a consequence of association between a particular stimulus and response. Furthermore, Guthrie argued that stimuli and responses affect specific sensory-motor patterns; what is learned are movements, not behaviors.

In contiguity theory, rewards or punishment play no significant role in learning since they occur after the association between stimulus and response has been made. Learning takes place in a single trial (all or none). However, since each stimulus pattern is slightly different, many trials may be necessary to produce a general response. One interesting principle that arises from this position is called "postremity" which specifies that we always learn the last thing we do in response to a specific stimulus situation.

Contiguity theory suggests that forgetting is due to interference rather than the passage of time; stimuli become associated with new responses. Previous conditioning can also be changed by being associated with inhibiting responses such as fear or fatigue. The role of motivation is to create a state of arousal and activity which produces responses that can be conditioned.

Term

Conversation Theory

Definition

The Conversation Theory developed by G. Pask originated from a cybernetics framework and attempts to explain learning in both living organisms and machines. The fundamental idea of the theory was that learning occurs through conversations about a subject matter which serve to make knowledge explicit. Conversations can be conducted at a number of different levels: natural language (general discussion), object languages (for discussing the subject matter), and metalanguages (for talking about learning/language).

In order to facilitate learning, Pask argued that subject matter should be represented in the form of entailment structures which show what is to be learned. Entailment structures exist in a variety of different levels depending upon the extent of relationships displayed (e.g., super/subordinate concepts, analogies).

The critical method of learning according to conversation theory is "teachback" in which one person teaches another what they have learned. Pask identified two different types of learning strategies: serialists who progress through an entailment structure in a sequential fashion and holists who look for higher order relations.

Term

Criterion Referenced Instruction

Definition

The Criterion Referenced Instruction (CRI) framework developed by Robert Mager is a comprehensive set of methods for the design and delivery of training programs. Some of the critical aspects include: (1) goal/task analysis -- to identify what needs to be learned, (2) performance objectives -- exact specification of the outcomes to be accomplished and how they are to be evaluated (the criterion), (3) criterion referenced testing -- evaluation of learning in terms of the knowledge/skills specified in the objectives, (4) development of learning modules tied to specific objectives.

Training programs developed in CRI format tend to be self-paced courses involving a variety of different media (e.g., workbooks, videotapes, small group discussions, computer-based instruction). Students learn at their own pace and take tests to determine if they have mastered a module. A course manager administers the program and helps students with problems.

CRI is based upon the ideas of mastery learning and performance-oriented instruction. It also incorporates many of the ideas found in Gagne's coditions of learning (e.g., task hierarchies, objectives) and is compatible with most theories of adult learning (e.g., andragogy, experiential learning) because of its emphasis on learner initiative and self-management.

Term

Double Loop Learning

Definition

Argyris (1976) proposes double loop learning theory which pertains to learning to change underlying values and assumptions. The focus of the theory is on solving problems that are complex and ill-structured and which change as problem-solving advances.

Double loop theory is based upon a "theory of action" perspective outlined by Argyris & Schon (1974). This perspective examines reality from the point of view of human beings as actors. Changes in values, behavior, leadership, and helping others, are all part of, and informed by, the actors' theory of action. An important aspect of the theory is the distinction between an individual's espoused theory and their "theory-in-use" (what they actually do); bringing these two into congruence is a primary concern of double loop learning. Typically, interaction with others is necessary to identify the conflict.

There are four basic steps in the action theory learning process: (1) discovery of espoused and theory-in-use, (2) invention of new meanings, (3) production of new actions, and (4) generalization of results. Double loop learning involves applying each of these steps to itself. In double loop learning, assumptions underlying current views are questioned and hypotheses about behavior tested publically. The end result of double loop learning should be increased effectiveness in decision-making and better acceptance of failures and mistakes.

In recent years, Argyris has focused on a methodology for implementing action theory on a broad scale called "action science" (see Argyris, Putnam & Smith, 1985) and the role of learning at the organizational level (e.g., Argyris, 1993; Schon & Argyris, 1996).

Term

Drive Reduction Theory

Definition

Hull developed a version of behaviorism in which the stimulus (S) affects the organism (O) and the resulting response (R) depends upon characteristics of both O and S. In other words, Hull was interested in studying intervening variables that affected behavior such as initial drive, incentives, inhibitors, and prior training (habit strength). Like other forms of behavior theory, reinforcement is the primary factor that determines learning. However, in Hull's theory, drive reduction or need satisfaction plays a much more important role in behavior than in other frameworks (i.e., connectionism, operant conditioning).

Hull's theoretical framework consisted of many postulates stated in mathematical form; They include: (1) organisms possess a hierarchy of needs which are aroused under conditions of stimulation and drive, (2) habit strength increases with activities that are associated with primary or secondary reinforcement, (3) habit strength aroused by a stimulus other than the one originally conditioned depends upon the closeness of the second stimulus in terms of discrimination thresholds, (4) stimuli associated with the cessation of a response become conditioned inhibitors, (5) the more the effective reaction potential exceeds the reaction theshold, the shorter the latency of response. As these postulates indicate, Hull proposed many types of variables that accounted for generalization, motivation, and variability (oscillation) in learning.

One of the most important concepts in Hull's theory was the habit strength hierarchy: for a given stimulus, an organism can respond in a number of ways. The likelihood of a specific response has a probability which can be changed by reward and is affected by various other variables (e.g. inhibition). In some respects, habit strength hierarchies resemble components of cognitive theories such as schema and production systems .

Term

Dual Coding Theory

Definition

The dual coding theory proposed by Paivio attempts to give equal weight to verbal and non-verbal processing. Paivio (1986) states: "Human cognition is unique in that it has become specialized for dealing simultaneously with language and with nonverbal objects and events. Moreover, the language system is peculiar in that it deals directly with linguistic input and output (in the form of speech or writing) while at the same time serving a symbolic function with respect to nonverbal objects, events, and behaviors. Any representational theory must accommodate this dual functionality." (p 53).

[image]

The theory assumes that there are two cognitive subsystems, one specialized for the representation and processing of nonverbal objects/events (i.e., imagery), and the other specialized for dealing with language. Paivio also postulates two different types of representational units: "imagens" for mental images and "logogens" for verbal entities which he describes as being similar to "chunks" as described by Miller. Logogens are organized in terms of associations and hierarchies while imagens are organized in terms of part-whole relationships.

Dual Coding theory identified three types of processing: (1) representational, the direct activation of verbal or non-verbal representations, (2) referential, the activation of the verbal system by the nonverbal system or vice-versa, and (3) associative processing, the activation of representations within the same verbal or nonverbal system. A given task may require any or all of the three kinds of processing.

Term

Elaboration Theory

Definition

According to elaboration theory, instruction should be organized in increasing order of complexity for optimal learning. For example, when teaching a procedural task, the simplest version of the task is presented first; subsequent lessons present additional versions until the full range of tasks are taught. In each lesson, the learner should be reminded of all versions taught so far (summary/synthesis). A key idea of elaboration theory is that the learner needs to develop a meaningful context into which subsequent ideas and skills can be assimilated.

Elaboration theory proposes seven major strategy components: (1) an elaborative sequence, (2) learning prerequisite sequences, (3) summary, (4) synthesis, (5) analogies, (6) cognitive strategies, and (7) learner control. The first component is the most critical as far as elaboration theory is concerned. The elaborative sequence is defined as a simple to complex sequence in which the first lesson epitomizes (rather than summarize or abstract) the ideas and skills that follow. Epitomizing should be done on the basis of a single type of content (concepts, procedures, principles), although two or more types may be elaborated simultaneously, and should involve the learning of just a few fundamental or representative ideas or skills at the application level.

It is claimed that the elaboration approach results in the formation of more stable cognitive structures and therefore better retention and transfer, increased learner motivation through the creation of meaningful learning contexts, and the provision of information about the content that allows informed learner control. Elaboration theory is an extension of the work ofAusubel (advance organizers) and Bruner (spiral curriculum).

Term

Experiential Learning

Definition

Rogers distinguished two types of learning: cognitive (meaningless) and experiential (significant). The former corresponds to academic knowledge such as learning vocabulary or multiplication tables and the latter refers to applied knowledge such as learning about engines in order to repair a car. The key to the distinction is that experiential learning addresses the needs and wants of the learner. Rogers lists these qualities of experiential learning: personal involvement, self-initiated, evaluated by learner, and pervasive effects on learner.

To Rogers, experiential learning is equivalent to personal change and growth. Rogers feels that all human beings have a natural propensity to learn; the role of the teacher is to facilitate such learning. This includes: (1) setting a positive climate for learning, (2) clarifying the purposes of the learner(s), (3) organizing and making available learning resources, (4) balancing intellectual and emotional components of learning, and (5) sharing feelings and thoughts with learners but not dominating.

According to Rogers, learning is facilitated when: (1) the student participates completely in the learning process and has control over its nature and direction, (2) it is primarily based upon direct confrontation with practical, social, personal or research problems, and (3) self-evaluation is the principal method of assessing progress or success. Rogers< also emphasizes the importance of learning to learn and an openness to change.

Roger's theory of learning evolved as part of the humanistic education movement (e.g., Patterson, 1973; Valett, 1977).

Term

Functional Context

Definition

The functional context approach to learning stresses the importance of making learning relevant to the experience of learners and their work context. The learning of new information is facilitated by making it possible for the learner to relate it to knowledge already possessed and transform old knowledge into new knowledge. By using materials that the learner will use after training, transfer of learning from the classroom to the "real world" will be enhanced.

The model of the cognitive system underlying this approach emphasizes the interaction of three components: (1) a knowledge base (i.e., long term memory) of what the individual knows, (2) processing skills including language, problem-solving, and learning strategies, and (3) information displays that present information. The performance of a task requires knowledge about what one is reading or writing, processing skills for comprehension and communication, and displays of information to be processed.

The functional context approach also proposes new assessment methods. Instead of using grade level scores, tests should measure content knowledge gained and distinquish between functional learning and academic learning. For example, an assessment of readi ng should measure both reading-to-do (e.g., looking up information in a manual) and reading-to-learn (e.g., information needed for future decisions).

Functional context theory shares a similar emphasis with Situated Learning theory which also stresses the importance of context during learning.

Term

Genetic Epistemology

Definition

Over a period of six decades, Jean Piaget conducted a program of naturalistic research that has profoundly affected our understanding of child development. Piaget called his general theoretical framework "genetic epistemology" because he was primarily interested in how knowledge developed in human organisms. Piaget had a background in both Biology and Philosophy and concepts from both these disciplines influences his theories and research of child development.

The concept of cognitive structure is central to his theory. Cognitive structures are patterns of physical or mental action that underlie specific acts of intelligence and correspond to stages of child development (see Schemas). There are four primary cognitive structures (i.e., development stages) according to Piaget: sensorimotor, preoperations, concrete operations, and formal operations. In the sensorimotor stage (0-2 years), intelligence takes the form of motor actions. Intelligence in the preoperation period (3-7 years) is intutive in nature. The cognitive structure during the concrete operational stage (8-11 years) is logical but depends upon concrete referents. In the final stage of formal operations (12-15 years), thinking involves abstractions.

Cognitive structures change through the processes of adaptation: assimilation and accommodation. Assimilation involves the interpretation of events in terms of existing cognitive structure whereas accommodation refers to changing the cognitive structure to make sense of the environment. Cognitive development consists of a constant effort to adapt to the environment in terms of assimilation and accommodation. In this sense, Piaget's theory is similar in nature to other constructivist perspectives of learning (e.g., constructivism, social development theory).

While the stages of cognitive development identified by Piaget are associated with characteristic age spans, they vary for every individual. Furthermore, each stage has many detailed structural forms. For example, the concrete operational period has more than forty distinct structures covering classification and relations, spatial relationships, time, movement, chance, number, conservation and measurement. Similar detailed analysis of intellectual functions is provided by theories of intelligence such as intellect theory, multiple intelligences, and triarchic theory.

Term

Gestalt Theory

Definition

Along with Kohler and Koffka, Max Wertheimer was one of the principal proponents of Gestalt theory which emphasized higher-order cognitive processes in the midst of behaviorism. The focus of Gestalt theory was the idea of "grouping", i.e., characteristics of stimuli cause us to structure or interpret a visual field or problem in a certain way (Wertheimer, 1922). The primary factors that determine grouping were: (1) proximity - elements tend to be grouped together according to their nearness, (2) similarity - items similar in some respect tend to be grouped together, (3) closure - items are grouped together if they tend to complete some entity, and (4) simplicity - items will be organized into simple figures according to symmetry, regularity, and smoothness. These factors were called the laws of organization and were explained in the context of perception and problem-solving.

Wertheimer was especially concerned with problem-solving. Werthiemer (1959) provides a Gestalt interpretation of problem-solving episodes of famous scientists (e.g., Galileo, Einstein) as well as children presented with mathematical problems. The essence of successful problem-solving behavior according to Wertheimer is being able to see the overall structure of the problem: "A certain region in the field becomes crucial, is focused; but it does not become isolated. A new, deeper structural view of the situation develops, involving changes in functional meaning, the grouping, etc. of the items. Directed by what is required by the structure of a situation for a crucial region, one is led to a reasonable prediction, which like the other parts of the structure, calls for verification, direct or indirect. Two directions are involved: getting a whole consistent picture, and seeing what the structure of the whole requires for the parts." (p 212).

Term

GOMS Model

Definition

GOMS is a theory of the cognitive skills involved in human-computer tasks. It is based upon an information processing framework that assumes a number of different stages or types of memory (e.g., sensory store, working memory, LTM) with separate perceptual, motor, and cognitive processing. All cognitive activities are interpreted in terms of searching a problem space, the fundamental premise of GPS and Newell's Soar theory .

According to the GOMS model, cognitive structure consists of four components: (1) a set of goals, (2) a set of operators, (3) a set of methods for achieving the goals, and (4) a set of selection rules for chosing among competing methods. For a given task , a particular GOMS structure can be constructed and used to predict the time required to complete the task. In addition, the model can be used to identify and predict the effects of errors on task performance. Error recovery is assumed to involve the same four components as correct actions.

Term

General Problem Solver

Definition

The General Problem Solver (GPS) was a theory of human problem solving stated in the form of a simulation program (Ernst & Newell, 1969; Newell & Simon, 1972). This program and the associated theoretical framework had a significant impact on the subsequent direction of cognitive psychology. It also introduced the use of productions as a method for specifying cognitive models.

The theoretical framework was information processing and attempted to explain all behavior as a function of memory operations, control processes and rules. The methodology for testing the theory involved developing a computer simulation and then comparing the results of the simulation with human behavior in a given task. Such comparisons also made use of protocol analysis (Ericsson & Simon, 1984) in which the verbal reports of a person solving a task are used as indicators of cognitive processes (seehttp://www.rci.rutgers.edu/~cfs/472_html/CogArch/Protocol.html)

GPS was intended to provide a core set of processes that could be used to solve a variety of different types of problems. The critical step in solving a problem with GPS is the definition of the problem space in terms of the goal to be achieved and the transformation rules. Using a means-end-analysis approach, GPS would divide the overall goal into subgoals and attempt to solve each of those. Some of the basic solution rules include: (1) transform one object into another, (2) reduce the different between two objects, and (3) apply an operator to an object. One of the key elements need by GPS to solve problems was an operator-difference table that specified what transformations were possible.

Term

Information Pickup Theory

Definition

The theory of information pickup suggests that perception depends entirely upon information in the "stimulus array" rather than sensations that are influenced by cognition. Gibson proposes that the environment consists of affordances (such terrain, water, vegetation, etc.) which provide the clues necessary for perception. Furthermore, the ambient array includes invariants such as shadows, texture, color, convergence, symmetry and layout that determine what is perceived. According to Gibson, perception is a direct consequence of the properties of the environment and does not involve any form of sensory processing.

Information pickup theory stresses that perception requires an active organism. The act of perception depends upon an interaction between the organism and the environment. All perceptions are made in reference to body position and functions (proprioception). Awareness of the environment derives from how it reacts to our movements.

Information pickup theory opposes most traditional theories of cognition that assume past experience plays a dominant role in perceiving. It is based upon Gestalt theories that emphasize the significance of stimulus organization and relationships.

Term

Information Processing Theory

Definition

George A. Miller has provided two theoretical ideas that are fundamental to cognitive psychology and the information processing framework.

[image]

The first concept is "chunking" and the capacity of short term memory. Miller (1956) presented the idea that short-term memory could only hold 5-9 chunks of information (seven plus or minus two) where a chunk is any meaningful unit. A chunk could refer to digits, words, chess positions, or people's faces. The concept of chunking and the limited capacity of short term memory became a basic element of all subsequent theories of memory.

The second concept is TOTE (Test-Operate-Test-Exit) proposed by Miller, Galanter & Pribram (1960). Miller et al. suggested that TOTE should replace the stimulus-response as the basic unit of behavior. In a TOTE unit, a goal is tested to see if it has been achieved and if not an operation is performed to achieve the goal; this cycle of test-operate is repeated until the goal is eventually achieved or abandoned. The TOTE concept provided the basis of many subsequent theories of problem solving (e.g., GPS) and production systems.

Term

Lateral Thinking

Definition

Edward de Bono has written extensively about the process of lateral thinking -- the generation of novel solutions to problems. The point of lateral thinking is that many problems require a different perspective to solve successfully.

De Bono identifies four critical factors associated with lateral thinking: (1) recognize dominant ideas that polarize perception of a problem, (2) searching for different ways of looking at things, (3) relaxation of rigid control of thinking, and (4) use of chance to encourage other ideas. This last factor has to do with the fact that lateral thinking involves low-probability ideas which are unlikely to occur in the normal course of events.

Although De Bono does not acknowledge any theoretical antecedents for lateral thinking, it seems closely related to the Gestalt theory of Wertheimer . His work is also highly relevant to the concept of creativity . Visit the De Bono web site for up-to-date information on his work.

Term

Levels of Processing

Definition

The levels of processing framework was presented by Craik & Lockhart (1972) as an alternative to theories of memory that postulated separate stages for sensory, working and long-term memory. According to the levels of processing framework, stimulus information is processed at multiple levels simultaneously depending upon its characteristics. Furthermore, the "deeper" the processing, the more that will be remembered. For example, information that involves strong visual images or many associations with existing knowledge will be processed at a deeper level. Similarly, information that is being attended to receives more processing than other stimuli/events. The theory also supports the finding that we remember things that are meaningful to us because this requires more processing than meaningless stimuli.

Processing of information at different levels is unconscious and automatic unless we attend to that level. For example, we are normally not aware of the sensory properties of stimuli, or what we have in working memory, unless we are asked to specifically identify such information. This suggests that the mechanism of attention is an interruption in processing rather than a cognitive process in its own right.

D'Agostino, O'Neill & Paivio (1977) discuss the relationship between the dual coding theory and the levels of processing framework. Other theories of memory related to levels of processing are Rumelhart & Norman and Soar .

Term

Mathematical learning theory

Definition

Mathematical learning theory is an attempt to describe and explain behavior in quantitative terms. A number of psychologists have attempted to develop such theories (e.g., Hull< ; Estes; Restle & Greeno, 1970). The work of R. C. Atkinson is particularly interesting because he applied mathematical learning theory to the design of a language arts curriculum.

Atkinson (1972) discusses the problem of optimizing instruction. He outlined four possible strategies: (1) maximize the mean performance of the whole class, (2) minimize the variance in performance for the whole class, (3) maximize the number of students who score at grade level, or (4) maximize the mean performance for each individual. Atkinson shows that while alternative (1) produces the largest gain scores, it also produces the greatest variance since it increases the spread between the most and least successful students. Alternative (4) produces an overall gain but without increased variability. This is accomplished by giving each student variable amounts of time depending upon performance.

Term

Mathematical Problem Solving

Definition

Alan Schoenfeld presents the view that understanding and teaching mathematics should be approached as a problem-solving domain. According to Schoenfeld (1985), four categories of knowledge/skills are needed to be successful in mathematics: (1) Resources - proposition and procedural knowledge of mathematics, (2) heuristics - strategies and techniques for problem solving such as working backwards, or drawing figures, (3) control - decisions about when and what resources and strategies to use, and (4) beliefs - a mathematical "world view" that determines how someone approaches a problem.

Schoenfeld's theory is supported by extensive protocol analysis of students solving problems. The theoretical framework is based upon much other work in cognitive psychology, particularly the work of Newell & Simon. Schoenfeld (1987) places more emphasis on the importance of metacognition and the cultural components of learning mathematics (i.e., belief systems) than in his original formulation.

Term

Minimalism

Definition

The Minimalist theory of J.M. Carroll is a framework for the design of instruction, especially training materials for computer users. The theory suggests that (1) all learning tasks should be meaningful and self-contained activities, (2) learners should be given realistic projects as quickly as possible, (3) instruction should permit self-directed reasoning and improvising by increasing the number of active learning activities, (4) training materials and activities should provide for error recognition and recovery and, (5) there should be a close linkage between the training and actual system.

Minimalist theory emphasizes the necessity to build upon the learner's experience (c.f., Knowles , Rogers ). Carroll (1990) states: "Adult learners are not blank slates; they don't have funnels in their heads; they have little patience for being treated as "don't knows"... New users are always learning computer methods in the context of specific preexisting goals and expectations." (p. 11) Carroll also identifies the roots of minimalism in the constructivism of Bruner and Piaget.

The critical idea of minimalist theory is to minimize the extent to which instructional materials obstruct learning and focus the design on activities that support learner-directed activity and accomplishment. Carroll feels that training developed on the basis of other instructional theories (e.g., Gagne, Merrill) is too passive and fails to exploit the prior knowledge of the learner or use errors as learning opportunities.

Term

Model-Centered Instruction/Design Layers

Definition

Model-Centered Instruction (MCI) is a set of principles to guide instructional designers in selecting and arranging design constructs, so it is appropriately called a design theory. It favors designs that originate with and maintain the priority of models as the central design

Background: A Layered View of Design—MCI is closely tied to a layered view of designs. This view assumes that a designer organizes constructs within several somewhat independent layers characteristic of instructional designs: the model/content layer, the strategy layer, the control layer, the message layer, the representation layer, the media-logic layer, and the management layer. The designer selects and organizes structures within each layer in the process of forming a design. The designer also aligns the structures within layers with those of other layers to create a vertical modularity in the design that improves its manufacturability, maintainability, and the reusability of designed elements. A design layer is typified by: characteristic design goals, building-block constructs, design processes, design expression and construction tools, and principles to guide the arrangement of structures. Over time, a layer becomes associated with specialized skill sets, publications, and a design culture. Instructional theories provide principles to guide design within one or more of these layers, but no theory provides guidelines for all of them, suggesting to designers the wisdom of subscribing to multiple local theories of design rather than a single monolithic theory.

MCI Theory: Model-Centered Instruction, as any design theory, can be described in terms of the prescriptive principles it expresses for each of these layers.

Content: The content of instruction should be perceived in terms of models of three types: (1) models of environments, (2) models of cause-effect systems (natural or manufactured), and (3) models of human performance. Together these constitute the elements necessary for performance and therefore for learning. Content should be expressed relative to the full model structure rather than simply as facts, topics, or lists of tasks.

Strategy: The strategy of instruction should be perceived in terms of problems. A problem is defined as any self-posed or instructor/designer-posed task or set of tasks formed into structures called “work models” (Gibbons, et al., 1995). These are essentially scoped performances within the environment, acting on systems, exhibiting expert performance. Problems may be presented as worked examples or as examples to be worked by the learner. During problem solution instructional augmentations of several kinds may be offered or requested. Dynamic adjustment of work model scope is an important strategic variable.

Control: Control (initiative) assignment should represent a balance between learner and instructor/designer initiatives calculated to maximize learner momentum, engagement, efficient guidance, and learner self-direction and self-evaluation. Instructional controls (manipulative) should allow the learner maximum ability to interact with the model and the instructional strategy’s management.

Message: Contributions to the message arise from multiple sources which may be architecturally modularized: (1) from the workings of the model, (2) from the instructional strategy, (3) from the controls management, (4) from external informational resources, and (5) from tools supplied to support problem solving. The merging of these into a coherent, organized, and synchronized message requires some kind of message or display management function.

Representation: MCI makes no limiting assumptions about the representation of the message. Especially with respect to model representation, it anticipates a broad spectrum of possibilities—from externalized simulation models to verbal “snapshots” and other symbolics that call up and make use of models learners already possess in memory.

Medial-Logic: MCI makes no assumptions regarding the use of media. Its goal is to achieve expressions that are transportable across media. The selection of the model and the problem as central design constructs assist in this goal.

Management: MCI makes no assumption about the data recorded and used to drive instructional strategy except to the extent that it must parallel the model’s expression of the content and align also with the chosen units of instructional strategy.

Term

Modes of Learning

Definition

D. Rumelhart & D. Norman (1978) proposed that there are three modes of learning: accretion, structuring and tuning. Accretion is the addition of new knowledge to existing memory. Structuring involves the formation of new conceptual structures or schema. Tuning is the adjustment of knowledge to a specific task usually through practice. Accretion is the most common form of learning; structuring occurs much less frequently and requires considerable effort; tuning is the slowest form of learning and accounts for expert performance.

Restructuring involves some form of reflection or insight (i.e., metacognition) and may correspond to a plateau in performance. On the other hand, tuning often represents automatic behavior that is not available to reflection (e.g., learning procedures).

Rumelhart & Norman (1981) extended their model to include analogical processes: a new schema is created by modeling it on an existing schema and then modifying it based upon further experiences.

Term

Multiple Intelligences

Definition

The theory of multiple intelligences suggests that there are a number of distinct forms of intelligence that each individual possesses in varying degrees. Gardner proposes seven primary forms: linguistic, musical, logical-mathematical, spatial, body-kinesthetic, intrapersonal (e.g., insight, metacognition) and interpersonal (e.g., social skills).

According to Gardner , the implication of the theory is that learning/teaching should focus on the particular intelligences of each person. For example, if an individual has strong spatial or musical intelligences, they should be encouraged to develop these abilities. Gardner points out that the different intelligences represent not only different content domains but also learning modalities. A further implication of the theory is that assessment of abilities should measure all forms of intelligence, not just linguistic and logical-mathematical.

Gardner also emphasizes the cultural context of multiple intelligences. Each culture tends to emphasize particular intelligences. For example, Gardner (1983) discusses the high spatial abilities of the Puluwat people of the Caroline Islands, who use these skills to navigate their canoes in the ocean. Gardner also discusses the balance of personal intelligences required in Japanese society.

The theory of multiple intelligences shares some common ideas with other theories of individual differences such as Cronbach & Snow, Guilford, and Sternberg.

Term

Operant Conditioning

Definition

The theory of B.F. Skinner is based upon the idea that learning is a function of change in overt behavior. Changes in behavior are the result of an individual's response to events (stimuli) that occur in the environment. A response produces a consequence such as defining a word, hitting a ball, or solving a math problem. When a particular Stimulus-Response (S-R) pattern is reinforced (rewarded), the individual is conditioned to respond. The distinctive characteristic of operant conditioning relative to previous forms of behaviorism (e.g.,connectionism, drive reduction) is that the organism can emit responses instead of only eliciting response due to an external stimulus.

Reinforcement is the key element in Skinner's S-R theory. A reinforcer is anything that strengthens the desired response. It could be verbal praise, a good grade or a feeling of increased accomplishment or satisfaction. The theory also covers negative reinforcers -- any stimulus that results in the increased frequency of a response when it is withdrawn (different from adversive stimuli -- punishment -- which result in reduced responses). A great deal of attention was given to schedules of reinforcement (e.g. interval versus ratio) and their effects on establishing and maintaining behavior.

One of the distinctive aspects of Skinner's theory is that it attempted to provide behavioral explanations for a broad range of cognitive phenomena. For example, Skinner explained drive (motivation) in terms of deprivation and reinforcement schedules. Skinner (1957) tried to account for verbal learning and language within the operant conditioning paradigm, although this effort was strongly rejected by linguists and psycholinguists. Skinner (1971) deals with the issue of free will and social control.

Term

Originality

Definition

Irving Maltzman conducted a number of studies that demonstrated that originality could be increased. According to Maltzman, originality refers to behavior that occurs relatively infrequently, is uncommon under given conditions, and is relevant to those conditions. Maltzman distinquished originality from creativity, the latter referring to the consequences of original behavior (including the reaction of society to the behavior).

Maltzman (1960) describes three methods that can increase original responses: (1) present an uncommon stimulus situation for which conventional responses may not be readily available, (2) evoke different responses to the same situation, and (3) evoke uncommon responses as textual responses. Maltzman used the latter approach and mentions Osborn (1957) as an example of the first two.

Maltzman's research is distinctive because he was one of the few behaviorists who attempted to deal with creative behavior. He provided a simple definition and methodology for studying originality. He also examined the relationship between originality and problem solving.

Term

Phenomenography

Definition

This conceptual framework focuses on the experience of learning from the student's perspective and is based upon a phenomenological approach to research. Entwistle explains: "Our task is thus to describe more clearly how learning takes place in higher education and to point out how teaching and assessment affect the quality of learning. From these descriptions teachers should be able to draw their own lessons about how to facilitate their students' learning" (Marton, Hounsell & Entwistle, 1984, p.1).

The most important element of this framework is that data be collected directly from learners themselves through self-reports and interviews. Furthermore, the content and setting should be those actually involved in learning. Research based upon the phenomenographic approach has been conducted by a number of individuals at universities in Sweden< and the United Kindom, of which F. Marton and N. Entwistle< are leading proponents.

Phenomenography is related to the work of Pask on learning styles and that of Craik & Lockhart on levels of processing.

Term

Repair Theory

Definition

Repair theory is a attempt to explain how people learn procedural skills with particular attention to how and why they make mistakes (i.e., bugs). The theory suggests that when a procedure cannot be performed, an impasse occurs and the individual applies various strategies to overcome the impasse. These strategies (meta-actions) are called repairs. Some repairs result in correct outcomes whereas others generate incorrect results and hence "buggy" procedures. Repair theory has been implemented in the form of a computer model called Sierra.

Repair theory has been developed from extensive study of children solving arithmetic problems (Brown & VanLehn, 1980). Even with simple subtraction problems, many types of bugs were found, often occurring in combinations. Such systematic errors are not to be confused with "slips" (cf. Norman, 1981) or random mistakes since they reoccur regularly in a particular student's work. On the other hand, bugs are not totally consistent:

"Students' bugs, unlike bugs in computer programs, are unstable. Students shift back and forth among bugs, a phenomenon called bug migration. The theory's explanation for bug migration is that the student has a stable underlying procedure but that the procedure is incomplete in such a way that the student reaches impasses on some problems. Students can apply any repair they can think of. Sometimes they choose one repair and sometimes another. The different repairs manifest themselves as different bugs. So bug migration comes from varying the choice of repairs to a stable, underlying impasse." (VanLehn, 1990) p 26.

Repair theory assumes that people primarily learn procedural tasks by induction and that bugs occur because of biases that are introduced in the examples provided or the feedback received during practice (as opposed to mistakes in memorizing formulas or instructions). Therefore, the implication of repair theory is that problem sets should be chosen to eliminate the bias likely to cause specific bugs. Another implication is that bugs are often introduced when students try to extend procedures beyond the initial examples provided.

Term

Script Theory

Definition

The central focus of Schank's theory has been the structure of knowledge, especially in the context of language understanding. Schank (1975) outlined contextual dependency theory which deals with the representation of meaning in sentences. Building upon this framework, Schank & Abelson (1977) introduced the concepts of scripts, plans and themes to handle story-level understanding. Later work (e.g., Schank, 1982,1986) elaborated the theory to encompass other aspects of cognition.

The key element of conceptual dependency theory is the idea that all conceptualizations can be represented in terms of a small number of primative acts performed by an actor on an object. For example, the concept, "John read a book" could be represented as: John MTRANS (information) to LTM from book, where MTRANS is the primative act of mental transfer. In Schank's theory, all memory is episodic, i.e., organized around personal experiences rather than semantic categories. Generalized episodes are called scripts -- specific memories are stored as pointers to scripts plus any unique events for a particular episode. Scripts allow individuals to make inferences needed for understanding by filling in missing information (i.e., schema).

Schank (1986) uses script theory as the basis for a dynamic model of memory. This model suggests that events are understood in terms of scripts, plans and other knowledges structures as well as relevant previous experiences. An important aspect of dynamic memory are explanatory processes (XPs) that represent sterotyped answers to events that involve analomies or unusual events. Schank proposes that XPs are a critical mechanism of creativity .

Term

Sign Learning

Definition

Tolman's theorizing has been called purposive behaviorism and is often considered the bridge between behaviorism and cognitive theory. According to Tolman's theory of sign learning, an organism learns by pursuing signs to a goal, i.e., learning is acquired through meaningful behavior. Tolman emphasized the organized aspect of learning: "The stimuli which are allowed in are not connected by just simple one-to-one switches to the outgoing responses. Rather the incoming impulses are usually worked over and elaborated in the central control room into a tentative cognitive-like map of the environment. And it is this tentative map, indicating routes and paths and environmental relationships, which finally determines what responses, if any, the animal will finally make." (Tolman, 1948, p192)

Tolman (1932) proposed five types of learning: (1) approach learning, (2) escape learning, (3) avoidance learning, (4) choice-point learning, and (5) latent learning. All forms of learning depend upon means-end readiness, i.e., goal-oriented behavior, mediated by expectations, perceptions, representations, and other internal or environmental variables.

Tolman's version of behaviorism emphasized the relationships between stimuli rather than stimulus-response (Tolman, 1922). According to Tolman, a new stimulus (the sign) becomes associated with already meaningful stimuli (the significate) through a series of pairings; there was no need for reinforcement in order to establish learning. For this reason, Tolman's theory was closer to the connectionist framework of Thorndike than the drive reduction theory of drive reduction theory of Hull or other behaviorists.

 
Term

Situated Learning

Definition

Lave argues that learning as it normally occurs is a function of the activity, context and culture in which it occurs (i.e., it is situated). This contrasts with most classroom learning activities which involve knowledge which is abstract and out of context. Social interaction is a critical component of situated learning -- learners become involved in a "community of practice" which embodies certain beliefs and behaviors to be acquired. As the beginner or newcomer moves from the periphery of this community to its center, they become more active and engaged within the culture and hence assume the role of expert or old-timer. Furthermore, situated learning is usually unintentional rather than deliberate. These ideas are what Lave & Wenger (1991) call the process of "legitimate peripheral participation."

Other researchers have further developed the theory of situated learning. Brown, Collins & Duguid (1989) emphasize the idea of cognitive apprenticeship: "Cognitive apprenticeship supports learning in a domain by enabling students to acquire, develop and use cognitive tools in authentic domain activity. Learning, both outside and inside school, advances through collaborative social interaction and the social construction of knowledge." Brown et al. also emphasize the need for a new epistemology for learning -- one that emphasizes active perception over concepts and representation. Suchman (1988) explores the situated learning framework in the context of artificial intelligence.

Situated learning has antecedents in the work of Gibson (theory of affordances) and Vygotsky (social learning). In addition, the theory of Schoenfeld on mathematical problem solving embodies some of the critical elements of situated learning framework.

Term

Soar

Definition

Soar is an architecture for human cognition expressed in the form of a production system. It involves the collaboration of a number of researchers including Allen Newell, John Laird and Paul Rosenbloom and others at different institutions. The theory builds upon earlier efforts involving Newell such as GPS (Newell & Simon) and GOMS (Card, Moran & Newell). Like the latter model, Soar is capable of simulating actual responses and response times.

The principal element in Soar is the idea of a problem space: all cognitive acts are some form of search task. Memory is unitary and procedural; there is no distinction between procedural and declarative memory. Chunking is the primary mechanism for learning and represents the conversion of problem-solving acts into long-term memory. The occasion for chunking is an impasse and its resolution in the problem solving process (i.e., satisfying production rules). Newell states that Soar suggests a reconstructive view of memory (c.f. Bartlett < ).

Soar exhibits a variety of different types or levels of learning: operators (e.g., create, call), search control (e.g., operator selection, plans), declarative data (e.g., recognition/recall), and tasks (e.g., identify problem spaces, initial/goal states). Soar is capable of transfer within or across trials or tasks.

Term

Social Development Theory

Definition

The major theme of Vygotsky's theoretical framework is that social interaction plays a fundamental role in the development of cognition. Vygotsky (1978) states: "Every function in the child's cultural development appears twice: first, on the social level, and later, on the individual level; first, between people (interpsychological) and then inside the child (intrapsychological). This applies equally to voluntary attention, to logical memory, and to the formation of concepts. All the higher functions originate as actual relationships between individuals." (p57).

[image]A second aspect of Vygotsky's theory is the idea that the potential for cognitive development depends upon the "zone of proximal development" (ZPD): a level of development attained when children engage in social behavior. Full development of the ZPD depends upon full social interaction. The range of skill that can be developed with adult guidance or peer collaboration exceeds what can be attained alone.

Vygotsky's theory was an attempt to explain consciousness as the end product of socialization. For example, in the learning of language, our first utterances with peers or adults are for the purpose of communication but once mastered they become internalized and allow "inner speech".

Vygotsky's theory is complementary to Bandura's work on social learning and a key component ofsituated learning theory as well. Because Vygotsky's focus was on cognitive development, it is interesting to compare his views with those a constructivist (Bruner) and a genetic epistemologist(Piaget).

 

Term

Social Learning Theory

Definition

The social learning theory of Bandura emphasizes the importance of observing and modeling the behaviors, attitudes, and emotional reactions of others. Bandura (1977) states: "Learning would be exceedingly laborious, not to mention hazardous, if people had to rely solely on the effects of their own actions to inform them what to do. Fortunately, most human behavior is learned observationally through modeling: from observing others one forms an idea of how new behaviors are performed, and on later occasions this coded information serves as a guide for action." (p22). Social learning theory explains human behavior in terms of continuous reciprocal interaction between cognitive, behavioral, an environmental influences. The component processes underlying observational learning are: (1) Attention, including modeled events (distinctiveness, affective valence, complexity, prevalence, functional value) and observer characteristics (sensory capacities, arousal level, perceptual set, past reinforcement), (2) Retention, including symbolic coding, cognitive organization, symbolic rehearsal, motor rehearsal), (3) Motor Reproduction, including physical capabilities, self-observation of reproduction, accuracy of feedback, and (4) Motivation, including external, vicarious and self reinforcement.

Because it encompasses attention, memory and motivation, social learning theory spans both cognitive and behavioral frameworks. Bandura's theory improves upon the strictly behavioral interpretation of modeling provided by Miller & Dollard (1941).  Bandura’s work is related to the theories of Vygotsky and Lave which also emphasize the central role of social learning.

Term

Stimulus Sampling Theory

Definition

Stimulus sampling theory (SST), first proposed by Estes in 1950, was an attempt to develop a statistical explanation for learning phenomena. The theory suggested that a particular stimulus-response association is learned on a single trial; however, the overall learning process is a continuous one consisting of the accumulation of discrete S-R pairings. On any given learning trial, a number of different responses can be made but only the portion that are effective (i.e., rewarded) form associations. Thus, learned responses are a sample of all possible stimulus elements experienced. Variations (random or systematic) in stimulus elements are due to environmental factors or changes in the organism.

A key feature of SST was the probability of a certain stimulus occurring in any trial and of being paired with a given response. SST resulted in many forms of mathematical models, principally linear equations, that predicted learning curves. Indeed, SST was able to account for a wide variety of learning paradigms including: free recall, paired-associates, stimulus generalization, concept identification, preferential choice, and operant conditioning. SST also formed the basis for mathematical models of memory (e.g., Norman, 1970) and instruction (e.g., Atkinson).

Term

Structural Learning Theory

Definition

According to structural learning theory, what is learned are rules which consist of a domain, range, and procedure. There may be alternative rule sets for any given class of tasks. Problem solving may be facilitated when higher order rules are used, i.e., rules that generate new rules. Higher order rules account for creative behavior (unanticipated outcomes) as well as the ability to solve complex problems by making it possible to generate (learn) new rules.

Unlike information processing theories which often assume more complex control mechanisms and production rules, structural learning theory postulates a single, goal-switching control mechanism with minimal assumptions about the processor and allows more complex rule structures. Structural learning theory also assumes that "working memory" holds both rules and data (i.e., rules which do not act on other rules); the memory load associated with a task depends upon the rule(s) used for the task at hand.

Structural analysis is a methodology for identifying the rules to be learned for a given topic or class of tasks and breaking them done into their atomic components. The major steps in structural analysis are: (1) select a representative sample of problems, (2) identify a solution rule for each problem, (3) convert each solution rule into a higher order problem whose solutions is that rule, (4) identify a higher order solution rule for solving the new problems, (5) eliminate redundant solution rules from the rule set (i.e., those which can be derived from other rules), and (6) notice that steps 3 and 4 are essentially the same as steps 1 and 2, and continue the process iteratively with each newly-identified set of solution rules. The result of repeatedly identifying higher order rules, and eliminating redundant rules, is a succession of rule sets, each consisting of rules which are simpler individually but collectively more powerful than the ones before.

Structural learning prescribes teaching the simplest solution path for a problem and then teaching more complex paths until the entire rule has been mastered. The theory proposes that we should teach as many higher-order rules as possible as replacements for lower order rules. The theory also suggests a strategy for individualizing instruction by analyzing which rules a student has/has not mastered and teaching only the rules, or portions thereof, that have not been mastered.

Term

Structure of Intellect

Definition

In Guilford's Structure of Intellect (SI) theory, intelligence is viewed as comprising operations, contents, and products. There are 5 kinds of operations (cognition, memory, divergent production, convergent production, evaluation), 6 kinds of products (units, classes, relations, systems, transformations, and implications), and 5 kinds of contents (visual, auditory, symbolic, semantic, behavioral). Since each of these dimensions is independent, there are theoretically 150 different components of intelligence.

Guilford researched and developed a wide variety of psychometric tests to measure the specific abilities predicted by SI theory. These tests provide an operational definition of the many abilities proposed by the theory. Furthermore, factor analysis was used to determine which tests appeared to measure the same or different abilities.

Parenthetically, it is interesting to note that a major impetus for Guilford's theory was his interest in creativity (Guilford, 1950). The divergent production operation identifies a number of different types of creative abilities.

Term

Subsumption Theory

Definition

Ausubel's theory is concerned with how individuals learn large amounts of meaningful material from verbal/textual presentations in a school setting (in contrast to theories developed in the context of laboratory experiments). According to Ausubel, learning is based upon the kinds of superordinate, representational, and combinatorial processes that occur during the reception of information. A primary process in learning is subsumption in which new material is related to relevant ideas in the existing cognitive structure on a substantive, non-verbatim basis. Cognitive structures represent the residue of all learning experiences; forgetting occurs because certain details get integrated and lose their individual identity.

A major instructional mechanism proposed by Ausubel is the use of advance organizers:

"These organizers are introduced in advance of learning itself, and are also presented at a higher level of abstraction, generality, and inclusiveness; and since the substantive content of a given organizer or series of organizers is selected on the basis of its suitability for explaining, integrating, and interrelating the material they precede, this strategy simultaneously satisfies the substantive as well as the programming criteria for enhancing the organization strength of cognitive structure." (1963 , p. 81).

Ausubel emphasizes that advance organizers are different from overviews and summaries which simply emphasize key ideas and are presented at the same level of abstraction and generality as the rest of the material. Organizers act as a subsuming bridge between new learning material and existing related ideas.

Ausubel's theory has commonalities with Gestalt theories and those that involve schema (e.g., Bartlett< ) as a central principle. There are also similarities with Bruner's "spiral learning" model , although Ausubel emphasizes that subsumption involves reorganization of existing cognitive structures not the development of new structures as constructivist theories suggest. Ausubel was apparently influenced by the work of Piaget on cognitive development.

Term

Symbol Systems

Definition

The symbol systems theory developed by Salomon is intended to explain the effects of media on learning. Salomon (1977) states: "To summarize, the symbol systems of media affect the acquisition of knowledge in a number of ways. First, they highlight different aspects of content. Second, they vary with respect to ease of recoding. Third, specific coding elements can save the learner from difficult mental elaborations by overtly supplanting or short-circuiting specific elaboration. Fourth, symbol systems differ with respect to how much processing they demand or allow. Fifth, symbol systems differ with respect to the kinds of mental processes they call on for recoding and elaboration. Thus, symbol systems partly determine who will acquire how much knowledge from what kinds of messages." (p226-227)

According to Salomon, each medium is capable of conveying content via certain inherent symbol systems. For example, Salomon suggests that television requires less mental processing than reading and that the meanings secured from viewing television tend to be less elaborate than those secured from reading (i.e., different levels of processing are involved). However, the meaning extracted from a given medium depends upon the learner. Thus, a person may acquire information about a subject they are familar with equally well from different media but be significantly influenced by different media for novel information.

Salomon (1981) focuses on the reciprocal nature of instructional communications, the instructional setting, and the learner. Salomon argues that schema play a major role in determining how messages are perceived -- in terms of creating an anticipatory bias that influences what information is selected and how it is interpreted. Furthermore, media create new schema which affect subsequent cognitive processing.

 

Symbol systems theory is closely related to aptitude-treatment interaction researchand Gardner's theory of multiple intelligences.

Term

Triarchic Theory

Definition

The triarchic theory of intelligence consists of three subtheories: (i) the componential subtheory which outlines the structures and mechanisms that underlie intelligent behavior categorized as metacognitive, performance, or knowlege acquistion components , (ii) the experiential subtheory that proposes intelligent behavior be interpreted along a continuum of experience from novel to highly familar tasks/situations, (iii) the contextual subtheory which specifies that intelligent behavior is defined by the sociocultural context in which it takes place and involves adaptation to the environment, selection of better environments, and shaping of the present environment.

[image]

According to Sternberg, a complete explanation of intelligence entails the interaction of these three subtheories. The componential subtheory specifies the potential set of mental processes that underlies behavior (i.e., how the behavior is generated) while the contextual subtheory relates intelligence to the external world in terms of what behaviors are intelligent and where. The experiential subtheory addresses the relationship between the behavior in a given task/situation and the amount of experience of the individual in that task/situation.

The componential subtheory is the most developed aspect of the triarchic theory and is based upon Sternberg (1977) which presents an information processing perspective for abilities. One of the most fundamental components according to Sternberg's research are the metacognition or "executive" processes that control the strategies and tactics used in intelligent behavior.

Term

Transformative Learning

Definition

The Transformational Learning Theory originally developed by Jack Mezirow is described as being “constructivist, an orientation which holds that the way learners interpret and reinterpret their sense experience is, central to making meaning and hence learning” (Mezirow, 1991). The theory has two basic kinds of learning: instrumental and communicative learning. Instrumental learning focuses on learning through task-oriented problem solving and determination of cause and effect relationships. Communicative learning involves how individuals communicate their feelings, needs and desires

Meaning structures (perspectives and schemes) are a major component of the theory. Meaning perspectives are defined as “broad sets of predispositions resulting from psychocultural assumptions which determine the horizons of our expectations” (Mezirow, 1991). They are divided into 3 sets of codes: sociolinguistic codes, psychological codes, and epistemic codes. A meaning scheme is “the constellation of concept, belief, judgment, and feelings which shapes a particular interpretation” (Mezirow, 1994, 223).

Meaning structures are understood and developed through reflection. Mezirow states that “reflection involves a critique of assumptions to determine whether the belief, often acquired through cultural assimilation in childhood, remains functional for us as adults” (Mezirow, 1991). Reflection is similar to problem solving and Mezirow talks about how we “reflect on the content of the problem, the process of problem-solving, or the premise of the problem” (Mezirow, 1991). Through this reflection we are able to understand ourselves more and then understand our learning better. Merizow also proposed that there are four ways of learning. They are “by refining or elaborating our meaning schemes, learning new meaning schemes, transforming meaning schemes, and transforming meaning perspectives” (Mezirow, 1991).

Mezirow’s original theory has been elaborated upon by others, most notably Cranton (1994;1997) and Boyd (1991). The theory has commonalities with other theories of adult learning such as andragogy (Knowles), experiential learning (Rogers), and Cross.

Supporting users have an ad free experience!