https://fltech-my.sharepoint.
com/:o:/g/personal/alejeune2021_fit_edu/Elr8JXkV
UExAq_-jVKBvln44Bj5HkzaaT26w7EuH8Rd__Ig?e=Z8vJx
Chapter 1
A. Introduction to Science
Definition of Science
“Science is a systematic approach to understanding natural phenomena
[emphasis added] - as evidenced by description, prediction, and control -- that
relies on determinism as it’s fundamental assumption, empiricism as prime
directive experientiation as its basic strategy, replication as its necessary
requirement for believability, parsimony as its conservatiative value, and
philosophic doubt as its guiding conscience.” - (Cooper), Heron, & Hewards,
2020, P. 7
Goals of Science:
- Description
- Prediction
- Control
Scientists' Behavior
- Empiricism: Observe and record the phenomenon of interest
- Experimentation: The basic strategy; manipulate something and see its effect
on the phenomenon of interest
- Replication: Repeat the experiment; ensure the results
Basic Assumptions of Science
- Parsimony: all simple logical explanations to be ruled out, before more
complex or abstract explanations are considered.
- Philosophic doubt : continuously question and doubt the information, rules,
and facts that has been found previously
- Determinism: the universe is a lawful place; phenomena occur as a result of
other events in a systematic way
Introduction to Behavior Analysis
2. Define behavior analysis: the natural science approach to studying
the effects of environmental variables on behaviour (the science of
behaviour)
Two appropriate uses: 1) scientific study of functional relation between
behaviour and environmental events
2) technological applications
Behaviour analysts: 1) research best ways to teach individuals; 2)
practitioners teach individual skills to help them lead better lives
Behavior Analysis and Learning
2. Define learning: relatively permanent change as a result of
experience
Experience: organism’s interaction with the environment
Behavior, Responses, and Response Classes
Behavior
a. Define: the interaction between an organism and its
environment
b. Identify the critical attributes
1. Biological in nature
2. Involves action
3. Involves interaction between the organism and the
environment (behaviour is not part of the organism)
c. Identify examples and non-examples
1. Ex: thinking
2. Non-ex: state of being (am hungry)
d. List two types: private & public behaviour
Behaver
a. Define: the individual who is behaving; the organism whose
behaviour is observed
b. Identify examples and non-examples
Non-ex: company; inanimate objects
Public behavior
a. Define: behavior that can be observed by others, even if it is not
directly being observed in the moment
b. Identify examples and non-examples
Private behavior
a. Define: behaviour that can only be observed by the organism
engaging in the behavior (can’t be observed by others even when
done in public)
b. Identify examples and non-examples
Eg: thinking, imagining, perception of pain, feelings
Response
a. Define: a single response of behaviour
b. Identify examples and non-examples
7. Topography:
a. Define: physical nature of responses; the exact form of the
response
b. Identify examples and non-examples
Function
a. Define: the effect of a response on the environment
b. Identify examples and non-examples
Response class
a. Define: a collection of two or more topographically different
responses that all have the same effect on the environment
Some Dimensions of Behavior
2. duration: the amount of time between the beginning and the end of the
response
3. rate: the number of responses over a period of time
The Environment
Environment
a. Define: events, stimuli, and conditions that can affect behavior
b. Identify examples and non-examples both outside and within the skin
1. Outside the skin:
2. Within the skin: inflamed tooth, full bladder (environmental events
within the skin may be public or private
Public environmental event
a. Define: environmental events that can be observed by others
b. Identify examples and non-examples
Eg. liquid filling up the bladder
Private environmental event
a. Define: environmental events that can only be observed by the
organism itself
b. Identify examples and non-examples
Eg. painful pressure on the bladder
Stimulus
a. Define: a change in the environment, at a movement in time, that is
sensed by the organism and affects its behaviour
b. Identify examples and non-examples
Chapter 2
Main Variables in Behavioral Analysis
· Independent variable: variable that is manipulated
o Stimulus classo
· Dependent variable: variable that is studied to see the effect of IV
o Response class
Stimulus Class
· Group of stimuli that share one or more common formal, temporal, or
functional characteristics:
o Formal
● Labeling based on specific critical physical properties
● Can include size color and intensity
o Temporal location
● Stimuli and other types of environmental variables occur either
before (Antecedents)or after (Consequences) a response
classifying
● Start by stimuli in respect to their temporal location in relation to
a response
o Functional characteristics
● Effect on behaviour
● Immediate effects: here and now on the behaviour of
interest (antecedents) abate or evoke
● Future effects: future occurrences of the behaviour of
interest (consequence) strengthen or weaken
● Examples of stimulus classes: pens, chairs (common formal
characteristics)
Antecedents
· Environmental events that occur before a response
o External environment or within skin
o Behavior of another person (social) or not (nonsocial)
● Antecedent example (external):
○ Light changes from green to red right before (antecedent)
→ you press the break (response)
● Antecedent example (internal):
○ Inflamed tooth begins to hurt (antecedent) → you reach
for pain relieving medication (response)
· Two effects of antecedents:
1) Evocative effect- antecedent event increases the momentary frequency
of a member of a response class
2) Abative effect- antecedent even decreases the momentary frequency
of a member of a response class
● Example:
○ Traffic light changing from green to red
■ Red light evokes pressing the brake
■ Red light abates pressing the accelerator
● These effects are in the here and now
● **note** a person's own behavior can not be an antecedent, this
is called a precursor
Consequences
· Environmental event that follows or occurs after a response
● External environment or within the skin
● Can involve the behavior of another person (social) or not
(non-social)
● The effects of responses are consequences
● Example:
○ I pull on a door handle (response) → door opens
(consequence)
● Consequences outside the skin:
○ Butch opens his umbrella (response) → rain stops falling
on his head (consequence)
● Consequences within the skin:
○ Zach applies pain reliever ointment to his back
(response) → pain subsides (consequence)
· NOTE: Consequences are stimuli, not inherently bad nor good
· May vary across physical properties but have same effect on behavior
o Same temporal and function, different forms
Latency
● The amount of time between an antecedent stimulus and a response
Consequence stimulus class:
● I turn in a report → T says, “Great!”
● I turn in a report → T pats my shoulder
● I turn in a report → T smiles
Two effects of consequences on behavior:
· Effect on behavior
o Effect FUTURE occurrences of behavior
o Strengthening effect-
§ Increase future probability of the behavior that the
consequence followed
● Example: Leslie gets a sticker after she finishes her
homework. Leslie finishes her homework more often in
the future.
o Weakening effect:
§ Decrease the future probability of the behavior that the ;
consequence followed
● Example: Jordi was late to an important meeting and his
boss gave him a written warning. Jordi is no longer late to
any meetings.
Determinants of Behaviors
· Environmental (nurture) and biological (nature) factors that
influence the probability of behavior
● In lay terms, the causes of behavior
Determinism:
Scientific assumption that phenomena are lawful and occur as a result
of
other events in a systematic way
Nature vs Nurture
● Nature- biology
● Nurture- environmental experience
Two determinants of behavior
1. Biological factors (Nature) (Phylogenic- without prior learning)
2. Environmental factors (Nurture- (Ontogenic- learned)
a. Current environment (antecedents)
b. Previous experience with the environment (the most
crucial variable: consequences)
· Biological determinants
o Genetics and DNA
§ Species-species genetic factors shared by all members of a
species
● Ability to learn certain behavior in certain ways
● Examples: imprinting in ducks, eye blink reflex in humans
§ Individual genetic makeup specific to an organism within the
species
§ Genetic mutations (ex., person born with Prader-Willi
syndrome)
o Non-inherited organic determinants examples
§ Brain injury, traumasd, physical disease, hormonal changes
· Antecedents as Determinants
● Evocative and abative effect of antecedents on behavior are
current determinants of behavior
● Remember that this is only temporary
· Consequences as Determinants
● Most crucial variable in an organism’s experiential and learning
history
Selectionism
Selection: the process in which repeated cycles of variation and interaction
with
the environment result in differential replication as a function of the interaction
Selectionism: scientific assumption that attributes genetic and behavioural
diversity to selection
● Three types of selectionism:
○ Natural=variation in genes
○ Operant=variation in behavior of a individual
○ Cultural=variation in the behavior of a group
Environmental Contingency
· Contingency: a dependent relationship between events, the likelihood
that
one event is the result of the other
● If A happens then B happens
● Example of a contingency:
○ Thunder never occurs unless lightning occurs first
○ If i press the play button on my remote (R), then the
movie starts to play (S)
· Always END with a stimulus or other environmental event (s-s, s-r-s)
3 Types of Contingencies
o S-S contingencies (pairing two stimuli)
o R-S contingencies (a response and a consequence)
o S-R-S contingencies (3-term contingency) (ABC: antecedent,
behavior, consequence).
Functional relation
- A mathematical relation between an independent variable and a dependent
variable
- In behavior analysis, functional relations are between:
- Stimulus classes (IV) and response classes (DV)
Parsimony
- Scientific assumption that the simplest answer should be ruled out before
more complex explanations are considered
- Example: an apple falls from a tree. Abby says it fell due to gravity.
Environmental explanations of behavior
a. Observable and measurable explanations for behaviour that are
based on current environment and past experience with
environmental contingencies
● Example: Petra went to the mall and picked out a new outfit for
school because in the past, she received compliments from
friends when she wore a new outfit.
Explanatory fiction
a. Define: fictitious or hypothetical variables that exist within the
individual, or future events that are credited as the source/cause of
behaviour
B. Two types of explanatory fictions
- Mentalistic explanations: rely on hypothetical variables
- Example: he hits because he is “angry.” How do you
know he is angry? Because he hits
- Teleological explanations: future events are used to explain
current behavior
- Example: phil goes to the gym for an hour every day so
he can lose 20 pounds by summertime
Chapter 3
A. Brief Review: Determinants of
Behavior
B. Unlearned Respondent Relations - S-R relation
1. Elicit
a. to strongly, consistently, and reliably cause to happen
b. Identify correct usage of the term
- Only used for respondent relations
- Describes the strongest causal link
2. Respondent behavior
a. Responses that are elicited by an antecedent stimulus
b. Identify examples and non-examples
3. Reflex
a. A simple relation between an antecedent stimulus and a reflex
response
- Respondents are reflex behaviours only
b. Identify examples and non-examples
c. Identify the type of relation
- Unconditioned reflex: Unlearned reflexes, involuntary,
unconditioned, phylogenetic
- Conditioned reflex: Learned reflexes, conditioned, ontogenetic
4. Unconditioned reflex
a. Define: a simple relation between
- A specific antecedent stimulus (US), and
- A specific innate, involuntary physiological response (UR)
b. Identify the critical attributes
- Innate: built int to the physiology of the organism
- Stereotypic: highly invariant
- Involuntary: not under conscious control (mediated by
autonomic nervous system, not by cortex)
- Exhibited by every member of a specific species
- Highly invariant in terms when a particular reflex
appears/disappears
c. Identify examples and non-examples
5. Unconditioned stimulus
a. Define: antecedent stimulus that has an innate capacity to elicit a
reflex response
b. Identify its role in an unconditioned reflex
- The stimulus part of relex
- Behaviour altering effect: elicits the reflex response without prior
learning
c. Identify examples and non-examples
d. Identify the name referring to its eliciting function
- Unconditioned elicitor
6. Unconditioned response
a. The response that is elicited by an unconditioned stimulus without
prior learning (due to phylogenic provenance)
b. The response part in an unconditioned reflex
c. Identify examples and non-examples
C. Respondent Conditioning
7. Identify the major contributions of Ivan Pavlov
- Introduced quantitative measurement to Psychology
- First study of the effects of systematic manipulation of environmental
effects on behaviour
- Discovered respondent conditioning and many of its principles
8. Respondent conditioning
a. The process of pairing a neutral stimulus with either an
unconditioned or conditioned stimulus that results in a new reflex
relation
- Conditioning must start with an unconditioned reflex
b. Identify examples and non-examples
9. Define S-S pairing in respondent conditioning
- The contingent presentation of two stimuli at (or very nearly at) the
same time
10. Neutral stimulus
a. Define: a stimulus that has no initial eliciting effect on the reflex
response
b. Identify its role in respondent conditioning
- Can be contingently paired with an US (or another CS) to be
conditioned to elicit a reflex response
- Is transformed into a CS after conditioning
11. Conditioned stimulus
a. Define: a stimulus that elicits the reflex response due to prior
conditioning (learning)
b. Identify its role in respondent conditioning
- The stimulus part of a conditioned reflex
- Elicits CR
c. Identify the name referring to its eliciting function
- Conditioned elicitor
12. Conditioned response
a. Define: is a response elicited by CS
- Is a learned behaviour
- Has the same topography as UR, but is now elicited by a new
stimulus
b. Identify its role in respondent conditioning
- The response part of conditioned reflex
13. Conditioned reflex
a. Define: a NEW relation between a once neutral stimulus and a
conditioned involuntary response after contingent pairing of that
neutral stimulus with an unconditioned stimulus
- function -altering effect
- The change of a neutral stimulus into a conditioned
stimulus
14. Temporal contiguity
a. Define: the nearness of events in time
- In pairing, contiguity refers to the amount of time between
the first stimulus and the second
15. S-S contingency:contingency in which one stimulus is dependent upon
another stimulus
a. contingency: dependency between events
- Strongest type: if and only if X, then Y
- Contingency vs contiguity
- contiguiTy think Time
- contingENCY think dependENCY
16. Five respondent conditioning procedures
a. Define
16.1 short delay
- CS occurs briefly prior to the US onset, overlaps with US, and
terminates before the US offset
- Usually very effective
16.2 long delay
- CS occurs prior to the US onset, overlaps with US, and
terminates before the US offset
- Usually effective
16.3 Trace
- CS begins and ends prior to US onset with no overlap between
the two
- Sometimes effective
16.4 simultaneous
- CS and US onset and offset occur at the same time with
complete overlap of the two stimuli
- Usually ineffective
16.5 backward conditioning
- US occurs prior to the CS onset, overlaps with CS, and
terminates before the CS offset
- Almost always ineffective
17. Respondent extinction
a. Define: the process through which a conditioned reflex is
weakened by no longer pairing the CS with US
B. for this to occur there must first be learning
18. respondent spontaneous recovery
- Sudden reappearance of a previously distinguished conditioned reflex
- Occurs as a function of the passage of time during the CS is not
presented; and/or the organism is in a different environment
19. Respondent stimulus generalization
a. Define: the spread of effects of respondent conditioning to stimuli
other than the conditioned stimulus
B. you are most likely to see this if the topography is the same
● Ex pavlov's dog salivates with a bell but might also will an
iphone ringer
20. Define higher-order conditioning
- Process of conditioning in which a US is paired with a previously
conditioned stimulus CS, rather than a US
Variables associated with effectiveness of respondent conditioning
1. Number of paired trials
2. CS-US contingency
3. CS-US contiguity
4. Inter-Trial Intervals (at least 20-30s)
5. Specific features
6. Preparedness
7. Past experience: previous experience with NS may inhibit conditioning
D. Other Related Unlearned
Environment-Behavior Relationships
21. Habituation
- A temporary reduction in a reflex response due to repeated
presentations of eliciting stimulus within a short period of time
22. Potentiation
- A temporary increase in some dimension or intensity of a reflex
response due to repeated presentations of eliciting stimulus within a
short period of time
- Most likely produced by an unpleasant stimuli
23. Sensitization
- The tendency of a stimulus to elicit a reflex response following the
elicitation of that response by a different stimulus
24. Adaptation
- A reduction in the frequency or magnitude of a response, or a set of
response, as a result of prolonged exposure to a stimulus or an
environment context
E. The Importance of Respondent
Relations and Learned Emotional
Responses
25. Identify the main contributions of John Watson
- Father of Behaviorism
- Known for his behaviorist approach, which he later applied to
human behaviour
- A landmark in the founding of behaviourism “psychology as
behaviorist views it”
- S-R behaviorism
- Discovered that emotional reactions could be learned
- Proposed ways to counter-condition phobias and other irrational
emotions
Chapter 4
Respondent: A class of responses defined in terms of the stimuli that elicit them
Operant Learning (R-S, S-R-S, etcc.)
-always includes a consequence contingent on behavior
S-R-S = ABC
A. A Brief History of the Operant
Paradigm
The Law of Effect (Thorndike)
-organisms learn through the consequences of their actions
the work of Edward Thorndike and his main contributions
- Further quantified animal experimentation
- Discovered the law of effect and the weak law of effect
- Lead to Skinner’s work on Operant Selection
the work of B.F. Skinner and his main contributions
- Developed a scientific method for the study of behaviour (experimental
analysis of behavior)
- Discovered many principles of operant selection
- Radical behaviorism
- Analysis of Verbal Behaviour
- His work led to the development of ABA (Science and Human
Behavior)
B. Operant Behavior and Selection
Operant behavior:
-behavior that has an effect on the environment and is primarily under
the control of its consequences
-usually brought under control by a combination of antecedents
and consequences(never just antecedents)
OPERANT VS RESPONDENT
See pie >Take a biteSee pie > Salivate
Irritation in throat > Gargle Irritation in throat > Cough
Crash into tree > Call policeCrash into tree > Activation SYNDROME
Consequence
-an environmental event that follows a response
Selection
-the process in which repeated cycles of ... occur
-variation
-interaction with the environment
-differential replication as a function of the interaction
Operant Selection
-process of repeated cycles of behavioral variability and modification
of behavior by the environment over the course of an individual’s
lifetime
C. Contingencies of Reinforcement and
Punishment
Selection by Consequences
-selection of the behavior of the individual organism by its
consequences
-many consequences occur naturally
-some may be planned
Consequation
-consequation consists of providing a consequence contingent on a
response
Simplest Consequential Operations
-presentation of a stimulus (onset or magnification)
-withdrawal or termination of a stimulus (offset or attenuation)
Two effects of consequences
Strengthening (Reinforcement)
-increases the future probability of occurrence under similar
circumstances (acceleration)
Weakening (Punishment)
-decreases the future probability of occurrence under similar
Circumstances (deceleration)
Reinforcement
-an environment change that follows a response in time, is contingent
on that response and increases the probability of similar responses
under similar circumstances
Examples:
-praise contingent on compliance increases the future
frequency of compliance with requests under similar
circumstances
-termination of naggin contingent on compliance increases the
future frequency of compliance with requests under similar
circumstances
Punishment
-an environmental change that follows a response closely in time, is
contingent on that response, and decreases the probability of similar
circumstances
Examples:
-ridicule contingent on compliance with requests decreases the
future frequency of compliance under similar circumstances
-termination of parental attention contingent on compliance
decreases the future frequency of compliance under similar
circumstances
Critical Attributes of Reinforcement and Punishment
-follows a response
-has close temporal contiguity (nearness)- essential factor
-consequence is contingent on the response
-defined by its effect on behavior
-has function-altering effects on antecedent stimulus
Define automaticity
- Effects of reinforcement and punishment occur automatically;
- A person or any other organism does not need to be able to describe or be
aware of the contingency to have their behaviour changed by it
D. Appetitive Versus Aversive Stimuli
Define appetitive stimulus
- A stimulus is labeled as appetitive if it functions (based on its effects on
behaviour):
- As reinforcement when presented following behaviour
- To abate behaviour that has terminated it in the past
- As punishment when withdrawn following behavior
Define aversive stimulus
- A stimulus is labeled as aversive if it functions (based on its effects on
behaviour):
- As punishment when presented following behaviour
- To evoke behaviour that has terminated it in the past
- As reinforcement when withdrawn following behavior
E. Some Variable Attributes of
Reinforcement and Punishment
- Positive VS Negative
- Unconditioned VS Conditioned
- Automatic VS Socially mediated
- Planned VS Unplanned
Positive reinforcement
● An environmental change in which an appetitive stimulus is ADDED
(presented) or magnified following a response, contingent on that response,
that INCREASES in probability of similar responses under similar
circumstances
● Bart is the behaver
Antecedent Behavior Consequence
Mom busy on computer Bart tries to talk to mom Mom says “Go away.
I’m busy.”
Mom still on computer Bart breaks glasses Mom runs over and
scolds Bart
○ In the future when Bart tries to talk to mom while she is busy, he breaks
the glasses.
● ASR: Joe is the behaver. Joe tells a joke while Sue is talking to Scott. She
turns around, laughs and says, “That was funny Joel!” Joe tells jokes more
often in the future. What is the positive in this example of positive
reinforcement?
○ A. Sue talking to Scott
○ B. Joe liking Sue
○ C. Joe telling jokes more often in the future
○ D. Sue laughing and commenting on Joe’s jokes
○ ANSWER: D
● Non-Example: Begins hailing--open umbrella--hail removed
● Ex: My son wants to play outside, so he asks, “Mommy can you play outside
with me?”
○ Example of positive reinforcement: Mom then goes and plays with him.
Define reinforcer
● A stimulus is “an energy change that affects an organism through its
receptors.
● Reinforcers may be visual stimuli, olfactory stimuli, auditory stimuli, tactile
stimuli, etc
● Reinforcers are NOT reinforcers independent of their effects on behavior-bad
things can be reinforcers too
Negative reinforcement
● An environmental change in which an aversive stimulus is SUBTRACTED
(withdrawn, removed; terminated) or ATTENUATED following a response,
contingent on that response , that INCREASES the probability of similar
responses under similar circumstances.
○ For negative reinforcement to occur there MUST be an irritant or
aversive antecedent-reinforcement by relief
●
Antecedent aversive Target response Negative reinforcement
Shock on Lever press Schock removed
(offset)
Mosquito bite scratch Itch removed
(attenuated)
Child screams Mom gives cookie Screaming removed
(offset)
● ASR: Elo is the behaver. The RBT presents a match-to-sample task to Elo,
and Elo throws the task materials off the table. The RBT ends the task and
picks up the materials off the floor. Elo continues to throw materials off the
table during tasks in the future. What is the antecedent aversive in this
example of negative reinforcement?
○ A. Elo throwing task materials off the table
○ B. The RBT ending the task to clean up
○ C. The RBT presenting the task
○ ANSWER: C
● Non Example: removing a student from the playground contingent on cursing
decreases cursing during recess
ASK:
1. Who is the behaver?
2. What is the behavior?
3. Did an event follow the bx and what was it?
4. Is the consequence added (positive) or removed (negative)
5. Was there an increase (reinforcement) or decrease (punishment) in
responding?
Escape behavior
● Behavior that terminates an aversive stimulus; and thus, is maintained by
negative reinforcement
○ Aversive already happening and response terminates it
○ Ex. My son is doing his hw and asks for a break and to go outside. I let
my son go outside and he asks for a break in the future when doing
homework.
Avoidance
● Behavior that terminates a warning stimulus and prevents an aversive
stimulus; and thus, is maintained by negative reinforcement
○ Ex. HW time is 3:00. At 2:55, my son asks for a break to go outside. I
let my son go outside and he didn't come in until dinner. He asks to go
outside at 2:55 in the future.
● ASR: opening an umbrella when rain is hitting your skin. Escape or
avoidance?
○ Escape-already occurring and terminates
● ASR: Putting on sunglasses before going outside in the bright sun. Escape or
avoidance?
○ Avoidance-terminates the warning stimulus
● ASR: Applying cooling aloe to relieve pain from a sunburn from going to the
beach without sunscreen. Escape or avoidance?
○ Escape
● Ex: gps saying traffic ahead so you take another route is avoidance but if you
forget to turn your gps on and then hit traffic and look for another route that is
escape
Positive punishment
● An environmental change in which an aversive stimulus is added (presented)
or magnified following a response, contingent on that response, that
decreases the probability of similar responses under similar circumstances.
● ASR: When Mike shares the stuffed animal he brought in for show and tell
with his classmates, they all begin to laugh and make fun of him. Mike never
shares items for show and tell again. What is the “positive” in this example of
positive punishment?
○ A. Mike’s classmates laughing and making jokes
○ B. Mike wanting to impress his classmates
○ C. Mike showing the class the stuffed animal
○ ANSWER: A-addition of aversive stimulus
Punisher
● Positive punisher: aversive stimulus that, when presented contingent on a
response, decreases the future frequency of that behavior
○ Identified by effect on behavior
Negative punishment (SP-)
● An environmental change in which an appetitive stimulus is subtracted
(withdrawn, removed; terminated) or attenuated following a response,
contingent on that response, that decreases the probability of similar
responses under similar circumstances
● Aka punishment by penalty
●
Antecedent appetitive Target response Negative punishment
Plate of food Steals food All food removed
Video game on Kicks little brother Video game turned off
Has 100 points Threatens to hit peer Loses 50 points
(response cost)
● ASR: During computer time in the classroom, Rasha kicks the peer next to
her under the table. The teacher takes away the rest of Rasha’s computer
time. Rasha no longer kicks her peers during computer time. What is the
antecedent appetitive in this example of negative punishment?
○ A. Classroom computer time
○ B. Kicking peers stops during computer time
○ C. The teacher taking away computer time
○ ANSWER: A-the pleasant thing removed
● In order for time-out to function as negative punishment, there must be a
time-in antecedent appetitive condition
Unconditioned reinforcer
● A stimulus that, usually, is reinforcing without any prior learning; that is, its
effect is due to phylogenic provenance (genetics)
● Aka primary reinforcers
● Four most common: food, water, air, return to normal temp and sexual activity
● Unconditioned negative reinforcement: termination of painful stimulation
● ASR: Select all that are likely to be unconditioned reinforcers:
○ A. Money
○ B. Attention
○ C. Breathable air
○ D. Termination of pain
○ E. Laughter
○ F. Food
○ ANSWER: C, D, F
Conditioned reinforcer
● A stimulus that initially has no innate reinforcing properties, but acquires
reinforcing properties through pairing with unconditioned reinforcers or
powerful conditioned reinforcers
● AKA secondary reinforcers
● due to ontogenic provenance
● ASR: Select all that maybe conditioned reinforcers:
○ A. Attenuation to an itch
○ B. Getting a $10 tip from customer
○ C. Getting a cup of juice
○ D. Getting a “like” on social media
○ E. Receiving praise from a teacher
○ F. Reduction of pain
○ G. Receiving chips
○ ANSWER: B, D, E
Generalized conditioned reinforcer
● A conditioned reinforcer that has been paired with a variety of other
reinforcers and is effective for a wide range of behaviors.
● Ex. praise, money, tokens, points
Unconditioned punisher
● A stimulus that is punishing without any prior learning due to phylogenic
provenance (genetics)
● Unconditioned positive punishers:
○ Electric shock
○ fire
○ Stimuli that causes pain naturally
● Unconditioned negative punishment:
○ Termination of access to water
Conditioned punisher
● A stimulus that initially has no innate punishing properties, but acquires
punishing properties through pairing with unconditioned punishers or
powerful conditioned punishers
● Due to ontogenic provenance (previous experience)
● Ex. of cond. Pos. punishers: reprimands, stern looks, frowns from another
person
● Ex. of cond Neg. punishers: point fines, loss of access to keys, favorite show
turned off
Generalized conditioned punisher
● A condition punisher that has been paired with a variety of other punishers
and is effective for a wide range of behaviors.
● Ex. The word “No”
Two types of automatic consequences (consequence directly produced by response)
● Automatic Reinforcement: type of reinforcement in which response itself
directly produces the reinforcing consequence
● 2 types:
○ 1. Automatic positive reinforcement ex.
■ Sophie’s playing the violin is maintained by the beautiful music
she produces and hears as she plays.
○ 2. Automatic negative reinforcement ex.
■ Ashley sees a nasty cockroach and sprays it with bug spray. In
the future she is more likely to grab the spray when she sees a
cockroach.
● ASR: Linus continues to pick flowers from his garden as a centerpiece for his
table because when he does this, the pleasant scent of flowers fills the air and
he can see colorful flowers on the table. This is an example of:
○ A. Automatic positive reinforcement
○ B. Automatic negative reinforcement
○ C. Socially mediated positive reinforcement
○ D. Socially mediated negative reinforcement
○ ANSWER: A
● ASR: Jerry covers his eyes with his hands when a scary monster pops out on
screen during a movie, so he doesn’t see the image. Jerry now covers his
eyes every time there is something scary on tv. This is an example of:
○ A. Automatic positive reinforcement
○ B. Automatic negative reinforcement
○ C. Socially mediated positive reinforcement
○ D. Socially mediated negative reinforcement
○ ANSWER: B
Two types of socially mediated consequences
● Reinforcement in which the reinforcing consequence is provided by another
person
● 2 Types:
○ 1. Socially mediated positive consequences (added)
■ Ex. Sophie plays the violin. Her playing is maintained by other
people’s clapping and praise
○ 2. Socially mediated negative consequences (taken away)
■ Ex. Ashley sees a nasty cockroach and screams for Allison to
kill it. Ashley’s screaming behavior is maintained by the bug
dying
● ASR: Nellie’s mom gives her a gold sticker to add to her reading chart when
Nellie finishes a book. Nellie continues reading to fill up her chart. This is an
example of:
○ A. Automatic positive reinforcement
○ B. Automatic negative reinforcement
○ C. Socially mediated positive reinforcement
○ D. Socially mediated negative reinforcement
○ ANSWER: C
● ASR: Spencer reads through the entire deck of sight words in under 2 mins.
The therapist removes the task and lets him have a brief break. Now each
time Spencer gets a deck of sight word cards, he always reads through them
in under 2 mins. This is an example of:
○ A. Automatic positive reinforcement
○ B. Automatic negative reinforcement
○ C. Socially mediated positive reinforcement
○ D. Socially mediated negative reinforcement
○ ANSWER: D
Unplanned contingency
● Not explicitly arranged and occurs naturally
● Ex. of unplanned reinforcement: There is an orange tree in the backyard.
Grab an orange and take a bite. The taste of the orange reinforces your
actions
Planned contingency
● Explicitly arranged by a person
● Ex. of planned reinforcement: Casino staff program slot machines pay off to
produce high rates of gambling
Unit 5
A. Simple Schedules and Parameters
1. Artificial contingency-Explicitly arranged, reinforcement can be implemented
Ex: Showering is rewarded with a dollar
Ex: Praise statements follow cleaning up toys
2. Natural contingency-Not explicitly arranged, reinforcement happens
Ex: Showering results in a clean body
Ex: Sipping water makes your tongue wet
ASR: Jodi answers a question correctly and gets a token added to his token
board. This is an example of an: Artificial contingency
ASR: Putting on a sweatshirt when I am cold results in my body temperature
increasing and feeling warmth. This is an example of: Natural contingency
3. Continuous schedule of reinforcement-EVERY correct response is reinforced,
used to establish new behavior
ASR: True/False: If behavior is ever followed by reinforcement, each and every
response emitted will always be followed by reinforcement. (FALSE)
4. Extinction (EXT) schedule-Behavior is NEVER reinforced
5. Intermittent schedules of reinforcement-Some responses are reinforced,
others are not
● Four types of Intermittent schedules:
○ Fixed Ratio
○ Variable Ratio
○ Fixed Interval
○ Variable Interval
ASR: Billy gets to watch a video on his tablet most of the time when he uses
the bathroom independently, but not every time. This is an example of an:
intermittent reinforcement schedule
6. Variable schedule-Schedule criteria changes around an average value
7. Ratio schedule-Reinforcement follows a certain number of responses
8. Fixed ratio (FR)-schedule criteria remains the same
● Ex: A child receives a cookie after every 3 correct responses (FR-3)
9. Variable ratio (VR)-schedule criteria changes around an average value
● Ex: A child receives a cookie after an average of 3 correct responses (VR-3)
Unlike an FR, the rate of responding in a VR is steady. Because one does not
know when reinforcement will occur, he/she continues responding in hopes of
receiving reinforcement soon. VR tends to be the most resistant to extinction.
ASR: Every time Greta completes 5 household chores, she gets a sticker on
her chore chart. This is an FR schedule of reinforcement. Everytime Greta
reads 2 book chapters on average, she gets a sticker on her reading chart.
This is an example of a VR schedule of reinforcement.
10.Interval schedule-Reinforcement follows the first response after a period of
time
11.Fixed interval (FI)- Reinforcement follows the first response after a set period
of time.
● Ex: A child receives a cookie when he has answered his homework questions
after 20 minutes (FI-20)
12.Variable interval (VI)-Reinforcement follows the first response after an
average period of time
● Ex: Bobby starts his homework at 4:00pm. Bobby’s mom tells him that
he’s doing a good job with his homework at 4:15, 4:25,and 4:45.
Average length of interval: (15+10+20)/3=15
VI-15
ASR: Reinforcement is delivered for the FIRST response after a certain amount
of time has passed since the last delivery of reinforcement. This is an interval
schedule.
ASR: Jim praises Bo for appropriate play and leaves the area. Jim comes back
5 minutes later and praises Bo the first time he sees him playing. Next time,
Jim comes back in 4 minutes and praises Bo if he sees him playing, and then
again in 6 minutes. What is the schedule? Variable interval
13.Interval schedules with a limited hold (LH)-timespan after an interval when a
response must occur to obtain reinforcement
● Ex: A train comes every 20 minutes. The door remains open for 15 seconds. If
you do not enter the train while the door is open, you will miss the train ride.
(FI-20 Minute/LH-15 Seconds)
ASR: A food pellet is delivered if the rat presses a lever within 30
seconds after each 2-minute period has passed since the last lever
press was reinforced. This is an example of: Interval schedule with
limited hold
ASR: True/False: A limited hold may produce a slight increase in rate,
but no change in response pattern (TRUE)
ASR: Schedule that produces very high and steady responding: VR
Schedule that produces a scalloped pattern: FI
Schedule that produces low-to-moderate but steady responses: VI
Schedule that produces high rates with a pause and burst: FR
14.Time-based schedules-Stimulus delivered or removed based on time,
independent of responding
15.Fixed time (FT)- Stimulus delivered or removed after a set period of time,
independent of responding
Ex: Mom tells Nora that she has 20 minutes to finish her dinner, and clears
the plate 20 minutes later, regardless of if Nora ate all of her dinner or not.
16.Variable time (VT)-Stimulus delivered or removed after varying period of time,
independent of responding
Ex: A new mom checks on her newborn baby after an average of 10 minutes,
regardless of if the baby has made a noise or not.
ASR: True/False: In time-based schedules, reinforcement delivery is
contingent on the occurrence of a response. (FALSE)
ASR: The RBT gives tickles to Janette about every 5 minutes.
Sometimes she gives tickles after 4 minutes, and sometimes after 6
minutes. Which schedule?
Time-based schedule
ASR: Abigail gets attention from her mom the first time she calls her
name after an average of 10 minutes have passed since the last time she
received attention. Which schedule? Variable interval schedule
17.Four parameters of reinforcement-
● Rate (Schedules) of reinforcement
● Magnitude (Quantity, duration, intensity)
Ex: How many/how big, how long, how loud
● Latency (Immediacy/temporal contiguity)
Ex: Teacher attention follows shouting by one second vs follows
hand-raising by 10 seconds
● Quality (preference)
Ex: One tv show vs another, one brand of chip vs another
ASR: Manipulating parameters of reinforcement can produce shifts in
response allocation
ASR: The schedule of reinforcement describes the: rate
Quantity, duration, and intensity describe: magnitude
Preference of reinforcement describes: quality
Temporal contiguity describes: immediacy/latency
B.Extinction and Recovery from Punishment
18.Recovery from Punishment-Reemergence of previously punished behavior
after punishment is discontinued
Ex: Lever Press------Shock. Later, Lever Press------No shock. Lever
pressing behavior returns
Ex: Nose picking---Verbal reprimand. Later, Nose Picking----No Verbal
Reprimand. Nose picking behavior returns
19.Extinction-a decrease in responding due to withdrawal of reinforcement. Often
used to eliminate unwanted behaviors that have been previously paired with
reinforcers
● Ex: Zoe throws a tantrum to get a toy at the store. To place this
behavior on extinction, Zoe’s mother does not allow Zoe to choose a
toy when she starts to tantrum. (Attention/Access-Based Extinction)
● Ex: Walter is yelling because he does not want to do his homework. His
mother ignores his yelling and redirects him back to his homework.
(Escape Extinction)
● Ex: Matthew likes to flick the lights off and on. His father places the
behavior on extinction by removing the light bulb. When Matthew flicks
the light switch, the lights don’t turn on or off, eliminating the
reinforcement that he seeks. (Sensory Extinction)
ASR: What should happen following the response in an extinction procedure
for behavior maintained by reinforcement? The appetitive stimulus is withheld
ASR: What should happen following the response of an extinction procedure
for behavior maintained by negative reinforcement? The aversive stimulus
continues
ASR: Vicky is the behaver. Every time Vicky pushed her classmates while
playing dodgeball in PE, the teacher sent Vicky to sit on the bleachers until the
game was over. When the teacher stopped sending Vicky to the bleachers after
she pushed, Vicky no longer pushed her classmates during dodgeball games.
This is an example of extinction for behavior maintained by: Negative
reinforcement
ASR: Aggie is the behaver. Each time Aggie responds correctly, the RBT gives
her a token. The RBT ran out of tokens for a few days and stopped giving
Aggie tokens for correct responses. Aggie does not respond correctly
anymore. This is an example of extinction for: positive reinforcement
Possible consequence of extinction: extinction burst
● An increase in the rate of the response early in the extinction process
Ex: Repeatedly pressing the button of the soda machine
Extinction-induced variability: change in the form of a response as a result of
extinction
Ex: Hitting, kicking or shaking the soda machine in place of pressing the
button (EXT)
ASR: The child stomps his feet in the store checkout line and his mom gives
him a candy bar. When the child does not get a candy bar the next time that he
stomps, he starts jumping, kicking, punching, and screaming. This is an
example of: extinction-induced variability
ASR: Every time I flip the light switch, the lights turn on. When the lights do
not turn on after I flip the switch today, I repeatedly flip the light switch over
and over. This is an example of: extinction burst
C. Differential Reinforcement and Shaping
20.Differential reinforcement (DR) -Procedure involving reinforcement plus
extinction. There are 6 types of differential reinforcement:
● Differential Reinforcement of Alternative Behaviors (DRA)
● Differential Reinforcement of Incompatible Behaviors (DRI)
● Differential Reinforcement of Other Behaviors (DRO)
● Differential Reinforcement of High Rates of Behavior (DRH)
● Differential Reinforcement of Low Rates of Behavior (DRL)
● Differential Reinforcement of Diminishing Rates (DRD)
21.Differentiation-results when differential reinforcement is based on different
response topographies. As reinforced topographies increase in rate,
non-reinforced topographies decrease in rate.
● Hint: DIFFERENTiaTion (Differing topography)
A note on differential reinforcement: it does not always include
extinction. Other components could include punishment, and less
reinforcement
ASR: When Damien hits his sister, he always gets access to his toy. When
Damien nicely asks his sister for his toy, she never gives it to him. Now,
Damien always hits his sister instead of asking nicely for the toy. True/False:
This is a form of differentiation (TRUE)
22.Shaping-DR used to gradually change the topography or dimension of a
response. When shaping is used, you will systematically reinforce a behavior
that is closer to the target behavior and the previously reinforced behaviors
will no longer produce reinforcement.
Ex: Bambi tries to say “bird”---
● says “buh”---reinforced (buh is put on extinction)
● says “buh”---no reinforcement---says “buh-ur---reinforced, (buh-ur ext)
● Says “buh-ur”---no reinforcement ---says BIRD---reinforced
● Now ONLY bird will be reinforced
ASR: What is reinforced in shaping? Successive approximation to a
desired target response
ASR: Once a learner is reliably performing a response in shaping, then
that response should result in (EXTINCTION) while the next successive
approximation should result in (REINFORCEMENT)
ASR: True/False: Reinforcement should never be delivered for a learner
engaging in a response that is closer in approximation to the desired
target behavior than the response that is currently receiving
reinforcement in the shaping process. (FALSE)
Two types of shaping:
● Across topographies: Novel behavior. Used to teach a response not yet
in a learner’s repertoire. Ex: teaching a child to say “More”
● Within topographies: Changing a dimension. Used to change a
response that is already in the learner’s repertoire. Ex: Rat pressing the
lever more quickly, running a marathon starting at running 1 mile
D.Variations on Simple Schedules
23.Differential reinforcement of high rates of responding (DRH)-reinforcement is
provided only if the rate of response is equal to or higher than a specified
criterion. Used to increase response rates
Ex: Fast lever pressing
24.Differential reinforcement of low rates of responding (DRL)-schedule in which
reinforcement is provided only if the rate of response is equal to or lower than
a specified criterion. Used to decrease response rates.
Ex: Requesting fewer breaks
25.Inter-response time (IRT)-the time between two successive responses
Ex: Sneeze---3 minutes elapses---sneeze again (IRT-3 Minutes)
Procedural variations for differential reinforcement:
● DRH/DRL:
○ IRT-reinforced based on length of time between responses
○ Full Session-reinforced based on response rate during entire session
○ Interval-reinforced based on response rate during shorter time periods
ASR: Schedules of differential reinforcement for rates of responding at or
higher than a specific criterion: DRH. Schedule of differential reinforcement for
rates of responding at or lower than a specific criterion: DRL.
ASR: A trainer works with his dog for 5-minute sessions. The dog currently
jumps 10 hoops in 5 minutes. The trainer sets a goal for the dog to jump 20
hoops in 5 minutes to earn a bone. If the dog jumps through all 20 hoops in 5
minutes, the trainer delivers a treat at the end of the session. If the dog jumps
through fewer than 20 hoops in the 5 minute session, no treat is delivered.
This is an example of: DRH (Full Session)
ASR: Otis is in class for an hour. If Otis asks his teacher for help two times or
less during each 15-minute period of the hour, he receives attention from his
teacher after each of the 15 minute blocks. What type of DRL: Interval
26.Differential reinforcement of diminishing rates of responding (DRD)-schedule
in which reinforcement is provided only if the rate of response is equal to or
lower than a specified criterion and this criterion is gradually decreased
across time. Used to decrease response rates across time.
Ex: Allen can finish an entire case of beer in one sitting and wants to
gradually reduce the amount of beer he drinks. A single pack of beer has 30
cans. Allen’s initial goal is to drink 25 or fewer beers in a day. If he succeeds,
he will receive reinforcement. When this criterion is mastered, the next goal
would be an even lower rate; 20 beers or less. This procedure repeats until it
meets a predetermined diminished rate, such as 3 beers/day.
27.Differential reinforcement of paced responding (DRP)-schedule in which a
response is reinforced only if the time between the response and the
preceding is within a specified range. Reinforcement is based on the time
between responses.
Ex: A ballroom dancer only moving on certain beats of the music
ASR: Which differential reinforcement schedule is used to decrease
rates of responding? DRD
ASR: Remi’s mom gives Remi access to his iPad if there is a 3-to-5 hour
time period in between each time he eats food. This is an example of:
DRP
28.Schedule thinning-gradually changing the contingency of reinforcement by
increasing the ratio or interval required before reinforcement is obtained
Ex: Daisy is matching socks. Because she is new to this task, her
mother offers continuous reinforcement, telling Daisy “Good job!” after every
match she makes. Eventually, as Daisy gets better and faster at matching, her
mother reinforces her on average of 3 matches (VR3) then an average of 5
matches (VR 5), and so on.
ASR: Why is it important to thin schedules of reinforcement?
To increase response variability?-it’s not always practical in a natural
environment to reinforce every response
29.Ratio strain-decrease in responding due to rapid increases in schedule
requirements
Ex: Alex’s dad offers him $5 if he pulls weeds. Alex accepts and earns
his $5. The next day, Alex’s dad offers $5 if Alex takes out the trash, walks the
dog, and washes the dishes. Alex declines the $5.
ASR: Ratio strain is likely to occur when the: requirements of a reinforcement
schedule rapidly increase
Discrimination-results when differential reinforcement is based on the same
response in the presence of different antecedent stimulus conditions. Reinforcement
is based on WHEN you do it (under what conditions)
Hint: DiSCrimination = DIfferent Stimulus Conditions.
ASR: When Zack tells a crude joke in front of his friends, they all laugh.
When Zack tells a crude joke in front of colleagues at a meeting, everyone ignores him and
continues their discussion. Now, Zack only tells crude jokes around his friends.
Is this an example of discrimination? YES
E. Compound Schedules
30.Compound Schedules of Reinforcement-include 2 or more schedules of
reinforcement for a target behavior. Combination of 2 or more basic schedules
and/or differential reinforcement schedules. AKA complex schedules. Basic
components can occur: simultaneously or successively, with or without a
signal, with reinforcement contingent upon each component independently or
in combination.
There are 7 types of compound schedules:
● Alternative schedule
● Conjunctive schedule
● Concurrent schedule
● Mixed schedule
● Multiple schedule
● Chained schedule
● Tandem schedule
ASR: Compound schedules can consist of which types of reinforcement
schedules? Basic simple schedules and differential reinforcement schedules
31.Concurrent schedule-requires 2 or more schedules of reinforcement for two or
more behaviors. Both schedules are run simultaneously but each schedule
has its own contingency.
Ex: When a child eats 5 carrots, he earns a chocolate bar. When he
remains seated for 10 minutes, he earns 5 minutes of TV time. (Conc FR-5
FI-10)
ASR: True/False: In concurrent schedules, each response is placed on
different schedules of reinforcement that operate simultaneously. (TRUE)
ASR: Concurrent schedules are often used to evaluate: Choice
32.Matching law-distribution of responding is equal to relative rate of
reinforcement for each response
Ex: Lily may yell or raise her hand in the classroom. When Lily yells,
the teacher reprimands her about every other time. However, when she raises
his hand, the teacher calls upon her about once every six times. If attention is
the reinforcer, yelling will occur more often than hand-raising.
Generalized matching law-equality in the relative rate of response to relative
rate of reinforcement. Describes the distribution of choice behaviors across
species, responses, reinforcers, and settings.
ASR: The matching law states that an organism’s relative rates of
behavior match relative rates of: Reinforcement
ASR: If Franny screams for a cookie, her mom gives her a cookie every
time. If Franny asks nicely for a cookie, her mom gives her one about
every fifth time she asks. Franny screams for a cookie more often than
asking nicely. Is this an example of a concurrent schedule? YES
33.Multiple schedule-2 or more basic schedules of reinforcement that alternate,
different stimulus for each
Ex: Max cleans the windows in the house. When his mom is home, she
gives him a soda after he cleans 4 windows. When his grandmother is home,
she gives Max a soda after he cleans 10 windows.
34.Behavioral contrast-a shift in response rate on an unchanged component of a
multiple schedule due to a change in the second component. Behavior
increases on one component and decreases on a second component (or vice
versa)
There are 2 types of Behavioral contrast:
● Positive contrast-Bx increases in unchanged component
with decrease in changed component
● Negative contrast-Bx decreases in unchanged
component with decreases in unchanged component with
increase in changed component
Ex: Billy eats cookies at the same rate given the presence or absence of his
grandmother. One day, Billy’s grandmother punishes Billy for eating cookies
when she is present. This results in reducing cookie eating when Billy’s
grandmother is in the kitchen, but increases the cookie eating in the alternate
condition of Billy’s grandmother being absent
ASR: In multiple schedules, each schedule of reinforcement is
“signaled” with: a different stimulus
35.Mixed schedule-2 or more basic schedules that alternate without signaling
stimuli. A mixed schedule requires 2 more more schedules of reinforcement
for one behavior and one schedule is presented at a time like multiple
schedule, but it is NOT controlled by a distinct stimuli.
Ex: A child can sometimes listen to his favorite song when he eats 5
spoons of rice. He can sometimes listen to his favorite song when he eats rice
for one minute.
TIP: you can remember mixed vs multiple with being all mixed up because
you don't have a signal with a mixed schedule
36.Chained schedule-two or more schedules of reinforcement operate
sequentially and in a specific order (not random) Components are signaled,
and the terminal reinforcer comes at the end of the entire sequence
Ex: Lever Press (FR 15)---Green Light---Lever Press (VI 30”)---Yellow
Light---Lever Press (VR 50)---Food
ASR: Cara wants to talk to her best friend on the phone. First, Cara has
to get her phone out of her purse, unlock her phone, and dial her best
friend’s phone number. After the phone rings and her friend picks up,
Cara can finally say “Hello” and enjoy a conversation with her friend.
This is an example of: chained schedule
37.Tandem schedule-two or more schedules of reinforcement and completion of
the first schedule leads to the next schedule (like chained schedule) but it is
NOT controlled by distinct stimuli. When they are all completed in order, the
reinforcer becomes available
Ex: When a child FIRST answers 10 questions AND has a 1 minute
conversation, THEN he will receive candy
38.Alternative schedule-two or more schedules of reinforcement operate
simultaneously, and reinforcement is earned when one of the schedule
components is completed. -about the same response
Ex: Evan is given an assignment. He will get a piece of candy when he
answers 20 questions OR when he works on his assignment for 10 minutes.
39.Conjunctive schedule-two or more schedules of reinforcement operate
simultaneously, and reinforcement is earned when both schedule components
are completed. -about same response
Ex: Lucy receives an A in PE when she jumps rope at least 100 times
and for at least 5 minutes.
ASR: Two schedules occur sequentially and in a specific order,
components are not signaled, and the terminal reinforcer is provided at
the end of the sequence. This is a: Tandem schedule
ASR: Reinforcement is provided when the criteria for only one of the two
schedules is met. This is a: Alternative schedule
Chapter 6
· Consequences as determinants
o Consequences are the most crucial variable in an organism’s
experiential or learning history
· Effects of Antecedents and Consequences on behavior
o Effects of antecedents are typically observed in the moment (here
and now).
o Effects of consequences can only be observed through repeated
measures across time.
· Evocative and Abative effects
o Evoke: Behavior is likely to occur under the current conditions
§ Increase in the momentary: Rate, duration and magnitude
o Abate: Behavior is NOT likely to occur under current conditions
§ Decrease in the momentary: Rate, duration and magnitude.
· Behavior-Altering Effect of Antecedents
o An effect of an antecedent stimulus to either begin, stop, increase,
or decrease a dimension of the behavior (frequency, duration,
magnitude, etc.)
§ Two behavior-altering effects: Evoke and Abate
· Operant Controlling Variables
o Operant behavior is always under the control of its consequences
§ R-S
o Operants are also often under the functional control of antecedents
§ S-R-S or A-B-C
o Operants are never controlled by antecedents alone
· Operant Functional Relations
o Operant functional relations form from contingencies between
specific antecedent event, behavior, and consequences
§ S-R-S, A-B-C, S-R-C (all the same thing)
· Function-Altering Effect of Consequences
o A new functional relation* between antecedent conditions and the
behavior due to a contingent consequence under those antecedent
conditions
§ *Functional relation= A reliable effect of the environment on
behavior
o This is the function-altering effect of the consequence on the
antecedent
· Function-Altering Effect of Reinforcement
o Baseline: An antecedent stimulus (S1) does not evoke or abate a
response
o Step 1: Reinforcement contingently follows R1 in the presence of S1
§ (S1) – R1 -> Sr1
o Step 2: When the antecedent stimulus is present again, it now
evokes the response
§ S1-> R-> SR
§ S1 exhibits a new functional effect due to the history of the
SR1
§ S1 now has an evocative effect
· Respondent example of a function-altering effect
o Respondent conditioning S1 and S2 are paired
· Function-Altering effects vs. Behavior-Altering effects
o All function-altering are stimulus-on- stimulus effects
o While all behavior-altering effects are stimulus-on- behavior effects
· Two types of Differential Reinforcement
o Differential reinforcement is either:
§ Reinforcing one behavior and not another, resulting in:
Differentiation
· Results when differential reinforcement is based on
different response topographies. (IE: What to do)
· Based on a two-term contingency R1->SR
§ Reinforcing in one environment and not another, resulting in:
Discrimination
· Results when differential reinforcement is based on
different antecedent environmental conditions (IE: When
or where to do it)
· This is an S-R-S or A-B-C
· Availability and unavailability
o Availability: In the past, the consequence was likely to follow the
response under the same or similar conditions
o Unavailability: In the past, the consequences was NOT likely to
follow the response under the same or similar conditions
o Discriminative stimuli signal availability
· Function-Altering effects and differential reinforcement
o Stimuli which previously had no effect now have evocative effect on
behavior
o There is an actual change in the functional relation between the
antecedent and the behavior
· Discriminative Control
o The presence or absence of an antecedent stimulus alters a
dimension of a specific behavior
o Effects:
§ “Immediate” = The effect of the antecedent stimulus happens
now
§ “Momentary” = The effect is temporary, in that it only lasts
while that specific antecedent stimulus condition is present.
· Discriminative Control and the Behavior-Altering Effect
o An antecedent stimulus evokes or abates a response as a result of
experiencing a history of differential reinforcement or punishment
· Discriminative Control Related to Reinforcement
o The tendency of behavior to occur more frequently* in the presence
of a specific stimulus as a result of the behavior being contingently
behavior being contingently reinforced in the past in the presence of
that stimulus
§ * Or increase in behavior across some other dimension
· Discriminative Control related to Punishment
o The tendency of behavior to occur less frequently* in the presence
of a specific stimulus as a result of the behavior being contingently
punished in the past in the presence of that stimulus
§ *Or decrease in behavior across some other dimension
· Discriminative Stimulus for recovery
o An antecedent stimulus that evokes or abates a specific behavior
due to a history of differential reinforcement or punishment in the
presence or absence of that antecedent stimulus.
· Bottom lines
o Bottom line 1
§ Discriminative control is established through the process of
differential reinforcement that leads to discrimination
· A behavior is reinforced or punished in one
environment but not in another environment
o Bottom line 2
§ Various discriminative stimuli exert discriminative control
behavior due to a history of differential reinforcement or
punishment, depending on their presence or absence.
· Two main categories of discriminative stimuli
o Discriminative stimuli related to reinforcement
§ SD
§ The discriminative stimulus is said to have discriminative
control over the behavior of interest
§ An antecedent stimulus that evokes a specific behavior due to
a history of reinforcement in the presence of that antecedent
stimulus
§ SD evokes a specific behavior
§ In the presence of the SD the past, this behavior had been
contingently and effectively reinforced
o Discriminative stimuli related to Punishment
§ S delta (SΔ) = Discriminative stimulus for extinction (no
reinforcement)
§ An antecedent stimulus that abates a specific behavior due to
a history of no reinforcement in the presence of that antecedent
stimulus
§ The S-delta abates a specific behavior
§ In the presence of the S-delta in the past, this behavior has
NOT been reinforced
§ The S-delta abates behavior due to a history of extinction
· Discriminative Stimulus (SDP) Related to Punishment
o SDP= Discriminative stimulus of punishment
o An antecedent stimulus that abates a specific behavior due to a
history of punishment in the presence of that antecedent
o The SDP always abates a specific behavior
o In the presence of the SDP in the past, this behavior has been
contingently and effectively punished
· S-Delta P
o SDeltaP= discriminative stimulus for recovery (no punishment)
o An antecedent stimulus that evokes a specific behavior due to a
history of no punishment in the presence of that antecedent stimulus
o The SDeltaP evokes a specific behavior
o In the presence of the SDeltaP in the past, this behavior has not been
punished
o The SdeltaP evokes behavior due to a history of unavailability of
punishment for behavior that had been punished under the SdeltaP
condition.
§ This describes recovery from punishment
Antecedent Effect Due to History of: Effect on Availability of
Stimulus Behaviour consequence?
SD Reinforcement Evokes Available
SΔ No reinforcement (ie: Abates Unavailable
extinction)
SDP Punishment Abates Available
SΔP No punishment (ie: Evokes Unavailable
recovery)
Ex: Kid crying for cookies at store:
a. Dad - SD - history of giving cookies to stop crying; crying evoked
b. Mom - SDP - history of scolding kid; no cookies, crying abated
c. Grandma - SΔ - bad hearing, history of totally ignores crying; crying abated
d. Mom’s best friend - SΔP - friend comes over, history of Mom ignoring crying
because of engagement in conversation with friend; crying evoked but less
intense than with Dad
Antecedents Have “First Names”
o The term discriminative stimulus can be considered one of the two
types of “first names” for an antecedent stimulus
o Antecedent’s first name tells us whether its functional effect is due
to:
§ A history of differential reinforcement (or punishment) under
specific conditions
§ A motivational variable, such as deprivation, satiation, or pain
· Antecedents have “last names”
o Each type of discriminative stimulus is related to the type of
consequences that caused it to form the first place
§ Antecedent’s “last names” specifies the type of that related
consequences
· Discriminative stimulus for reinforcement
· Discriminative stimulus for punishment
· Discriminative stimulus for extinction
· Discriminative stimulus for recovery
· Sub-Types of SD
o Discriminative stimulus for positive reinforcement (SDR+)
§ Evokes behavior due to a history of the availability of positive
reinforcement under this antecedent condition
o Discriminative stimulus for negative reinforcement (SDR-)
§ Evokes behavior due to a history of the availability of negative
reinforcement under this antecedent condition
o Discriminative stimulus for extinction of behavior maintained by
positive reinforcement (SDelta R+)
§ Abates behavior due to a history of a lack of positive
reinforcement under this antecedent condition
o Discriminative stimulus for extinction of behavior maintained by
negative reinforcement (SdeltaR-)
§ Abates behavior due to a history of a lack of availability of
negative reinforcement under this antecedent condition
o Discriminative stimulus for positive punishment (SDP+)
§ Abates behavior due to a history of availability of positive
punishment under this antecedent condition
o Discriminative stimulus for unavailability of positive punishment
(SdeltaP+)
§ Evokes behavior due to a history of a lack of availability of
positive punishment for a previously punished behavior under
this antecedent condition
o Discriminative stimulus for negative punishment (SDP-)
§ Abates behavior due to a history of availability of negative
punishment for a previously punished behavior under this
antecedent condition
Conditional Discrimination
o A discrimination in which more than one antecedent condition must
be present for the response to be reinforced
§ A conditional discrimination is “a discrimination in which
reinforcement of responding during a stimulus depends on (is
conditional on) other stimuli.
§ This is at minimum of a four-term contingency
· SA-S1-R- SR
· SB-S1-R-EXT
· SA-S2-R-EXT
· SB-S2-R-EXT
§ Only one condition produces reinforcement
● Both SA and S1 must be present for SR
Ex: you need to be in the lobby (S1), at the desk (S2), and a hotel employee must be at the
desk (S3) in order to ask for a new room key (R), otherwise you’re not getting a new key
(SR)
· Stimulus Generalization
o The spread of effects of training to stimuli not present during training
§ The untrained stimulus conditions have some similar physical
properties or other association with the original SD in training
o Simply stated: The same response is evoked by different stimuli
§ Contrast with “response generalization”, in which different
responses are evoked by the same stimulus.
· Stimulus Generalization vs. Discrimination
o In many ways, stimulus generalization and discrimination are
opposite
● Stimulus generalization occurs when discriminative control is
absent or incomplete
Ex: toilet training:
a. Generalize toilet training to all toilets
b. Discriminate to only toilets
Chapter 7
A. Introduction to Motivating Operations
1. Motivating operation
a. Define: antecedent environmental variable that increases or
decreases the effectiveness of a consequence and thus evokes or
abates a response
b. Identify the two main effects
1. Value-altering effect = Increase or decrease the effectiveness (value) of a
consequence
2. Behavior-altering effect = Increase or decrease a dimension of a behavior • The
immediate effect on behavior
C. Define deprivation: The state of an organism during or following a
period of time where a reinforcer has not been contacted or consumed
D. Define satiation: The state of an organism during or following continued
contact with, or consumption of, a reinforcer
B. Discriminative vs. Motivative Control
4. Discriminate between examples of discriminative stimuli
and motivating operations
C. Two Main Effects of Motivating
Operations
5. Value-altering effect of motivating operations
a. Value-Altering Effect Define: Establishing or abolishing the value
(i.e., effectiveness) of a consequence. Effect of the MO on a stimulus.
b. Identify the two types
1. Establishing effect: Increase the effectiveness of a
consequence
2. Abolishing effect: Decrease the effectiveness of a
consequence
C. identify examples
EO Examples:
1. Food deprivation had a reinforcer- establishing effect on food
pellets. The food’s value/effectiveness as a reinforcer was
increased. Food deprivation = EO that established food as a
positive reinforcer.
AO Examples:
1. Satiation on popcorn had a reinforcer-abolishing effect on
popcorn. The value/effectiveness of popcorn as a reinforcer was
decreased. Popcorn satiation (as an antecedent condition) was
the AO for positive reinforcement
6. Define establishing operation: establish (increase) the effectiveness
(value) of a consequence (SR / SP)
-The EO can either evoke or abate behavior, depending on:
-If it establishes the effectiveness of a reinforcer, it evokes behavior
-If it establishes the effectiveness of a punisher, it abates behavior
7. Define abolishing operation: Abolish (decrease) the effectiveness
(value) of a consequence (SR / SP)
-The AO can either evoke or abate behavior, depending on:
-If it abolishes the effectiveness of a reinforcer, it abates the behavior
-If it abolishes the effectiveness of a punisher, it evokes behavior
8. Behavior-altering effect of motivating operations
The effect of a motivating operation to increase or decrease a
dimension of a behavior
1. Evocative effect: Increase the momentary frequency of a
behavior • Evoke behavior NOW
2. Abative effect: Decrease the momentary frequency of a
behavior • Abate behavior NOW
c. Identify examples and non-examples of each type
D. Some Types of Motivating Operations
9. Establishing operation for reinforcement
10. Abolishing operation for reinforcement
E. Unconditioned Motivating Operations
11. Unconditioned motivating operation (UMO)
a. Motivating operations that have value-altering effects due to
phylogenic provenance (genetics)
-Unconditioned establishing operations (UEOs)
-Unconditioned abolishing operations (UAOs)
b. Identify the nine main human UMOs
1–5: Deprivation and satiation UMOs
Food, water, sleep, activity, and oxygen
6–7: UMOs related to temperature-Too cold, too warm
8: Painful stimulus onset/magnification
9: UMOs related to sex
12. Define conditioned motivating operation (CMO)
Motivating operations that have value-altering effects due to ontogenic
provenance (learning, pairing with UMO)
-Conditioned establishing operations (CEOs)
-Conditioned abolishing operations (CAOs)
Discriminative Stimuli (SD) - signals the availability of a specific consequence
following behavior
● SDs are conditioned (can only function as a SD as a result of past
experiences)
● Evoke or abate
● SD
● SDP
● S△
Keller and Schoenfeld introduced establishing operation (EO) to refer to
motivational variable as an antecedent causal factor in behavior
Michael (1982) defined “establishing operation” (EO) as an antecedent motivational
variable and covered both deprivation and satiation
Two distinct functional effects of motivational variables
● Establishing operation (EO) - increases its value/effectiveness (ex
deprivation)
● Abolishing operation (AO) - decreases its motivational value (ex satiation)
2003 Michael coined the term Motivating Operation (MO) - an antecedent
environment variable that increases or decreases the effectiveness of a
consequence and thus evokes or abates a response
Table Template
Available SD Not Available SΔ
Valuable EO^SR+
Not Valuable AO^SR+
MOs have 2 effects
1. Value-altering effect = increase or decrease of effectiveness (value) of
consequence (refers to effectiveness of consequence not effectiveness
of behavior)
● Establishing (increase momentary value) or abolishing (decrease momentary
value)
● EO (motivation) + SD (availability) → R1 (target response) → SR + or SR- (+/-
reinforcement)
● For most behavior the SD and EO are necessary antecedent conditions for
the specific behavior to be evoked
Two common MOs
1. Deprivation - state an organism during or following a period of time
where the reinforcer has not been contacted or consumed (establishing effect)
-Water deprivation does not make “water drinking” more valuable as a
response-it makes water more valuable as a reinforcing stimulus
2. Satiation - state an organism during or following continued contact
with, or consumption of, a reinforcer (abolishing effect)
2. Behavior-altering effect an environmental variable that alters the
reinforcing or punishing effectiveness of some stimulus and alters the current
frequency of all behavior that has been reinforced or punish by that stimulus -
increase (evoke) or decrease (abate) a dimension of a behavior
● Effect of the antecedent on behavior in the moment
● Evocation or abative (happens in the here and now)
● EO = reinforcer effectiveness increased - behavior is evoked
● EO = punisher effectiveness increased - behavior is abated
● AO = abolishes the effectiveness of a reinforcer, it abates behavior
● AO = abolishes the effectiveness of a punisher, it evokes behavior
Reinforcer E - E
A-A
Punisher E - A
A-E
The first name of a motivating operation is based on the value altering effect.
The last name of a motivating operation is based on the consequence whose value
is being altered.
Two Variables of further classifying motivating operations
1. Provenance - result of phylogenic provenance (reflex) or ontogenic
(learning history)
UEOs - unconditioned establishing operations
UAOs - unconditioned abolishing operations
CEOs - conditioned establishing operations
CAOs - conditioned abolishing operations
Unconditioned SR
Conditioned Sr
Unconditioned SP - (pain)
Conditioned Sp - (traffic ticket)
Nine Main Human UMOS
1-5 food, water, sleep, activity, and oxygen
Deprivation = increases
Satiation = decreases
6-7 Temperature too cold or too warm
8 Painful stimulus onset/magnification
9 Sex
MOs have been known to widen or narrow the stimulus generalization gradient.
EX. drinking water from puddle or chef stopping at BK
Discriminate - narrows conditions
Generalization - widens conditions
EOs for reinforcement widen the stimulus generalization gradient.
AOs for reinforcement narrow the stimulus generalization gradient.
MOs can effect variability of response forms
EX. waving calmly to waving wildly, Speaking loudly to screaming, walking to
running, & knocking to banging (new response forms)
How can you tell S^D from EO
(Ask this question)
- Does it signal the availability of reinforcer?
- Does it make the “reinforcer” more valuable?
Unit 8
Unit 8: Motivating Operations and Multiple Functions of Stimuli Thomas
Freeman
A. Effects of Motivating Operations Review
● A motivating operation is an antecedent condition, operation, or stimulus that
has two effects: value-altering effect and behavior-altering effect
● ·Value-altering effect: defined by the effect of the antecedent MO on
the consequence. The effect of an MO to increase or decrease the
effectiveness of a consequence.
● Two types of Motivating Operations:
o Establishing Operations (EO)
o Abolishing Operations (AO)
● Value-altering effects of MOs
○ Establishing Operations-increases effectiveness of
a consequence. (Establishing effect)
○ Abolishing Operations-decreases effectiveness of
consequence (Abolishing effect)
● Three Motivating Operations
o Deprivation
o Satiation
o Aversive Stimulation-relates to negative reinforcement
● The Value of Change:
○ EOsr-: the antecedent “discomfort” condition establishes the
effectiveness of the removal of discomfort as a reinforcer. Change is
valuable.
○ AOsr-: The antecedent “no discomfort” abolishes the effectiveness of
its own removal of discomfort as a reinforcer. Change is not valuable.
● First Names of MOs
o “First names” of MOs are derived from their value-altering effect
● Establishing Operation (EO)
● Abolishing Operation (AO)
● Last Names of MOs
o “Last names” of MOs are the consequences related to their
value-altering effect. Ex:
● EOs for Positive Reinforcement (EOsR+)
● AOs for Positive Reinforcement (AOsR+)
● EOs for Negative Reinforcement (EOsR-)
● AOs for Negative Reinforcement (AOsR-)
ASR: An antecedent that signals the availability of a consequence for the
behavior in the moment is a: SD/discriminative status
● Behavior-altering effect of MOs:
o Evokes behavior
o Abates behavior
● Consequences and the Behavior-Altering Effect of EOs
o EOs ALWAYS increase the effectiveness of a consequence.
However….EOs may evoke OR abate behavior depending on
whether the consequence is a reinforcer or a punisher
o AOs ALWAYS decrease the effectiveness of a consequence
However…AOs may evoke OR abate behavior depending on
whether the consequence is a reinforcer or a punisher.
ASR: The value-altering effect of an EO is referred to as an: Establishing effect
ASR: The value-altering effect of an AO is referred to as an: Abolishing effect
ASR: True/False: EOs always evoke behavior while AOs always abate
behavior. (False)
B. Motivating Operations Related to Punishment
1. Abolishing operation for punishment
● Motivating operations that decrease the effectiveness of other events
as forms of punishment and evoke behavior that has been punished
by those events in the past
o Is an antecedent condition
o Decreases effectiveness of a punisher
o Increases the momentary frequency of behavior
● Two effects of AO’s for Punishment
o Value-altering effect
● Punisher-abolishing effect
o Behavior-altering effect
● Evocative effect (for behavior that has been punished by
the consequence in the past)
● Example: AOsp
o Antecedent: Rat is given a dose of benzodiazepine
o Behavior: Lever Pressing
o Consequence: Food pellet and electric shock delivered
● Benzodiazepine in bloodstream abolishes shock as an
effective form of punishment
● Lever pressing is evoked
● Example: AOsp
o Antecedent: Being in a boring, high-demand environment
o Response: Hitting a peer
o Consequence: Put in time-out (excluded from boring environment for
5 minutes)
● Boring environment abolishes time-out as an effective
form of punishment
● Evokes hitting a peer behavior
● Example: AOsp
o Antecedent: Won the lottery
o Response: Speeding on the highway
o Consequence: Getting a speeding ticking with a $250 fine
● Winning the lottery abolishes a speeding ticket fine as an
effective punishment
● Evokes speeding behavior
Hint: When we analyze the effect of any MO, we must clearly identify whether it is
related to REINFORCEMENT (R) or PUNISHMENT (P). That will allow us to
determine its behavior-altering effect, and if it evokes or abates behavior
ASR: A teenager having no plans for the evening decreases the effectiveness
of cleaning the house as a punisher and evokes yelling at mom. What is the
value-altering effect?
● Cleaning the house decreased in effectiveness as a form of punishment
ASR: A teenager having no plans for the evening decreases the effectiveness
of cleaning the house as a punisher and evokes yelling at mom. What is the
behavior-altering effect?
● Yelling is evoked
ASR: Having more tokens than needed to access the best toys in the token
store decreases the effectiveness of losing a token as a punisher and evokes
running around the classroom. What is the value-altering effect?
● Losing a token decreased in effectiveness as a form of punishment
ASR: Having more tokens than needed to access the best toys in the token
store decreases the effectiveness of losing a token as a punisher and evokes
running around the classroom. What is the behavior-altering effect?
● Running around the classroom is evoked
2. Establishing operation for punishment
a. Define
b. Identify the two effects
C. Conditioned Motivating Operations for Reinforcement
Identify three types of conditioned motivating operations
Surrogate conditioned motivating operation (CMO-S)
a. Define
b. Identify examples and non-examples
Reflexive conditioned motivating operation (CMO-R)
a. Define
b. Identify examples and non-examples
Transitive conditioned motivating operation (CMO-T)
c. Define
d. Identify examples and non-examples
D. Multiple Functions of Stimuli and Behavior Chains
Behavior chain
a. Define
b. Identify examples and non-examples
Response class hierarchy
c. Define
d. Identify examples and non-examples
Omnibus terms
e. Define
f. Identify the two common terms in behavior analysis
Unit 8: Motivating Operations and Multiple Functions of Stimuli:
Thomas Freeman
A. Effects of Motivating Operations Review- Two Main Effects-
Motivating operations have two general effects. They -
(1) Alter the value of a consequence - increase or decrease the effectiveness of a Cx
(2) Alter certain dimensions of behavior. – Evoke Bx/Abate Bx
2 types of MO
● EO: Increases the effectiveness of a consequence. There are EOs that evoke
behavior (when related to reinforcement) and there are EOs that abate behavior (when
related to punishment). The establishing effect (Slide#42)
● AO: Decreases the effectiveness of a consequence. There are AOs that evoke
behavior (when related to punishment) and AOs that abate behavior (when related to
reinforcement). The abolishing effect (Slide#43)
Slide#9
Three MOs: Deprivation(Evokes Bx) Satiation(Abates Bx) Aversive Stimulation
Deprivation & Satiation Mos: related with positive Rx (SR+ or Sr+) E.g. water, money..
Aversive Stimulation Mos: relates to negative Rx (SR- or Sr-)
● EO for SR- :Antecedent Aversive is a discomfort condition so establishes the
effectiveness of its own removal/termination as negative Rx(the removal of
annoyance/pain/irritant)-change is valuable.
● AO for SR- : Antecedent aversive has no discomfort so abolishes its effectiveness
of removal and no need to terminate something that doesn’t bother you. Change is
not valuable-why would I want to be removed/or get rid of something that does not
affect me.
Slide#15-22
❖ First names of MOs· Value-altering effects; the effect MO has on the consequence
-EO/AO
❖ Last name of MOs· Look for the specific type of consequence whose
effectiveness is being altered- Rx(SR) / Px(SP) The last name of a CEO or CAO
determines the effect on behavior. The “last name” of a motivating operation
indicates the type of consequence whose value is altered. Depending on the
consequence, whether it is reinforcement or punishment, behavior could be either
evoked or abated. Therefore, a motivating operation’s ”last name” is needed to know
what consequence is involved and whether behavior is evoked or abated.
EO for SR- establishes the effectiveness of the consequence as a reinforcer.
AO for SR- abolishes the effectiveness of the consequence as a reinforcer.
Slide#23-31
Identify the Types of Rx
· If stimulus is added to the env. Then positive Rx.
· If stimulus is removed from the env. Then negative Rx.
Differential Availability: means-is the consequence likely to follow a response under the
antecedent condition. (not if the consequence is effective or not)
Differential Availability vs Effectiveness (or value)
Differential Availability (SD / S-delta) results from differential Rx
Effectiveness (MO- EO/AO) results from a motivating operation
B. Motivating Operations Related to Punishment
1. Abolishing operation for punishment/ Establishing operation for punishment
Slide#48-55
a. Define: an antecedent condition/operation/stimulus that-
● increases (EO) or decreases (AO) the effectiveness of a punisher as a Cx
● Evokes or abates the momentary frequency of Bx that has been punished by
the Cx in the past.
Slide#56-91
b. Identify the two effects: value-altering effects and Bx-altering effects
Value-altering Effects of MOs: SAME for both Rx and Px
EOs always increase the effectiveness of both Cx (SR & SP)
AOs always decrease the effectiveness of both Cx (SR & SP)
● The Es and As go together 😊
❖ EO for ER= evokes
❖ AO for SR= abates
Bx-altering Effects of MOs: NOT the same for Rx and Px. Have opposite effect on Bx
● The Es and As do not go together ☹
❖ AO for SP= evokes
❖ EO for SP= abates
Effects of EOs related to punishment:
(a) increase the value of a consequence as a form of punishment. This is
increasing the effectiveness of punishment.
(b) abate behavior that has been followed by that punisher in the past. This
is an abative effect.
Note: It would be incorrect to say EOs alter the value of behavior, since they alter the
value of a consequence and not of behavior.
Slide#62 Remember- All EOs increase the effectiveness of a Cx
(value-altering effect), regardless of the type of Cx
Effects of AOs related to punishment:
(a) decrease the value of a consequence as a form of punishment. This is a
decrease in the effectiveness of punishment.
(b)evoke behavior that has been followed by a form of punishment in the past.
This is an evocative effect.
Note: It would be incorrect to say AOs alter the value of behavior, since they alter the
value of a consequence and not of behavior.
The value effect is the same for both MOs related to reinforcement and punishment.
● Remember, the ”first name” of a motivating operation indicates the effect on a
consequence (i.e., the value-altering effect).
● Therefore, if the motivating operation is an EO, it will increase the value of a
consequence regardless of whether it is reinforcement or punishment.
● If the motivating operation is an AO, it will decrease the value of a consequence
regardless of whether the consequence is reinforcement or punishment.
MO with establishing and evocative effects: Since this EO has an
evocative effect, it must be related to reinforcement (EOSR).
MO with abolishing and abative effects: Since this AO has an abative
effect, it must be related to reinforcement (AOSR)
MO with establishing and abative effects: Since this EO has an abative
effect, it must be related to punishment (EOSP).
MO with abolishing and evocative effects: Since this AO has an evocative
effect, it must be related to punishment (AOSP).
The effect on behavior (not the same) helps us determine whether the consequence is a
form of reinforcement or punishment.
● When reinforcement is more valuable, like is the case with EOs for reinforcement,
behavior tends to occur more frequently.
● When punishment is more valuable, like is the case with EOs for punishment,
behavior tends to occur less frequently.
(Because the punishment is so strong the Bx is less likely to occur)
● When reinforcement is less valuable, like is the case with AOs for reinforcement,
behavior tends to occur less frequently.
● When punishment is less valuable, like is the case with AOs for punishment,
behavior tends to occur more frequently.
(Because the punishment is so weak the Bx is more likely to occur)
Slide#92-125
B. Conditioned Motivating Operations for Reinforcement
Review
Provenance: MO can have value-altering effect due to-
o UMOs-Phylogenic provenance (genetic history)-
Unlearned/Unconditioned-UEOs/UAOs
Example-deprivation/satiation of certain substances/activities-food, water, air
Uncomfortable temperature (too hot/cold) Physical discomfort (pain)
o CMOs-Ontogenic provenance (individual learning history)- Learned/Conditioned-
CEOs/CAOs
Example- A fire alarm established the effectiveness of escape/avoidance
Value-altering Effects of UMOs and CMOs:
o UMOs are unlearned antecedent motivational conditions that alter the
effectiveness of Cx.
o CMOs are learned antecedent motivational conditions that alter the
effectiveness of Cx.
Bx-altering Effects of UMOs and CMOs:
o The Bx-alterning effect of both UMOs and CEOs are learned.
o All responses that are evoked or abated by both UMOs and CEOs are learned
Bx.
NOTE: Is any Bx unlearned?????
YES!!!!!
Respondent Bx (Reflex responses) are unlearned bx.
Bx influenced by Cx come under the OPERANT relationship, not respondent.
3. Identify three types of conditioned motivating operations
1. Surrogate CMO-CEO-S & CAO-S: acts as stand-in for another MO
2. Reflexive CMO-CEO-R & CAO-R: reflects upon itself
3. Transitive CMO-CEO-T & CAO-T: transit through a task/transfer power to another
tool
Surrogate conditioned motivating operation (CMO-S) Slide#126-167
a. Define : A CMO-S is an antecedent stimulus condition that acquires its value-altering and
behavior-altering effects through pairing with another motivating operation.
A substitute or stand-in: CMO-S starts as a neutral stimulus (Sx). Then it is paired with
UMO and becomes a MO (conditioned). CMO-S evokes same response (Rs) as UMO after
pairing.
CEO-S (Deprivation)
Value-altering Effect: establishing-increases the momentary effectiveness of a
reinforcer
Bx-altering effect: evocative-increases the frequency of Bx that has produced the
reinforcer in the past.
An SD (Discriminative Stimulus) can become CEO-S through pairing!!!! (Slide#148 and
150)
CAO-S (Satiation)
● Value-altering Effect: abolishing-decreases the momentary effectiveness of a
reinforcer
● Bx-altering effect: abative-decreases the frequency of Bx that has produced
the reinforcer in the past.
b. Identify examples and non-examples
You eat breakfast before you get to work at 8am every morning, so by early afternoon you are
food deprived and your stomach starts to growl. You see the clock says 12:00pm and you get
your food out of the refrigerator. On the weekends, you sleep in so you eat later in the
morning. Still, when you see the clock says 12:00pm, you get food out of the refrigerator,
even though you aren’t feeling food deprived. How can we categorize the sight of 12:00pm
on the clock on the weekend in relation to getting food out of the refrigerator?
What type of antecedent?
● The sight of the time on the clock is an MO because it did not change the
availability of the food but altered the value of the food.
● This is a conditioned motivating operation because we’re focused on the
clock at 12:00pm and this requires prior learning to evoke responding.
● It is surrogate because it acquired this value altering ability by being paired
with food deprivation (an unconditioned motivating operation).
● More specifically, the sight of the clock is functioning as a CEO-S for SR+
because it increases the value of access to food as a positive reinforcer.
Slide#168-227
5. Reflexive conditioned motivating operation (CMO-R)-
c. Define : A CMO-R is an antecedent stimulus condition that alters the effectiveness of its
own termination as an effective form of negative reinforcement or punishment.
● Makes its own removal an effective negative Rx.
● Signals worsening/warning – threat
■ (can also signal improving=promise)
● Increases the value of escape/avoidance.
● Any Sx that precedes the onset of pain becomes CMO-R
● Its own removal/offset functions as a Rx.
(Cooper book-pg.386) Aversive Sx- anything unpleasant.
● Its presence is punishment (Px).
● Its removal is reinforcement (Rx).
● Its presence evokes behavior (Bx) that requires its removal(termination) as a MO.
● When a Rs results in avoiding an aversive Sx(pain/unpleasantness) it is said to be
Avoidance Rs (Response). This is a learned Bx.
● All warning stimuli are learned. (slide#211)
● They are Conditioned CEOs for escape (from themselves) and avoidance (of
something else)
● Even though warning stimuli are learned they are NOT called “conditioned Stimuli”
because “conditioned Stimuli” elicit conditioned reflex responses! In respondent Bx.
(slide #212)
● In operant Bx there are no elicited responses.
➢ The warning Sx evokes the avoidance Rs as a CMO. Avoidance because the
onset of pain did not start yet. It delays/prevents the onset of a harsher
aversive condition.
➢ The painful stimulation (shock) evokes the escape Rs as UMO. Escape
because the onset of pain a has begun already…Terminates the aversive once
it has already started.
Threat CAO-R (slide#182)
● Value-altering Effect: abolishing-decreases the effectiveness of its own
termination as a form of negative Rx.
● Bx-altering effect: abative-decreases Bx that has resulted in its own
termination in the past, and/or abates avoidance Bx
In simple language-Since the value of escape is not at all valuable/effective, the bx is abated.
Why engage in bx if you already have what you need? bx isn’t evoked/its abated.
Example- no fire alarm, no need to leave the building.
Fuel gauge reads “Full”, I don’t need to get more gas
Threat CEO-R (slide#188)
● Value-altering Effect: establishing-increases the effectiveness of its own
termination as a form of negative Rx.
● Bx-altering effect: evocative-increases Bx that has resulted in its own
termination in the past, and/or evokes avoidance Bx
In simple language-Since the value of escape is very effective, the behavior is evoked.
Therefore, bx is evoked and its termination is negative Rx.
Example- fire alarm, need to leave the building.
Fuel gauge reads “Empty”, I need to get more gas.
Discriminative Avoidance (slide#191)
★ Discriminated/Signaled avoidance – avoidance occurs when a warning stimulus
evokes an avoidance response.
★ The warning stimulus is a CEO-R- it establishes its own termination as conditioned
negative Rx.
CMO-R is NOT an SD – Discriminative Stimulus (slide#198 & 202)
★ Antecedent aversive does not signal differential availability of Rx consequence
(Negative Rx)
d. Identify examples and non-examples
● Shawnelle is a new RBT in an autism clinic. The last few supervision sessions have
been rough. The supervisor provided harsh feedback when running 1-on-1 academic
programs at the table. Today, when the supervisor arrives Shawnelle takes the client
to the group room to work on social skills. As a result, he avoids the harsh
feedback. How do we categorize the presence of the supervisor in relation to taking
the client to the group room?
Does it evoke behavior because it signals the consequence is available or unavailable?
The presence of the supervisor does not signal the availability or unavailability of the
consequence, so you know it is not an SD.
Remember that the antecedent that evokes behavior maintained by negative
reinforcement is always a motivating operation, not a discriminative stimulus.
How did it get its ability to alter the effectiveness of the consequence?
The antecedent was correlated with a worsening condition- The presence of the supervisor
has been correlated with a "worsening" condition, the harsh feedback.
What type of antecedent is it?
Since it has been correlated with the harsh feedback (the “worsening” condition), the
presence of the supervisor makes the removal of the supervisor more effective as a
consequence and evokes behavior that results in its own removal. This is a reflexive CMO –
in fact it is a CEO-R for negative reinforcement. A reflexive CEO establishes itself as an
effective form of negative reinforcement because it has been correlated with an aversive
condition that can be avoided by engaging in behavior.
Slide#228-250
6. Transitive conditioned motivating operation (CMO-T)- (Cooper book pg.390)
e. Define: A CMO-T is an antecedent stimulus condition that alters the effectiveness of a
second stimulus as a reinforcer and evokes or abates behavior that has been reinforced by the
second stimulus in the past.
CMO-T makes something else (the second stimulus) more effective as a Rx rather than
altering itself.
CEO-T (slide#236)
➢ Value-altering Effect: establishing-increases the effectiveness of another
stimulus as a form of reinforcement.
➢ Bx-altering effect: evocative- increases a dimension of Bx that has resulted in
accessing that second stimulus in the past.
CMO-T evokes response because of its relation to the value of consequence rather than to
the availability of a consequence.
CAO-T (slide#240)
➢ Value-altering Effect: abolishes-decreases the effectiveness of another
stimulus as a form of reinforcement.
➢ Bx-altering effect: abative- decreases a dimension of Bx that has resulted in
accessing that second stimulus in the past.
f. Identify examples and non-examples
● Santiago skipped breakfast and has been in a training all morning. It is finally
lunchtime, and he realizes he forgot to bring lunch as well. It has now been 18 hours
since he last ate, and the nearest available food is a sandwich shop down the street. He
starts looking for his keys so he can drive to the shop. He finds the keys and is able to
drive to get lunch. How would we categorize 18 hours without food with respect to
looking for his keys?
What type of antecedent? CMO-T
Remember: The transitive CMO increases the value of another stimulus as an effective
form a reinforcement because it makes that other stimulus “necessary” for engaging in more
behavior. In this example, food deprivation begins a whole chain of behaviors, Santiago
needs to get food, but in order to get the food he needs to get to the sandwich shop, and in
order to get to the sandwich shop, he needs to drive his car, and in order to drive his car, he
needs to find his keys. Each link in this chain involves transitive CMOs.
Slide#252-318
D. Multiple Functions of Stimuli and Behavior Chains
7. Behavior chain (Cooper pg.558)
a. Define: A behavior chain is a sequence of responses that must occur in a specific
order and are reinforced on a chained schedule, with the terminal reinforcer following
the last response in the sequence.
Behavior Chain
● Two or more responses
● Specific order
● Reinforced on a chained schedule
● Terminal reinforcer following the last response
● “Links” – stimuli between response with multiple functions
○ Conditioned Rx for previous Responses
○ SD for next responses-next step in the chain.
In a behavior chain, each response produces a stimulus change that functions as a
conditioned reinforcer for that response, and also functions as a discriminative
stimulus (SD) for the next response
What maintains the entire behavior chain? The last response in a behavior
chain produces the terminal reinforcer, which reinforces the entire behavior chain
by resolving the initial EO and maintaining each stimulus change produced by each
response as both conditioned reinforcers and SDs.
8. Response class hierarchy
c. Define
● A hierarchy of responses within a functional response class based on the relative
probabilities of those responses under particular conditions.
● Same reinforcer but any one might be successful If one doesn’t work, we move up the
hierarchy.
● Sometimes called “precursors
Response class hierarchies differ from behavior chains in that responses can
occur independently of any other response in the hierarchy while responses in a
behavior chain are dependent on the occurrence of the previous response in the
sequence.
Any response in a hierarchy can produce the terminal reinforcer that is tied to the
initial EO whereas in a behavior chain, only the final response in the chain can
produce the terminal reinforcer.
9. Omnibus terms (slide#292)
e. Define: Omnibus terms are terms used to categorize stimuli that have multiple
functions, such as appetitive and aversive stimuli.
● Appetitive and aversive are omnibus terms because they refer to
stimuli that: have multiple functions
f. Identify the two common terms in behavior analysis
➢ An aversive stimulus (slide#299)
○ can evoke behavior to terminate it, (CEO-R Threat/warning)
○ weaken behavior when it is presented following a response, (aversive
added)-Positive Px-(Px by pain/unpleasantness added into env)
○ strengthen behavior when it is removed following a response.
(termination of aversive, negative Rx)
Aversive Stimuli- depends on how these function for a particular behaver
for that specific behavior. (slide#305)
● certain stimuli often exhibit aversive fx
● Pain
● Certain smells (rotting animal)
● Tastes (extreme spice)
● Sounds (high pitch squeal)
● Electric shock (might not always be aversive though)
➢ An appetitive stimulus (slide#295)
○ can strengthen behavior when it is presented following a response,
○ can weaken behavior when it is removed following a response,
○ can abate behavior when it decreases the value of a conditioned
reinforcer.
● Fx as Rx when presented following a Rs.- water , money- appetitive
added
● Fx as Px when withdrawn following a Rs.- playtime (time-in taken
away functions as Px)- appetitive withdrawn
● Decreases the value of a conditioned
reinforcer-satiation-example-wallet full of money
Chapter 9
A. The Analysis of Verbal Behavior
Verbal behavior
a. Define § Communicative behaviour that is reinforced by an individual
who has been trained by a verbal community to respond in specific ways
b. Identify examples and non-examples
· Non-examples of Verbal Behaviour: vocal utterances that are not words,
screaming, leading (hand leading), self-injury
Examples of Verbal Behavior: speaking, writing, ASL
Define speaker
o The individual emitting the verbal response, regardless of the form of the VB
o Other terms that may be used: singer, writer, verbalizer ( most inclusive term)
o The speaker could serve as their own listener (e.g., when you talk to yourself)
Define listener
● The Individual who provides relevant antecedents and consequences for a speaker’s
behaviour
● The individual with whom the speaker interacts, who provides the consequences for
the speaker’s VB
● Other terms: observer, reader, mediator ( most inclusive term)
Define audience
● Listeners(mediators) in a trained community
Define verbal episode
● The basic speaker-listener unit of verbal behaviour
B. Selectionism in Verbal Behavior
Define cultural selection
o Practices that contribute to the success of the group are passed on from one
generation to the next
o Language is shaped by the verbal community through cultural selection
o Operant selection shapes the individual’s acquisition of language and the
way in which they speak; Cultural selection shapes the language as a
whole
C. Selection and Topography-Based Verbal
Behavior
Selection-based verbal behavior
o Verbal behaviour in which the speaker selects a stimulus and the listener
responds based on the selection
o Each response looks the same of very similar, what differs is the effect on
the listener
o The listener responds based on what the speaker selected
o Selection-based response forms:
§ Touching pointing
§ Handing selected pictures, symbols, words
§ Typing
§ Clicking items
o Any time an individual makes a choice regardless of the topography is not
selection-based (e.g., Do you want a puppy or a kitten? A puppy)
o Selection-based verbal behavior is based on selecting stimuli (e.g., pointing
to an item, grabbing a picture, etc.). Therefore, the form of the response is
not as important, so the responses can look very similar. But, it is
important for the listener to discriminate what the speaker is selecting so
they can respond. Any response by the speaker indicating a choice or
selection is not necessarily “selection-based” because this could include
saying, or signing, or writing the choice—in which case, the response
would be topography-based.
Topography-based verbal behavior
o Verbal behaviour in which each speaker response form is different and the
listener responds based on that topography
o Each response looks or sounds different
o The listener reacts to the speaker’s form of verbal response
o Examples of response forms: saying words, forming gestures or signs,
writing words, finger spelling
D. The Elementary Verbal Operants
Identify the elementary verbal operants
Mand
· Verbal operant under the antecedent control of an establishing operation and followed
by a specific reinforcer
· A mand is a verbal response evoked by an establishing operation for reinforcement
and maintained by a specific form of social reinforcement. Therefore, the antecedent that
evokes the mand is an EO for SR.
Review differential reinforcement (differentiation vs. discrimination)
· When differential reinforcement is based on dimensions of stimuli, it will lead to
discrimination—responding in the presence of certain stimuli and not others.
· When differential reinforcement is based on dimensions of responses, it will lead to
differentiation—emitting certain responses and not others.
Define verbal stimulus
● A stimulus that is a result/product of verbal behaviour
Tact
· Verbal operant evoked by a non-verbal discriminative stimulus and followed by
generalized conditioned reinforcement
· Refer to as naming or labeling in common language
· Objects, actions/events, property of an object
· Could refer to both private or public events
Point-to-point correspondence
a. Define
· A specific relation between antecedent verbal stimulus and a verbal response
· Evidenced when the beginning, middle and end of the antecedent verbal stimulus
match the same in the verbal response (letters, phonemes, etc. match exactly)
· Not necessarily in the same modality (e.g., spoken and heard, written and read, etc.)
· E.g., the rods dog- letters written on a page, the phonemes and the signs (ASL) all
match
· It is not “idea-to-idea” correspondence; For example “dog” and “perro” mean the
same thing but do not have point-to-point correspondence
b. Identify the verbal operants with point-to-point correspondence
● Duplic, codic
Define formal similarity
● Antecedent and the response share the same modality ( spoken, written, etc.) + same
formal properties
Duplic
● Verbal operant under the control of an antecedent verbal stimulus that shares formal
similarity with the response
● Same response in the same form
● Maintained by generalized form of reinforcement
● Think duplication; Duplication of vocal, signed or written verbal SD
o Echoics ( vocal)
o Mimetic ( sign)
o Copying text (written)
Echoic
· Verbal operant under the control of a vocal verbal antecedent that shares
formal similarity with the response
· Repeating a vocal verbal unit, where the vocal response matches the vocal
antecedent
· Consequence- generalized conditioned reinforcer
· Echoic responses are common in language acquisition·
Generalized conditioned reinforcement may change to automatic
reinforcement
Mimetic
· Verbal operant under the control of a signed verbal antecedent that shares formal
similarity with the response
· Imitating a signed verbal unit
· Response matches the signed verbal antecedent
· Generalized conditioned reinforcement as a consequence
· Mimetic is not the same as motor imitation ( motor imitation is only a mimetic when it
involves imitation of a signed verbal stimulus)
ASRs
· An echoic is a type of duplic involving vocal verbal behavior. A duplic is a verbal
response that is under the antecedent control of a verbal SD, there is formal similarity,
and it is maintained by a generalized form of reinforcement. Therefore, the antecedent
that controls the echoic is a vocal verbal SD.
· An echoic is a type of duplic involving vocal verbal behavior. A duplic is a verbal
response that is under the antecedent control of a verbal SD, there is formal similarity,
and it is maintained by a generalized form of reinforcement. Therefore, the
consequence that maintains the echoic is a generalized form of reinforcement.
· The verbal response is saying, “Hello.” The antecedent is hearing someone say,
“Hello.” There is point-to-point correspondence between the two (each sound of the
antecedent matches each sound of the response). They also share the same sensory
modality (both vocal). Therefore, there is formal similarity, and this would be an
echoic since they are vocal.
Copying text
· Verbal operant under the control of a written antecedent that shares formal similarity
with the response
· Replicating a written verbal unit
· Written response matches the written antecedent
· Followed by generalized conditioned reinforcement
19. Codic
· Verbal Operant under the control of a verbal antecedent with which it shares
point-to-point correspondence but no formal similarity ( different sense modality)
· Also followed by a generalized conditioned reinforcer
· Response Forms and the Codic
o Transcription (spokenàwritten)- writing every single word you hear is
transcription; Taking short-hand notes is not
o Reading aloud (written - spoken)
o Finger spelling words heard (spoken- sign)
o Saying words finger spelled (sign - spoken)
20. Intraverbal
· Verbal operant under the control of a verbal antecedent with which it does not
share point-to-point correspondence
· Verbal SD and verbal response do not match
· Followed by generalized conditioned reinforcement
· E.g., Sd: Mary had a little, Response “lamb”
An intraverbal is a verbal response that is under the antecedent control of a verbal SD, there
is no point-to-point correspondence, and it is maintained by a generalized form of
reinforcement. Saying “Coffee” would be an intraverbal when hearing ”What do you want to
drink?” (verbal SD with no point-to-point correspondence). Saying, ”Coffee” when having
slept for 2 hours (EO for reinforcement) would be a mand. Saying, “Coffee” when seeing a
pot of coffee brewing (non-verbal SD) would be a tact. Saying, “Coffee” when hearing
someone say, “Coffee” (verbal SD with formal similarity) would be an echoic, which is a
type of duplic. Saying, “Coffee” when seeing “Coffee” written on a menu (verbal SD with
point-to-point correspondence but no formal similarity) would be a codic.
E. Multiple Control
Multiple control
· Pure verbal operants- Under the control of a single antecedent; E.g., Saying “hug”
solely as a result of seeing someone hug
· In most cases however, the response is evoked by more than one antecedent
· Multiple control- When a verbal response is controlled by more than one
environmental variable; Much more common that pure verbal operants
· Example: I need to use the restroom and I see a sign that reads “Rest Area”
F. Non-Verbal Listener Behavior
Non-verbal listener behavior
a. Define: Behaviour that is under the antecedent control of a mand to
comply ( a.k.a receptive language; compliance behaviour)
o The consequence for this type of behaviour is often social reinforcement
o Example: Speaker says ”sit down” and the listener sits down;
o Non-verbal listener behavior is controlled by a mand to comply. For
example, a student walking back into the classroom after recess when the
teacher says, “Let’s go back inside.” The teacher is asking the student to
go back inside, which is a mand. The student responds to this mand from
the teacher by walking inside, which is non-verbal listener behavior.
G. Contingency-Shaped and Rule-Governed
Behavior
· Two types of Operant behaviour: Contingency shaped and Rule-governed
o Contingency-shaped behaviour: Behaviour that is shaped through
direct contact with environmental contingencies ( a.k.a event-governed
behaviour);
§ Learned through shaping and maintained via immediate
reinforcement contingencies
§ SD/EO → response → reinforcer
Rule-Governed Behaviour
o Behaviour that is controlled by descriptions of contingencies (rules)
o Learning through language without directly experiencing certain
contingencies would be rule-governed and not contingency-shaped
o Can be learned more quickly than contingency-shaped behaviour
o Rules are especially useful when contingencies are complex/unclear
Rule
o A verbal description of a behavioural contingencyq
o Function altering effect of rules; Verbal rule alters the function of
antecedent stimuli
o Functional response class of “following rules” is reinforced
Rule-governed behaviour
o Controlled by the verbal statements; no direct experience with the
contingency described
o Example: Sign at a nature park states “Fine for littering up to $300”;
Contingency does not have to be experienced by the behaviour for the
behaviour to come under the control of the rule
o Rule-governed behavior is learned through verbal descriptions of
contingencies. As a result, one might learn faster through rules, since it
doesn’t require trial and error. For example, one might hear, “You need to
stop your car when the traffic light is red. If you were to continue driving
when the light is red, you can get into a car accident or get a traffic ticket.”
The person‘s behavior can change upon hearing this description and never
get into a car accident or get a ticket for running a red light. On the other
hand, contingency-shaped behavior is learned by direct experience with
certain environmental events.
H. Dimensions of Applied Behavior Analysis
Behaviourism
● Philosophy of the science of behaviour that posits that behaviours is the subject matter
of interest and is a function of environmental contingencies
● Takes a systematic approach to the study of behaviour
● Behaviorism is the philosophy underlying behavior analysis. There are several types
of behaviorism, like methodological behaviorism, S-R behaviorism, and radical
behaviorism.
Radical Behaviourism
● B.F. Skinner’s philosophy of the science of human behaviour
● Type of behaviourism that includes and analyses all forms of behaviour including
thoughts, feelings and verbal behaviour
● Radical behaviorism is known as “radical” because it analyzes any type of behavior,
whether public or private. This includes thinking, feelings, verbal behavior, etc. Since
states of being, traits, and attitudes are not behavior, these are not part of radical
behaviorism.
Applied Behaviour Analysis
● Natural science approach to the study of socially significant human behaviour
Defining Features of ABA
Applied:
• Dimension of ABA that requires focus on socially significant human behaviour
• Examples: Language acquisition, improving performance in school, independent
living skills, reducing harmful behaviours
Behavioural
• Dimension of ABA that requires a focus on behaviour in its own right as a target for
change
• Behaviour studied should be directly observable and measurable
Analytic
• Dimension of ABA that requires demonstration of functional relations between
behaviour and environmental events through systematic manipulations (e.g., control
behaviour; systematically change environment and measure the behaviour of interest by
collecting data)
• Interviews do not provide valid information about the function of behavior, because
they are subjective (based on opinion). As a result, people might have different
interpretations of why a particular behavior occurs. Therefore, this is not enough to develop
an intervention and demonstrate environment-behavior functional relations. Therefore, this
does not demonstrate the “analytic” dimension of ABA.
• A functional analysis (FA) involves systematic manipulations of environmental
events. These manipulations reveal environment-behavior functional relations, which are of
paramount importance in the development of effective treatment. For these reasons, using a
functional analysis in the provision of behavior-analytic services demonstrates the “analytic”
dimension of ABA.
Technological
• The techniques making up a particular behavioral application are completely
identified and described
• Dimension of ABA that requires procedures to be clearly and completely described
such that another individual could reproduce the application
• Providing a detailed description of treatment implementation would demonstrate the
“technological” dimension of ABA. The example in this ASR provides a clear description of
how to implement a token economy. Therefore, we can say it is technological.
Conceptually Systematic
• Dimension of ABA that requires procedures to be linked to and described in terms of
the basic principles of behaviour
• For example, using extinction for cursing and reinforcement for socially appropriate
behaviour; These are procedures that have been validated in the research
• Linking interventions to the behavior analytic principles demonstrates the
“conceptually systematic” dimension of ABA. In this ASR, the intervention of using
preferred stimuli during self-care tasks is linked to the principle of motivating operations (i.e.,
an AO for escape). Subsequently, this description is conceptually systematic.
Effective
• Dimension of ABA that requires behaviour changes to be in the intended direction to
a practical degree
• Behaviour interventions need to be valid and effective; If treatment protocol is not
effective, it needs to be re-evaluated
• An effective intervention is one that will produce significant changes in behavior.
Changes will be considered significant when they have an impact on the life of the person.
An increase from 10% to 20% on tests will have no impact on the person’s success in the
class. Therefore, this intervention would not be considered effective.
Generality
• Dimension of ABA that requires the effects of an intervention to maintain and spread
to other situations and behaviours
o Maintenance: The continued effect of an intervention in the absence of the
intervention
o Two types of generalization: Stimulus and response generalization
• In this example, Jorah only says, “Dog” when shown a picture of the same type of dog
used during teaching. Jorah doesn’t say, “Dog” when shown other type of dogs not presented
during teaching. The “generality” dimension only applies when intervention incorporates
strategies that promote changes in behavior outside of treatment. For example, to new types
of behavior, new settings, and to new stimuli. None of this happens in the scenario described.
o Response generalization
The spread of effects of reinforcement to other similar unreinforced stimuli
“New behaviour for free”
AKA induction
E.g; saying cat is reinforced; child starts saying kitty as well
o Stimulus generalization
The spread of effects of training to stimuli not present during training
Stimulus generalization = same response but different stimuli or stimulus conditions
1. The process of fading from continuous reinforcement to intermittent reinforcement
is referred to as:
a. Ratio strain
b. DRH
c. Schedule thinning
d. Extinction
2. The actual shoe, the pictured shoe, the spoken word 'shoe', the written word shoe,
all evoked your behavior to say 'shoe.' These variations of shoes exemplify
a. Topographical response class
b. Functional response class
c. Stimulus class
d. Feature class
3. I describe a meal I ate; You salivate. Your salivating is an example of a(an):
A. Conditioned response
B. Conditioned stimulus
C. Unconditioned response
D. Operant behavior
4. I receive a reinforcer approximately every 5th time I ask for money from my mother.
What schedule?
a. FR
b. VR
c. FI
d. VI
5. After Your boss walks by you, then you begin to work harder, this is:
a. Conditioned response
b. Conditioned stimulus
c.. Unconditioned response
d. . Operant behavior
6. Mildred participates in an experiment at school. The researcher plays a tone, and
then blows a puff of air into Mildred's eye: Mildred blinks. After doing this many
times, Mildred now blinks when she hears a tone. Blinking when the tone elicits it is
a(n):
a. Unconditioned reflex
b. Conditioned reflex
c. Conditioned stimulus
d. Conditioned response
7. My dog has learned to ring a bell by the door when he needs to go outside to
relieve himself. The need to urinate functions as ___ in relation to his bell ringing
behavior.
a. discriminative stimulus
b. abolishing operation
c. stimulus generalization
d. Establishing operation
8. I target a behavior for decrease in the home and the behavior increases at school,
where nothing has changed. This effect is referred to as:
a. Consequating
b. Behavior contrast
c. Operant conditioning
d. Conditioned stimuli
9. If you eat your veggies you get dessert. If you do not eat your veggies you don't
get dessert, no matter what. This is an example of
A. S-R-S
B. R-S
C. S-R
D. S-S
10. If you complete 100 jumping jacks in 1 minute your behavior is reinforced. If you
complete less than 100 jumping jacks in 1 minute, there is no reinforcement.
a. DRL
b. VR
c. FT
d. DRH
11. The salty popcorn at the theater increases the reinforcer effectiveness of the diet
coke they sell. This is an....
a. establishing effect
b. evocative effect
c. abative effect
d. abolishing effect
12. I ask you for a cream soda and you give me one, YOUR behavior is:
a. tact
b. mand
c. intraverbal
d. receptive behavior
13. Having eaten the salty popcorn I then reach for the soda that I have and take a
drink. Taking a drink of the soda is due to the ___ effect of the salty popcorn
a. establishing effect
b. evocative effect
c. abative effect
d. abolishing effect
14. When you are stopped at the intersection you wait for the police officer to wave
you through. Then you drive through the intersection and as a result safely make it to
the other side. This is an example of?
A. S-R-S
B. R-S
C. S-R
D. S-S
15. As a field, we may concentrate on changing someone’s verbal statements, ability
to balance a checkbook or how to do laundry. This best exemplifies which dimension
of ABA?
a. applied
b. behavioral
c. analytic
d. effective
16. Every 10 minutes timer goes off and I give you some popcorn (reinforcer). What
schedule?
A. FR
B. VR
C. FT
D. VT
17. A strong smell of feces elicits gagging. This is an example of a(an):
A. Function altering relation
B. Ontogenic relation
C. Operant relation
D. Respondent relation
18. When I sign “elephant” and then little Joe signs “elephant” this is an example of?
a. echoic
b. codic
c. mimetic
d. intraverbal
19. Having a headache momentarily increases my behaviors that have resulted in
removal of my headache in the past. This is an...
a. establishing effect
b. evocative effect
c. abative effect
d. abolishing effect
20. Hearing dinner is served evokes salivation. This is an example of a(an):
a. Function altering relation
b. Operant relation
c. Phylogenic relation
d. Respondent relation
21. Talk of feces paired with smell of feces, leads to talk of feces eliciting gagging.
This is exemplifies:
a. Behavior altering effect
b. Establishing effect
c. Function altering effect
d. Value altering effect
22. The following two questions are about Jose. Jose hardly ever does the dishes.
One evening he comes home and sees the sink overflowing with dirty dishes. Josh,
his son, usually does the dishes but is very busy completing his science fair project,
so Jose starts to wash the dishes. Josh walks into the kitchen and sees his dad
doing the dishes, and gets him some macadamia nut ice cream, which he (Josh) had
been hiding.
Jose immediately stops washing the dishes and eats the ice-cream. This is an
example of:
A. Behavior-altering effect of the ice-cream
B. Function altering effect of ice-cream
C. Respondent conditioning
D. R-s contingency
23. Jose, after the above incident now will frequently do the dishes when he sees
that the sink is full, but only if Josh is home. Therefore Josh is functioning as:
a. Sd
b. Sdelta
C. EO
d. SDP
24. When working with a client I ensure to provide a detailed behavior plan, giving
the staff training first and then the detailed written information of exactly how they
should implement the procedure for decreasing the inappropriate behavior. This best
exemplifies the dimension of:
a. applied
b. behavioral
c. effective
d. technological
25. When a stimulus reliably occurs before behavior that contacts reinforcement this
stimulus becomes and SDR. This change is referred to as:
a. Respondent conditioning
b. Conditioned stimuli
c. Function altering effect
d. Behavior altering effect
26. I have the tendency to turn the thermostat up repeatedly during the day. Once it
reaches a certain temperature this increases the frequency of my husband’s bx of
complaining about the heat, going out to the pool, and threatening to leave if I don't
turn down the thermostat. The effect that the increase of temperature has on my
husbands’ above behaviors is an….
a. establishing effect
b. evocative effect
c. abative effect
d. abolishing effect
27. The turning up of the thermostat functioned as a ____ for my husband’s whining
bx:
a. EO
b. AO
c. SD
d. SDP
28. I receive a bonus of $100.oo for every 10 completed reports. What schedule
a. DRH
b. FR
c. VR
d. FT
29. I see a commercial for Dairy Queen, I immediately ask Jay if we can go for ice
cream. The commercial had a:
a. Functional altering effect
b. Behavior altering effect
c. Consequential effect
d. Signaling effect
30. Which of the following is NOT true regarding duplics:
a. They lack point to point correspondence
b. They require an exact replication of the antecedent
c. They require formal similarity
d. They include echoics and mimetics
31. Gambling is on what schedule of reinforcement:
a. VR
b. FR
c. FI
d. VI
32. Jessie often engages in whining bx towards Jay. This occurs when he is busy at
work or busy watching football and has been ignoring her. If Jay is spending lots of
time with her then she doesn't whine. In relation to Jessie's whining behavior, the
periods of Jay working and watching football function as:
a. EO for positive punishment
b. EO for negative punishment
c. EO for positive reinforcement
d. EO for negative reinforcement
33. On a cumulative record, this schedule of reinforcement produces a scalloped
pattern:
a. VR
b. FR
c. FI
d. VI
34. Mark whispers softly, instead of talking, and people cannot hear him. What
behavior change procedure are we using if we initially reinforce Mark for whispering
softly, then for whispering loudly, then for talking softly, and finally for talking in a
normal voice volume?
a. Discrimination
b. Differentiation/shaping
c. induction
d. response generalization
35. What effect does an Sdelta have on behavior
a. None
b. Evokes
c. Abates
d. abolishes
36. On an airplane, the word occupied on a bathroom door acts as what?
A. Sd
B. Sdelta
C. EO
D. AO
37. Jay decides that we need to go to therapy to deal with my whining. So he
schedules a session with a therapist. The therapist suggests that each time I begin
to whine that Jay stops what he is doing and has a conversation with me about how I
should be more respectful of his time and that we should then spend 2 minutes
hugging. Jay implements the recommendations that the therapist gives but i just
keep whining. In fact, I whine even more than I did before. In relation to my whining
the conversation and 2 minute hug worked as:
a. EO
b. SD
c. Reinforcer
d. Punisher
38. Stimulus generalization is the opposite of?
a. Response generalization
b. Induction
c. Discrimination
d. Differentiation
39. The “mind”, the “self” and the ‘will’ are all considered?
a. behavioral explanations
b. hypothetical constructs
c. deterministic structures
d. functional relations
40. Another term for response generalization is?
a. Differentiation
b. Discrimination
c. Shaping
d. Induction
41. During a discussion I ask my husband when he’ll have the door fixed. He
answers me, “someday”. His behavior is:
a. mand
b. tact
c. intraverbal
d. mimetic
42. I have to have a procedure done that requires me to drink a gallon of water. At
this point I do not want any more water, because there is….
a. An establishing operation
b. An abolishing operation
c. An SD
d. An SDp
43. About every 5 minutes, Joey’s mom stops and gives joey attention. What
schedule?
a. FR
b. VT
c. FI
d. VR
44. When I see Jane, I totally freak and get out of the room whatever way I can. Jane
is a:
a. SDr
b. SDelta
c. EO
d. AO
45. Another therapist (that Jay goes to alone after all this mess with my whining)
recommends that before I start whining he tries to give me frequent attention and
affection throughout the day. Jay does this and my behavior of whining has
decreased significantly. In relation to my whining behavior the frequent attention and
affection worked as:
a. EO
b. AO
c. SD
d. SDP
46. I see the word “crisp” written as part of my spelling list on the board and write it
on my paper. Writing the word as a result of seeing the word on the blackboard is a:
a. codic
b. duplic
c. intraverbal
d. non-verbal behavior
47. When Susie says “ball” I give her a ball. When she says “shoes” I give her shoes.
When she says “Damn” I take away all her toys. My actions are best described as:
a. consequences
b. reinforcers
c. antecedents
d. motivating operations
48. Jane has several ways to gain my attention. She can call my name, wave at me,
scream “HELP” or throw something at me. Her behaviors are an example of?
a. respondent relations
b. consequences
c. stimulus class
d. functional response class
49. Radical behavior includes private events and verbal behavior as elements of
behavior that can be studied scientifically.
a. True
B. False
50.-53. I am working with a client: Sam. I measure the time between two instances
of Sam’s hand raising behavior as 33 seconds. I also count the total number of times
he raises his hand in 1 hour, 4. In addition, I write down what happens right before, a
description of his hair pulling behavior, and what happens right after, whenever this
occurs in the classroom. I also set up a system where the teacher provides praise to
Sam every 10 minutes of class time. I graph and review the data every week, and
will continue to make changes as needed based on the changes in behavior that are
seen.
50. The measurement of 4 hand raises in 1 hour is:
a. frequency
b rate
c. duration
d latency
51. The dimension of ABA best exemplified by my graphing and review of graphs is:
a. applied
b. behavioral
c. effective
d. technological
52. When I measure the time between two instances of Sam’s hand raising behavior
as 33 seconds, this is an example of:
a. a dimensional quantity of behavior
b. a fundamental property of behavior
c. duration
d. topographical response class
53. The specific dimensional quality when I measure the time between two instances
of Sam’s hand raising behavior as 33 seconds is:
A. Rate
B. Latency
C. Duration
D. IRT
54. All of the following are dimensional quantities of behavior EXCEPT:
a. duration
b. latency
c. repeatability
d. celeration
55. Respondent behavior that is learned is due to:
a. ontogenic provenance
b. consequating
c. phylogenic provenance
d. polyvore provenance
56. Bob and Sue are playing poker, and betting with cash. The fact that Sue recently
lost her job would likely have what effect on her behavior of betting large sums of
money?
a. evocative
b. abative
c. establishing
d. abolishing
57. A motivating operation has a ______ and _______ effect.
a. function altering and behavior altering
b. behavior altering and establishing
c. function altering and value altering
d. behavior altering and value altering
58. A little rat in a Skinner box has two levers. When he pushes the green one 5
times a food pellet comes out. At the same time, he can push the yellow one, and
the first time he does so after 3 minutes he’ll get a food pellet. What’s the schedule
of reinforcement?
a. VR
b. multiple
c. mixed
d. concurrent
59. the difference between a multiple and mixed schedule is that the mixed schedule
is associated with an additional stimulus.
a. True
b. False
60. What schedule of reinforcement produces the highest and steadiest rates of
responding?
a. VI
b. chained
c. VR
d. concurrent
61. Is a relatively permanent change in behavior as a result of your experience:
a. operants
b. learning
c. cultural selection
d. respondents
62. Jane and Joey are boyfriend and girlfriend. When Jane hasn’t seen Joey for a
long time she is more likely to drive over to his house. When she calls first, joey let’s
her in when she arrives. If she didn’t call first, Joey doesn’t let her in. As time passes,
Jane is sure that she always calls first before driving over to Joey’s house.
Joey is using what process to influence Jane’s behavior?
a. Signaling
b. Motivating operations
c. Differential reinforcement
d. Induction
63. In the previous scenario what type of antecedent condition is present when Jane
has not seen Joey for a long time?
a. UMO
b. CMO
c. SDr
d SDp
64. Which type of selection is the result of the passage of genetic material from one
generation to the next?
a. Operant Selection
b. Natural Selection
c. Cultural Selection
65. This type of antecedent condition results in an increase of the effectiveness of a
reinforcer and has an evocative effect on behavior.
a. EO for reinforcement
b. EO for punishment
c. AO for reinforcement
D. AO for punishment
66. This schedule of reinforcement requires the organism to complete 3 schedules of
reinforcement prior to receiving a single reinforcer. There is no specific stimulus
associated with each schedule.
a. Multiple
b. Chained
c. Mixed
d. Tandem
67. This type of motivation operation is one that signals that the individual’s
environmental conditions are likely to worsen and increases the likelihood that
escape/avoidance behavior will occur.
a. CMO-T
b. CMO-S
c. CMO-R
d. CMO-P
68. An antecedent condition that indicates that there is no availability of punishment
is a:
a. UEO
b. SDP
c. UAO
d. Sdelta P
69. Which motivating operation will increase the effectiveness of the consequence
and have an abative effect on behavior?
a. EO for reinforcement
b. AO for reinforcement
c. EO for punishment
d. AO for punishment
70. A stimulus which signals the unavailability of reinforcement has what effect on
behavior:
a. establishing
b. evocative
c. abative
d. abolishing
71. A motivating operation indicates both of the following:
a. The value of the consequence and if it’s efficient
b. The value of the consequence and the likelihood of the consequence
occurring
c. The value of the consequence and the likelihood of behavior occurring
d. The value of the behavior and the likelihood of the consequence
72. The difference between public and private behavior is?
a. One is not really behavior
b. There is no difference
c. One involves only the mind
d. One can be viewed by others and one can only be observed by the person
engaging in the behavior.
73. Which of the following does not pass the dead man’s test:
a. Susan responds “yes” when asked if she likes soup
b. Cameron puts his plate in the sink after dinner
c. Dave doesn’t get up after dinner
d. Giulia yells “no way!” when asked to stand up
ANSWERS
1. C
2. C
3. A
4. B
5. D
6. D
7. D
8. B
9. B
10. D
11. A
12. D
13. B
14. A
15. A
16. C
17. D
18. C
19. B
20. D
21. C
22. A
23. A
24. D
25. C
26. B
27. A
28. B
29. B
30. A
31. A
32. C
33. C
34. B
35. C
36. B
37. C
38. C
39. B
40. D
41. C
42. B
43. B
44. C
45. B
46. B
47. A
48. D
49. A
50. B
51. C
52. A
53. D
54. C
55. A
56. B
57. D
58. D
59. B
60. C
61. B
62. C
63. B
64. B
65. A
66. D
67. C
68. D
69. C
70. C
71. C
72. D
73. C