In this short article I would like to address the definition of the behavior science term “stimulus.”
The term is defined in Webster’s as:
“… [S]omething that rouses or incites to activity … an agent (as an environmental change) that directly influences the activity of a living organism or one of its parts (as by exciting a sensory organ or evoking muscular contraction or glandular secretion)”
Not the easiest of definitions to understand, so let’s try the “student” definition, also from Webster’s Online:
1. Something that rouses or stirs to action : INCENTIVE
2. Something (as an environmental change) that acts to partly change bodily activity (as by exciting a sensory organ) <heat, light, and sound are common physical stimuli>
Basically a stimulus is some event or thing in the environment that elicits behavior or an event or thing that follows and has some effect upon behavior; something an animal reacts to.
I will limit the discussion in this article to those stimuli that occur immediately after a behavior and serve to either increase, maintain, or decrease the strength of the behavior.
A stimulus that follows a behavior and serves to increase or maintain the strength of that behavior is called a reinforcer. They are things that the animal/bird will work to gain.
A stimulus that follows a behavior and serves to reduce its strength is called a punisher. These stimuli are things the animal/bird will work to avoid.
The procedures of Reinforcement and Punishment will be covered in a future article.
Sid.