Chapters+17+and+18+Schedules++ANSWERS

** Schedules ** ** ANSWERS ** 1. Ratio schedules and interval schedules of reinforcement and punishment. a.compare and contrast
 * Chapters 17 and 18 **


 * ANSWER:**
 * Similarities:** They both are manners in which intermittent reinforcement or punishment may be scheduled.


 * Crucial Difference:**
 * Ratio Schedules- the outcome for a response is delivered only after a specific **//number//** of responses have occurred since the last reinforced or punished response regardless of the amount of time in which those responses have occurred.
 * Interval schedules- the crucial measure is amount of **//time//** that has passed since the last response that was reinforced or punished without regard for the number of responses that have occurred in that time frame.

2. Variable-ratio (VR) schedules in the Skinner box versus the scheduling used by slot machines in Las Vegas. a. Compare and contrast these two schedules.


 * ANSWER:** //We could simply write in the table that provided by PB here, but that wouldn’t be sufficient. You need to be able to explain what table means in English. So here we go…//

A typical gambling “schedule of reinforcement” and a VR schedule in the Skinner box may seem the same because the reinforcer is only delivered intermittently dependent on the number of responses.
 * Partial Similarities:**

//(By the way, don’t be too intimidated by this answer or the answer for number 39. These are just provided as examples of answers to these questions. As long as you can adequately explain each major difference between the two concepts in your own words, that will be sufficient)//
 * FOUR crucial differences:**
 * The first is that there are a variety other reinforcers interspersed in between lever pulls on the slot machine than there are in the Skinner box. (There are the lights, music, and visual stimuli contingent on pulling the lever on the slot machine. Even if you are continually losing your money to one machine, it still is reinforcing to continue to pull the lever. Rudolph doesn’t get the extra reinforcers in his Skinner box).
 * The **//amount of reinforcer//** delivered **//varies//** between ratios. (So while you may win $10 the first time you play on the slot machine, you may win $100 the next time you win. In the Skinner box, there is only one value for the reinforcer: ONE drop of water. More or less is not delivered between ratios. Again, those tricky casino operators are trying to keep you gambling with the promise of a big payout, but we don’t do that with Rudolph. He continues to press simply for the single drop of water that we will occasionally give him.)
 * The third difference between these two schedules is that ratios in the Skinner box can be **//MUCH higher//** than those used on the slot machine. (We can get Rudolph to continue to press a lever on a VR 100 schedule for a single drop of water, but when it comes to the slot machine, customers would stop pulling that lever WAY before they ever got their first payout if they had to pull that lever 100 or more times before wining some money.)
 * There are also those **//emotional reinforcers//** that we have to consider. This is known as the “close, but no cigar!” phenomenon. (You might get those two cherries in a row (!!!), but then a lemon (aww, poor baby). That is quite a powerful reinforcer that keeps many gamblers throwing their money into the machine, but in the Skinner box, there is no such thing as being //close// to getting a drop of water, but not quite getting it. The water is either delivered, or it is not.)

3. Fixed-interval (FI) versus fixed-time schedules. a. Compare and contrast


 * ANSWER:**
 * Similarities:** Delivery of the outcome with these two schedules is dependent upon the passage of time.


 * Crucial Difference:**
 * For a FI schedule, the delivery of an outcome is **dependent on the first response** after a fixed period of time since the last opportunity for reinforcement or punishment.
 * In contrast, the delivery of an outcome with a fixed-time schedule is dependent upon the passage of time since the last delivery of the reinforcer or aversive condition **//REGARDLESS OF WHETHER THE RESPONSE OCCURRED//**. //(That last part is the key. That is why it is big, bold, and italicized. Remember it.)//

4. Limited hold versus a deadline a.Compare and contrast


 * ANSWER:**
 * Similarities: B**oth of these concepts specify the time in which a response will produce a reinforcer .[1]


 * Crucial difference:**
 * Deadline only specifies the time **before which** a response will be reinforced. (In other words, a deadline states that the response can occur from now until a specific time and still be reinforced).
 * A limited hold specifies the time **during which** a response will be reinforced (or punished). (A limited hold states that a response must occur from some time in the future until another specific time in order to be reinforced).

(As added aids, the diagrams below are provided. //While they aren’t required for a complete answer on the quiz, they may be helpful as you teach this concept to other students)// 5. Provide an example in daily life both of a deadline and of a limited hold.
 * ANSWER:**
 * An example of a deadline would be the due date for a term paper.
 * An example of a limited hold would be running through the park between 6 and 7 pm when you know that another cute boy or girl regularly runs through the park at that time (and presumably you will use that time to chat and get to know them – thus, the reinforcer).

6. Using the examples that you’ve provided, please explain how each fits the definition of a deadline or limited hold using the terminology that you provided in your answer to 40a.


 * ANSWER:**
 * Term Paper- reinforcement (which is the prevention of a failing grade because this is an avoidance contingency) is delivered at any time between now and the due date.
 * Running through park- reinforcement will only be delivered **//if//** you run between 6 and 7 pm. If you run through the park at 5 pm, then your response of looking for your “limited hold angel” in the park will not be reinforced. (the response has to occur between two specific times in the future, and can’t occur between now and some time in the future.)

7. Why does intermittent reinforcement increase resistance to extinction (as compared to continuous reinforcement)?

(Let’s take this back to the Skinner box. For CRF, Rudolph will receive a drop of water for every lever press, but for INT (and let’s assume it is on a FR 10 schedule), Rudolph will receive a drop of water for every ten responses. When we implement the extinction procedure, Rudolph will receive no drops of water whatsoever. Now when comparing the CRF schedule to extinction, all of a sudden, the reinforcement that was to be delivered contingent upon every lever press has stopped completely, so extinction will occur rapidly. However, when comparing the FR 10 schedule to extinction: those first nine lever presses are entirely the same. There is no water presented contingent upon the lever press. It is only for the **tenth** that lever press that INT and extinction differ. Therefore, since there is much more similarity between INT and the extinction procedure, stimulus generalization is more likely to occur, and the frequency of the lever press will diminish more slowly.)
 * ANSWER:**
 * The reason why intermittent reinforcement increases resistance to extinction is because there is greater stimulus generalization that occurs between intermittent reinforcement (INT) and extinction than there is generalization between continuous reinforcement (CRF) and extinction.

[1] We originally said, “the time in which a response will be reinforced;” but often it will be an analog to reinforcement. Also, on rare occasions, it might be the removal or presentation, of an aversive condition.