Tuesday, January 6, 2015

The problem of variable intervals

 Click on image to enlarge

Back in 1972 I read an article by B.F. Skinner with the intriguing title, "God is a Variable Interval." I think it was in Playboy magazine -- because I only read it for the articles, you know. Actually that is sadly sort of true.

Within the context of operant conditioning (training through the giving and withholding of rewards), a variable interval means that the desired response is rewarded NOT every time, but periodically and on a seemingly random schedule. This type of behavioral training is very hard to extinguish.
Extremely oversimplified and perhaps misleading example: Every time the pigeon hits a red button, you give it a pellet of food. The pigeon quickly learns to hit the red button. BUT if you suddenly stop doing this after having done it every single time for an extended period, the pigeon will try a few times and then stop when the reward is not forthcoming.

In the variable interval scenario, the pigeon is reward every time for a while, then is not rewarded once and then is rewarded a few times, then not rewarded once, then rewarded a few times, then not rewarded a couple of times, then is rewarded ... What happens is the pigeon will keep hitting the button for a longer time because it expects the reward eventually, even if not every time. Instead of stopping after the first couple of failures, it re-doubles its effort. And even though the reward is given only occasionally, it will keep at it because it never knows if this will be the time or not ... or maybe the next time ....

Sound familiar? Skinner used the example of God promising rewards but not always coming through, but people keep asking on the chance that this time might be the time that it works. I don't think the article was very much about the God angle at all, but it did make a catchy title. I am sure the title is the only reason I read the article.
Which is a woefully long (and hopefully not too inaccurate) way of telling a tale about the cats. (Trust me, the weather will be brutal tonight and tomorrow and I will get back to that.) The cats get their treats (mentioned in an earlier post) on a somewhat regulated schedule. When I am working in my basement office and realize it is time for a treat, I come upstairs and Sundance comes running. Cassidy may or may not rouse herself to come to meet me at the top of the stairs, but she does raise her head in expectation.

The problem is that now the cats have come to associate my climbing the stairs with them getting a treat. It took me a while to realize that I was reinforcing their behavior by sometimes rewarding them with a treat even though one was not due, just because they were following me around and getting in  the way because I had come upstairs for some other reason. So now I have trained them to expect something every time I come up. I am trying to extinguish this behavior (train them that it is not going to happen) but because I have been irregular in giving or withholding the reward, it will be hard to break them of the habit. Okay, maybe impossible. Or maybe we could install a bathroom in the basement and I could limit my trips up from the depths to the times the cats are supposed to be treated the way they clearly expect to be treated all the time.

I know -- pretty high end problem, right?

No comments: