B.F. Skinner
Essay by Zomby • January 22, 2012 • Essay • 3,788 Words (16 Pages) • 2,067 Views
Abstract
B. F. Skinner was undoubtedly, one of the most controversial, and yet most influential psychologists of the last century and he is one of the most famous American psychologists. This paper is going to begin by looking at who B. F. Skinner was. His educational background, along with how he became interested in the field of psychology. This paper is going to examine what B. F. Skinner is known best in the world of psychology, operant conditioning and schedules of reinforcement. This paper is also going to explain what operant conditioning (along with how the Skinner box works) and schedules of reinforcement are. This paper will also explore some of Skinner's other projects, inventions, and publications he did during his lifetime.
B. F. (Burrhus Frederic) Skinner was born on March 20, 1904 in a small town in Pennsylvania and passed away on August 18, 1990 in Massachusetts. His father was a lawyer; and his mother was a very educated woman. Skinner had a younger brother that, unfortunately, died when he was sixteen from a cerebral aneurysm. He was brought up old-fashioned and hard -working. As a boy, Skinner loved to be outside and enjoyed building things. He was a very active boy and even enjoyed going to school.
Skinner obtained a B.A. in English literature in 1926 from Hamilton College and decided he wanted to become a writer. He moves back home, however, he does not do much writing. In a year's time, he only wrote a dozen small articles for the newspaper. Even though times were difficult, Skinner had vowed to himself that he was going to have two years to make it as a writer. If his dream did not come true, by the end of those two years, would then choose a career. During this time, Skinner had several different jobs; making ship models in a workshop in the family garage, landscape gardener, and a bookstore clerk. All the while, Skinner was dreaming of becoming a writer.
While Skinner was working as a bookstore clerk, he came across some books by Pavlov and Watson. Skinner found the books so interesting and exciting that from that moment on, he was very interested and eager to learn more about the field of psychology. He was so eager to learn to more about the field of psychology that on May 21, 1928, he applied for enrollment in the psychology graduate program at Harvard University and accepted on May 24, 1928. Skinner began classes the following September and earned his master's degree in 1930 and earned his doctorate in 1931. After earning his doctorate's degree, he stayed at Harvard doing research until 1936.
Upon leaving Harvard in 1936, Skinner moved so he could teach at the University of Minnesota, where he met and married Ms. Yvonne Blue. In 1945, Skinner became chairman of the psychology department at Indiana University. Then in 1947, Skinner was invited back to Harvard University to give the William James Lectures. In 1948, Harvard University asked him to join their psychology department. Skinner stayed with Harvard University until his retirement in 1974.
In 1937, B. F. Skinner created the term operant conditioning in the setting of reflex physiology, to separate what he was engrossed in; which was behavior that upsets the environment, and from the reflex-related subject problem of the Pavlovian. The term was fresh, on the other hand, its referent was not totally new. Operant behavior, however, defined by Skinner as behavior "controlled by its consequences" is in practice was barely changed from what had earlier been named "instrumental learning" and what most individuals would call habit. Any well-trained "operant" is in effect a habit.
Skinner's research on operant conditioning earned him the tile, father of operant conditioning. Skinner had found a way to lengthen, transform, and perfect Thorndike's reward learning theory as operant conditioning. Skinner came up with the term operant conditioning, which means crudely changing of behavior by the use of reinforcement, which is given after the preferred response. In order for Skinner to observe operant conditioning behavior, he created the operant conditioning chamber or Skinner box.
"A Skinner box typically contains one or more levers which an animal can press, one or more stimulus lights and one or more places in which reinforcements like food can be delivered. The animal's presses on the levers can be detected and recorded and a contingency between these presses, the state of the stimulus lights and the delivery of reinforcement can be set up, all automatically. It is also possible to deliver other reinforces such as water or to deliver punishers like electric shock through the floor of the chamber. Other types of response can be measured - nose poking at a moving panel, or hopping on a treadle - both often used when testing birds rather than rats. Of course, all kinds of discriminative stimuli may be used.
In principle, and sometimes in practice, it is possible for a rat to learn to press a bar in a Skinner box by trial and error. If the box is programmed so that a single lever-press causes a pellet, to be dispensed, followed by a period for the rat to eat the pellet. When the discriminative-stimulus light is out and the lever inoperative, then the rat may learn to press the lever if left to his own devices for long enough. This can often take a very long time. The methods used in practice illustrate how much the rat has to learn to tackle this simple instrumental learning situation. The first step is to expose the rat to the food pellets he will later be rewarded with in the Skinner box in his home cage when he is hungry. He has to learn that these pellets are food and hence are reinforcing when he is hungry. Now he can be introduced to the Skinner box.
Initially there may be a few pellets in the hopper where reinforces are delivered, plus a few scattered nearby, to allow the rat to discover that the hopper is a likely source of food. Once the rat is happy eating from the hopper, he can be left in Skinner box, and the pellet dispenser operated every now and then so the rat becomes accustomed to eating a pellet from the hopper each time the dispenser operates. The rat is probably learning to associate the sound of the dispenser operating with food - a piece of classical conditioning, which is incidental to the instrumental learning task. Once the animal has learned the food pellets are a reinforcement and where they are to be found, it would, however, still probably take some time for the rat to learn that bar pressing when the SD light was on produced food.
The problem is that the rat is extremely unlikely to press the lever often by chance. In order to learn an operant contingency
...
...