Monday, January 9, 2017

Chasing Mirth


Ambiguous images produce a curious phenomenon: when people see the image change from one perception to another in a bistable image (like the duck/rabbit image), or when they see the visual illusion, they laugh. (I realized this phenomenon from watching Al Seckel’s Ted Talk, Visual illusions that show us how we (mis)think, posted April, 2007. I recommend watching it, and listen to the audience, especially during the first third or so of the presentation while you do.)  Now why should they laugh? And what does it tell us about humor?  About mirth?  Most interestingly (to us) is how this phenomenon could help us find a physiological index of mirth.  I am wont to think that laughter is an expression of a subjective feeling of mirth.  You might have noticed that this thinking implies that laughter is stimulated by the subjective feeling of mirth, that is, first mirth then laughter.  (It could be the other way around, as the James-Lange theory of emotion would have it, that the behavior—laughter—precedes the emotion--mirth.)  Although a bit speculative, perhaps mirth is generated by neurons in the ventral striatal reward area in response to a rewarding cognition and sends its message to the supplemental motor area to generate laughter.  We would like to find evidence to support this causal chain, at least insofar as subjective feeling preceding motor response.

A question then is what generates mirth?  Is there something in common between jokes and bistable images?  The hypothesis we plan to test is that the common “cognition” is comprehension.  With both “getting” jokes and “resolving” ambiguous images is “I get it,” I see it,” or “I understand it.”  Comprehension is the rewarding cognition” that mediates mirth and laughter, or so we will test.

The purpose of the exploration into the physiology of mirth is that something besides jokes is needed to determine if stroke survivors can experience mirth—feelings of joy.  While it is clear that some stroke survivors experience mirth—they laugh at jokes, it is not clear if other stroke survivors do experience mirth.  One difference between stroke survivors who laugh and those who do not laugh at jokes is the location and extent of the brain lesion caused by the stroke.  Jokes are a problem for some stroke survivors because jokes require intact language but various aspects of language are compromised for some stroke survivors.  Not only could the damage from the stroke have affected language processing, it could be also that a stroke damaged the ventral striatal “reward area” thereby eliminating mirth experiences (I suspect). And too, if we can find a way to elicit mirth from language compromised patients, can it be used therapeutically?  We need other avenues to mirth. 
  

Methodologically, we will collect some neurophysiological measures.  We are going to place sensors on the face to detect activity in the zygomaticus muscle, which runs from the corner of the mouth to the tip of the ear lobe, to detect even the faintest tendency to smile.  We will connect another sensor on the forehead just between the eyes to detect activity in the corrugator muscle, which will register a tendency to frown.  In addition to these electromyographic (EMG) responses, we will connect a third set of sensors to the volar phalanges of the index and middle fingers of the non-dominant hand to detect electrodermal activity (EDA, also known as GSR). EDA activity will suggest an emotional response.  If the emotional response is related to mirth, then we expect EDA to be correlated with activity in the zygomaticus but not correlated with activity in the corrugator. Perhaps I should say correlated more strongly with activity in the zygomaticus than in the corrugator because a frown will be associated with a subjective emotional response generated by unjokes, as one grad student called them—too but hopefully a much weaker one.  However, mirth will have to be detected neurophysiologically and these measures compared between different (potential) sources of mirth.

Wednesday, March 9, 2011

The stimuli

The sentences we use as stimulus materials were obtained from Seana Coulson and Marta Kutas, distinguished researchers from University of California, San Diego.  They used their sentences in an event related potential (ERP) study in which the event was the last word of the sentence.  (They wanted to see if there was a systematic deflection in brain waves at a particular time following the last word.  There was.) The 60 sentences, or "one liners," were each a sentence frame that when completed with last word created a joke or a control (not joke).  The last word of each one-liner is called the disjunctor.  Whether control or joke, the disjunctor is matched for length and frequency and cloze probability.  The cloze probability is the probability of the word being generated by participants who were given the one-liner without the last word and asked to say the first word that came to mind.  The disjunctor used in the ERP study all had a cloze probability of 3% to 5%.  That is, only 3% to 5% of the respondents produced that word spontaneously.

The sentences are further subdivided into two constraint types within each category (joke, control), 30 of each constraint type. Coulson and Kutas (1998) noted when orignially collecting cloze probabilities that some sentences were spontaneously finished with a particular last word by many participants, whereas other sentences were completed with a much greater variety of last word.  These were labeled high constraint and low constraint sentences, respectively.  Note that the actual endings--disjunctors--generated for the high constraint sentences were not the last word used when the sentences were presented to subjects in the ERP study.  For presentation, the dusjunctor used was a word that matched the cloze probability of words generated for the low constraint sentences.  It was in this way that the disjunctors were matched for the cloze probability of 3% to 5% for each constraint type.  Thus, the 60 one-liners were composed of 30 high constraint and 30 low constraint sentences, and half of each constraint type were a joke and half were control.  For example, compare sentences (1) and (2):

     (1) When I asked my bartender for something cold and full of rum, he recommended his daiquiri.

     (2) When I asked my bartender for something cold and full of rum, he recommended his wife.

We have now presented the sentences to a bunch of students (50 to 60) and to 18 stroke survivors.  Soon we will present them to age matched controls (for the stroke survivors) to be obtained from faculty and staff.  I will tell you about some results and new developments with the project in the next post. 

Friday, October 8, 2010

Brain and Humor

My students and I are about to begin collecting data for our study in the brain and humor. The basic research question is this: is there a site in the brain necessary for "getting" a joke? We already know that several brain areas become active when comprehending a written joke, especially in the left hemisphere. And we know that different areas become active when comprehending a cartoon, especially in the right hemisphere. We are less sure about brain activity from hearing a joke, but we would anticipate considerable left hemisphere involvement because of its dominance in language processing. [I will extend this later.]

Our initial approach to this question is to study stroke survivors and compare their performance with an age-matched, neurologically intact control group.  For the control group we are testing faculty and staff from the university. (We can verify their age fairly easily, but beyond that we will just take it for granted.) Our stroke survivors are wonderful people who desire to participate in research just in case it helps. Our approach will be to test several stroke survivors. They will tend to have brain lesions due to their stroke in various brain sites. Then by grouping stroke survivors by lesion site (e.g., left frontal lobe, right frontal lobe, left pariatel lobe, and so on) we can compare their performance on our jokes. [I will tell you more about the jokes later.]

Jay Sheth, Andrea Nuckolls, and I (and other students who have since graduated and moved away from the university) spent several hours recording our jokes (and non-joke controls) this summer and early fall. We plan to present the jokes (and controls) over headphones. After a joke is presented, the participant will rate it for funniness and then answer a comprehension question about it. While this occurs, we will also video tape the participants facial expressions for signs of "mirth."

We will have much data to analyze.