An innovative behavioral study involved pairing four emotions represented by instrumental classical and jazz melodies without lyrics for analysis by both acoustic and perceptual means. Analysis revealed that music representing happier emotions had faster attack slopes, higher spectral centroids, quicker tempos, as compared to sadder emotions expressed through legato articulation with slower tempos.
Emotional response
Many studies have attributed the unexpected enjoyment that people derive from listening to sad music to different emotional reactions, such as seeking connection, retrieving memories, validating and re-experiencing emotions, providing solace and offering solace. Yet none of these explanations provide sufficient understanding as to why sad music should be considered pleasurable at all. According to this research study, listeners who enjoy sad music experience pleasure due to re-contextualisation of an emotional response through two independent mechanisms – contagion of negative emotion and aesthetic judgement – that result in pleasure.
Research has demonstrated that emotional responses to musical excerpts that include lyrics are different than when they don’t. One experiment revealed that lyrics increased activation in specific brain areas such as left transverse, middle, and superior temporal gyri, right inferior frontal gyrus, insula activation and left inferior frontal gyrus activity more strongly; suggesting the presence of lyrics intensifies an individual’s emotional reaction to an excerpt of music.
Tempo was also influenced by lyrics’ presence or absence; songs with lyrics were perceived to have slower tempo than instrumental music, while happy music was judged faster than sad music in terms of its tempo. Brain imaging experiments demonstrated this trend: participants who listened to sad music experienced increased activation in their default Mode Network than when listening to happy music (which is associated with self-reflection and meta-awareness).
These experiments indicate that our preference for sad music, whether with or without lyrics, is driven primarily by our emotional response to sadness and loss rather than any specific memory or situation. This finding is in keeping with various surveys that have demonstrated strong correlations between trait empathy and enjoying nominally sad music; however, none of these surveys distinguished between instrumental and vocal music or considered the potential influence of familiarity or autobiographical associations on satisfaction with these musical pieces.
Familiarity
Although music’s emotional responses may be determined by various factors such as lyrics, memories and autobiographical experiences, listener connection also plays an essential role. Some listeners may be more sensitive to sadness induced by unfamiliar music due to an increased connection with it and its emotions; these individuals can likely relate more readily than to other forms of music.
Research has demonstrated that individuals who respond strongly to sad music may do so due to possessing high levels of trait empathy. Empathy has been associated with mimicry of others’ emotions leading to greater feelings of sympathy and compassion – an explanation why those more susceptible to sadness enjoy listening to sad songs even if they do not fully comprehend them.
One possible explanation for why people find comfort in listening to sad music without lyrics may be its familiarity. One study asked participants to rate the acoustic and perceptual properties of four musical excerpts: happy and calm instrumental melodies, classical and jazz instrumental melodies with lyrics tailored specifically for that music’s emotions, as well as instrumental music without lyrics. Results revealed that listeners rated more positively the acoustic and perceptual characteristics of musical excerpts with happy or calm themes than instrumental pieces without lyrics; however, ratings for classical jazz instrumental melodies with lyrics were comparable with ratings given to instrumental tracks without lyrics.
Other studies have examined how familiarity affects an emotional response to music by measuring brain activity related to emotional processing. One fMRI study discovered that music with and without lyrics activated different parts of the brain: with lyrics activating areas related to facial expression processing (left caudate head and subcortical thalamus), while listening without lyrics activated areas related to perception and movement control (right insula and right caudate head) while left thalamus activated more strongly than listening without lyrics did.
Perceived tempo
Studies indicate that an important element of sadness in music is its perceived tempo. Sad songs have been found to elicit melancholic feelings due to an increasing tempo that seems longer than expected, in contrast to happy music which induces positive affects.
Acoustic aspects of music are equally significant, with specific musical features conveying emotion differently. For example, soft dynamics and legato articulation often conveying sadness while happiness is associated with staccato articulation and louder intensity. Timbre is another key consideration – sad music typically has lower spectral centroids as well as a dominant minor mode.
One behavioral experiment involved fifteen participants listening to four sad and four happy pieces of their favorite music and then reporting any type of thought they experienced while listening. The results demonstrated that sad songs elicited more negative emotion words than happy songs did, while music’s tempo could predict how long thoughts persisted; and that slow tempo led to less mind-wandering and greater meta-awareness than fast ones.
FMRI scans were then employed to identify regions that responded to different conditions, with contrast between sad music with lyrics and one without lyrics activating parahippocampal gyrus, amygdala and claustrum becoming more active – particularly the latter which receives inputs from most regions including limbic system and putamen lateral to putamen below insula; also it was found that happy instrumental music produced positive responses while sad music with or without lyrics did not.
Results indicate that limbic systems play a greater role than instrumental cues in experiencing sadness, as evidenced by recent research that examined instrumental versus vocal music’s effects on mood; results of which indicated singing had more intense effects in amygdala and parahippocampal gyrus than instrumental music did.
Mode
Music is an enduring human experience that unites cultures around the globe, invoking emotions across cultures. Since ancient times, people have used music to convey various emotional states – sadness and happiness being two such emotions most frequently invoked by music. Studies have demonstrated how emotional content of a piece of music correlates with frequency and duration of brain activity response to it – the more intense an emotion, the longer and more frequently brain activity spikes occur – these patterns have been studied using neuroimaging techniques such as functional magnetic resonance imaging (fMRI).
Multiple studies have examined how mood can impact music perception and enjoyment. One such experiment used functional magnetic resonance imaging (fMRI) scans of participants listening to sad music before rating their levels of perceived enjoyment using rating scales. Researchers discovered that enjoyment was associated with increased auditory and default mode network activation – particularly the left insula and inferior frontal gyrus activation which also increased with higher trait empathy levels among the participants surveyed.
Another experiment used a similar methodology to assess the influence of emotion and mode on perception and enjoyment of music. Participants were asked to rate their enjoyment of a song in response to “Does this music make you feel happy or sad?” Researchers discovered that happiness was associated with activation in both the insula and precentral gyrus while sadness resulted in activation in superior temporal sulcus and Heschl’s gyrus – while both states produced significant increase in blood flow to certain regions in their brains.
The vmPFC, PCC and pIPL all displayed increased centrality when exposed to sad music compared to happy music – consistent with their putative role in self-referential processing. Furthermore, individuals reported more often mind wandering about personal goals while listening to sad music than happy songs.