Even today many people think the mystical powers of the full moon induce erratic behaviors, psychiatric hospital admissions, suicides, homicides, emergency room calls, traffic accidents, fights at professional hockey games, dog bites and all manner of strange events. One survey revealed that 45 percent of college students believe moonstruck humans are prone to unusual behaviors, and other surveys suggest that mental health professionals may be still more likely than laypeople to hold this conviction. In 2007 several police departments in the U.K. even added officers on full-moon nights in an effort to cope with presumed higher crime rates.
But there are at least three reasons why this explanation doesn’t “hold water,” pardon the pun. First, the gravitational effects of the moon are far too minuscule to generate any meaningful effects on brain activity, let alone behavior. As the late astronomer George Abell of the University of California, Los Angeles, noted, a mosquito sitting on our arm exerts a more powerful gravitational pull on us than the moon does. Yet to the best of our knowledge, there have been no reports of a “mosquito lunacy effect.” Second, the moon’s gravitational force affects only open bodies of water, such as oceans and lakes, but not contained sources of water, such as the human brain. Third, the gravitational
effect of the moon is just as potent during new moons—when the moon is invisible to us—as it is during full moons.
There is a more serious problem for fervent believers in the lunar lunacy effect: no evidence that it exists. Florida International University psychologist James Rotton, Colorado State University astronomer Roger Culver and University of Saskatchewan psychologist Ivan W. Kelly have searched far and wide for any consistent behavioral effects of the full moon. In all cases, they have come up empty-handed. By combining the results of multiple studies and treating them as though they were one huge study—a statistical procedure called meta-analysis—they have found that full moons are entirely unrelated
to a host of events, including crimes, suicides, psychiatric problems and crisis center calls. In their 1985 review of 37 studies entitled “Much Ado about the Full Moon,” which appeared in one of psychology’s premier journals, Psychological Bulletin, Rotton and Kelly humorously bid adieu to the full-moon effect and concluded that further research on it was unnecessary.
Persistent critics have disagreed with this conclusion, pointing to a few positive findings that emerge in scattered studies. Still, even the handful of research claims that seem to support full-moon effects have collapsed on closer investigation. In one study published in 1982 an author team reported that traffic accidents were more frequent on full-moon nights than on other nights. Yet a fatal flaw marred these findings: in the period under consideration, full moons were more common on weekends, when more people drive. When the authors reanalyzed their data to eliminate this confounding factor, the lunar effect vanished.
So if the lunar lunacy effect is merely an astronomical and psychological urban legend, why is it so widespread? There are several probable reasons. Media coverage almost surely plays a role. Scores of Hollywood horror flicks portray full-moon nights as peak times of spooky occurrences such as stabbings, shootings and psychotic behaviors.
Perhaps more important, research demonstrates that many people fall prey to a phenomenon that University of Wisconsin–Madison psychologists Loren and Jean Chapman termed “illusory correlation”—the perception of an association that does not in fact exist. For example, many people who have joint pain insist that their pain increases during rainy weather, although research disconfirms this assertion. Much like the watery mirages we observe on freeways during hot summer days, illusory correlations can fool us into perceiving phenomena in their absence.
Illusory correlations result in part from our mind’s propensity to attend to—and recall—most events better than nonevents. When there is a full moon and something decidedly odd happens, we usually notice it, tell others about it and remember it. We do so because such co-occurrences fit with our preconceptions. Indeed, one study showed that psychiatric nurses who believed in the lunar effect wrote more notes about patients’ peculiar behavior than did nurses who did not believe in this effect. In contrast, when there is a full moon and nothing odd happens, this nonevent quickly fades from our memory. As a result of our selective recall, we erroneously perceive an association between full moons and myriad bizarre events.
Still, the illusory correlation explanation, though probably a crucial piece of the puzzle, does not account for how the full-moon notion got started. One intriguing idea for its origins comes to us courtesy of psychiatrist Charles L. Raison, now at Emory University, and several of his colleagues. According to Raison, the lunar lunacy effect may possess a small kernel of truth in that it may once have been genuine. Raison conjectures that before the advent of outdoor lighting in modern times, the bright light of the full moon deprived people who were living outside—including many who had severe mental disorders—of sleep. Because sleep deprivation often triggers erratic behavior in people with certain psychological conditions, such as bipolar disorder (formerly called manic depression), the full moon may have been linked to a heightened rate of bizarre behaviors in long-bygone eras. So the lunar lunacy effect is, in Raison and his colleagues’ terms, a “cultural fossil.”
We may never know whether this ingenious explanation is correct. But in today’s world at least, the lunar lunacy effect appears to be no better supported than is the idea that the moon is made of green cheese.