The average television viewer in the United States (US) watches as many as nine drug advertisements per day and about 16 hours per year, far exceeding the time an average individual spends with his/her primary care physician.1 Since 2012, spending on drug commercials has increased by 62%, and $5 billion were spent on drug commercials last year.2Given their ubiquity, the article by Klara, et al. in this issue of JGIM offers one more piece of evidence to indicate that this medium is not operating as intended, and to force us to consider alternatives to the status quo.3
First, it is important to consider the history and original purpose of direct-to-consumer (DTC) advertising. In the 1960s, Congress granted the Food and Drug Administration (FDA) regulating authority of prescription drug labeling and advertising. This authority included ensuring that ads (1) were not false or misleading, (2) presented a “fair balance” of both drug risks and benefits, (3) included facts that are “material” to a drug’s advertised uses, and (4) included a “brief summary” that notes every risk described in the drug’s labeling.1 While the first DTC advertisement was a Merck print advertisement for the Pneumovax® vaccine in 1981, DTC advertising exploded in the late 1990s after the FDA eased up on regulations for required risk information by stipulating that ads must include only the “major risks” and provide resources that consumers can be directed to for full risk information.