Sarah Wendell and Candy Tan have helped romance readers discern the good smut from the bad for years with their blog Smart B*tches, Trashy Books. And now with their book Beyond Heaving Bosoms: The Smart Bitches’ Guide to Romance Novels, they offer a guide through the heady world of the $5.99 novel and defend it with wit and intelligence. First they told us who really reads these things, and today we ask why romance novels deserve respect and what they tell us about society, and Candy and Sarah make the genre sound like the college course we all missed out on!
What do romance novels tell us about women, relationships, and society? Why do they deserve more respect?
Sarah: Romances are a billion dollar industry of women writers working to produce narrative fiction for an audience largely comprised of women. No, nothing feminist or subversive about that at all. The industry and the genre are long overdue for critical attention, especially because the whole of the genre's history parallel major shifts in the social and political status of women in the US. Romance novels represent an anthropological history of women through fictional narrative, exploring professional status, sexual agency, self-empowerment, self-actualization, and achievement of autonomy. I think that's the part that surprises new readers the most - the heroine always wins. To see Candy's answer,
Candy: I think more than any other genre, romance novels are about finding emotional stability at the family level and nurturing hope. It's part of why they're so appealing, and why they've been selling like gangbusters in this economic downturn. But I really hesitate to make any kind of generalization about the overall message of romance, because the authors are every bit as diverse as the readers, and consequently, the books run the cultural gamut, from espousing fundamental Christianity to displaying a deep suspicion of religion, from endorsing authority to urging the readers to question it, from displaying a deep-seated (and oftentimes largely unconscious) homophobia to centering around gay love stories.
Romance novels deserve more respect because they're the whipping boys for no goddamn reason that I can see other than a societal fear of and disdain for squishy emotions. They're no better and no worse than any other genre out there, but there's this incredibly strong stigma attached. In many ways, I think my desire to see romance stop being the Rodney Dangerfield of publishing has more to do with my desire for the deeply-buried societal disdain for femininity to go away and stop wrecking our sh*t already. Romance authors are still, by and large, seen as a "damn mob of scribbling women," though they're a much bigger mob now than poor Hawthorne could've dreamt of in his philosophy.
I don't necessarily agree with Sarah — that romances as a billion-dollar industry written by and for women are feminist and subversive — because a lot of what's out there is conservative. I hesitate to say that a lot of them reinforce or are supportive of the dominant paradigm and of patriarchy, because honestly, that's the cultural medium we inhabit, and it's difficult, if not outright impossible, to step out of it completely while we're still in it. Patriarchy stll informs the fabric of our experiences, and that includes the fiction we read and write.
Ultimately, market forces are shaped by society at large, which means that the romance market as a whole tends only to be as feminist or subversive as the society as a whole. That's not to say that there aren't many wonderfully subversive feminist romances, because there are. I'm just not convinced the genre as a whole is. Romance authors are still, by and large, seen as a "damn mob of scribbling women," though they're a much bigger mob now than poor Hawthorne could've dreamt of in his philosophy.