How does one address the Civil War in a film? This is a question that many filmmakers have asked for years. More difficult to answer, though, is the question: how does one portray the Confederacy in a film?
One would think that the Confederacy would be consistently portrayed as villainous and evil due to their beliefs and the history surrounding the Confederate secession from the United States. However, surprisingly, the Confederacy is often given a sympathetic lens in films. Sometimes the South and the Confederacy are even romanticized in the context of the Civil War.
So why does this happen? Why do filmmakers like to romanticize the Confederacy?
I began to notice this trend recently after, coincidentally, watching some Civil War films and episodes of The Twilight Zone in close succession. The most recent of these (I watched it this morning) was Anthony Minghella's 2003 adaptation of Cold Mountain, a romantic drama about a wounded Confederate soldier trying to make his way back to his wife.
Cold Mountain does not try to hide the fact that the Confederacy owned slaves; however, it does romanticize the agrarian image of the South, with a muscled Jude Law building houses and women (mainly Nicole Kidman) wearing bonnets and playing piano. It represents the south as this sort of idyllic landscape ravaged by war and the Union army.
The other famous example of this is in the 1939 classic Gone With the Wind. Again, we are focusing on a love story between two Southerners while the Civil War rages in the backdrop.
However, it's not just these classic films that paint the South, and the Confederacy, in this light. For instance, in the 2014 film The Keeping Room the story revolves around three women (one of whom is a slave) fighting off two rogue Union soldiers who are trying to kill them. In Clint Eastwood's 1976 film, The Outlaw Josey Wales, Eastwood plays a Southerner who joins the Confederacy after his family is murdered by Union soldiers. And then, of course, there is the infamous (and incredibly racist) D.W. Griffith film, The Birth of a Nation, which paints the Confederacy and the KKK in a heroic light.
Now very few of these films (except for Griffith's) actually romanticize the Confederacy. Rather, these films seem to be romanticizing the tragic image of the South. Many of these films take place towards the end of the Civil War, when the Confederacy knew their time was up.
It's also important to say that not every Civil War film is about the Confederacy, of course. Films like Glory, The Red Badge of Courage, and Lincoln are all told from the North's point of view.
But there is still a question of why. Why do directors seem so interested in telling stories of Confederate soldiers and Southerners sympathetic to the Confederacy?
Part of the appeal of telling a Southern story may come from the more complex morality at work. How does one tell the story of a good Southerner when the South during that time was synonymous with the Confederacy? Furthermore, telling the story from the Confederate side avoids romanticizing the Union army (who, to be fair, did do some horrible things during the war).
But there is still no escaping the fact that these films, either overtly or subtlely, romanticize the Confederate army.
None of these films are ignorant of their history (except for The Birth of a Nation); many of them have scenes dedicated to showing white Southerners being kind to slaves or show harrowing images of slavery. However, their polished version of the South and their idyllic vision of lush countrysides and small towns contradicts the horrors we know happened during this time.
Is there anything inherently wrong with telling a Civil War story from the South's perspective? I don't think so, as long as we are not trying to avoid or rewrite history. There is no way to spin the atrocities that the Confederacy committed (and there should never be an attempt to spin it) but there were still people in the South who were not directly associated with the Confederate army. There are plenty of stories that could be told -- especially from the perspective women and people of color -- which may be complex morally and may challenge audiences in a new way.
With that said, though, romanticizing the Confederacy feels uncomfortable for lack of better words. When we see a film that portrays the Union army as evil and the Confederacy as heroic it skews everything we know about American history. When we see polished cinematography of white farmers and lush countrysides without any sight of a slave (which happens often in Cold Mountain) it seems to be pushing aside history for something more palatable and digestible.
I don't have the answer for which stories should be told, or how they should be told (and I don't think I'm in a very good place to dictate that regardless). However, I think we should take a look at these films and really question why they are being told from a Southern perspective. What benefits (if any) does this offer us?
The answer may not be as clear-cut as we may think.