My Response to “Learn Your History”

A Confederate flag is displayed at the South Carolina state capitol in Columbia January 9, 2008. Many U.S. presidential campaigns shift their focus to South Carolina today for their first test in the south--the historic flag, which until 2000 flew from the capitol dome, is for some a symbol of the state's political and racial divisions.  REUTERS/Jonathan Ernst   (UNITED STATES) - RTX5DUD

A Confederate flag is displayed at the South Carolina state capitol in Columbia January 9, 2008. Many U.S. presidential campaigns shift their focus to South Carolina today for their first test in the south–the historic flag, which until 2000 flew from the capitol dome, is for some a symbol of the state’s political and racial divisions. REUTERS/Jonathan Ernst (UNITED STATES) – RTX5DUD

As of today, our governor has called for the Confederate flag to come down from the monument on the State House grounds.

For something so divisive, you’d think there would be celebration.

But amidst the almost-universal praise, there is a strain of grumbling that goes something like this:
Learn your history! It’s about heritage, not racism!

History is a tricky subject. We can think that it’s an objective account of our past, or we can acknowledge that it is a subjective mish-mash of narratives that managed to win out over other narratives. To be sure, a history of Native Americans in the United States would be written differently from the Native American perspective than those who drove them out of their lands. Curriculum wars have been waged over how certain history has been told in school textbooks. If history was objective, we wouldn’t have these issues.

So when I hear “Learn your history,” I ask: Whose history? I’m a white male Southerner who spent a lot of time in past years studying how the Confederacy came to be, because I wanted to find out which was true: heritage or hate? I love flag design and owned a few Civil War era flags of South Carolina. I read Calhoun and Davis. I have been heavily involved in my own local history initiatives. I studied other instances of secession in world history. I could even see some of the constitutional rationale for secession, on paper and in a vacuum, so to speak. None of that could erase the fact that, despite the causes we may assign to Southern secession and the lingering Confederacy, there were clearly racist motives amidst the non-racist ones in the founding of the CSA. To that end, the flags represent it.

Flags are about active causes. It’s one thing to have a monument to the Confederate dead, but flying a battle flag in their memory doesn’t do anything other than imply we still support that cause today. It’s safe to say those men died for their country–CSA, at the time–and I would imagine their idea of patriotism today would beholden them to the same United States Flag that we fly today as Americans. (That brings up another point: flying those two flags at the same time seems completely illogical to me, but that’s another post.)

When we–white, southern, reasonably comfortable, and only connected to the Civil War by memory or distant relatives who were fighting to keep their economic interests alive–say “learn your history,” we are actually saying “learn the history that makes this flag okay.” The problem is, it’s not that simple. History is checkered. History is subjective. You cannot tell me that a descendant of a slave in South Carolina has the same perspective on history that we do. When we say “learn your history,” we are once again imposing our will over those who don’t have the luxury of a Gone With The Wind recollection of history.

If you can put together an entire nation, or state, or county of people who think that battle flag represents them, then secede again and fly it. But as it stands, state government represents all state residents, and enough of them don’t share the same history you do of that symbol. What’s wrong with being a good neighbor and letting it go?

A Quick Word on SC’s Confederate Flag

A Confederate flag is displayed at the South Carolina state capitol in Columbia January 9, 2008. Many U.S. presidential campaigns shift their focus to South Carolina today for their first test in the south--the historic flag, which until 2000 flew from the capitol dome, is for some a symbol of the state's political and racial divisions.  REUTERS/Jonathan Ernst   (UNITED STATES) - RTX5DUD

A Confederate flag is displayed at the South Carolina state capitol in Columbia January 9, 2008. Many U.S. presidential campaigns shift their focus to South Carolina today for their first test in the south–the historic flag, which until 2000 flew from the capitol dome, is for some a symbol of the state’s political and racial divisions. REUTERS/Jonathan Ernst (UNITED STATES) – RTX5DUD

I am a SC native and resident. I remember every gubernatorial election and debate that ran up to the removal of the flag from the Statehouse dome in 2000. I also enjoy vexillology and history. Right now, the national conversation about the shootings in Charleston (the deeds of a racist madman) has renewed debate on removing the flag altogether from the Statehouse grounds (in my mind, a wise decison). Looking past the value judgments, let’s examine some frequently asked questions.

Q: How did it get there?

A: In 1962, the all-white legislature voted to place the flag atop the dome in what was considered an oppositional gesture against the Civil Rights Movement. The official reason given was the anniversary of the Civil War, but that would have been 1961, so it’s anyone’s guess. It remained there ever since.

Q: Why is it flying over a monument now?

A: What flies over the Confederate monument on the Statehouse grounds is a slightly different flag. What was above the dome was the Confederate Naval Jack; what flies over the monument is the Army of Northern Virginia Battle Flag, designed by William Porcher Miles. In 2000, the State Legislature passed the South Carolina Heritage Act, which effectively removed the flag from atop the dome and placed a different flag at the monument.

Q: Why wasn’t that flag placed at half-staff when the others were?

A: Logistics and symbolism. Logistically, the flagpole at the monument doesn’t adjust. So it’s either all the way up or all the way down. Symbolically, that flag is not representative of a sovereign entity. Flags at half-staff are usually either federal, state, or local. You may see private homes that fly their decorative flags at half-staff, but insofar as flag code goes, only sovereign flags are of consequence. The flag at the monument is a memorial flag and represents no sovereign entity, so it doesn’t count.

Q: Okay, so it’s a historical flag. Why isn’t it in a museum?

A: The Heritage Act requires a vote of the Legislature before any action is taken on the monument and flag. As of this writing, there is some support in the Legislature to take it down. We will see how that pans out. (Editorial: Why opponents of removing the flag believe that taking it down will somehow dishonor the memory of the dead is beyond me. No one has asked to tear up the monument. It’s only a question of removing a banner that is causing a lot of trouble and heartache. Seems to me that we would better honor past and present by compromising.)

History, Because We Made It

This day in history, exactly one year ago, a plate of food was put before me at a restaurant, and it was so beautifully presented that I absolutely had to snap a photo and post it to Facebook. #foodporn #omg #goingtomyhips #bejealous #howmanymorehashtagscanIthinkof

Not really. But I expect this sort of notification comes about on a Facebook timeline once every few hours across the world.

We have Timehop and On This Day to thank for commemorating the mundane. What used to be a collective effort–history–has become increasingly selfish, says Sarah Senk in Slate.com. I agree. I could go on in the echo chamber re: selfies and increasing eye-to-screen time, adding more volume to the already overpacked “we are too selfish” chant. However, I suggest that Senk’s take is different enough to talk about. In fact, I may share it on my timeline.

By replacing events of broad cultural significance with mundane “events” of little to no relevance to anyone else, Facebook seems to be transforming our understanding of commemorative practice in two ways: It hastens the process through which events get treated as “historical,” and it lowers the bar regarding which past events get to count as “history.”

Imagine someone wandering down to Dealey Plaza in November 1963, taking a selfie in front of the Depository, and captioning: “Just woke up. Need coffee. Oh, and the Prez is here today.”

By privileging an anniversary regardless of the content, Facebook urges people to go through the motions of retrospection, to have feelings of nostalgia generated more by the automatic action of marking time than by any specific event or experience. In this way, On This Day risks transforming commemoration into a meaningless gesture, in which all one really reflects upon is a potentially empty process of reflection itself. Look at me being pensive and nostalgic and caring about the past, the user gets to feel while contemplating how something happened “one year ago today.”

I remember when my sense of history went from planar to linear. I sat at home with my family watching videos of my childhood that they decided to bring out on my 21st birthday. Until that point, I’d looked back on memories as these random things I could pluck from an array of life events, and vaguely thought of myself as disassociated from that person in the recollection. Watching those films, though, I realized that same kid was me, and everything between that point in time and the present was the sum of who I was at that point. To be sure,

It was a genuine action of marking time and realizing how it impacted me. Says Senk: “In place of a shared object, we have a shared process of remembering something, anything.” I sat there and shared the object with my family on that birthday evening. In contrast, Timehopping only requires me to acknowledge that a commemoration is necessary, and I should share/post/tweet with a witty or nostalgic comment. The plate of food is, and was, all about me; the old home movies were about us.

There are, of course, arguments for histories of the mundane. I have done enough literary research through journals and letters to understand this. I don’t disagree. But the difference is real. What separates an artist’s journals on daily food intake with a random #foodporn Instagram shot? Time. If I become famous and important, perhaps in a hundred years my digital minutiae will be of some consequence to researchers. But there’s the rub: deeming who or what is important enough for such recognition usually takes a collective effort. Even histories of local communities are done by way of a collective effort.

I think the arguments are dual and parallel here: (1) empty commemoration for commemoration’s sake; and (2) importance of self vs importance of collective experience.

Of course, this is only my individual assessment. Time will tell.

Java SE 6 on OS X Yosemite

Quick fix for those of us running Adobe CS in OS X Yosemite. If you get the following error:

Screen-Shot-2015-02-07-at-14.16.35

As of this writing, the page for the Java SE 6 runtime isn’t working. However, by running the following Sudo commands in Terminal, you can bypass an unnecessary installation altogether:

sudo mkdir -p /System/Library/Java/JavaVirtualMachines/1.6.0.jdk
sudo mkdir -p /System/Library/Java/Support/Deploy.bundle

Clarifying e.g. and i.e.

These two are easily confused. I had four semesters of Latin in college and I still have to remind myself of correct usage. Although both phrases are originally Latin, they are part of the English language, and thus do not require italicizing like we so often do with foreign words or phrases.

In Latin, i.e. stands for id est, while e.g. stands for exempli gratia. The former means “in other words,” while the latter means “for example.”

I see the two confused–most often i.e. going in place of e.g. erroneously. Only one of the two phrases actually contains the root for “example.” If you’re fishing for the right phrase to use in order to provide an example, e.g. is the way to go.

Additional references:

http://grammarist.com/usage/ie-eg/

http://www.dailywritingtips.com/the-difference-between-eg-and-ie/

Periods and Quotation Marks

dr-evil-bunny-quotes

Ah, the quotation mark. It takes many forms…in the air, overused, underused, absent, and just plain wrong. Some use it for “emphasis.” (Hint: don’t.) Some put quotation marks common phrases in signs or marketing collateral. The short answer on all this is, of course, that quotation marks are marks for notating quotations. Not for emphasis. We might use them to set off a phrase someone said or their misuse of a phrase (such as a liar’s version of the “truth,” for example). But otherwise, they only work for quotes.

But within that, how are they treated? In American English, double quotations go around an actual quote. Punctuation (periods, commas, dashes, etc) go inside. It’s different in British English, but that’s beyond the point. When in doubt, consult this APA table.

There is an argument for logical punctuation, but as an old English major having two major publication styles chiseled into my head, this sort of thing makes my skin crawl.

 

 

 

 

http://www.huffingtonpost.com/william-b-bradshaw/commas-periods-and-quotat_b_3625244.html

Mad Men and Colonel Sanders

Mad Men has wrapped up, and with the closing of this beautifully executed AMC series, we say goodbye to the wave of nostalgia that has gripped the cable networks for most of recent memory. However, just as Don left on a commercial note–Coke, in this case–the fast food chains have filled the vacuum with characters from their golden days. Colonel Sanders and the Hamburglar are back at KFC and McDonald’s.

I don’t know the rational behind it, and Darrell Hammond isn’t exactly a clone of the Colonel, but it’s schtick that should at least turn some heads. I remember Chicken Littles from the Colonel Sanders era of KFC. I remember playing on the McDonald’s playground where the plastic Hamburglar lurked over in the corner.

One company’s dealings with nostalgia did not go as planned. Last year, Hardee’s quietly did away with the Cinnamon Raisin Biscuit. Those had been around ever since I was a child, and I have fond memories of getting ready for school early enough so that my dad would take me to Hardee’s on our way and we would share an order of those. It didn’t happen often, but it was fun. Many have similar memories, and when those biscuits were replaced by an inferior Cinnamon Pull Apart, all hell broke loose. Even the clerks at the register admitted the company made a mistake. At last check, Hardee’s had put things back to normal and acted like the ill-advised move never happened.

The lesson here? Nostalgia is a tricky thing. Mad Men seemed to have run its course, as had the viewing audience’s appetite for a midcentury modern drama. However, where food is concerned, appeals to the past are timeless. Now, go get yourself some home cooking at a restaurant nearby….”just like Mom used to make.”

Edward Said’s Orientialism

Orientalism was itself  a product of certain political forces and activities.

Said wrote this in 1978, but fast-forward to 2015 and we can easily substitute Orient for Middle East or Islam or Arab. None of these terms point to an objective Truth; rather, they are the subjective signifiers of some “other” culture, whose only common definition among a random sampling of 10 people might be that they “aren’t us.” In the context of the humanities, Oriental art and literature are not some imperfect versions of European or American works produced people who are imperfect versions of “us” and need some kind of rescue.

It is an intellectual, rather than political, colonization of regions of the world we aren’t ready to admit we know a whole heck of a lot about. Said Said: “My contention is that Orientalism is fundamentally a political doctrine willed over the Orient because the Orient was weaker than the West, which elided [associated] the Orient’s difference with its weakness” (61).

Consider Nietzsche’s version of truth: ideas “which after long use seem firm, canonical, and obligatory to a people…illusions about which one has forgotten that this is what they are” (60). In this case, it’s not a matter of forgetting, but never knowing. Students in world literature or art classes never knew a time in which Middle East or Islam or Arab didn’t have a mountain of obligatory ideas surrounding the terms.

In essence, it’s a system in which the identities of ours/us is always opposite theirs/them. Foucault contends that there is no normal without an abnormal; if we settle upon Orientalism being a way to comfortably label an “Other” culture, then we place ourselves in the realm of the normal. It is a refusal (whether intentional or not) to identify this culture with our own—to reduce a group of human beings to an idea that can be studied and dissected, rather than seeing individuals who do share common bonds with us.

Source: Said, Edward. (1995). Extracts from Orientalism. In A. Easthope & K. McGowan (Eds.), A critical and cultural theory reader (pp. 55-61). Toronto, ON: Toronto University Press. (Reprinted from Orientalism, by E. Said, 1978, New York: Random House.)

Book Design in Word or InDesign?

in-d-vs-word

I get a lot of manuscripts in Microsoft Word. Fair enough, as Word’s Track Changes feature is the gold standard for the editorial process. However, once we move from editing to actual typesetting and book layout, Word is horrible. It essentially treats everything (and I mean everything) like a continuous line of text. Graphics placement is a bear, columns and tables are cumbersome, and dependent files are unreliable.

Enter Adobe InDesign (ID). Back in my yearbook days, this was Aldus PageMaker. Adobe bought the company and we now have the ghost of PageMaker present in InDesign. I love the program. It’s much easier to do book layout in an application that treats everything spatially as ID does. No, there aren’t many editorial capabilities for copy in ID, but we’re talking layout here.

Here are some of my favorite comparative reviews from over the years:

Why use InDesign instead of MS Word?

Book Design: MS Word vs. Adobe InDesign

Document Design: Word or InDesign?

 

Negative Feedback, Effectively

1601589_10152385847737787_1131036149_n

From Harvard Business Review – Your Employees Want the Negative Feedback You Hate to Give

What is clear is the paradox our data reveal, no matter how we slice them. People believe constructive criticism is essential to their career development. They want it from their leaders. But their leaders often don’t feel comfortable offering it up. From this we conclude that the ability to give corrective feedback constructively is one of the critical keys to leadership, an essential skill to boost your team’s performance that could set you apart.