One week from Monday I will mark eight years blogging at Pruning Shears. It’s largely been quiet - my numbers have rarely bumped above the hundreds in visits per week. I like to console myself that this is because I write really smart, well reasoned and occasionally polemical essays that don’t fit the successful clickbait models (sensational headlines with quick, breathless and highly partisan takes; inspirational drivel; celebrity sideboob). But the more likely explanation is either a) I’m a terrible writer or b) I write about things that few care to read about.
Which is fine, I’ve only run the site as a hobby and (thankfully) never had to make money off it. I write about what I want, when I want. I don’t have an assignment editor making me cover topics I’m not interested in, and over the years I’ve connected with some really interesting people (including the bloggers at Prairie Weather and First Draft). While it would be nice to have massive traffic, I pretty much have the traffic I’ve earned and I can live with that.
I can afford to do that because I don’t need to make a living off it. Professional writers, on the other hand, need to hustle. They have to publicize their work and compose with one eye on capturing the largest possible audience. They don’t have the luxury I do: of writing only about what interests me, and if no one wants to read it then fuck it. Or of writing “fuck” for that matter.
So I understand that pressure, and I also understand the amateur blogger’s desire to get a big server-busting hit. But I have a hard time wrapping my head around how one can make the leap from that to presenting fiction as fact - particularly when doing so involves confessing to crimes. Alice Goffman appeared to do just that in her new book “On the Run.” She quickly backed off when the issue was raised, yet that just raises more questions. Her statement doesn’t square with her account in the book, so which are we to believe? Will future editions of the book be rewritten to present Hoffman’s new, less dramatic account? And how does she reconcile her radically re-worked version of events with the one in the book? Both of them cannot be true.
A story this week had a similar theme. A doctor with the pseudonym Hope Amantine wrote about1 an absolutely extraordinary event that happened during her residency. Noting her extra care during a heart procedure,
My attending asked, “Why are you being so dainty with your dissection there?” I answered that I wanted to avoid ripping the cava because they’re so much harder to fix.
I take it he interpreted my comment as fear, and decided upon a teaching moment. He took his scissors and incredibly, before my eyes, and with no warning or preparation of any kind, cut a one-inch hole in the cava.
I was stunned. As I tried to process what I just saw, incredulous that he would actually intentionally make a hole in the cava, and as dark blood poured out of the hole, the tide rising steadily in the abdomen, he remarked, “Well, are you just going to stand there or are you going to fix that?”
Now, whatever else you may think about this, it is not presented as fiction. She doubles down on it as fact in the comments too:
it was a different era. Time will tell if we are better or worse off today… I can tell you that since much has changed in the last twenty years, surgical residents today touch instruments much less often, and many report feeling unprepared for the rigors of attendingship when they have finished their training. Their work hours are restricted, their experience likewise, and I have seen more than a few young attendings that can’t operate their way out of a paper bag. They have been trained in a kinder, gentler environment, and that is great as long as every operation goes as planned.
When there is a computer simulation that adequately prepares surgeons for unexpected anatomy, findings, and intraoperative unplanned “events,” I will be the first one to sing Hallelujah. It hasn’t been invented yet - so until that time, you better pray that you never get a hole in a cava. But if you do, you better hope that the person holding the knife can actually fix it in less than the five minutes it will take for you to bleed to death.
When it became clear that this was not just an appalling breach of ethics or a grimly satisfied reflection on how much better things were back in the day (along with mandatory snark about how soft kids today are), but rather a felony assault, the following got tacked on to the end: “Author’s note 7/8/2015: This is a fictional article. No one was harmed, then or ever, in my care or in my presence. I apologize for any remark that may have been misconstrued.” And the author’s personal blog disappeared too.
What’s frustrating is that in both cases the authors already had compelling material to work with. As Michael Hiltzik wrote about Goffman:
Certainly much of “On the Run” rings very true, and there’s no disputing the vigor of its prose and the percipience of much of Goffman’s observation. Authorities’ exploitation of petty infractions to confine minorities in an endless cycle of fines and court dates and police harassment has been documented in many communities, including Ferguson, Mo. No one can follow news reports of police shootings and beatings of black residents of cities across America and doubt that much of what Goffman described does happen as a matter of course in the neighborhood she dubs “6th Street.”
And Janet Stemwedel on Amantine:
Without a doubt, the central question of the original post is an important one. Trainees perfecting their skills can be cautious in a way that frustrates the more-practiced people training them. That caution is amplified, understandably so, when they are perfecting their skills while working on real patients. It is true that real cases they will eventually face outside the training context may be more serious, more complex, more urgent, and that practitioners will need to deploy their techniques more swiftly and confidently.
Which is why it’s so frustrating when things like this happen. Were neither confident enough in the story they were telling to let reality speak for itself? Was there nothing else on the mean streets of Philadelphia that would have made for a dramatic conclusion? Did Amantine have no other tales of god complexes in the operating room that would have seized the reader’s attention? Both authors seem passionate about their subjects. Don’t they realize how severely they cheapen and degrade those subjects by turning in eye-popping reports that get falsified?
I really try not to judge on these things. Sometimes people are under pressures that we can’t know or understand. I’ve certainly done things I’m not proud of, and so far haven’t had any of them held up to the world for comment. I sure don’t know how well I’d handle it if they were. I don’t want to see either writer hounded from public life or drowned in shame. But I don’t want to see them get off without being called to account a little, either. The topics they’re covering are too important for them to get a pass just because their hearts were pure. And in any event, getting it right matters for its own sake.
1. The story has since been deleted from the site. An archived snapshot is still available here. Twitter user Matt Algren made a PDF of the comments here, which I’ve also uploaded along with a plain text version since one of the comments in the PDF appears truncated. Finally, a snapshot of Hope Amantine’s tweets just prior to her account deletion is here.
Two stories this week, one local and one global, got me thinking about the same question: Is it better to try for everything all at once or to slog away with progress in small chunks? The local(-ish) story concerns the Community Environmental Legal Defense Fund (CELDF) and its approach to anti-fracking activism.
A little background. CELDF got its start in Pennsylvania, where the Marcellus shale represented the first really big fracking play in the east. Communities found they were getting steamrolled by the state, which was paving the way for fracking via a host of industry-friendly laws. Towns that objected were (and are) constrained in their ability to say no to fracking. CELDF’s response was to go for the whole enchilada: instead of, say, challenging laws on the siting of injection wells, they pushed for Community Bill of Rights laws (CBR).
CBRs assert a Constitutional right of democratic self-determination and argue that communities’ interests in a healthy, sustainable environment trump the oil and gas industry’s preference to treat neighborhoods as resources to be used up and discarded. I am entirely sympathetic to that in theory. In practice though it’s a really tough nut to crack. For one, it’s an all-or-nothing approach - one that basically anticipates eventual adjudication by the Supreme Court.
Unfortunately, cases take a long time to arrive there. While CBRs’ legal status wind through the courts, the actual fracking that CELDF finds so objectionable continues unabated. It seems like communities might make a more immediate impact by using the tools they still do have - like weight limits for vehicles, noise ordinances, road use maintenance agreements and so on. Small ball to be sure, but they would slow down - and maybe even end - the activity these communities object to.
That may not be as satisfying, of course. It’s much more spectacular to invoke the country’s founding principles and have a dramatic showdown at the highest court in the land. Anyone want to take bets on how the current Supreme Court would rule in a fight of industry versus local activists?
What’s also troubling about CELDF’s approach is its seemingly cavalier stance towards communities. Usually this attitude is implied, but Valdmanis gets a refreshingly candid admission:
The fund’s rebellious approach has drawn fire from the oil industry, legal experts and established environmental groups. And the criticism is likely to grow as cash-strapped local jurisdictions find themselves on the hook for defending ordinances in court cases they have little chance of winning.
But Linzey says his goal is not to write local laws that are popular, or stand up in court, but rather to trigger a public debate about community rights to local self-government - even if it means a community ultimately falls into financial ruin.
“If enough of these cases get in front of a judge, there is a chance we could start to have an impact within the judiciary,” said Linzey. “And if a town goes bankrupt trying to defend one of our ordinances, well, perhaps that’s exactly what is needed to trigger a national movement.”
Or perhaps a town going bankrupt will go bankrupt in obscurity. Perhaps suggesting that a national movement will be triggered by CBRs is just a way to rationalize a shitty little vanity project that doesn’t give a good goddamn about the communities which will bear the burden of the project’s likely failure. Perhaps communities that end up with gutted public services (and jacked up prices for the ones that remain) will not take the same detached view of things as CELDF. And perhaps those communities would have a palpable sense of gratitude towards, say, fracking companies that threw a few coins towards mitigating the damage wrought by the wild-eyed radicals. If we’re going to wax philosophical on how the CBR approach will shake out, perhaps those are more likely outcomes to consider.
The global story is Greece, and the “go big” item here is Sunday’s referendum on - essentially - Greece leaving the Euro (Grexit). I’ve seen analysis with wildly divergent viewpoints on it. Here’s a more optimistic (an extremely relative term with Grexit) take from David Dayen: “It may initially look terrible – there’s no sunny scenario here – but the status quo offers no hope at all, and over the long term, the country can rebuild.”
Meanwhile, Daniel Howden has a darker prediction:
It has been clear that the choice awaiting Greece was a future that looked like Portugal, a degraded economy of the European south; or Serbia, a proud nation led into the international wilderness by populists’ lies and fantasies of Russian rescue. Neither are good choices, but one is incalculably worse than the other.
Like with CBRs, I’m theoretically massively in favor of Grexit. Greece is now in the fifth year of a Depression and austerity budgeting has created unconscionable human suffering. Dump the bastards, make them eat their losses, crank up the printing presses, shove handful after handful of drachmas into every Greek’s pockets, export the hell out of your products with your newly-cheap currency, and welcome vacationers across the continent for the same reason. Take a short term hit in order to end the vicious cycle of budget cuts and decreased revenues, and re-assert democratic control of the nation. Oxi!
But again, the devil is in the details - and revolutions usually don’t bother too much with anything outside the big picture. It doesn’t seem very wise to simply postulate the existence of engaged and competent leadership, available social infrastructure (like a large and well prepared civil servant class), etc. There are a multitude of factors that have to line up in order to avoid long term, widespread misery. From the outside none of that can be measured, and maybe not from the inside either. It’s like not knowing the width of a chasm until after you’ve jumped.
A viable left government should always demonstrate a clear understanding of the practical demands and strategic necessities of directing and carrying out the activities of the state. They should always foreground the interests of poor and working people and demonstrate an understanding of the impact their work will have on the immediate lived realities of the people. They should also be able to speak to the specific interests of other political actors and be able to propose practical changes to the mechanics of policy and administration. Syriza has consistently failed in these things, and increasingly so in recent weeks.
So again like with CBRs my enthusiasm fades.
This isn’t meant to be an endorsement of incrementalism, or maybe it is and I just don’t realize it. Sometimes big, sweeping changes are called for. You can’t get across that chasm in two half jumps after all, and sometimes the best you can do is estimate the risks and jump without knowing exactly how far you have to go. Maybe the kind of leaders who usher in revolutionary changes are clueless about managing the thing afterward, and must necessarily be succeeded by more technocratic types.
All of that may be true, but it would be nice to see what’s called the prefiguration too: What do you do the day after you succeed? Say you get your wish, how then do you move forward? What are the steps for implementing the plan? You can’t just say, let’s win first and we’ll take it from there.
And of course, it would also be nice to let everyone know what to expect should the grand design fall apart. It’s been known to happen.
The debate over the Confederate flag that has erupted since the Charleston massacre has, for the most part, put the flag’s defenders on the spot in a way that doesn’t seem to happen very often. It usually seems like a day or two of mumbling about heritage is enough to stave off those who want the flag taken down, but that hasn’t worked this week. Journalists - particularly at the Post And Courier - have been pressing representatives to publicly state where they stand. There’s palpable frustration that candidates equivocating on the issue are being questioned more closely than those who aren’t.
(As for Tantaros’ complaint that the Democrats were the party of Jim Crow and, for many years, segregation: yes, and it’s a terrible stain on their history. But then they passed federal civil rights legislation about fifty years ago and the Dixiecrats stormed out of the party. And by the way, if tolerating them was terrible many years ago - which it was, and Tantaros is right to imply as much - then how much worse is it to house their ideological descendants in 2015? Or more pointedly, what does it say about a political party that would have those racist crackers?)
Others on the right are trying to stake out a savvier position, namely that the fiendishly clever libs have laid a trap for conservatives by forcefully denouncing the Confederate flag and what it stands for.1 I tend to put the causation in the opposite direction; liberals were incensed by the Confederate flag’s symbolism so near a racist terror attack, and denounced it. The right just happened to be poorly served by its animating impulse (“today’s conservatism is the opposite of what liberals want today: updated daily”) in this case.
In any event, it’s been surprising how poorly the fight has gone for the neo-Confederates this time around. They usually don’t lose ground like this. The intensity and relative length of the story (a week is a long time to stay in the headlines in the Internet era) has also caused some of the more obscure talking points on the issue to bubble up. I’ve encountered two new - to me anyway - narratives on how slavery’s evil is mitigated. If the puke funnel theory of the news is correct then there’s a decent chance at least one of the following will be a talking point when the issue flares up again.
The first and more plausible theory seems to have its roots in Barbara J. Fields’ paper “Ideology and Race in American History.” In it, Fields traces the history of African slavery in the US. One of her themes is that because the idea of race is nonsensical, any attempt to interpret history through race is a fool’s errand and will inevitably fail. Other factors - most notably class - will be neglected, which will further distort our understanding of what actually happened.
There’s a lot to be said about that approach. The idea that monied interests in the North and South had a common goal of creating large scale industrial agriculture - in opposition to the common interests of small scale farmers both black and white - is fairly compelling to me. We might well have underappreciated class dimensions in our history because race has been so prominent. We might also wonder, as Fields does, what our history might look like had lower class interests shown more solidarity.
But she isn’t naive either. The fact that America’s institution of slavery used only African slaves from the beginning doesn’t escape her notice. She just draws a distinction between “prejudice and xenophobia” as the de-humanization of The Other, and “the enterprise of classification and identification” - the pseudo-scientific gloss - that came to be known as racism. She doesn’t think racism caused slavery because the very idea of race is absurd - but prejudice and xenophobia were very much factors.
It’s a fine distinction, one certainly worth exploring in an academic paper, but one that can easily get lost in public discourse. Prejudice and racism are used pretty much interchangeably in everyday use. So when some clever academic on Twitter states that “Historians discarded this ‘racism caused slavery’ nonsense decades ago” using Fields as a citation, it throws a wrench into the discussion. Using a technical definition of a word in a vernacular dialogue is too clever by half at best - and outright deceptive at worst. But maybe that’s where we are headed.
Then there is the less plausible theory. It’s actually just a part of the much larger politics of grievance on the right, which in this case can be summed up as Whites Are The Real Victims. The most perfect expression of how completely untethered from reality this thinking can get comes from Kennewick Man. It seems that in 1996 an 8,500 year old skeleton was found, and self-styled anthropologists with no axe to grind at all determined that this was a white man murdered by colored savages. So whites were really here first! And were the original victims of genocide in North America! Haha, we win!!! Or not.
(In perhaps the most hilariously clumsy piece of messaging in human history, black or white, the forensic reconstruction of Kennewick Man’s face is a reproduction of Captain Picard from Star Trek.)
I learned this week that Whites Are The Real Victims of slavery too. At least, unlike with Prehistoric Patrick Stewart, there is an actual generally accepted historical record to fall back on. But as with the discussion of racism it requires a little sleight of hand.
Ever hear of the Barbary slave trade? Pirates from Algeria and elsewhere in northern Africa raided European coasts - even inland villages and towns - and carried residents back to serve as slaves. We don’t have accurate records from the time so good numbers can’t be had outside of extrapolation, with the upper estimate being around 1.25 million. A huge number, though still about one twelfth the number of African slaves.
Barbary’s slave system had some key differences from America’s too, perhaps the largest being a ransom system. Europeans that raised enough money could buy back a slave, making that institution more volatile across generations - but the bottom line is slavery is slavery and it’s all abhorrent. Of course, in the present case that means getting “BUT THERE WERE WHITE SLAVES TOO” tweets as a way to point out that our slavery wasn’t so bad after all. Instead of saying that maybe we should reckon with our past and let Algeria reckon with its past, we should apparently just declare the entire subject off limits - or only describe it in the blandest, most technocratic terms.
So there’s your preview of (possible) coming attractions: it had nothing to do with racism and anyway white people were enslaved too. I’m already nostalgic for the comparatively high-minded dialogue we’re having at the moment.
In the wake of the Charlie Hebdo massacre there’s been an increase of a very particular commentary: dissident Muslims critical of Western liberals for being reluctant to weigh in with forceful condemnations of Islamic intolerance. Ayaan Hirsi Ali raised the issue in April, and last month Eiynah at Nice Mangos made similar points. Since both women come from Muslim cultures I tend to give their commentary more weight than, say, Bill Maher, whose acquaintance with Islam consists of what he sees in the news. As they say, you only hear about the planes that crash.
Ali and Eiynah raise good points though. I think it is true that Western liberals like me tread more carefully when it comes to criticizing Islam. There are several reasons for that, the first being familiarity. As a general rule it’s a good idea to be a little more tentative when dealing with an unfamiliar topic - which Islam is for many of us. Christianity is the dominant religion here. Even those who are atheists or who grow up without practicing it can’t help but learn about its rough contours through osmosis. That puts us on surer footing and makes it easier to weigh in sensibly on the subject. Not doing so with Islam is less about moral relativism or some misguided stab at sensitivity than it is a healthy respect for our own ignorance.
To the extent that we do comment, we often first try to draw parallels with our own experience with organized religion - which again, Christianity is our reference point. And the Christianity we see in the news disproportionately represents the Westboro Baptist churches and Ku Klux Klans of the world. They aren’t the majority of Christians and we know that, but someone observing from the outside could easily get that impression. When we see something similar elsewhere we take that into account.
We also take into account how religion is interpreted. We see fundamentalist Christians take a line of scripture about homosexuality being an abomination - right next to, say, a line that says shellfish is also an abomination - and think: yeah, maybe they’re not getting a real good reading on that. Mainstream Christian denominations are more likely to interpret those passages as artifacts of their time and not the inerrant word of God. It isn’t hard to imagine zealots of another religion having such a literal reading of their sacred texts.
Most importantly, our stance towards Islam can’t be divorced from our recent history. Before the war in Afghanistan was launched we were told that it would liberate women there. It was pretty obviously done to give progressives a reason to support the war, and at a minimum it created some hesitance to oppose it. After it was clear the war was having a disastrous effect on women, some remained deeply conflicted about it. And of course, once second thoughts began to emerge we were treated to graphic depictions of what would happen if we left.
Islam’s status as a religion, and its stance towards women in particular, cannot be divorced from our foreign policy. The issue gets raised here not for the promotion of more enlightened thinking but as the prelude to a new round of freedom bombing. It’s not like we’re saying: there’s a humanitarian crisis happening, let’s set up an extensive network of well-supplied refugee camps to reach out to and aid the victims. It’s: let’s have a military response. Criticism of Islam has slipped too easily into demonization, and greasing the skids for the latest adventure, for many of us to feel comfortable signing on too freely to it.
I understand the frustration people like Ali and Eiynah have with that. They’ve felt (and continue to feel) the effects of religious intolerance firsthand, and would like the vocal support of those who would seem to be natural allies. In the West, though, Islam is not as well understood and is often treated as a monolith. Liberals want to be extra careful that condemnation of its violent lunatics not be conflated with comment on the religion as a whole - a reflection on the planes that don’t crash, as it were.
We know how Christian fundamentalists use the Bible to justify hatred and intolerance, and are anxious not to allow a corresponding criticism of extremist Muslims to be taken as a comment on Islam generally. And since we’ve seen how the Othering of Islam has been bundled with highly problematic responses, we know that any such commentary is not made in a vacuum. There is, in short, an awful lot of context that Ali, Eiynah and other Muslim-raised critics do not account for.
Saturday morning northeast Ohio came to a standstill to watch judge John O’Donnell’s live reading of his verdict in the Michael Brelo trial. (Background here.). I’ve uploaded an OCR copy of the verdict here if you’d like to read it for yourself. I was fairly astonished at this portion (pp. 20-1):
Despite not being convinced of which shot it was, I have found beyond a reasonable doubt that Brelo fired a shot that by itself would have caused Russell’s death. But proof of voluntary manslaughter requires a finding, beyond a reasonable doubt, either that his shot alone actually caused the death or that it was the “straw that broke the camel’s back,” to use Justice Scalia’s locution, combined with other non-lethal wounds. Dr. Wiens opined that each of the four wounds was fatal if suffered alone and she described all of them as antemortem, i.e. pre-death….In other words, any one of the four caused the death, and not necessarily the first to hit Russell, since the time from injury to cessation of life varied depending on the wound.
Ultimately, Dr. Wiens could not offer an opinion on which antemortem wound caused death first, leaving me as the finder of fact to guess at which of the four undoubtedly deadly bullets caused the “cessation of life.” Guessing and being convinced beyond a reasonable doubt are incompatible. Brelo’s deadly shot would have caused the cessation of life if none of the other three were fired, but they were and that fact precludes finding beyond a reasonable doubt that Russell would have lived “but for” Brelo’s single lethal shot.
Because three unequivocally fatal wounds were caused by one, two or three other people besides Brelo, and because I am unable to find beyond a reasonable doubt which shot caused the cessation of Russell’s life, I find on count one, the voluntary manslaughter of Timothy Russell, that the essential element of causation has not been proved beyond a reasonable doubt.
Not being a legal scholar, I had been unaware of this particular bit of jurisprudence. From a layman’s point of view, I would have thought the law would take the opposite position: that if multiple people fired fatal shots, each one of them would be liable for the death. If they all fired shots just prior to death and the law only considers it manslaughter if and only if a single person fired a fatal shot, then the sensible response is for everyone to empty their clips. Doing so increases the odds of more than one person firing fatally, in which case everyone gets acquitted.
It also makes it sensible for everyone to make sure they are armed to the teeth, and to fire all of their bullets in response as well. Timothy Russell and Malissa Williams were unarmed. But once you know the law’s stance on fatal shots, it only makes sense to pack as much heat as you can and use every last bit of it in order to give yourself a fighting chance. That’s particularly true if a car backfiring or police officer firing will be taken as reasonable belief that the suspects themselves are armed and firing. While that kind of shooting gallery might be the NRA’s wet dream, I don’t think most people would regard it as very desirable.
Endorsing such a wild state of affairs means, then, that O’Donnell’s approach to the verdict was not a strictly legal one - it was political as well. He chose an arch-conservative judge to buttress his reasoning. While that is perfectly legitimate - he was citing a Supreme Court justice and not a shouting head from Fox News - it certainly gives an indication where his sympathies lie. He could have instead looked to a centrist or liberal. Given how Scalia’s logic inevitably led to such a curious outcome in this case, he also could have cited it in order to specifically reject it. New case law gets created precisely by challenging precedent. Perhaps Scalia did not anticipate his reasoning leading to the kind of outcome the Brelo trial ended with, and he may have been willing to revisit it.
O’Donnell chose not to do any of that, though. He took the quote and treated it as binding. Which again: that’s legitimate. But it’s also something that one does when one agrees with the reasoning - and that in turn is an expression of one’s political position. O’Donnell didn’t make his ruling in a vacuum, and it wasn’t the result of a detached and objective review. Like all decisions, it reflects his biases and predispositions. It was made in a context, and that is how we should understand it. If, having done so, some observers conclude there’s a thumb on the scales of justice, well, it would be hard to blame them.