Assassination territory

The other day I was at work and got tapped to contribute to a big story we’ve been following: the shooting at a house party that killed three teenagers and wounded a fourth by another teen in a jealous rage.

We’d just received transcripts and CDs of audio recordings of 911 calls and police and paramedics’ radio chatter from that night. There were hours of recordings, not very well indexed, so we split them up at random and transcribed. I got what turned out to be the first paramedics’ conversations as they arrived on the scene and started finding the victims, first the wounded teen, then the first dead one, then the others.

These were audio recordings, so I could only imagine the visual scene before the medic’s eyes, but I could tell that, behind the clipped professionalism in the medic’s voice, there was real emotion being kept in check when he reported, “Updated victim count, three black.”

After several hours of this — most of the rest was pretty mundane staging and traffic instructions and calls for chaplains to help the survivors — I took off the headphones, wrote up my little bit for the next day’s paper, and proceeded to look on the internet in an effort to clear my head before I went to work on the next thing.

And I learned that Donald Trump had pretty much suggested someone should assassinate Hillary Clinton, or perhaps just a federal judge or two she might appoint.

I’ve long since grown tired of Donald Trump and his continuing to ratchet up inflammatory speech and fly even further off the handle the lower his polls dip. After each Monday outburst, followed by two or three days of doubling down, then trying to spin it, then to blame it on Hillary Clinton or President Obama or the media… it’s gets tiresome and boring. Until the next week. What’s then: “Ya know, sharia law isn’t all that bad, because it’ll keep… well, some women… I have lots of women friends, all the best most beautiful women, right? Believe me, they all love me. I’m great with the women. But there’s a problem, right?”

I exaggerate, but only because he hasn’t called for instituting religious law as the law of the land yet. Ask me in a week or two.

But back to the point: we’ve crossed a line here.

Read that again. I said “we” have crossed a line. Not “he.” But he certainly did, too. That’s not debatable, and the Secret Service agrees.

The reason I say that is because, as appalling as it is that a candidate for president is hinting at the murder of his opponent —

Excuse me. This is for every one who just held up their finger and said, “But he was joking,” or “He was talking about encouraging gun owners to vote,” or “He was referring to the power of using their voting power to deter regulations.” Please. Everyone in that room knew exactly what he meant when he said it. Nothing else matters. Now let me continue.

— the murder of his opponent, we have some culpability for mainstreaming this kind of behavior. Yes, Donald Trump is the end result of three decades of Republicans combining their race-based divide-and-conquer electoral strategy with scorched-earth delegitimization of anyone else’s claim to power. But if it weren’t Trump, it would have been someone else emerging from that fetid swamp.

Trump may be unique in his combination of charisma and thorough derangement — history does not look favorably on those kinds of people when they obtain power, and many people suffer horribly because of them — but he is a weed that sprouted from well-fertilized soil.

Our responsibility for this comes from allowing this soil to be fertilized.

I take my job as a journalist seriously, but I am also a harsh critic of the failings of my profession. We’re the only profession specifically called out for protection in the Constitution, but, as they say, there are some implied responsibilities that go along with that, chief among them that we should help our readers (and viewers and listeners) make better sense of the world.

Let’s get a few things straight here, to put this in the proper context, because when people talk about “the media” (or “the mainstream media” or the “media elite”) they’re usually bringing to it a fair amount of political baggage. When I’m using it, I’m referring to those people who practice journalism. And that means a few specific things:

Journalists research and report the facts by going to the best sources available. That means they don’t retweet others and call it a day. They don’t write opinion pieces with nothing to back up their claims, without clearly labeling it as such (such as this very piece on my personal blog). It means if someone said or did something newsworthy (or even trivial, if we’re going to report it), they check it out with witnesses, video, public records requests of emails and transcripts and other documentation — anything that gets closer to the truth. They don’t give credence to sources that aren’t in a position to have authority on a subject, they don’t traffic in hearsay, and they don’t use weasely language to try and get around a less-than-airtight story, such as, “People have been saying…” or “It is widely believed…” If you can’t put a name to it, you haven’t got the goods.

Journalists do their utmost to be accurate. When they make an error, they admit it, they correct it, and they don’t repeat the error again. Some journalists let their egos get in the way of admitting when they screw up. This is an on-the-job failure. The reporter has a job to do and his or her ego isn’t relevant to it.

Journalists must be fair. This is a loaded word, and its meaning often gets confused with that of “balanced” (wonder where that comes from?) But let’s go to Webster. Here’s the most applicable definition of the word:

6a :  marked by impartiality and honesty :  free from self-interest, prejudice, or favoritism <a very fair person to do business with> b (1) :  conforming with the established rules :  allowed (2) :  consonant with merit or importance :  due <a fair share> c :  open to legitimate pursuit, attack, or ridicule <fair game>

OK, but what does that mean for journalism? It means you don’t take sides in your reporting, for one. I’ve interviewed plenty of people I’ve personally disagreed with, and I’ve always striven to not just quote them correctly and represent their views accurately, but also that their comments are taken in the appropriate context. Journalists aren’t supposed to have visible political views — and their reporting should absolutely reflect that — but personally? I don’t care who knows that I’m a political liberal. I do care that people believe I have the professional integrity to do my job and that my personal views don’t affect how I do it.

But look at section (b) in the definition above. Fairness also means conforming to established rules and being consonant with merit or importance. That means that we journalists have to recognize that we don’t work in a vacuum. Context matters. We play by the rules (we do not intentionally misquote you) but also we are not going to stay quiet if you violate the rules (you claim we misquoted you when we did not). Being a journalist does not mean we are not allowed to defend the integrity of our work when we know it (by virtue of evidence, sourcing and so forth) to be true.

Section (b) also tells us that we should give each side what it’s worth. This is where people confuse “fair” with “balanced”: One word says you must treat all sides equally, the other says you must make a judgment call as to the worth of each sides. Coupled with definition (a), which compels honesty, this leaves us with a conflict. How are we to honestly judge the merits of each argument, in order to render a fair reporting of such.

Here are two hypothetical examples.

  1. Person A, who advocates lower taxes, claims lower taxes for businesses help them grow. Person B, who advocates raising taxes, claims many big businesses do not pay their fair share in taxes.
  2. Person C, who represents a civil rights organization, says a new law unfairly targets minorities for discrimination in voting. Person D, who represents a white supremacist organization, says the new law is just fine, and in fact should be stronger.

In the first example, neither of these statements has any inherent moral advantage over the other, and absent hard, verifiable data that proves or disproves one or the other side, there’s no real reason to give more merit to one rather than the other.

In the second example, one of the positions clearly is untenable. Person D’s position is worth less, not because of the details in the law (absent having the law in front of us to judge the language for ourselves), but because of who Person D is. Context matters. And in this case, the context is that we live in a country where voting is a Constitutional right (15th, 19th, 24th and 26th Amendments, most importantly) and that efforts to restrict citizens from voting by race, sex, religion, age (if they’re above 18) or financial wherewithal are unconstitutional, and people with a demonstrated intent of denying people those rights necessarily need to get called out on it. We don’t report from a civil rights rally and then say, “And now let’s see what the Klan has to say about this.” There are some things which are outside the bounds of acceptability, and those things are determined by the society we live in, the laws we abide by, and the Constitution that gives us certain rights.

The same goes for the Second Amendment, by the way, but again: context. Read the whole amendment and frame your debate from there. The assault weapons ban that Congress let expire was not found to be unconstitutional, or abridging of anyone’s rights. As with voting, there’s a line to be drawn where someone’s rights can or cannot be abridged, but only within certain boundaries. Race-based voter disenfranchisement is as unconstitutional as saying a person is prohibited from carrying any kind of weapon for self-defense in any scenario. We attach conditions to these: Felons cannot vote, nor can they own or buy guns;  part of the punishment felons suffer for having violated our social contract is that certain of their rights are abridged. We could change both restrictions tomorrow without changing the Constitution, so even those laws are not inviolate. But the point is, we currently have a system that allows two seemingly contradictory legal points to stand. Where do you draw the line on the Second Amendment? Maybe that line is between handguns and AR-15s, maybe it’s between semi-automatics and full automatics — already illegal, by the way — or maybe it’s between modern firearms and flintlocks, or between Alaska and the lower 48. Those are all valid debate points well within the confines of the Constitution.

So when I say that fairness is not the same thing as balance, and that often the two are at odds with each other, it’s about considering where the boundaries of acceptable debate are, and weighing positions against each other if they’re within the boundaries, or if they’re outside, weighing them against the boundaries themselves.

That also brings me back to where I started, where a political candidate can wink-wink obliquely call for someone to shoot his rival OR MAYBE NOT WHO KNOWS? and we in the media, and in the rest of society, can just brush this off as yet more he-said-she-said. “Tell me, Mrs. Clinton, what do you think about being targeted for assassination?”

That’s pretty much where we are now. The media has become so inured to despicable behavior this year that calling for someone to be killed is just a little bit more shocking than tossing a baby out of a rally, which is just a little bit more shocking than implying that a Gold Star parent is a terrorist… let’s now see what the other side says, shall we?

Add to this the journalism profession’s hypersensitivity to being seen as having a liberal bias, and quite a few of my fellow reporters are hesitant to call the play as they see it: out of bounds. It’s a pre-emptive way to dodge a political attack that is inconsistent with the values of the profession.

But let’s do this one thing, and restore a bit of fairness to these proceedings.

  • It is not fair that, when a candidate blatantly lies (not “misspeaks” or “dissembles”) about things that have been demonstrated to be true without any doubt, we should let the lie stand without pointing out the lie immediately, or as soon as possible so that the context of the correction is not lost with the passage of time in a hyper-accelerated media environment.

Some journalists are now doing this, and providing real-time fact-checking to compensate for a political candidate to whom the facts are irrelevant. Case in point: “Hillary Clinton wants to eliminate the Second Amendment.” Fact: There is no evidence of this. No video, no speech. She has not said so, nor has she hinted at it. Gun control does not mean the Second Amendment must be overturned (see above), and in fact the two can co-exist quite comfortably. But we in the media have neglected our duty to the people we are supposed to be helping make sense of everything. People are free to form their own opinions no matter what the local paper says, but if we’re not at least saying, “This is demonstrably false, and now I’ll demonstrate why,” we’re not doing our job.

  • It is also not fair that comments that come from outside the bounds of our social contract — overthrowing the democratically-elected government to establish a Communist people’s republic, say — should be given as much weight as those that come from inside the bounds: Let’s raise taxes to pay for more social services. Context matters here, too. Back in the days of the Soviet Union, calls for democratic elections were likewise outside the bounds of their social contract, as was the case in many police states. We might have hailed those calls for elections from our standpoint here in the West, but are we taking Pravda to task for not encouraging democratic reforms?

To clarify: journalism in the Soviet Union was not the same as it is or has ever been in the United States. Newspapers were deemed to be the mouthpiece of the government and the Communist Party, full stop. If you bucked the system, you found yourself in jail and the newspaper shut down. This is not to excuse that. A critic might accuse me of imposing a degree of moral relativism here, and they would be correct: Judging another system by the standards or our own (and vice-versa) is necessarily not a fair assessment in either context. But that same moral relativism also allows us to take a broader contextual look. Let’s say we’re not talking about our Constitution-protected free press, but rather human rights, among which are freedom of expression. The UN’s Universal Declaration of Human Rights is one such venue in which, morally, the U.S.’s free press is weighed more heavily than the Soviet Union’s state-controlled press (see Article 19). It’s not the same as the First Amendment, but it provides the moral authority and a wider context in which to make that moral argument. Increase the context widely enough and who knows? Perhaps the United States would be found wanting for not guaranteeing paid vacation for every worker. It’s all relative, as they say, but if you know what context you’re working in, that is fine. Back to the main point.

  • It is also not fair that we do not consider and relay the context in which we are operating when we make moral decisions about what we cover and why we do it. Why are we, as journalists, not pointing out that what Trump is doing is considered outside the bounds of decent behavior and is quite possibly a danger to the republic? Some of us are. Not nearly enough, in my (admittedly, see above) biased opinion.

Several years ago, Stephen Colbert coined the term “truthiness” to mean “facts that just feel right.” Colbert is a comedian and a masterful satirist, and he was clearly sending up then-President George W. Bush, who had a, let’s say, rather uncomplicated view of the world. But Colbert was also sending up the media who played along with Bush, and allowed him to make unsubstantiated claims about (for example) weapons of mass destruction in Iraq, because really: are we going to believe Saddam Hussein, who everyone knew was an evil psychopath, or our own president, a bit dull-witted, but kind of funny about it and goshdarnit, he’s one of us?

It wasn’t just about Bush. We’re seeing, in this late period of the 2016 presidential campaign, the resurrection of 25 years of shit that the Republicans have been flinging at the Clintons from the day Bill took office in order to see what would stick so they could delegitimize this interloper from the sticks of Arkansas who had the gall to win the election. We’ve internalized it, and we drop phrases like “the Whitewater scandal” or “Clinton’s impeachment” without stopping to look at and explain what that actually entailed. And by “we,” I mean we in the media as well.

Is there media bias? Yes. The bias isn’t necessarily liberal (like many individual reporters are) or conservative (like many media executives and owners are). It’s toward uncritical acceptance of aberrant behavior (I was alive at and remember a time when we didn’t have mass shootings every week — this week’s work, by the way, is the second local mass shooting I’ve worked on in the past three years, and I’m not even on the crime desk — and when presidential candidates didn’t threaten their opponents) and the unwillingness to ask uncomfortable questions that might make our jobs harder and expose us to criticism from people who don’t want to understand truth in context, or worse, whose interests lie in undermining it.

So now we’re in assassination territory. Here’s the context: Four of our 44 presidents have been assassinated while in office. Attempts have been made on at least seven sitting presidents (including Lincoln, eight months before his actual assassination), not counting plots foiled in advance (three of the five attempts on Obama’s life), attempts that didn’t put the president’s life in imminent danger (Nixon in 1972 and 1974), attempts on the president’s life while traveling abroad (Hoover in Argentina, Bush in Kuwait), or when the would-be assassin changed his mind before making the attempt (Kennedy in 1960).

Interesting tidbits from this well-sourced Wikipedia article: Obama has been targeted in five separate attempts, and Bill Clinton  four times. Lincoln was targeted three times, the third time being successful.

In short: 11 out of 44 is a 25 percent assassination attempt rate against a sitting president over the entire history of the United States. That is a really high rate of political violence for a developed country. That’s a rate you’d more expect in a banana republic that has a coup d’etat every decade or so.

The cynical thing to say is, “Hey, so this is normal, right? What Trump said isn’t all that big a deal!”

The fair thing to say, however, is this: It’s way outside the bounds. This isn’t a joke, and we should seriously question the morality and intentions of those who say it is.

Memorial Day, 2016

I spent today at work covering a Memorial Day ceremony and fell into talking with a couple of Air Force vets about their experience in Vietnam. That got me thinking about the vets in my own family: Air Force, Navy, Marines and Army going all the way back to the Revolutionary War (hello, Moses Winters!). Along the way, two of them died in uniform (that I know of; there may have been more in previous centuries). We still have some in the younger generations in the services, but not as many as were in the older generations.

Then there’s this: 1.2 million U.S. service members have died in wartime since the founding of the republic. Their stories stopped then, ended by a bullet or a bombshell or bacteria or just bad luck, but the end result is the same: Here ended a life. Yet many more people survived war than died in it, and those survivors carried their memories forward.

Old soldiers don’t necessarily fade away, they just begin to look like everyone else, perhaps with better posture. But they’ve all got stories, and most of those never get told unless you ask them. They may not want to tell them — that’s understandable. But I’m willing to guess that many never get asked.

There was an essay I read and posted here some time ago about how it seems the “thank-you-for-your-service” line has become almost a reflex, a politeness that may be sincerely felt, yet still glosses over the myriad experiences our men and women in uniform have. I’ve heard it delivered one-to-one and also over a p.a. system: “We’ll now board active duty members of our military. Thank you for your service.” OK. That’s a nice gesture, but that’s kind of missing the point in some ways. People don’t join the service to get priority boarding at the airport.

I think the way to avoid that problem is to actively seek out and hear those service members’ stories, understand what exactly it is you’re thanking them for, and what they did for the sake of all the rest of us. I admit I’m a bit prejudiced in this regard: I tell stories for a living and I come from a family with a lot of military experience. But the thing that drives me most in telling stories — of all kinds, whether they’re about the military or schoolkids or animals or anything else — is getting at what is “true” about them: not just getting the facts right (although that’s important too), but knowing how to best understand the experiences of others and render them into words so that others might also understand them in the same manner.

From a writer’s perspective, these are the tools of the trade: setting a scene, developing mood and so on. The end result is about putting your reader or listener into another place and time and helping them understand what others have seen and heard.
I never piloted a helicopter, got shot out of the sky, felt terror in a foxhole or gotten doused with defoliant in the jungle, but I’ve known people who have. And while Memorial Day is about remembering the fallen, and I’m talking about people who are still with us, I think hearing the stories of the living is another way of honoring those who gave their lives in the service of the country. The fallen aren’t here to tell their own stories any more, but others can still carry those stories forward. It’s incumbent on all of us to listen.

Delegate math after New York

I’ve been kind of geeking out a bit on this, during a year when many people (including a couple candidates) seem to be waking up to the idea that we don’t actually pick our political candidates by direct popular vote.
It’s also the first year in modern times when the actual delegate count is becoming a real issue, for both parties. (I’ll say, if you’re at all interested in this, fivethirtyeight.com has been doing real-time delegate tracking with each primary; worth a look). It’s rather interesting, and there’s lots of little details that get overlooked.
For example: I haven’t seen anyone point out the fact that Ted Cruz will not win a majority of delegates in the primaries. No way, no-how. The die was cast when the fly named Ted Cruz met the windshield named New York Values. If he’d pulled his head out of his ass long enough to consider that he might need some of those New York delegates later on, he might have decided not to take a trip down Dogwhistle Lane. But that implies careful consideration and a deft political touch. He ain’t deft. The human version of a Reddit troll got pwned.
Here’s the math: There are 2,472 total delegates on the Republican side, so a candidate needs the magic number of 1,237 to secure an outright majority in advance of the convention, when those delegates are required to vote with the way their state went (“pledged delegates”). Trump has 846 to date. Cruz has 544. Kasich has 149. There are 674 delegates still at large. Trump could get above 1,237, but only on June 7, the last day of the Republican primaries, when California and its 172 delegates goes to the polls. Cruz… can’t. 544+674=1,218, or 19 delegates shy of the majority. He could win every single delegate in every race from here on out and it won’t be enough. And given recent trends, he’s unlikely to get anything other than a token minority of delegates.
So therefore Cruz has only one strategy left: stop Trump from getting to 1,237, forcing a contested convention and then trying to wrest delegates away on second or subsequent rounds of voting, when those pledged delegates will be free to vote for whoever they want. And Cruz has played a very good game of working state parties and conventions to make sure that HIS supporters get appointed delegates to the national convention.
It sounds far-fetched, and maybe it is (considering Trump’s implicit warnings of rioting should he not win). But consider this: Cruz has 544 delegates, Kasich has 149, and Rubio (remember him?) has 172, who are STILL going to vote for Rubio at the convention, or for whoever Rubio decides they should support. Taken together, the three other Republicans have 865 delegates… 19 more than Trump. So this game isn’t over yet. States to watch, with big delegate counts: Pennsylvania on April 26 (71), Indiana on May 3 (57), and on June 7: California (172) and New Jersey (51), the latter of which is one of those winner-take-all states. There are a few smaller winner-take-all states left too, but I’d guess Nebraska (with 36) is the one Cruz is most likely to take, it being somewhat Bible-Beltish, if not properly within said belt. Indiana might also be fertile territory for Cruz, but Hoosiers award their delegates in a mix of proportional representation and other factors, so it’s unlikely Cruz will win all 57.
On the Democratic side, it is a different kind of race: The magic number to hit to clinch the nomination is 2,025. Clinton has 1,444, Sanders has 1,207, and there are 1,400 delegates still in play in the upcoming primaries. Both candidates have been playing up some rather “non-voting” issues to try and sway the race, Clinton acting as if she’s already won, Sanders as if she’s doing something underhanded to steal it. Neither is true. There’s plenty of room: Clinton theoretically could clinch it on May 17 when Oregon and Kentucky vote, and Sanders could clinch it on June 7, with California, Montana, New Jersey, New Mexico, South Dakota, and North Dakota vote.
However, because the Democratic Party allots delegates proportionally, it’s unlikely either candidate will clinch it until June 7, and maybe little ol’ Washington D.C.’s 20 delegates will actually matter for once on June 14.
What is also true is that Bernie still has an uphill battle: he needs to win 819 delegates, or 58.5 percent of all remaining delegates, while Clinton needs just 582, or 41.6 percent. In other words, Bernie needs to win big from here on out. Each race that isn’t a win by 58-42 raises his must-win threshold on each race afterward. Looked at another way, Clinton could lose every state here on out, and as long as they’re not blowout losses, she’ll still win it.
Looking at the states still in play, I’m guessing Bernie will find most fertile territory in West Virginia on May 10 (29 delegates at stake), Oregon on May 17 (61 delegates), and on June 7, Montana (21), South Dakota (20) and North Dakota (18). Given the trend so far, it will be difficult for him in many of those to crack 58 percent (or whatever his new threshold will be next week when a slew of Clinton-friendly states vote. Oregon should be fun.
A word about superdelegates: there are 712 of them, most pledged to Clinton, but they’re free to vote however. Most are elected officials already, so they’re likely to remain pledged to the likely winner, no matter who that is, because they WILL be watching the political weathervane. Therefore I don’t think they’ll necessarily play the role of spoilers that the Sanders campaign fears; Clinton (or Sanders) could win the race without them. This is the first time since the party created the superdelegates (as a hedge against populist candidates like Sanders, naturally) that they’ve appeared to be in play. But they’re politicians. They risk blowback if they don’t follow the popular vote, at least nationally if not in their own states. It is possible they could split the vote (along state lines, for example) and push BOTH candidates above the 2,026 threshold. I don’t think the overall race will be close enough that, even if they do split the vote, that it would alter the outcome. They’re each individuals, they each can vote however they want, and it’s not a violation of any rule that they can do that, if it it does the unthinkable and hands the nomination to someone winning the smaller number of delegates in the primaries. It’s a sucky system, in my personal opinion, and Democrats should ditch superdelegates (just as the Republicans wish they had something like that to stop Trump), but it’s the system the party adopted.
One final note: except for a few states which have open primaries, the primaries are a creation of the political parties. There’s nothing in the Constitution that tells parties how to run primaries (there’s nothing that says there should be only two parties either, but that’s another issue), and the two parties have the ability to set their own rules (and both parties have deferred a lot of that to their state committees, creating the hodgepodge system we have now). If those rules are followed, winning still winning, no matter how odious it is, even if Cruz surreptitiously plants delegates at the convention to swing the vote his way after the first round, or if the Democratic superdelegates band together to turn a Clinton loss into a win. I don’t like it, and this year, more than any in my lifetime, all these tiny little nuances and rules suddenly matter a great deal. But it’s the only deck we have to play with, Sanders’ “political revolution” not withstanding. So I’m not going to get too worked up over the result of the primaries (either of them). I’m looking at November.

Elections calculus, Trumpism

Color me surprised. Bernie Sanders’ win in Washington, Alaska and Hawaii yesterday was surprising in its scope (I’d have thought a closer victory, similar to what he pulled off in Michigan). It puts Bernie in play for the nomination for the first time, really. (It hasn’t really been the case beforehand because Bernie’s path to the nomination has always been uphill running backward, at best. Now at least he’s facing the right direction, so to speak.)

It isn’t over yet, and technically, the nomination is still Hillary Clinton’s to lose. (And she could well do it, too, through some ham-fisted gaffe of her own making that just pisses off a certain segment of the Democratic voting bloc at just the wrong time.) The numbers still favor Hillary: if you’re tracking delegates, Sanders’ sweep was a big help: According to Fivethirtyeight.com, he needed 81 delegates from of Washington, Alaska and Hawaii in order to still be “on track” (as in, not keep falling further behind.). Sanders won 98, and his campaign is now trying to persuade Washington’s 17 superdelegates, most of whom have endorsed Clinton, to feel the Bern. In yesterday’s contests, Clinton won just 32 delegates, and needed 61 to stay on track.

And yet, these victories are all relative. Hillary is still ahead in the delegate count, and is in fact ahead of the curve; she’s still on a path to lock this up by the June 7 California primary, if not sooner. Bernie can still win, but it probably won’t happen until the very last minute: June 7 at the earliest, and who knows? Maybe tiny Washington, D.C.’s 20 delegates up for grabs one week later will come into play for the first time ever. Because every race on the Democratic side awards its delegates proportionally the guiding rule in all this is that for Bernie to win, he needs to win big every time. He hasn’t done that so far, and he’s fallen behind. Even narrow victories mean he’s just falling behind a bit slower than before. Saturday’s victories were a bit of a change, but he’s yet to show he can sustain it in states that aren’t largely white in their demographic makeup. With big multiethnic states like New York and California still on tap — and they’re holding primaries, which tend to favor more centrist candidates, not caucuses, which draw out the activist element in both parties — it’s still uphill all the way for Bernie.

As the campaign gets hotter and nastier, there’s a different calculus at work. I’m see two possible calculations to consider, and I honestly don’t know which one is correct. Is A>B? Or is B>A? Wherein A = Aghast Republicans Who Remember Eisenhower and Goldwater and Can’t Believe This Is the Best We Can Do and therefore will stay home or secretly vote Democratic come November, and B = Berniacs in a Righteous Snit Who Ragequit the Election Because There’s No Difference Between Shillary and the Rethuglicans and the Country Needs to be Taught a Lesson, and therefore will stay home or secretly vote Republican come November.

That assumes that Clinton ultimately prevails on the Democratic side. If Sanders wins, I see a slightly different equation: B = Bourgeois Moderate and Working Class Democrats Scared of Change who stay home (and maybe switch sides, but that would only be in the case of white male Democrats, mostly).

But on the Republican side, it’s Trump all the way. The party brass may be clutching their pearls and diligently planning out how mathematically they will unite around some unidentified moderate candidate to deny Trump the nomination. But while they’re huddled over their calculators and smelling salts, Trump is driving over and past them in an Abrams tank. His momentum slipped a little in the last couple races as the party got a little more scared, but not enough to fundamentally change the overall vector of the race.

The GOP could do one of two things at this point. One is they stage a coup at the convention, abandoning any kind of lip service to the democratic process to ensure that Trump is not the nominee (and therefore almost surely triggering a third-party Trump spite-run, which would completely splinter the party, possibly for good). The more likely alternative is that the party leaders will make peace with their inner fascists, swallow their vomit and pull the lever for a candidate they all secretly hope dies the moment after he is sworn in.

Given how toxic some of the Sanders politicking has become (and some Berniacs have been mumbling they’d rather see the country flushed down the Trump toilet than hand the White House to a warmongering scandal-prone conservative-in-sheep’s-clothing), I suspect the equation at work is B>A, because (1) Berniacs already are threatening to quit, and a certain subset of them will probably follow through because they truly believe some of the attacks they’re pushing against Hillary (even if they originated in the Republican playbook), and (2) Republicans making peace with their inner fascists isn’t that much of a stretch these days, since their politics over the past 30+ years has been precisely what Trump has been advocating: protecting rich white male privilege and making sure the takers don’t get uppity, keeping (or kicking) out the unwashed Mexicans, Muslims, and other swarthy foreigners coming to America to blow us up, take our white-collar jobs, or both, and so on. It’s the politics of self-entitlement from a party that perennially tries to cast that term on the opposing side.

I say this as someone who likes Bernie Sanders, and even prefers him over Hillary Clinton: I’ve had to mute several Berniacs on social media because their constant my-way-or-blow-it-all-up moralizing is having precisely the opposite intended effect on me, and on many other would-be Bernie sympathizers, I’d wager. There’s a bit of angst on the Clinton side as well, and I think the accusations of misogyny from the Bernie camp are way overblown. But the intraparty hate is nowhere near as strident as that originating with the hardcore Berniacs, who are starting to sound like left-wing versions of Ted Cruz. I know, party unity, blah blah blah, but November matters more. We had a real bad run of things in the early 2000s when we had unthinking reactionaries running the show who didn’t care about context or planning. I’m not sure in which universe a president Trump won’t be several orders of magnitude worse than the Bush years, but it isn’t this one.

This seems like the appropriate place to point out that I did not attend Washington state’s Democratic caucuses, for two reasons. One: I’m a journalist, and caucusing is more persuasive and political than I think I should have a role in — it’s for the party insiders to argue and decide who their representative will be, and who their delegates to the convention. And two: Either Democrat is preferable to Trump, by any measure, unless your qualifications for the presidency include anger, spitefulness and capriciousness and notably don’t include things like policy knowledge, a diplomatic temperament, a track record and an understanding of history.

I like the fact that Bernie has been in the race, not for the least because he’s been holding Hillary’s feet to the fire. We wouldn’t even be having discussions about income inequality, campaign finance reform, free college education and small-s socialism (long a dirty word in both parties) without Bernie in the mix, and his success so far has shown the party establishment that they can’t take for granted progressive voters — the same ones who put Barack Obama in the White House twice. I say this even though I think that Bernie would have a hard time getting any of his policy proposals passed by a Republican obstructionist-controlled Congress, or even a Democrat-controlled Senate (it’s almost impossible that the House will flip before 2022, because most House districts are gerrymandered to be completely safe, and a majority of those are safe-Republican seats). I like Bernie because he truly is a progressive and he’s forced Hillary to pay attention to issues she probably would rather have not had to on the way to her coronation. While Hillary’s perceived hawkishness is worrisome, I think we do need an experienced foreign policy hand to counter the very real belligerence from Russia, China, North Korea and ISIS/ISIL/Da’esh.

And if the hypothetical Berniac is worried about Clintonian hawkishness, you’d think they’d be lining up behind whoever wins the Democratic nomination, because Trump is downright frightening. Josh Marshall over at Talking Points Memo points out that the overall dynamic of Trump’s campaign isn’t the words “Make America Great Again.” Those are just words. Trump’s real message is exerting dominance and taking revenge on everyone who has “laughed at” or humiliated America. He’s talked casually about using nuclear weapons to combat terrorists. “It’s payback time,” he’s said. Revenge does not make for a peaceful foreign policy, and that’s scarier than anything the Democrats could come up with.

Read the transcript of Donald Trump’s interview with the Washington Post editorial board. What comes across more than anything else is someone with no core values other than to promote himself, and who is mentally unstable to boot. He is not the kind of person that any sane individual should want anywhere near the nuclear button.

Hence my worry about the Democratic primary. Sanders decided to run as a Democrat to get access to the fundraising apparatus, true, but I also think he recognized that splitting the liberal and moderate votes in the general election would accomplish nothing except allow the Republicans to take control in November — a Republican party that, even without Trump, has been governed by its most extreme elements for the past 30 years, and shows no signs of moderating. If Hillary wins, I would hope Bernie does the game thing and during his convention speech (he’s earned that, at least, plus influence over the party platform) not just endorses Hillary, but makes a plea for his followers to join forces against the common enemy. He’s a profoundly moral person, and that can go a long way toward smoothing things over. That might not sit very well for those pursing his political revolution, but Bernie at least is smart enough to realize that there’s a lot more at stake.

(Likewise, if Bernie prevails, I fully expect the Democratic Party apparatus to line up behind him, even if a few blue dog conservative Democrats don’t like it. The alternative is Trump.)

Granted, the Berniacs are not anywhere near as scary as the Trumpshirts, who can’t have a rally without assaulting someone. It’s just a matter of time before they go all Kristallnacht on the local Democratic party offices. The Republican convention in Cleveland should be interesting to watch… from a safe distance.

I hope the Berniacs carry on their political revolution by getting involved in local politics. Third-party challenges to elite power structures seldom succeed from the top down. George Wallace in 1968 split the Republican party and convinced the racists to work in the background, not out front. H. Ross Perot in 1992 was a flash in the pan. Ralph Nader in 2000 truly was a spoiler, and he didn’t have any coattails for anyone else to ride on (and in 2004, he was a joke that no-one remembered the punch line to). Revolutions start at the bottom and rise up from within. This is how the Tea Party took over the Republican Party and remade it into an extremist organization that even now threatens to tear the country apart. It’s how a group of religious zealots took over the school system in Texas, and now dictates the content of history books used across the country. It’s also how a group of moderate-to-conservative southern Democratic governors and congressmen took control of the Democratic Party and steered it to the right in 1992 — to electoral success, granted, but at the cost of a number of policies that have proven inimical to the nation.

The ultimate prize isn’t only the White House. In fact, in this particular election cycle, taking the presidency is just a way of making sure America’s slowly healing wounds aren’t ripped open again. Since the wounds inflicted during the Bush years were pretty traumatic — unnecessary wars and tanking the world economy, for starters — that makes this a really important race.

But in order for the “political revolution” to take root, the Berniacs ought to be looking at the state legislatures, city councils, school boards and local Congressional races. Politics is a numbers game, and it’s not zero-sum except in the White House. The numbers add up over time and space, and a coalition of true progressives in positions of power across the country will have a greater impact than a single person in Washington, D.C. could have. That, more than anything, is the hope of Bernie’s revolution. He’s been calling the faithful along on his crusade, but if they don’t keep moving forward after he’s gone — or even if he succeeds — it won’t have mattered at all.

Scenes from a Quiet Hungarian Border Town

Battonya, Hungary, Sept. 5, 2015

Heavy rains before dawn broke what has been the fourth heat wave of the summer here, where temperatures stayed above 90 and often 100 degrees for seven days or more at a time.

When the weather is that hot and humid — eastern Hungary, in particular, is a large plain surrounded by mountains, and tends to trap the worst of the dog days of summer — the best one can do is stay indoors with the lights off and the shutters drawn. The worst one could do was be out working in the blistering sun.

That was until this summer, when summer, Hungarian-style, took on a new meaning, as thousands of refugees from the Middle East and Africa — mostly Syrians fleeing the bloody civil war — found themselves caught up in a web of east European nationalist politics, camped out in Budapest’s Keleti Railway Station waiting for trains to the west that the government unilaterally decided to cancel.

The news this morning in the national media is what may be the end to a crisis that has stretched out for several weeks, as buses pulled up to the station at 1 a.m. Sunday, filled up with refugees, and took them to the Austrian border. Cheers, jubilation, cries of “Thank you, Austria,” came from the buses as they crossed over and made their way to Germany, where they also received a similarly warm welcome.

All while the masses waited in immigration limbo in Hungary, Hungarian Prime Minister Viktor Orbán has been making the rounds in Brussels decrying the Muslim hordes who would overrun Christian Europe, just as the Ottomans did 500 years ago. Because somehow, that’s still relevant in the mind of a modern European nationalist, even if the rest of the world considers it to be unadulterated bullshit.

The government, led by Orbán’s nationalist right-wing party Fidesz, meanwhile has constructed a razor-wire fence along Hungary’s southern border with Serbia, imposed jail times for anyone caught climbing or breaking the fence or otherwise entering the country except at the border crossing, and otherwise trying to make life as hard and miserable on the migrants as possible, as if that would make them think twice about entering the country.

The supreme irony is that for all Orbán’s posturing about keeping the refugees out, he and his party have done everything they could to make sure the refugees, once they were in the country, couldn’t leave.

And then there’s the fact that the migrants had no desire to remain in Hungary, either. Their destination was Germany, where they have such strange things called “jobs,” “civil society,” and  “a future.” Indeed trainloads of migrants arriving in Munich have been welcomed with food, water, smiles, efficient processing of their status by officials and, most important, hope that things will now get better. How can it not, when they’ve come from a civil war zone that’s been raging for five years, or if not straight from the conflict, then from one of many refugee camps around the Middle East that are likely to turn into permanent settlements.

The roots of the crisis have been long in coming, and while the rules of the Schengen agreement that governs passport-free travel in Europe require all migrants to be registered in the country they first arrive in — namely Hungary — the fact is that Hungary’s poverty and its indifferent and corrupt bureaucracy left it unprepared for the waves of migrants arriving on its borders this summer.

Not that the crisis couldn’t be seen. Most of the domestic criticism of Orbán’s government has been their refusal to prepare for the inevitable as the migrants started showing up in the Balkans earlier in the year. Orbán and Co. would like to blame the effete liberals in Berlin and Paris for enticing all these dangerous foreigners to come into their territory. But finger-pointing at the EU for having convoluted immigration policies pales in comparison to the malign neglect shown by the government in Budapest.

The crisis least has abated for the moment, but thousands more refugees are getting ready to enter Hungary in hopes of reaching the west, and the government has threatened to detain them again.

Maybe Orbán hopes to continue to milk this humanitarian crisis for political points, to show the nation that he can be just as reactionary and thuggish as the unapologetic neo-fascists in the Jobbik party.


For the past nine days I’ve been in Battonya, a village in southeastern Hungary on the Romanian border where my in-laws live. Here, the drama played out on television. The migration route lies a few kilometers west of here, from Serbia up through Hungary to Austria. What we get is a mix of news from state-owned media, reporting nonstop about the “illegal migrants” causing chaos in Budapest — indeed, they used the word “illegal” no less than four times in five minutes during the height of the refugee standoff, to emphasize the fact that nothing good will come from their presence in Hungary, even as the government was trying to prevent them from leaving.

Cut away to one of the few independent TV networks left, and there you can see the minority opposition delivering firebrand speeches from Parliament, strong words that belie their ability to have any influence over policy in Orbán’s nascent dictatorship.

The construction of a fence across the Serbian border did cause some folks in the southeast to wonder if the refugees wouldn’t simply walk around to Romania and cross the border here. But the Romanians have thus far kept their border shut to the migrants, and that hasn’t come to pass.

Instead this town has remained its normal sleepy self, the tribulations in Budapest far away and at most subject for discussion across the breakfast table.

In another irony — there are multiple levels of irony at work in this region, where borders have a history of shifting back and forth, depending on who is in power in Budapest, Moscow or Berlin — the scene of hundreds of Syrians walking from Budapest to the Austrian border called to mind other crossings, most recently in 1989, when the Hungarian government decided to open the border to let hundreds of East Germans cross over into Austria, an action that led directly to the fall of the Berlin Wall and the Warsaw Pact.

Then there was the summer of 1956, when Hungarians rose up against their Soviet occupiers in an outbreak of violence, hoping that the West would come to their aid. When the revolution failed, thousands of Hungarians fled to the west, across the border into Austria.

Indeed, most Hungarians are more charitable than their government has been, providing water and food for the refugees even while the police refused to let them board the trains to their eventual destination.

If the western border of Hungary has symbolized a gateway to freedom and prosperity, the eastern border has long held a different meaning. For Hungarians, it was the metaphorical edge of civilization (although some living west of the Danube would argue that the river, which neatly bisects the country, marks the real edge of the known world, as it did for the Roman Empire and its successor states). In 1541, the Ottoman Empire rose out of the Balkans and overran the medieval Kingdom of Hungary, starting 150 years of foreign rule. When the Habsburgs expelled the Turks, they took over, replacing Islam with Christianity but nonetheless still calling the shots from a foreign capital. When Hungary finally became independent from Austria in 1920, it was at the expense of two-thirds of its territory, and the eastern border had moved several hundred kilometers west to where it now lies about ten minutes from my in-laws’ front door.

It was the changeable nature of this border, among others that had moved over the years, that gave the Hungarians a sense of self-righteousness, and which turned out to be easily exploited by the forces of xenophobia originating in Nazi Germany. The borders moved again during the war, the spoils of collaboration, and then back again afterward. On Sept. 22, 1944, the Soviet Union crossed into Hungary here, in Battonya, to begin pushing the retreating Germans back west.

There’s still a monument to the”liberation” in town, as there are in many others across Hungary. These monuments still stir conflicting emotions here. They’re a reminder of the bad times of years past, and the worse times that came before. The war, the deportations of the Jews and German occupation was replaced by Soviet occupation and 700,000 Hungarians sent to the gulags of Siberia.

Hungarians have long internalized and gotten used to this dichotomy in their identity. It gives them a rich sense of irony and dark humor. Hungarians in general are wistful for reclaiming their lost history, whether in the form of territory redistributed to neighboring states after World War I, or the influence that came from being joined to one of the last European monarchies, or the “greatness” of the medieval kingdom that stretched from the Adriatic to the Baltic, and which ceased to exist nearly 500 years ago.

But those same Hungarians are all too aware of the law of unintended consequences. Like the current crisis, where in trying to keep the Syrians out, Viktor Orbán succeeded only in keeping them in. Despite the government’s ham-handed attempt to characterize the displaced men, women and children escaping an even worse hell back in the Middle East as Islamist sleeper agents come to undermine western civilization, average Hungarians came forward with water and food for the refugees, doing their best to make a bad situation better. It’s more the pity that their own government refuses to rise to the occasion.

Empty thanks

I’ve been thinking about war a lot recently, not just because it looks as if we might be stepping up our military involvement in Iraq for the third time in a decade. And also not just because Russia has been doing its best to destabilize Ukraine, with barely disguised intentions on bringing it to the Baltic states next.

We’re also, here in the U.S., beginning to see a increasing number of veterans returning from the last 11 years of war in the Middle East. Most are re-entering civilian life without problems, although we’re also reading about individual cases of post-traumatic stress disorder, depression, substance abuse, domestic violence … pick your symptom.

The nature of the news media to tell these stories when they happen makes it seem as if we’re dealing with an epidemic of mental health problems. This isn’t to say we aren’t dealing with those problems, but they probably aren’t as near epidemic proportions as we hear. It’s analogous to how the news reports crime: you only hear about crime when it occurs, not when it doesn’t. You could live in a perfectly quiet neighborhood for years, but if one person gets killed (for whatever reason), the TV is full of people wringing their hands about an epidemic of violence and “You don’t know who to trust any more.” News reports focus on what stands out, not on what happens day after day without change.)

But aside from what issues veterans may or may not be dealing with as they re-enter civilian life, there’s been a bit of change in civilian life as well. Call it the “veteranization” of society. It’s most readily apparent at sporting events, where it’s obligatory to parade a few veterans in front of the crowd and publicly thank them for their service. Veterans often get called first for boarding on aircraft, along with first-class passengers, families with young children and people in wheelchairs. Politicians often need to express some sort of gratitude at any event in which someone shows up in uniform.

This is to be expected, and it’s also appropriate to a certain extent. Veterans are here among us, many have been put in horrible situations and asked to do horrible things so the rest of us don’t have to. Meeting someone in uniform, or being made aware that someone was recently in a combat zone, immediately tells us civilians that there’s really nothing we’ve done recently that compares to their experience, either individually or as a measurement of our worth to society. Our most significant contribution to the greater good might have been to land a new sales contract in Ohio, or save our company money in procurement costs, or, at best, helped raise money in a 10k walk to cure cancer.

Those aren’t bad things to have done, but it doesn’t compare with saving your platoon or routing a nest of Taliban fighters that had been slaughtering girls trying to go to school in Afghanistan. So there’s a bit of sheepish guilt involved when the initial reaction is to say “Thank you for your service.” What else can you say?

But it’s also something, as the New York Times pointed out recently, that can really grate against some veterans because it is, in fact, a cop-out, and they know it. We don’t understand what that veteran has gone through recently. Even if we’ve read about that individual’s exploits in the war zone, we weren’t there, faced with the unending and unbearable stress of knowing a single slip-up could mean getting yourself or your friends killed. Couple this knowledge with the fact that not all service members believe in what they’re doing equally. Some were gung-ho, true believers, like Chris Kyle of “American Sniper” fame/infamy, while others like Pat Tillman questioned the rationale for their being there, or the wisdom of those that decided to send them to war in the first place.

There is already a tendency to treat veterans as damaged goods in our society. Part of this stems from the post-Vietnam era, and may be a mixture of our perceptions of some cases of real PTSD and collective guilt for not rallying behind the war or not “supporting the troops” more than was the case.

(As an aside, I think it’s a bit lazy to consider the anti-Vietnam War movement as a product of excessive liberalism, or even “liberal” at all. Organized opposition certainly originated from the left, but the mass protest movement — well, a lot of those hippies turned into Reagan voters and didn’t utter a peep when we invaded Grenada and Panama and launched a proxy war in Nicaragua. Protesting a war you might get drafted to go and fight and die in is self-preservation, and that’s a rather easy cause to support, especially for the narcissistic Baby Boom generation. Why go get killed when you can stay in college, smoke a lot of dope and get a lot of sex under the guise of “free love”? The real liberals were marching in Birmingham and Selma, and facing down the National Guard and their fire hoses and dogs.)

If there’s anything we should have learned from the Vietnam era, it’s that catastrophes can happen if people are reflexively deferential to the powerful, who might have another agenda entirely. By 2003, we seem to have completely unlearned that lesson, and allowed our leaders to drag us into yet another war for dubious purposes: a war that is now entering its 12th year, longer than our time in Vietnam from the Gulf of Tonkin incident to the surrender of Saigon. (Although, technically, the war for us started in 1955, and for the Vietnamese is started during World War II as a movement to expel the colonial rule of France and Japanese military dominance of Southeast Asia.)

In 2015 we’re in a different situation, but there are some similarities. Our military is professional, not conscripted. Every service member chose to be there, knowing they might be called upon to sacrifice their lives, without question, on the orders of someone sitting at a desk thousands of miles away. Just signing up takes a bit of bravery, if you think about it (although not everyone who signs up is fully cognizant of the risks, especially during peacetime, when joining the military may be seen as more of a leg up into the middle class through the GI Bill.)

But there have been times when guilt has been deployed as an effective recruiting tool. “Daddy, what did you do in the Great War?” was a recruiting slogan used by the British and we had Uncle Sam telling us that he wanted us in the army. That was a time when winning a war required marshalling an entire population into the war effort, into the factories for women, into uniform for men. We haven’t required that level of national sacrifice for a long time, and some of us are quite aware of that when we encounter one of the 1.3 million or so active-duty personnel in our daily lives. They comprise less than one percent of the population, and yet they stand in for the rest of us in foreign wars. Saying “thank you for your service” is the least we can do, it seems, but it’s also the most many of us do. We’re not, for example, “sacrificing” more of our tax money to support programs to end homelessness, to provide more addiction treatment, more job placement and career transition services, and so forth. Some of those services exist, but they’re not everywhere, and they’re definitely not reaching everyone. (And, to get a little political here, it does seem that the one party that publicly aligns itself with aggressive military action is the same one that doesn’t hesitate to cut benefits for those veterans before it would ask the well-off to pony up a bit more to support them.)

More to the point, civilians don’t know what it’s like to be in a war zone, and that’s something active-duty personnel and veterans are acutely aware of. The cost of an all-professional military has been an increased distance from the larger civilian society. The rest of us don’t know what sacrifice during wartime means, and consider higher gas prices to be an unbearable burden.

There is no easy answer to this problem. But it starts with making civilians more aware of what our military does, and making our people in uniform more willing to tell their stories to the rest of us. Empty gestures of gratitude might be appreciated by some, but they don’t help break down the barriers that we’ve erected between the armed services and civilian life. Saying “support the troops” in recent history has taken on a political bent, and one toward quashing dissent rather than providing actual support. The meaning in the broader culture is unambiguous: It’s “Shut up and support the troops,” not “Support the troops with higher taxes to provide social services to returning veterans.”

There are still true believers, in uniform and out, who never questioned the 2003 U.S. invasion of Iraq (I was adamantly opposed, for the record), or the 2001 invasion of Afghanistan (I was reluctantly in favor of it, to the extent that we succeeded in disrupting Al Qaida). There was talk a few years ago about launching military action against Syria (which didn’t have much public support from any quarter), and now we’re talking about escalating the war against the Islamic State, for which there seems to be more support, but it’s not a “hu-ah” chest thumping as much as a “eh-somebody’s-got-to-do-it-and-it-might-as-well-be-us.” (I will say this about George W. Bush: he did an incredible job of ginning up enthusiasm for a war that was absolutely the worst foreign policy mistake in a century, while Barack Obama has been abysmal at building support for a conflict that we have an obligation to see to its completion. “You broke it, you bought it,” doesn’t have the same ring to it as “Smoke ’em out.”)

What needs to happen, regardless of whether or to what extent we go back into Iraq, is that we all need to understand more about what is happening with and to our service members. But it’s probably incumbent on our troops and vets to take the lead in telling their stories. No single story can capture the entirety of a war, and not all soldiers will want to talk about it. But when we start talking about supporting the troops again, we need to be doing it open-eyed, with the understanding that we might not like what we’re seeing. Only then, I think, can we be in a position to say “thank you.” The troops we’re thanking will at least know that this time we really mean it.

UPDATE (May 22, 2015)

After posting this, I went and asked a vet I know, my cousin David, who served in Vietnam in the Air Force. We’re probably political opposites (although I don’t know for sure, because like much of my somewhat diverse family, we tend to shy away from hot-button topics; we’re largely Scottish-Northern European mutts, but we fight like Italians when we get going). But I asked his opinion of this, since he’s active in the local veterans community where he lives, and he’s often in the position of talking with them soon after they arrive back in the U.S. from deployment. No matter how they might feel about the “thank-you-for-your-service” meme (because that’s really what it is, it gets passed around almost without thinking in the same way you might forward a funny cat picture), this is the nut of what David told me. When he meets a vet, or a soldier back from deployment, what he says is this: “Welcome home.” They all appreciate that.

Scotland and Secession

Scotland votes for independence in a couple of days. I don’t have any particular dog in this race, or at least, not a big one. Some of my ancestors were Scottish. They came to the Americas before there was a United States to come to. I’ve never been there, and my travels and the years I spent living in Europe were confined to the continent, specifically in the East.

Yet Scotland poses an interesting conundrum, both in what it means for Scotland/the U.K., and what it means for everyone else.

First, the easy answer: everyone else. Slate recently ran a series of stories looking at various sides of What Scottish Independence Means elsewhere, including among other secessionist movements in Europe like the Catalonia-Spain rift and here in the U.S., where we’ve often had populist groups raise their handmade flags for a bit before sheepishly taking them down again once they realized no one was paying much attention.

In the U.S., secession happened once, and it led to the bloodiest conflict in our history. And as a cause for a movement, secession failed utterly. The Civil War did lead to the birth of a nation, but it was a nation in which the ideals of the Declaration and the laws of the Constitution would march in their slow but sure manner toward realization. Before South Carolina threw down the secessionist flag, the country was a hodgepodge of local jurisdictions that mostly did their own thing and ignored what everyone else was doing. After the war, the country was a hodgepodge of local jurisdictions that always had one eye looking over their shoulder to see if Washington wouldn’t object.

The Civil War created the United States as we knew it, and the United States it created was the one based on the Union. The South, the old Confederacy, was gone. Jefferson Davis is a name on statues and highways in a few states, and Robert E. Lee is probably  known mostly for his genteel surrender to Ulysses S. Grant than he is for any of his prewar or wartime deeds. Most other “Founding Fathers” of the Confederacy are known only to those who actively study the Confederacy. Which is to say: few people alive today.

Nonetheless, there’s still a bit of the old North-South schism that still lurks under the skin in isolated pockets. Confederate flags were commonplace when I was growing up in Maryland (part of the Confederacy in the war, but now more similar culturally to Massachusetts than nearby Virginia). The Stars and Bars were commonplace when I went to college in North Carolina in the 1980s. Even in the 21st Century someone still trots out that old banner every once in a while, yelling about Southern heritage and how the South will rise again, then screaming political correctness when it’s pointed out that it’s pretty racist to be doing that.

(I’d ask why it is that people who talk about Southern heritage don’t talk about anything other than victimization, or race, or Civil War-era grievances, or why it’s only white conservatives that ever bring it up. It’s not like the South never had any black people.)

But for all the ugliness of this barely-concealed racism among neo-Confederates, the Tea Partiers, even some pretty mainstream Republicans in the South (especially since the nation elected a black man president of the United States) these are really isolated incidents. Yes, there’s a lot of attempting to organize, and there are plenty of wealthy racists willing to make One Last Stand against the long list of things they believe to be Wrong with America that were not so wrong six years ago. The Republicans may even win the Senate this November. But it won’t be because they play the racist card or promote secession, it will be because people are fed up with the Democrats too.

For one thing, demographics have changed even more. As popular as Fox News’ perpetual rage machine is, its core demographic is old. They’re not gone yet, but they’re going. The strategy among southern conservatives seems to be more about gerrymandering the minority vote out of existence before pesky things like demographics put them out to pasture for good.

But for all that — covert and overt racism at the local and regional scale, discriminatory policies, pseudo-threats to secede from the United States and paens to a lost time when all was sun porches and lemonade and black people kept to the servants quarters out back — secession in the U.S. is really a non-starter.

Let’s assume for the moment that we’re not just talking about the South seceding as a whole. Let’s throw in the Vermont or Northwest liberals who want to carve out their own green republics. Let’s throw in the occasional fruitbat conspiracy nut like Cliven Bundy, who thinks he can squat on federal land and declare it his own and outside the law of the federal government that allowed him to use it in the first place.

Let’s talk about “secession.” Can it work here, for any reason?

No. Not really.

Because this nation was rebuilt in a new image after the Civil War, and went on to become an industrial powerhouse, emerging as a global superpower in the aftermath of World War II, we can’t go back. We’ve become very mobile. I myself, born into the middle class in the Mid-Atlantic, have lived in the South, in New England, and now in the Pacific Northwest (the latter decision a whim that’s turned into 15 years and something approximating “putting down roots”). Traveling across the country has never been easier (except they keep shrinking airline seats). Most of us know people from other parts of the country, and most of us, whether we are liberal or conservative, religious or not, find that we have a fair amount in common.

We in the Lower 48 are too American to really be secessionist any more. In the 1850s you could make an argument that there was a bigger difference, not just in the institution of slavery, but in fundamental character (the industrializing, mainline Protestant/Catholic North versus the agrarian, evangelical/fundamentalist South). That’s not the case for most people anymore, especially in southern cities, where the influx of capital, education and immigration has turned former backwaters into major metropolises with diverse populations: Raleigh-Durham-Chapel Hill, Atlanta, Houston.

Economics are really a secondary driver of any secessionist movement. Even gung-ho secessionists in Texas (while Gov. Rick Perry’s comments were taken somewhat out of context, a northern governor would never even contemplate that kind of comment, because it’s part of the foundational myth of what makes Texas Texas) would sing a different tune the minute there was an oil bust.

No, secession is about national identity, and the vast majority of us are American first and something else a distant second when it comes to finding our place in the world.

Scottish actor James McAvoy said it best recently: people may be talking about economics in the Scotland vote, but it has to come from the heart. If you get a new country, it’s yours, for better or for worse, and there’s no turning back.

And you need to assume the worst: A crisis will hit the new country, because they always do. You will get an oil bust in Texas (or Scotland), or Cascadia will hit a wheat shortage, or Vermont’s currency will get debased by Wall Street hedge funds, or maybe a larger power will pick on the nascent Free Republic of Whatever. Will they go running back to the Stars and Stripes? Yes.

But if there’s an identity to replace that Americanness they jettisoned — and I don’t see much evidence of that here in the Lower 48 — that might drive the locals to persevere and see their new country through both good times and bad, that will determine if a secessionist movement will stick. Otherwise it’s just politics, and those winds shift every few years even when there isn’t a storm brewing.

The rules are different in Alaska and Hawaii. Alaska likes to talk its independence, but for many reasons (not least the Alaska Permanent Fund, which pays people just to live there) it receives more in federal funding than it pays in, so secession is hardly likely there, no matter what the provincial twits in the Palin clan say. Hawai’i, where people of Pacific/Asian extraction are in the majority, is another issue, but it is still a big recipient of federal money, and given Hawai’i’s remoteness and substantial tourist trade driving the state’s economy, the islands are likely to stay in the union for some time.

No, the only real secessionist movement of any significance in North America is the one where there is a significant barrier between regional and national governments, reinforced by different cultures and languages: Québec. As one Québécois friend remarked to me at one time, it’s not a matter of if Québec secedes, it’s when.

(And while the province’s motto, “Je me souviens”/”I remember,” may seem nationalist on the surface, the author’s original intent might have been to reinforce the notion of Québec as an integral part of a larger Canada: “Je me souviens/ Que né sous le lys/ Je croîs sous la rose.”/”I remember/ That born under the lily/ I grow under the rose.” For those unfamiliar with the symbolism, the lily is associated with France, the rose with England.)

So where does this leave Europe if Scotland goes independent? I don’t know. Its a tougher question that I’m comfortable not answering because I simply don’t have the knowledge of the various little movements all over the continent. Of the ones that might be significant, Catalonia has long wanted to break away from Spain, but Madrid is hardly likely to let them go, lest it give the Basques any ideas. Both Cataláns and Basques speak a language other than Spanish, which has helped build their sense of national identity as distinct from that of Spain as a whole. Both groups also suffered under the yoke of the fascist Franco regime from the time of the Spanish Civil War up until 1975, so it’s a little harder for Madrid to say “trust us” when that era is such a recent memory. And, like England, Spain also used to be a global superpower that saw its colonies break away one after another. (Really, someone ought to write a paper about countries suffering from National Post-Superpower Complex.)

Belgium may try to split into Flanders and Wallonia (its government hardly functions as it is, divided largely along the Flemish-French ethnic lines), but it’s hard to see how that will play out in the heart of the European Union.

The real question mark is would be back in the rump United Kingdom of England, Wales and Northern Ireland, the latter of which, you may recall, has spent centuries in internecine warfare, with a not-insignificant population who would like nothing more than to break from England once and for all. Because if Scotland can be just let go, why not them as well?

But all these disputes (plus many smaller ones of much less significance) are all happening within the confines of the European Union. The Basque and Northern Ireland conflicts have been violent in the past, but have been largely calm lately. The worry in some quarters is that the Scotland vote, if it succeeds, will encourage a return to that era. “Some quarters” may just mean in the halls of power in London and Madrid, and those fears may not be borne out on the streets. Or maybe they will.

That, however, shouldn’t be a reason to prevent the Scottish from choosing their own destiny. The campaign has been largely sedate, with U.K. prime minister David Cameron’s plea to remain to be rather weak, when comedian John Oliver said Cameron needed to deliver a full-on Love, Actually-style romantic reunion speech. My only hope is that the Scottish thistle can thrive as much as the English rose, whether they’re in the same or separate gardens.

(Un)pleasant surprises

Well, the missiles are not on their way. For one, I’m glad the U.S. and Russia concocted a plan to avoid the attack, and that the result should eventually be a chemical weapon-free Syria.

There are plenty of flies in this particular ointment, a lot of big unknowns that will need to become… well, known knowns for everything to work. Namely:

1. Syria could stall, cheat, do everything in its power to thwart the inspections. This sets the regime up for UN action down the road if it’s particularly egregious, although “UN action” is kind of a misnomer, because Bashar al-Assad could disembowel and consume live kittens on Syrian national television and the Russians would continue to pretend nothing happened. Syria is Russia’s last outpost in the Cold War, and they’ll be damned if they’ll give it up.

2. Assad could just ignore the agreement and continue to gas its own people without a care, leading probably to the airstrikes they think they’ve avoided. Again, Russia will never sign on to it, but Russia never signed on to either of our adventures in Iraq either.

3. Syria could comply, but the war will go on, with the U.S. surreptitiously arming the rebels and Russia the Assad government. Lots of angry speeches will be made.

4. Assad complies, the U.S. keeps to its agreement to not arm the rebels and Russia… still backs the regime. Because Russia.

Still no easy answers. The war’s going to fight itself out, and the only real questions are whether the country still has chemical weapons at the end of it, and whether Assad still sits on his throne.

Once More Unto the Breach

I’ll just come right out and say it. I don’t think the U.S. should attack Syria, and it looks as though we’re going to. The uproar in the West is over the Syrian regime’s use of chemical weapons to kill more than a thousand citizens, most of them civilians, in an Aug. 21 attack on several locations in and around Damascus. This was the “red line” that should not be crossed, according to the Obama Administration (actually, Obama’s position is a bit more nuanced than that; left unstated was the possible reactions to crossing the “red line”). But crossed it has been, and therefore we need to send the regime of Bashar al-Assad a message, that is, cruise missiles.

“War comes at the end of the twentieth century as absolute failure of imagination, scientific and political.” So said poet Adrienne Rich (it appears in her essay collection What is Found There: Notebooks on Poetry and Politics). The sentiment applies in the Syrian Civil War as well, because if nothing else, our entrance into the two-year-old conflict represents nothing so much as the results of setting a rhetorical and moral trap for Assad but ensnaring ourselves. We set the conditions, and Assad called Obama’s bluff.

Now the choices are pretty bleak, so bleak you can’t help but laugh your way through tears of desperation (which makes this situation perfect fodder for The Onion, the funhouse mirror of the early 21st Century that you later realize is perfectly flat). We can send in a few missiles against Assad (the most likely option, which I’ll refer to as the Spanking Option going forward, for all the good it will likely do to prevent Assad from further transgressions against the sensibilities of the West). We can (and may well) increase our shipments of arms to Syrian rebels (and which might prompt an increase of arms shipments to the Syrian government from its Russian allies). We can launch a prolonged and expensive bombing campaign to enforce a no-fly zone, which we can’t really afford, and might also provoke a response from Moscow. We can launch an invasion (see above). Or we can do nothing, and Obama comes away looking weak.

(There’s also the non-option of targeting Assad for assassination, which, in addition to being insanely difficult, is just answering one breach of international conventions with another, and one which is illegal under U.S. executive order since 1976. And it might not change the net outcome of the conflict, since someone else within the Baathist regime will step into Assad’s shoes to continue the war).

(There’s also the added difficulty of trying to remove Assad from power at all. The U.S. doesn’t have a good track record when it comes to creating power vacuums in the Middle East, and among various rebel groups contesting for power in Syria, quite a few are tied to radical Islamic movements that are, shall we say, less than friendly toward U.S. interests.)

The simple, pithy answer was to not get into this in the first place, to not set a “red line” that should not be crossed. In fact, it’s fair to say that the “red line” argument is a direct extension of the Bush Administration’s hyping of weapons of mass destruction in Iraq (and Iran and North Korea) as the damn-the-facts justification to pursue war. (The neocon axis of Cheney-Rumsfeld-Rice-Wolfowitz had other misguided ideas about letting 1,000 flowers of democracy bloom in the Middle East so the oil could keep flowing, but W brought his own personal baggage to what was arguably one of the most ill-conceived examples of imperial adventurism since Vietnam.)

Such was Bush’s drumbeat about Iraq’s chemical, biological and “nucular” weapons that there was nothing to do but invade and depose Saddam Hussein in order to prevent him from ever using those nonexistent weapons. This unholy triad of WMD—chemical, biological, nuclear—was the casus belli for a war about other things, but like many bad ideas, they have taken root in debates in power circles as a magical trigger for throwing out all the rulebooks.

This is despite the fact that most WMDs in the world aren’t. It is very difficult to kill large numbers of people with chemical or biological weapons. Both work best in tightly enclosed spaces packed with a large number of people; a stiff breeze or rainshower could severely diminish the effects of gases or biological agents, the explosives in the delivery vehicles could destroy the agents themselves, unfavorable environmental conditions could neutralize the compounds, so using them in anything other than ideal conditions is extremely difficult for anyone without advanced military, chemical and pharmaceutical infrastructure—a national government can do it; a terrorist cell hiding in a cave likely cannot.

In general, hitting a crowded residential neighborhood with nerve gas shells is probably one of the more effective uses of a chemical weapon, and that appears to be what happened in Damascus. (So is pumping highly concentrated sleeping gas into a crowded theater.)

But in the grand scheme of things, not that many people were killed (the U.S. estimates 1,429 dead). Approximately 100,000 people have died so far from conventional weapons in the Syrian conflict. A well-placed bomb has the potential to do far greater damage than chemical nerve agents, and guns and bombs are responsible for the vast majority of deaths in any conflict.

If anything, biological weapons are even trickier to deploy effectively, since they depend on contagion vectors and lack of immunity and/or treatment options to do their dirty work. (Nuclear weapons are another matter entirely, but Syria is not a nuclear power, nor was Iraq, nor is any other Middle Eastern nation except Israel, which continues to deny the existence of its arsenal despite reports to the contrary.)

Don’t get me wrong: chemical weapons are nasty. People exposed to them suffer painful deaths, or if they survive, may be stuck with lifelong debilitating injuries. They should be outlawed. But guns and bombs have the same effects, and we’re generally not talking about disarming the world’s armies or attacking other nations that shoot their own people—this happens all the time, and only once in a great while do we get involved the way we did in the Serbia-Kosovo conflict, to shut down a war with a war of our own.

But what is the trigger to fight a so-called “just war”? The argument for attacking Assad is almost entirely based on these moral terms: he was bad, and therefore must be punished. Syria isn’t a big oil producer, so any economic pull is probably coming from weapons manufacturers who see a potential business opportunity in an expanded conflict (jet fuel and Tomahawk missiles don’t come cheap). The geopolitical impetus is almost entirely driven by the U.S. (with support from Israel premier Benjamin Netanyahu, who has been wanting us to attack Iran or Syria so Israel won’t have to). The U.K., once-bitten, has elected not to support military action at this time. France, once ridiculed by the Bushies as “cheese-eating surrender monkeys,” supports an attack (probably influenced by their historical affinity to and ties with Lebanon). Russia, of course, is denying that any chemical attack took place, and will probably continue to do so even after the UN inspection team confirms the attack. So the U.S. acting without the imprimatur of a United Nations mandate or approval of the Security Council is seeming a bit too much like Iraq in 2003.

Be that as it may, it looks like the Spanking Option is the way forward, despite the fact that its purpose has more to do with the U.S. puffing up its chest in the face of a newly resurgent Russia than with solving any on-the-ground problems in Syria. Indeed, the humanitarian crisis in Syria is likely only going to get worse, and Bashar al-Assad is unlikely to change his behavior.

As the writer Robert A. Heinlein once wrote, “Never appeal to a man’s better nature. He might not have one. Invoking his self-interest gives you more leverage.” In this case, Heinlein was prescient. Assad is clearly a monster, but his only interest is remaining in power, and he’s demonstrated he’s willing to do just about anything to do so. Would he be willing to stop his war against his own people if it meant he could stay in power? That could be seen, from Assad’s perspective, as a “mission accomplished” outcome. But President George H.W. Bush once compared Saddam Hussein to Hitler to build support for Operation Desert Shield/Storm, but after Kuwait was liberated, he left Saddam in power, which contributed to Bush’s election defeat in 1992. Would the West be willing to see a similar outcome in Syria, or have we already decided that Bashar al-Assad is the problem that must be eliminated? Instead of saying, by way of comparison, that the Syrian Civil War is the problem? Because those are two distinct problems, and their solutions might not be identical.

The Next Big Thing

This morning I had the realization that sometime in the mid-1990s I fell unknowingly into a trap of sorts, a psychological condition wherein I would go on a more-or-less lifelong quest for The Next Big Thing. At the time, I was living in Boston, was fresh out of graduate school with one of those oh-so-useful MFA degrees that was supposed to be your entry ticket into the literati but was somewhat lacking in the groceries-providing department. I had found temp work, then full-time entry-level work at a company that anticipated the dysfunction on display in “Dilbert” and “The Office.” It was a natural response, once I had secured that job, to want to jettison it for something better, as soon as possible. The Next Big Thing.

That did happen a year or so later, for another company with a slightly smaller level of dysfunction (what’s the quote? “Functional companies are all alike; every dysfunctional company is dysfunctional in its own way”) and before too long, I was in the similar situation of looking for the escape hatch. By that time, however, I had already hatched my plan to find The Next Big Thing.

This focus on what was just around the corner, where the grass was presumably greener, wasn’t limited to employment. There was a time when a former girlfriend and I mused about moving to Pittsburgh primarily because it scored high on those best-places-to-live lists that now come out every couple of months, but back then were somewhat of a novelty. (We didn’t, and it didn’t, although I’m sure Pittsburgh is still a perfectly fine place to live now that the steel industry is both cleaner and smaller. Another city that also ranked high on those lists was Seattle, which truly was about to become The Next Big Thing thanks to a company called Microsoft and a band called Nirvana. Naturally, I arrived in the Pacific Northwest when most of those glory days were past.) Back in high school I’d fallen in love with computer programming, and was considering a career in that Next Big Thing. In that case, my instincts were right, but my math skills were, sad to say, not equal to my level of enthusiasm for the subject. Nor were my acting skills equal to my love of theater, my filmmaking skills to my love of movies, and so on (although on that latter note, I maintain I could have been a pretty good filmmaker from the standpoint of being able to construct a coherent narrative, provided I could obtain the right kind of technical education I wasn’t getting, and didn’t need to wade into the shark tank that is Hollywood to try and play that particular game).

The plan I was hatching in the 1990s, however, was to get out of the country. I’d gone on a rather formative writers’ retreat in the Netherlands, and more than anything else (I didn’t get much writing done there), it opened my eyes to the fact that the world was much larger and more interesting than I’d known, and that furthermore was pretty easy to get out into and explore. I did my research, deciding where I wanted to go (Eastern Europe was rather vibrant at that time), and how I would do it (teaching English), and what vehicle to use to get me there (the Peace Corps; not having any money or means, it offered the best “benefits package” of all the potential gigs: two-year commitment, paid air fare and health care, three months of intense in-country training. It was ideal, and the relatively high standards of their application process was no deterrent).

It worked in the end. My application took 18 months (covering my employment periods at both dysfunctional companies), having been delayed by a sudden outbreak of plantar warts, but by mid-1995 I was on the plane to Hungary with 51 fellow Americans, mostly young, mostly idealistic, whose own reasons for joining might have been similar to mine. Or not. I don’t know.

I was not teacher material, at least not high school English teacher material. I tried to do a good job, I think in some cases I did, but I didn’t have the patience and finely honed diplomacy skills necessary to deal with the students who weren’t interested in being there. The next plan was journalism. Always a news junkie, while I was living in a small Hungarian village I came to rely on whatever media I could consume from the outside world: month-old issues of Newsweek from the Peace Corps office (useful for the pictures I’d cut out for lesson planning), copies of the International Herald Tribune, The Economist and a little rag called Budapest Week when I could get up to the capital on the weekend, and a late night mix of Hungarian TV news (I could mostly understand the weather report) and tabloid-style German TV.

Post-Peace Corps, the goal was to move to Budapest to become an International Journalist. Which kinda-sorta happened at the aforementioned rag, but never to the extent that I’d envisioned because at the time, with the Bosnian Civil War wrapped up a year prior, the Kosovo War a few years off still, and the wider world’s interest turning to places other than a tiny European country with a fragile democracy, the local market for International Journalists was looking rather thin. There were journalists working for the big wire services in Budapest, and I knew all those guys. They weren’t going anywhere soon. So if I wanted to continue in this vein, I figured I’d have to return home. Which, after a couple years of doing freelance pieces for Budapest Week and teaching something called “business English” at a private language school, and after I married my Hungarian girlfriend and helped secure the necessary immigration paperwork, I did.

Seattle was never the goal, only a means to an end. In 1999, there were three daily papers here, plus two alt-weeklies, a whole nest of community papers (at least one for every neighborhood in a very neighborhood-centric city), and the city seemed to have some traction in this new thing called the Internet. Regardless of the fact that web pages were primitive, that only one of the three dailies had a functional website to speak of (and its “news,” I realized upon arriving in Seattle, was not reflective of day-to-day life in the city, but that’s another story…) I got a job at the smallest of the dailies, and then, for the next seven years, had what was probably the best low-paying job working for morons one could ever have.

Which isn’t to say my immediate supervisors were bad. Most were quite good, smart, funny, dedicated people with drive and ambition to comprehensively chronicle the day-to-day life of our coverage area. The morons were higher up the chain, Peter Principles rendered in real life, whose management skills were evidenced by a steadily decreasing circulation, a buzzword-laden afterthought of an online strategy, and an ignominious end with the paper sold off and shut down. But again, another story.

Being a newspaper reporter is probably the best job someone of my disposition could have. A former editor of mine there once remarked that it’s a job that skews toward people who are poor at planning. As someone who came in from the cold, so to speak, with no direct newsgathering experience but a lot of drive, there was a Next Big Thing waiting for me in the office every day. One day I was greeted by a car chase through a local park, one day a convoluted dispute over soccer fields, one day a couple of planes hitting buildings on the other side of the country, an event which to this day still rattles me when I think about the moment I walked into the newsroom after the drive in (listening to an all-music station, and after I’d broken my usual morning habit of surfing news websites in favor of writing a song about a dream I’d had the night before) and being told by an editor that we were under attack.

Since those years, I’ve never experienced the same feeling as the rush of adrenaline that seems to originate from the thin air of a crowded newsroom when suddenly everything turns and heads in a completely different direction, a pack of hounds hot on the hunt for the fox we’d just caught scent of. My newspaper career ended not because of any one thing that happened, but rather a series of events and decisions that made it abundantly clear that it was, once again, time to find the escape pods. I moved over into magazine writing about four months before that paper folded, but since then I’ve gone on to doing more freelance work, and most recently a contract gig at a large technology enterprise that came to its end a couple weeks ago.

What’s the Next Big Thing? It’s possible that I’ve already been on it. I’ve been working on a novel for about the last seven years (I may have started it after I left the newspaper gig, maybe before, I no longer remember), and I’m not going to say anything more about it here (maybe later), but being suddenly given a lot more free time (plus what most people refer to as unemployment insurance but what I like to think of as the real National Endowment for the Arts), I at least have this project to fall back on until something comes along that will once again make me seem like a productive member of larger society. I’ve tried novel writing twice before, once in grad school and once while living in a small Hungarian village with a lot of time on my hands, and in both cases the projects were abandoned when it became apparent that they suffered from a lack of structure, character development or even an interesting plot. This time I hope it’s different, and as I’ve already completed one draft and have a plan for the next, I’d like to think so.

In the meantime, while I’ve been out chasing the Next Big Thing, my family, friends and colleagues have been doing theirs. But I think most of them would simply refer to it as Life: pursuing careers (or just gainful employment; it seems we’ve become a society that insists your job be worthy of an autobiography, whereas it should be fine to just work for the paycheck), getting married, having children (a Very Big Thing for most people), growing older, dying.

I picked up on this obsession of mine with Next Big Things this morning, I believe, because some news of national import happened yesterday which, on the face of it, doesn’t affect me in any real sense other than to make me depressed about the future of this country, and triggering an old instinctive response: is it time to head for the lifeboats? I think my answer this time is no. It may be that I’ve matured (yeah, right), or simply that I’ve got enough of a Next Big Thing to keep me busy in the form of this current project. When I was in my twenties it seemed a perfectly reasonable thing, when things in the home country weren’t going so well, to pack up and head off to foreign parts in pursuit of whatever utopia or distraction or Next Big Thing lay in waiting. To that extent, the European/Australian practice of a gap year seems to be a good way for the young ones to let off some of that steam, and their societies’ tolerance of such youthful perambulation (the modern-day “Grand Tour” of dive bars, beach parties and raves across Europe) may work out to everyone’s longer term benefit.

But as we like to say, there comes a time to settle down. Or, at least, to approximate the appearance of settling down while quietly pursuing the Next Big Thing.