I Don’t Read That Stuff

What follows is a completely personal, wholly biased view.

On average, I read between four and seven books a month.  That’s cover to cover.  From time to time I have a month wherein I manage ten to twelve, but that’s getting rarer.  Between four and seven is what it comes out to and at that rate I’m reading fifty to eighty books a year.

I put that out there so that what follows may make more sense than the usual kind of argument about taste in reading matter and why I don’t wish to waste time on certain things.

I’m sixty, which means I have maybe fifteen, maybe thirty years left to do the things I want to do, and I’m getting picky about what fills my time.  Too much trivial nonsense does simply because that’s the way life is.  And it’s hard to break habits made when you were much younger and it felt like time was plentiful.  I’m not being morbid, just practical.  Trivia has a function.  Upon trivia, friendships cement, the culture binds itself together, and the time between important things is bridged with something that at least keeps us engaged.

Anyway, given all this, plus the very important fact that I’ve been reading books (without pictures) since I was eight years old (at least—and I’m not, believe me, being critical of comics, I actively read them till I was 18 or 19 and still consider them worthwhile) and my tastes have…

I was going to say, “my tastes have changed,” and that’s certainly true, but it’s just as if not more accurate to say my tastes have evolved.  I still like the same kind of things I did way back, but not in the same form I did then.  When I was 12, the Lensmen were the ne plus ultra of fiction, the absolute coolest of the cool. Today? Not so much.  But I still love space opera as a form.  Only now I want a better example, language pitched to my level or higher, and maybe some subtext and a story that isn’t just about finding the next biggest weapon to defeat the slimy aliens but maybe tells me something interesting about human nature under unique conditions, which means characters that inhabit more than two dimensions.  While I can’t get through Triplanetary anymore, I can still read Delany’s Nova or Iain Banks’ Culture novels with pleasure.  Why?

No single reason, but a constellation of them resulting in what we start to recognize as serious literature.  The fact is, I pick up Embassytown by China Mieville and I have an experience which simply cannot be duplicated by—

Ah, there’s a problem.  We verge now on comparisons.  And that gets tricky, because I don’t wish to denigrate anyone’s work.  I have always tried to address the different pleasures of fiction, top to bottom, without resorting to saying So-and-So is great while Such-and-Such is crap.  For one thing, a lot of people may very much like Such-and-Such and by calling it crap I am by definition saying they have bad taste.  It becomes elitist in a particularly counterproductive way.

For another, this hasn’t much to do with what one likes.  That word covers a lot of territory and means many vague things having to do with pleasure. No one has cause to say anyone’s pleasure is somehow less important because of its position on some presumed scale of relative value.

But that’s not the same as claiming all experiences are of equal merit.  All books are not the same and yes, some are better than others.  “Like” has little to do with those assessments, though.

A well prepared filet mignon is obviously “better” than a hamburger from a fast food chain.  The fact that McDonalds has sold billions upon billions is not an assessment in any way that their burgers are better than the steak you’ll be served at a five-star restaurant.  Popularity is not an endorsement of quality.

A rough comparison at best, but I make it to establish the idea that while you may eat more burgers than filet mignons garnished with champignon mushrooms, you know the difference and you also know it’s a question of experience that allows us to recognize the distinction and understand it.  You’re going to have a deeper culinary experience with the latter.

Unless you have no taste at all and can’t tell the difference.  That’s certainly possible and by the evidence of certain groups would certainly seem the case.

Enough with the culinary analogy, let me get back to choice of reading material.  What I do not read any longer has to do with opting for the deeper experience.  I read slowly, relative to some, and I don’t have time to tear through mediocre books.  I have developed to the point where Doc Smith just doesn’t do it for me any longer.  I mean, the ideas are great, the seeds of later pleasures, but the execution is pitched to the bright 12-year-old and eschews any kind of nuance.  For one thing, you begin to notice eventually, if you read with any kind of acuity, that while we’re all in the far future, everyone acts and talks as if they lived in 1932 Brooklyn.  Even the aliens.  If it were satire, I could understand, but it’s not, it’s meant to be taken seriously.  And, really, it’s the far future and everyone (even the aliens) is so white.

I want something special.  I want my synapses engaged as fully as possible.  If I have to do a little work to understand the full substance of a sentence, great, especially if the work pays off exponentially.

Which has resulted in a long, gradual drift away from the slam-bang of what once represented the bulk of my reading choice toward material that causes me to react in ways I find much more satisfying.

Some books require more from a reader.  The reader has to rise to the level of the book.  Experience teaches us to recognize these books when we find them—and sort them out from those that may be obtuse just for the sake of hiding their lack of anything to say—and experience also gives us the desire to have those experiences.  Which, perhaps unfortunately, leaves us unsatisfied with less nutritious fare, fare which once filled out requirements.

This is akin to growing up.  You just don’t find the things that fulfilled you as a kid to be all that wonderful as an adult.

Assuming we’ve grown up.

Okay, I am here at the point where I either have to give a concrete example or leave this whole thing a vague, kind of hand-wavy bit of stuff with the message so buried as to be useful only to those of an archaeological bent.  Ordinarily, I would not feel I need to do so, but given events and circumstances in my genre of first love—science fiction—maybe I should just bite the bullet and go for it.  After all, names have already been named and assertions made and being polite to the point of swallowing meaning serves no useful purpose.

I will not, however, name names.  If I do, it will be those who are long dead and whose day is past. Unlike some who have dragged the discourse that is science fiction to the level of a political convention floor fight, I will not point at specific works currently in play in order to say “Here be crap” and make the bones of my argument on the unnecessarily scoured sensibilities of people who toil earnestly at their craft.

Earnestness does not inoculate anyone against doing mediocre work.  Nor does it guarantee exemplary work.

What do I mean by mediocre writing?

Writing that exhausts itself by one reading, fails to fulfill the potential of its ideas, and/or rests upon cliché to make the page turn.

Good writing by contrast allows for multiple readings from which deeper meaning and new interpretations  can be derived.

Sometimes you can see the difference sentence by sentence.  Often scene by scene, chapter by chapter.

When you’re 12 years old, the better writing may leave you baffled, because it is not always straightforward and single-pointed.  So no one should feel bad for not getting Joseph Conrad at that age.

I pick Conrad because he wrote rousing adventures.  But they are so much more than that, and it’s that so much more that has seen his works continue to be published, read, and appreciated by large audiences.  Once you’ve internalized the sea-going thrills and fighting in something like Lord Jim you find that, upon further or more careful reading, there is so much more.  And that so much more actually calls into question the heroic æsthetic of the surface read and causes—presumably—a deep reflection on the the meaning of heroism—and cowardice—and the mythic templates we accept that define them.

Boring?

Well, if you’re bored by such contemplations, then stick with action-adventure.

If beautiful sentences bore you or you are blind to them, stick to simple plot-thickening prose.

If you are not emotionally moved in ways other than by pure adrenalization, then stick to the slam-bang thrills and avoid anything that talks about the soul in multiple ways.

(If you can read Dante’s Inferno and believe that it has anything to do with the afterlife, then stick to work more facile and less steeped in metaphor.  You will do less disservice to Dante and perhaps yourself.)

In short, if a sentence like “It was strange that even sex, the source of so much solace, delight, and joy for so many years, could overnight become an unknown territory where he must tread carefully and know his ignorance; yet it was so*” reads like gibberish, seems pointless, or causes the kind of reaction that refuses to allow for the possibility that more is going on here than simply your inability to decode meaning and apprehend the layers involved, then you may have reached your limit with sentences like “What he really needed was a session with a pleasure unit in order to clear his mind for the ordeal ahead.”  If you can’t understand why the former sentence is a richer text, revelatory of character in ways that the latter sentence simply not only fails to be but in some ways actively resists being, then—

But I border now on insult.  In light of the current kerfluffle going on in the field, it’s hard not to, though.  The essential nature of science fiction is being challenged, all in the name of what appears to be a petty rejection of message.  As if science fiction has not always been message fiction.

In terms of plot and idea, if calling into question the basic assumptions by which civilization, culture, and the very lives we lead promises to be an impenetrable drudge, then I have to wonder why you claim to like science fiction at all.  Because that’s what it’s all about, dislodging the reader from cozy assumptions of self-justified rightness.  And no, stories wherein humanity must wage war against an alien race in order to preserve an identity which goes largely unquestioned do not represent the chief benefit of the form.  The physiognomy and bloodlust of the aliens is exciting for only a brief time if there is nothing more to the story.

Yes, I’m indulging a bit of elitism here.  I have nothing against well-done action stories.  I read one recently that offered, or at least promised, a nice twist on the formula, but then failed to deliver and turned into a pat good-guys-cleverly-defeating-alien-menace-with-cool-explosions story.  I enjoyed the ride but will never read that book again.

And that’s okay.  It was even well-written in terms of character, exposition, pacing.  It’s not a question of condemning things just because they aren’t Dostoevsky.

But using the author of Crime and Punishment as an example, to argue that work deserves an award precisely because it isn’t Dostoevsky is a seriously flawed idea, especially when the award in question is supposedly for the Best of Field.  We give awards to the Dostoevsky’s in order to set bars and celebrate potential, not congratulate ourselves for reveling in mediocrity.  To insist that the better work is undeserving because it does things differently from the usual is a statement of adolescent resentment.

However, that doesn’t seem to be the source of the spleen.  It’s not that SF is message fiction, but the message being conveyed that has sparked all this contention—and who is delivering it.

It can be asserted that overall, since 1926, a good deal of SF, especially of the planetary romance and interstellar adventure variety, has been, in subtext if not overtly, imperialist.  Brave Earthmen venturing forth to conquer and pacify an alien and maleficent universe.  By default if nothing else, most of those high principled adventurers have been white males.  That aspect wasn’t the main point of the choices made, just the default assumption based on current standards of perceived merit.  John W. Campbell, jr. was an unapologetic champion of this ethic, so much so that he eventually annoyed many of his best writers with his chauvinism.  Nevertheless, the model stuck, because it allowed for the continual generation of really cool stories.

Came a time, though, when we finally became a bit more introspective and realized how parochial much of it was, how chauvinistic, and, yes, how racist much of it was.  So, like any healthy art form, stories began appearing that questioned these assumptions.

And the questioning resulted in a lot of really cool stories.

It may be that some folks still like the old ideas and forms so much and, coupled with a weariness of continual reassessment and moral reevaluation, yearn nostalgically for days of unquestioned heroic virtue, that the current noise in opposition to what has been derogatorily labeled as the work of Social Justice Warriors is just their way of stamping their feet and demanding unequivocal action adventure of the so-called Golden Age variety.  From some of the sales numbers I’ve seen, there is a healthy market for such stories.

Speaking personally, though, please don’t try to tell me work built guilelessly on discredited values and outgrown sentiment is award worthy—and by that I mean exemplary of the best, indicative of the future, and representative of the limits of possibility.

I end this now because I am coming perilously close to venting spleen and getting personal.  I’ve watched this advent over the Hugo Awards with dismay and bewilderment.  There has been too much doubling down on false pretense and too much empty fury and not enough genuine debate over what is actually at issue in terms of the work.  Those who have brought this to fruition have placed many of their colleagues in unfortunate situations for no good purpose, or at least for purposes poorly stated if not seriously misrepresented.  If, by their lights, the “wrong” work wins a rocket this year and they continue to erupt in fury over a perceived injustice in the direction of the field, then in my opinion they seriously misunderstand the nature of the endeavor of which they claim a share.

In any event, I have looked at the work at issue and, from a purely personal vantage, like so much else in what is too little time to spare, I have to say, I just don’t read that kind of stuff anymore.

 

_____________________________________________________________________

*Okay, I’ll name one name.  That sentence is from The Dispossessed by Ursula K. Le Guin, whose works I suspect, were they being written and published today would bring down the disdain of those who have made loud and gaseous cause over “social justice” fiction.

Sandblasting History

A call has gone out to eradicate the carvings on the face of Stone Mountain in Georgia.  The work depicts Robert E. Lee, Stonewall Jackson, and Jefferson Davis, presumptive heroes of the Confederacy. In the wake of movements to remove Confederate iconography from government buildings, parks, and other, especially federal, properties, this would seem to be another symbol of the co-called Lost Cause in need of removal.  Sentiment is running high on both sides of the argument and a quick read of the issues would suggest that, yes, this ought to be removed.  It’s in a public park, supported by tax dollars, and represents three personages one could easily label traitors to the United States.  As far as it goes, I have no quibble with the labels.

The carving is another matter.  On Facebook I recently opined that this is like the Taliban blowing up the Buddhas in Afghanistan. An extreme comparison, perhaps, but the more I think about it the more I’ll stand by it. In a few centuries or more, when all this is part of some dusty chapter in history books with little left to stir the blood, it may well appear more like the usual eradication of the loser’s history by the victors.  A history people then might well be annoyed at not having to hand.  It will by then just be an interesting carving.  The politics will likely have faded into quaintness (we can only hope) and the judgment will be that temper trumped reason and a work of art was destroyed to appease the passions of the moment.

I doubt that argument would have any traction with either side just now—those wanting it effaced who see it as emblematic of current (and past) injustice and those wanting it preserved feeling their heritage is being tossed aside with no regard for feelings.  My suggestion that preserving for a later time when it has lost all immediate meaning may seem facile and probably will find offense on both sides—those who may see my position as a negation of their outrage or those who see my demotion of its symbolism to mere novelty over time.

But what about all those other emblems being removed?  What about that?  Well, what about that?  They’re being removed, not destroyed.  Those who appreciate them will not have lost them, but they in fact have no place as part of the representative symbols of our country.  The Confederacy was a rebellion against elected authority, it lost, and is now gone.  Heritage is a personal thing but it has a public function, certainly.  However, public heritage is a matter of democratic symbolism, not the maintenance of symbols of a presumed right subsequently proven nonexistent.  A government building may (and does) have as part of its function to represent a national mythology (and when I use that word I intend no denigration, but rather a definition that what is being represented is a distillation of feeling, committment, and identity that transcends mere event, indeed which exists usuall in spite of event) relevant to us all as a commonwealth.  However earnestly it may be construed, the Confederacy represents nothing we are required to preserve in any positive iconography.  Its existence was a perversion of the core beliefs informing the Union as codified in the Declaration of Independence and the Constitution.  It could well be said that the Civil War represented the final referendum on the principles espoused in the Constitution, and those principles won.  This is the reality with which we live today.

I digress, but with purpose. The symbols of the Confederacy are being removed from government property, finally relegated to places and in the keeping of hands with no official function in the representation of the United States.  Removed.  Not destroyed.

I think that is a very important distinction.

Destroying monuments is, in my opinion, like burning books.  Even something as vile as Mein Kampf I would refrain from destroying.  Destruction like that—the purposeful attempt to eradicate a symbol of history—invites a peculiar kind of martyrdom.  It makes the symbol into something it did not start out to be and gives it new life and meaning.  It becomes a different, though kindred, cause celebre and then you have to figure out how to fight that new fire.

Kemal Attaturk wanted his country to be secular, out of the hands of the imams.  He knew better than to destroy the mosques, because then he would have created a monster he could never kill. Instead he turned them all into museums. Nothing was destroyed but they lost their power to fuel rebellion.  When the Soviet Union fell, all the statues and monuments were taken down.  A few may have been destroyed, but officially they were all simply removed and placed in a kind of graveyard where they have become the ghosts of a discredited era.  Not symbols of a lost cause waiting to be rallied around.

It would be best if the Stone Mountain carving could be removed.  Hard to move a mountain, though, so it becomes a thorny logistical problem.  Maybe the state could auction it off to a private owner.  But I would rather it remain to outlive its putative symbolism than be sandblasted and thereby become, Phoenix-like, a symbol for a renewed set of tensions.

When the Taliban dynamited those Buddhas, the world was shocked.  Attempts had been made to dissuade them.  The Buddhas had for most the world long since ceased being religious icons and were just seen as art.  It was senseless to destroy them, especially out of the anger of a shortsighted ideology that will likely fade into oblivion in time.  By the time the Taliban have become a footnote in a history text for all their other crimes, the destruction of those Buddhas may continue to represent everything about them.  We rightly decry the loss of so much art at the hands of missionaries burning their way through Central and South America.  The brilliance of church art from the Middle Ages has few examples remaining in place because of the temper of the iconclasts of the Reformation and the Clunaic movement.  All these people thought they were fighting evil and by their lights were right to eradicate these symbols.  They did cause themselves more problems by so acting, sometimes in the short run, often enough in the long run.

To be clear, I have zero sympathy with the romanticism of the Confederacy and the dewy-eyed revisionism of the antebellum South.  It is accurate to say the seceding states committed treason.  I will take my lead from Lincoln, though, who did not and would probably not have gone there.  Hard as it was, he saw them as misguided, strenuously arguing a case that had no merit but needed arguing.  The aspects of Reconstruction that exacerbated the animosities the War created probably would not have been part of his policy had he lived, but by treating them as, in toto, traitorous states in need of occupation and “rehabilitation” created the subculture which today struts like a barnyard cock with nothing to do but crow and has become fodder for opportunistic politicians feeding on the poorly understood sense of victimhood based on borrowed wounds.  Rather than give them one more thing to be angry out, it would be better to simply ignore them until they become a forgotten irrelevance.  The pathetic attempt to assert the secession was all about “state’s rights” rather than slavery is so clearly an attempt to rewrite history—history which is right there in all the various declarations of secession, justification number one, the presumed right to keep their slaves—that it would be sad if it weren’t getting people hurt on the streets.

Make Stone Mountain into a teachable moment.  Put up a sign right there that says “These Three Men Acted Stupidly In Support of an Immoral Cause” and talk about it.  And talk about the people who can’t see the truth in that claim, the people who erected a monument to stupidity.  That might serve our purpose much better than just erasing them.  Because we’ll do that and then many of us will assume the argument is over and then later be very surprised to find out that it was only the beginning of a new one based on the same old tired ignorant nonsense.

Finally, if we’re going to get all righteous about Stone Mountain, maybe we might consider that the original owners don’t think too much of Mount Rushmore.

Freedom and Those People Over There

It’s the Fourth of July.  The national birthday party.  On this day in 1776 was the official reading of the Declaration of Independence, when the Thirteen Colonies broke from Great Britain and began the process of forming a nation. In the 239 years since we as a people have engaged an ongoing and often contentious, sometimes violent conversation about the one thing we like to say distinguishes us from every other people or nation or country on the globe:  Freedom.

Contentious because everyone means something different when they use that word. We do not agree on a common definition.  This isn’t a deep, difficult to understand reality, we simply don’t.  Put any group of people together from different parts of the country and have them talk about what they mean by Freedom and while certain common ideas bubble, once you get into the details you find divisions, sometimes deep.

Clearly for most of the first century, as a nation, we had a pretty limited notion of what it meant.  It meant freedom for a certain few to do what they wanted at the expense of others.

So native Americans didn’t have it, nor did slaves, nor, for the most part, did women. Even a white skin on a male body didn’t guarantee one equal consideration, because money and property were important, and, to a lesser extent, natural born versus immigrant, language, and religion.  We, like any bunch of people anywhere, fell into groups and competed with each other over privilege and those who came out on top extolled the virtues of freedom while doing what they could, consciously or not, to limit it for others who might impose limits on their success.

This is not controversial.  This is history.  We’re human, we can be jerks like anyone else.  What makes it awkward for us is this widely-held belief that we are unfettered supporters of Freedom.

In the simplest terms, we claim to be free when we feel no constraints on preferred action.  So if you’re going on along doing what you like to do and no one tells you that you can’t, you feel free.  If, to complicate things a bit, someone passes a law that says Those People Over There may not do something you have no interest in, well, you don’t feel any less free and may wonder why they’re complaining about being oppressed.  After all, you’re free, you don’t have any complaints, and that makes this a free country, so stop bitching.

Naturally, if someone passes a law that says you can’t do something you either want to do or makes claims on your resources in order to support such rules, now you feel a bit less free, imposed upon, and maybe complain yourself.  Of course, Those Other Folks Over There are quite happy about the new law and themselves feel freer as a result, so they look at you now as the sore thumb sticking up.

But it still involves questions of constraint, which is what the law is about, and we agree in principle that we need laws.

If we need laws to restrain—to tell us what we can and cannot do—doesn’t that immediately beg the question of what it means to be free?  I mean, the libertarian line would be that I’m a grown-assed adult and I can control my own life, thank you very much, you can keep your laws.

What if your desire for unconstrained action puts a burden on other people?

What if, to make a big but logical leap, your sense of freedom requires that others have less than you or, to put it back at the beginning, that some people be ownable? You know: slaves.

That the Founders built it into the framework that slavery could not only exist within the borders of this new “land of the free” but that it was illegal to discuss the issue in Congress for twenty years might cause us to ponder just what they meant by Freedom.

And it did take over a century before the laws began to change concerning women and property. Was a time a wife was legally owned by her husband—her, her body, and all her associated belongings—and could be thrown out with nothing but the clothes on her back if the marriage went sour. That doesn’t even take into account that it wasn’t till 1919 that women could legally vote.

How does this fit with our self-congratulatory view as the freest nation on Earth?

Well, we say, that was then.  This is today and we’re not like that.

Aren’t we?  Then why are we still arguing—loudly—over questions of equality, and in several areas of concern?

I put these out there to leaven the uncritical jubilation over what really is a worthy aspect of this country.

What the Founders implicitly recognized was the multifaceted and often conflicting perceptions people will inevitably bring to this question.  They may well have held some overarching, abstract view as to what Freedom meant but they knew such could not secure the kind of stability necessary for a viable nation.  Absolute freedom would destroy us just as surely as absolute tyranny.  So they set up a framework in which we as a people would continually argue about it, and by extension demonstrated that it was this freedom to hash it out that they saw as the most relevant, the most viable, and in the end the only practicable way of securing individual liberty over time.  They built into it all the nearly sacred idea that we can say and think what we please and set up fora wherein we could express ourselves without authoritarian retribution.

That was the idea, at least.  Like everything else they put in place, it hasn’t always played out that way.  McCarthy wasn’t the first one to send a chill through the republic to make people afraid of ideas.

We are, however, free to argue.  Sometimes we have to bring ridiculous force to the table to make an argument, but at the individual level we can go to our various barbecues this weekend and have it out on any topic without fear that some censorious official will show up at our door next week to take us to a room and be questioned about our beliefs.  There have been times when even this was not a guaranteed freedom, but over all this is what the Founders decided on as the most efficacious form of freedom to protect.  They arranged things so the suppression of the freedom to have an opinion could end up fueling a political movement and take the argument into the public arena where it can be further debated.

But this also means we have to learn to privilege the freedom of expression and thought over any other.

And it’s hard. It is damn hard.

Follow the comment threads of any heated or controversial post anywhere—the equivalent today to Letters to the Editor in other periods—and you can see that many people just don’t get that.  It frightens them.  Why?  Because it’s fluid.  Because it means things change.  Because it calls into question what they thought were absolutes.  Because they grew up thinking their country was one thing, unchanging, ordained by divine testimony, and their sense of freedom is based on holding to those absolutes and defending them from those who would see things differently.  Flux, change, revolution.

They came to believe that all the work was already done and everything would be fine  except for Those People Over There, those…those…malcontents.

Forgetting, of course, that the whole thing came from the minds and labor of malcontents.

We come away from our youthful education about 1776 with the belief that the war was the revolution, but this is not the case.  It was the war for the revolution, which is what came after.  The revolution was the process of setting up a new form of government and establishing a framework distinct from what had gone before. 1787 was the year of revolution.  The Constitution was ratified by the delegates to the convention on September 17, 1787.  It then had to go before the individual states for final acceptance, which was not finished till May, 1790, when the last state, Rhode Island, voted to accept it by a two vote margin. Those two and half years were the actual revolution, because revolution brings us the new.  In a way, 1776 was little more than a decree to stop sending the rent to England and a statement that we were willing fight over the right to have a revolution. The war was not the revolution, it only allowed the revolution to happen.

And what was that sea change in the affairs of people?  That the people would choose their leaders?  Not an especially new idea—kings had been elected before (in fact, the Thirty Years War began over just such an election)—but here it would be the way we would always choose our leaders.  The mechanism by which we made that choice, now, that was based on the revolution, which was folded into this rather imprecise notion of Self Determination. But it rests ultimately on the sacred right of each one of us to disagree.

It is by disagreement—loudly and publicly, but beginning privately and from conscience—that we move toward that other nebulous concept “a more perfect union.”  Which itself is a strange phrase.  More perfect.  Perfection, by definition, does not come in degrees.  It either is or isn’t.  Usually.  Unless they, the Founders, were recognizing the fact that change is inevitable, especially if we’re going to sacrilize the freedom to disagree.  In practical terms, your perfection, however conceived, is unlikely to be mine.  If so, then the formula is there to move us from one state of perfection to another equal but different state of perfection.

Which is unlikely and sloppy logic.  Most likely, they knew, as they should have, being good students of the Enlightenment, that perfection is unachievable but the idea of it serves as a spur to do better.  Perfectibility is the ongoing process of seeking perfection.  In the seeking we have to define it and in the definition comes the debating.  In the debating we find a method for—often convulsively—blocking the hegemony of factions, or at least tearing them down when they become onerous.

So in order to “form that more perfect union” we accept that it is always just over the next hill and we have to have a consensus about what it looks like and to get there.  Which sets us to arguing, which is the best guarantor of liberty of conscience.

But we have to work at it.  Which means the revolution is not finished.  What they set in motion was something that would never be finished if we tended to it seriously and with reason and commitment.  So if anything, July 4 is the day we should celebrate as the point when we took steps for creating the conditions for the revolution. The revolution followed the surrender of the British and the commencement of the work to create a nation.  That was—and is—the revolution.

As long as we can meet and differ and find accommodation despite our differences and allow for those differences to be manifest to the benefit of society, the revolution continues.  That it continues is the sure sign that we have freedom (and tells the nature of that freedom).  Even when we don’t always use it or recognize it or allow it to define us.  Oh, we have work yet to do!  But we can do it if we choose.

Just some ruminations from a citizen.  Have a safe Fourth of July.

Work History, Wages, and Doing The Things

The other day I was taking with friends about that pesky subject, wages. Minimum wage is in the news, a big argument, and the politics are necessarily touchy.  Comparisons were made and my own situation caused a bit of raised eyebrows and “What’s up with that” detours through personal histories.

According to some, among people who have known me a long time, I have always been seriously underpaid throughout my working life.

Before we get into that, though, I would like to reference this article, written by my boss, Jarek Steele, about the current anxiety-laden question of raising the minimum wage.  Go read this, then come back here.

First off, I would like to say that I work at a wonderful place.  Left Bank Books is now family.  As you can tell from the essay, they are thoughtful, concerned people with no small amount of brainpower and good bead on life as it is and a solid moral sense.  I’m lucky to work there.  I’ll come back to that later.

Now. Most of my adult life I have been relatively unconcerned about my wages.  I don’t know where I got this from, but I’ve always felt they were secondary to several more important factors.  Some of this is naïveté, but some of it is a result of early on making a choice between security and fulfillment. For many people, money serves as fulfillment, and for some it genuinely is.  They work to have.  I offer no judgment here, everyone is different, and it’s all a question of degree anyway, because we fall along a spectrum.

For myself, I’ve always worked to Be.

Perhaps a small difference to some, but a huge difference over time. I came out of the box, as it were, with intentions to be a certain kind of person, to do certain things, to make a crater in the world that looks a certain way, and if the pursuit of money got in the way of that, then I ignored the money.  Not consciously, because I always just assumed that somewhere along the way I would have it, mainly as a consequence of having done all the stuff that fulfilled my requirements of Being.

Now, if this all sounds a bit zen and possibly foolish, so be it. I’d be willing to bet many if not most of us have career-type dreams at some point that focus mainly of what we’re doing and not how much money we’re going to make doing it.  But this is America and identity is conflated with owning things, so it becomes very difficult to tease apart the doing from the reward.

Which brings me to my rather jagged career path, which saw me graduate high school intent on a career in photography, which I pursued as an art first and foremost and, in the end, only.  I never figured out how to make it pay.

So I worked for a major photofinishing chain, then a period as an in-house commercial photographer for a marginal advertising company, then as a delivery driver for a custom lab, and finally as the darkroom jockey of one of the best camera stores/black & white labs in town.  That last for 20 years.

I never became the photographer I thought I’d be, at least not commercially.  I did all the things.  Portraits, landscape, art and abstract, architectural.  Occasionally I did them for clients, but mainly I did them because they were cool to do and they produced images I wanted to see.  I was Doing Photography and that was the important thing. I was fulfilled.

All the while I drew my wage from my job, which supported the art and all the other stuff.

Then I picked up the writing again.  Time passed, I learned my craft, started selling stories, and then that 20 year stint of a job ended with the close of the business. Two years later I applied to and got another lab job, at which I worked for 11 years, most of them rather unhappily.

(And here the concerns over money enter in the most annoying way, because money would have been the means by which I would have been able to just write instead of having to work at something I no longer loved in order to eat.)

The story sales never added up to enough for me to quit that job.

But I was getting published.  I was fulfilled, at least in the desire to Do The Thing.

Age does force one to confront certain realities.  Looking back, I realized that I had never pushed for more money.  I never once, in all the years of “working for a living,” asked for a raise.  Somewhere in the back of my head there floated the assumption that good work brought remuneration, so if the people I worked for chose not to give a raise, then it was due to my lack of good work.  I could maintain this attitude largely because, with one exception (that first job right out of high school) I have never worked for a large corporation.  Never.  I have spent my employed life working for small local businesses, the health of which I could see, right in front of me.  They all struggled.  I was part of that struggle, so adding a burden to them was not in my nature.  I never asked for a raise.

Instead, I lived a life that fit with my earnings.  One could do that at one time.  And I did get raises, so it’s not like I’m talking about trying to scrape by on minimum wage.  (Which was, btw, right around two dollars an hour when I graduated high school, and I worked for Fox Photo over a year before they granted me a ten cent an hour raise.)  But I never asked.  I was always grateful when they came, but I never asked.  The people for whom I worked were usually close enough to the ground to show appreciation when they could.  For a while I made a decent living.

Donna and I, however, had no children.  That one fact explains a great deal about how we could opt to work for who we chose (often) and live as we pleased without overly worrying about income.  We were careful.  When we bought a house, we paid it off early.  We carry no balances on our credit cards.  We owe no bank anything.

And we realize how unusual this makes us.

But it also points up the major disconnect many people suffer in their lives in terms of employment and compensation.  I never asked for raises because, by and large, I never had to.  Had we lived a more traditional lifestyle, money would have been the single greatest driver of all our choices.

However, my comment above about being underpaid…

Several years ago an opportunity opened for me to possibly take a job as an editor at a local magazine.  I’m not familiar with the task, but I’ve always been a quick learner, so I had no doubts about my ability to come up to speed, and I could offer myself for a bit less than others might.  I went over the requirements of the position with a friend who had been in this end of the industry.  She remarked as one point that the salary would probably be X, which was low, but in a couple of years I could probably come up to standard.  I laughed and told her I’d never made that much in a year in my life.

She was flabberghasted.  How, she wondered, could someone with my abilities have been so undercompensated?

Because it had never occurred to me for a long, long time that I had been.  I’d been Doing The Things, and wasn’t that what mattered?

No.  At least it’s not the only thing.  Money is the means by which we live the kind of lives we wish to.  I want “success”—monetary success—as a writer so that I can do that and nothing else.  But I’m not good at that kind of success. I’ve never been adept at parlaying skills and artistic ability into money.  Whatever it is that allows some people to be skilled at getting compensated, I’ve never been good at it.

And the owners of corporate America know that most people are like that.  They depend on it.  The main reason unions were so important is for that reason and that most people need someone who is good at understanding that game to struggle on their behalf.  But the fact remains, most people take what they can get and then worry about the shortfall.

Because we have consistently misunderstood the relationship between, in the classic terms, labor and management.  As the economy has changed, that misunderstanding is becoming critical, because we are collectively faced with the consequences of our failure to address it.

Business knows average people aren’t either interested or especially adept at Doing Business.  That alone gives business—and I’m talking business at the disembodied corporate level here—an advantage because they take it.  They can shortchange employees because they know how and their employees don’t know they have either any power or can find the means to engage management to worker advantage.  Had we kept abreast of the changes to labor’s benefit these past 30 years when we shifted predominantly from a manufacturing economy to a service economy, then the present strained issue of raising minimum wages would not be so traumatic.  The problem of catching up is putting strain on small to mid-level businesses that they should not have had to bear.  Because we’ve been underwriting cheap product and services for decades by a disproportionate-to-reality compensation formula that treats people like parts.  Read Jarek Steele’s breakdown above.  Numbers, folks, and realities.

Drastic measures become necessary only because of indolence in the system.  As long as the numbers of people receiving poor compensation for work that has become increasingly primary were low, the problem could be ignored.  It’s not even so much that so many are trying to make full livings on minimum wage but that all wages are commensurately constrained by the growing imbalance in consumer ability to pay for what we need and want.

Then there are people like me, who frankly have never known how to care about the money.  Or at least never felt the freedom to demand it, because we keep getting sidetracked by Doing The Things.

Because Taking Care of Business consumes the one thing that art demands—time.  I loved doing photography.  I hated running a business.  I love writing.  Paying attention to marketing and sales is frankly loathesome.  I wish sometimes (lately more than ever) that it were otherwise, that I had that ability to engage promotions and negotiations, but I am who I am and do it only because if I don’t then some day I won’t be able to do the art anymore.

Which, by completely unconscious intent, has caused me to work locally, for people I see everyday and can talk to as friends more than as employers.  I think this is a good business model, but because it is not primary in this country, because people who think very much differently set the parameters of what constitutes “business practice” for so much of the country, this is not the business model that trumps treating people like parts.

We’ve been arguing about this since the founding of the Republic, since the idea of the yeoman farmer and the independent artisan was turned into a romantic myth by the privileging of corporate giants saw a massive culling early on, when it became harder and harder for the independent owner to function in the face of cheaper prices and savage competition that stripped people of their own labor by turning them into wage-slaves.  The argument went on and on, the battle raging for over a century and a half, until finally the Second World War, the Cold War, combined to usher in the era of corporate hegemony that, while not eradicating the small business managed to place the entire economy in thrall to the requirements of giants.*

Hyperbole?  Consider what happens when a large corporation closes a plant or leaves a market and dozens of smaller, local businesses—those that survived the initial arrival of that corporation, at least (mainly by learning to service it)—find their customers drying up because so many of them are unemployed.  Taxes dry up as well, so relief doesn’t stretch as far, and we no longer have an economy that will support a regrowth in a timely manner.  Towns have been abandoned due to this cycle.

Doom and gloom?  No, I think there’s enough latent ability and power in local, small business to still have a good chance at not only holding its own but of succeeding and altering the standard model.  Because there is still value in prizing Doing the Things over Making the Buck, and compensation can flow in those directions.  We’re looking at a crucial time where those kinds of choices are more important than they have been in a long time.

Which leaves me back at where I started, admitting to a kind of aphasia when it comes to this money thing and by and large, as inconvenient as it is, still not much interested in changing who I am in order to meet some mogul’s notion of success.  I work where I work and do what I do because I can decide that “career” is not a synonym for sheer acquisitiveness.

I am lucky, as I say, and do not in any way offer my life as an example of how to do this.  I might well have ended up in much worse places.  But it’s the people around me who have made the difference.  They all ought to be better off, but we’re all Doing The Things and making the world, at least around us, better off.  Meantime, I am grateful.  I can still Do The Things.

It would be good if more of us remembered or realized that that is why we work so hard.

____________________________________________________________________________

* Consider further the completely bass ackwards relationship between large corporations and local communities wherein the community is required by circumstance to bride the corporation to set up shop—a bribe done with tax money, which means the community starts off impoverishing itself for the “privilege” of hosting an entity that will then extract profits from that community to distribute among people who do not live there.  And when the latent wealth of that community has fallen sufficiently that the profits to the corporation are less than deemed desirable, they then close up shop and leave, the community having grown dependent to such a degree that, scaffolding removed, the local economy collapses, partially or completely.  What should be the case is the corporation ought to pay the community for the privilege and the relationship should be one where the community as host is a primary shareholder and gets compensated first.  Unworkable someone in the back says?  Not so.  Alaska did this will the oil companies decades ago and every Alaskan since gets a stipend from Big Oil.  Or did till recently.

Passing of Giants

I cannot adequately tell you how I feel right now.  My insides are being roiled by a gigantic spoon.

Chris Squire, bass player, co-founder of in my estimate one of the greatest musical groups to ever grace a stage, has died.

A brief report of the particulars can be read here.

I have been listening to, following, collecting, and appreciating YES since I first heard them late one night on my first stereo, a track being played as representative of an “underappreciated” band.  That status did not last long.  A year or two later, they were a major force in what has been called Progressive Rock, a label with some degree of oxymoronicalness in that, not a decade before their advent, all rock was progressive.

Rather it was transgressive and altered the landscape of popular music.  By the time YES came along, divisions, subdivisions, turf wars of various arcane dimensions had become part and parcel of the scene, and there were those who found YES and others like them a transgression to some presumed “purity” of rock music that seemed to require simplistic chord progressions, banal lyrics, and sub par instrumental prowess.  As Tom Petty once said, “Well, it was never supposed to be good.

Well, I and many of my friends and millions of others, across generations, thought that was bullshit, and embraced their deep musicality, classical influences, and superb craftsmanship. They were a revelation of what could be done with four instruments and a superior compositional approach and as far as I’m concerned, Punk, which began as an intentional repudiation of actual musical ability, was a desecration of the possibilities in the form.

Chris Squire and Jon Anderson met and created a group that has since become an institution, with many alumni, that challenged the tendency of rock to feed a lowest-common-denominator machine.  Nothing they did was common, expected, or dull.  Some of it failed, most of it elevated the form, and all of it filled my life with magic.

The ache felt by many at the loss of George Harrison is the ache I now feel at the loss of Chris Squire.  He was brilliant.

There may be more later, but for now, here is an old piece I wrote about YES.

Much To My Pleasant Surprise…

The Supreme Court, in a (predictably) five-to-four vote, has declared that people can get married.

Barriers to marriage based on the criteria that the involved participants must fit a predetermined template having to do with gender are no longer viable or, more importantly, legal.

No, I didn’t expect this.  I am delighted to be wrong.

This also means that we can perhaps start moving forward on a slew of other reforms that are long overdue.  I know there are people who are doubtless going apoplectic about this, predicting the end of all things, the demise of civilization, the collapse of our republic, yada yada yada.

As if any of that could be determined by what two people do to make a home together.

Well, I suppose it could, but letting more people participate in an already-established system which has been held up to be the foundation of that very civilization?  It never made sense, but bigotry rarely does make sense.  This has always been about social control, stigmatizing certain groups for the purposes of preserving privilege and power, and dictating codes of conduct which we have learned the loudest proponents of don’t obey anyway.  At a minimum this takes away the ability of certain people to misrepresent themselves at other people’s expense.

So, two wins in one week.  The ACA still passes constitutional muster, much to the dismay of those who thought any attempt to provide publicly-subsidized health care would also bring about the End Times, and now gender is no longer a legal consideration in who gets to marry whom.

(And for all those who for some reason feel marriage is strictly about procreation—yes, you Mr. Santorum—well, no, that has never been either the sole purpose or even the primary reason, and maybe now we can start having a more rational dialogue about that issue.)

So, all in all, this would seem to be a pretty positive week for a whole lot of folks.

A Few Words About Unpleasant Realities

First off, I would like to say that I work with some amazing people.  I will address just how amazing they are in a different post.  The reason I mention it here is that this morning I attended a meeting wherein we all discussed an extremely delicate, profoundly important issue in order to establish a protocol for a specific event and it was one of the most trenchant and moving experiences in which I’ve been involved.

In mid-July, Harper Lee’s novel, Go Set A Watchman, will be released.  That I am working at a bookstore when this is happening is incredible.  That I am working at a bookstore with the commitment to social justice and awareness that Left Bank Books brings to the table is doubly so, and one of the reasons I feel privileged is the discussion we engaged this morning.

It concerned a particular word and its use, both in Harper Lee’s novel To Kill A Mockingbird and in the larger community of which we are all a part. Necessarily, it was about racism.

I’ve written about my experiences with racism previously. One of the startling and dismaying aspects of the present is the resurgence of arguments which some may believe were engaged decades ago and settled but which we can now see have simply gone subterranean.  At least for many people.  For others, obviously nothing has gone underground, their daily lives are exercises in rehashing the same old debates over and over again.  Lately it has been all over the news and it feels like Freedom Summer all over again when for a large part of the country the images of what actually went on in so many communities, events that had gone on out of sight until television news crews went to Alabama and Mississippi and Georgia and the images ended up in everyone’s living rooms often enough to prick the conscience of the majority culture and cause Something To Be Done.

What was done was tremendous.  That an old Southerner like Lyndon Johnson would be the one to sign the 1964 Civil Rights Act into law is one of the mind-bending facts of our history that denies any attempt to reduce that history to simple, sound-bite-size capsules and forces reconsideration, assessment, and studied understanding that reality is never homogeneous, simplistic, or, more importantly, finished.

It became unacceptable for the culture to overtly treat minorities as inferior and allocate special conditions for their continued existence among us.

Those who objected to reform almost immediately began a counternarrative that the legal and social reforms were themselves the “special conditions” which were supposed to be done away with, conveniently forgetting that the level playing field such objections implied had never existed and that the “special conditions” that should have been done away with were the apartheid style separations and isolations these new laws were intended to end and redress.  Pretending that you have not stepped on someone for so long that they no longer know how to walk and then claiming that they are getting unwarranted special treatment when you provide a wheelchair is about as disingenuous and self-serving as one can get, even before the active attempt to deny access to the very things that will allow that person to walk again.

Some of this was ignorance. Documentary films of southern high school students angry that blacks would be coming into their schools when they had schools “just as good as ours” can only be seen as ignorance.  Spoon fed and willingly swallowed, certainly, but the cultural reinforcements were powerful.  The idea that a white teenager and his or her friends might have gone to black neighborhoods to see for themselves whether or not things were “just as good” would have been virtually unthinkable back then.  Not just peer pressure and adult censure would have come in play but the civic machinery might, had their intentions been discovered, have actively prevented the expedition.

But it is ignorance that is required to reinforce stereotypes and assert privilege where it ought not exist.

Bringing us to the present day, where one may quite honestly say that things have improved.  That African-Americans are better off than they could have been in 1964.  That for many so much has changed in two generations that it is possible for both sides to look at certain things and say, “hey, this is way better!”

Which prompts some to say—and believe—that the fight is over.

And the fact that it is not and that the arguments continue prompts some to believe it is a war and that the purpose of at least one side is hegemony over the other.

Which leads to events like that in Charleston and Dylann Roof’s savage attack.  He’s fighting a war.

The fact that so many people have leapt to excuse his behavior demonstrates that the struggle is ongoing.  I say excuse rather than defend, because with a few fringe exceptions I don’t see anybody hastening to defend his actions.  What I see, though, are people taking pains to explain his actions in contexts that mitigate the simple hatred in evidence.  For once, though, that has proven impossible because of Roof’s own words.  He was very clear as to why he was doing what he did.

He is terrified of black people.

Irrational? Certainly. Does that mean he is mentally ill?  Not in any legal sense.  He has strong beliefs.  Unless we’re willing to say strong beliefs per se are indicative of mental illness, that’s insufficient.  That he is operating out of a model of reality not supported by the larger reality…?

Now we get into dicey areas.  Because now we’re talking about what is or is not intrinsic to our culture.

Without re-examining a host of examples and arguments that go to one side or the other of this proposition, let me just bring up one aspect of this that came out of our morning staff meeting and the discussions around a particular word.

After the Sixties, it became unacceptable in the majority culture to use racial epithets, especially what we now refer to as The N Word.  We’ve enforced social restrictions sufficient to make most of us uncomfortable in its use.  In what one might term Polite Society it is not heard and we take steps to avoid it and render it unspoken most of the time.

To what extent, however, have we failed to point out that this does not mean you or I are not racists.  Just because we never and would never use that word, does that mean we’ve conquered that beast in ourselves or in our culture?

Because we can point to everything from incarceration rates all the way up to how President Obama is treated to show the opposite.  But because “race” is never the main cause, we claim these things have nothing to do with it.  We have arranged things, or allowed them to be so arranged, that we can conduct discriminatory behavior on several other bases without ever conceding to racism, and yet have much the same effect.

Because in populist media we have focused so heavily on That Word and its immediate social improprieties, we have allowed many people to assume, perhaps, because they’ve signed on to that program that they have matriculated out of their own racism and by extension have created a non-racist community.

That’s one problem, the blindness of a convenient excuse.  Put a label on something then agree that label represents everything bad about the subject, then agree to stop using the label, and presto change-o, the problem is gone.  Like sympathetic magic.  Except, deep down, we know it’s not so.

The deeper problem, I think, comes out of the commitment, made decades ago, to try to achieve a so-called “colorblind society.”  I know what was meant, it was the desire to exclude race as a factor in what ought to be merit-based judgments.  No such consideration should be present in education, jobs, where to live, where to shop.  We are all Americans and essentially the same amalgamated shade of red, white, and blue.  (Or, a bit crasser, what Jesse Jackson once said, that no one in America is black or white, we’re all Green, i.e. all classifications are based on money. He was wrong.)

While there is a certain naïve appeal to the idea, it was a wrongheaded approach for a number of reasons, chief of which it tended to negate lived experience.  Because on the street, in homes, people live their heritage, their family, their history, and if those things are based, positively or negatively, on color, then to say that as a society we should pretend color does not exist is to erase a substantial part of identity.

But worse than that, it offers another dodge, a way for people who have no intention (or ability) of getting over their bigotry to construct matters in such a way that all the barriers can still be put in place but based on factors which avoid race and hence appear “neutral.”

Demographics, income level, residence, occupation, education…all these can be used to excuse discriminatory behaviors as judgments based on presumably objective standards.

This has allowed for the problem to remain, for many people, unaddressed, and to fester.  It’s the drug war, not the race war.  It’s a problem with the educational system, not a cultural divide.  Crime stats have nothing to do with color.  Given a good rhetorician, we can talk around this for hours, days, years and avoid ever discussing the issue which Mr. Roof just dumped into our living rooms in the one color we all share without any possibility of quibbling—red.

We’ve had a century or more of practice dissembling over a related issue which is also now getting an airing that is long overdue.  The Confederate flag.  And likewise there are those trying to excuse it—that there never was a single flag for the entire Confederacy is in no way the issue, because generations of Lost Cause romantics thought there was and acted as if that were the case, using Lee’s battleflag to represent their conception of the South and the whole Gone With The Wind æsthetic.  We’ve been exercising that issue in our history since it happened, with even people who thought the North was right bowing the sophistry that the Civil War was not about slavery.

Lincoln steadfastly refused to accept a retributive agenda because he knew, must have known, that punishment would only entrench the very thing the country had to be done with. He did not live to see his convictions survive the reality of Reconstruction.

So we entered this discussion about the use of a word and its power to hurt and its place in art.  My own personal belief is that art, to be worthwhile at all, must be the place where the unsayable can be said, the unthinkable broached, the unpalatable examined, and the unseeable shown.  People who strive for the word under consideration to be expunged from a book, like, say, Huckleberry Finn, misunderstand this essential function of art.

For the word to lose valence in society, in public, in interactions both personal and political, it is not enough to simply ban it from use.  The reasons it has what potency it does must be worked through and our own selves examined for the nerves so jangled by its utterance.  That requires something many of us seem either unwilling or unable to do—reassess our inner selves, continually.  Examine what makes us respond to the world.  Socrates’ charge to live a life worth living is not a mere academic exercise but a radical act of self-reconstruction, sometimes on a daily basis.

Which requires that we pay attention and stop making excuses for the things we just don’t want to deal with.

 

Spoiling the Punch

This is almost too painful.  The volume of wordage created over this Sad Puppies* thing is heading toward the Tolstoyan.  Reasonableness will not avail.  It’s past that simply because reasonableness is not suited to what has amounted to a schoolyard snit, instigated by a group feeling it’s “their turn” at dodge ball and annoyed that no one will pass them the ball.

Questions of “who owns the Hugo?” are largely beside the point, because until this it was never part of the gestalt of the Hugo.  It was a silly, technical question that had little to do with the aura around the award. (As a question of legalism, the Hugo is “owned” by the World Science Fiction Society, which runs the world SF conventions.  But that’s not what the question intends to mean.)

Previously, I’ve noted that any such contest that purports to select The Best of anything is automatically suspect because so much of it involves personal taste.  Even more, in this instance, involves print run and sales. One more layer has to do with those willing to put down coin to support or attend a given worldcon.  So many factors having nothing to do with a specific work are at play that we end up with a Brownian flux of often competing factors which pretty much make the charge that any given group has the power to predetermine winners absurd.

That is, until now.

Proving that anything not already overly organized can be gamed, one group has managed to create the very thing they have been claiming already existed. The outrage now being expressed at the results might seem to echo back their own anger at their claimed exclusion, but in this case the evidence is strong that some kind of fix has been made.  Six slots taken by one author published by one house, with a few other slots from that same house, a house owned by someone who has been very vocal about his intentions to do just this? Ample proof that such a thing can be done, but evidence that it had been done before? No, not really.

Here’s where we all find ourselves in unpleasant waters. If the past charges are to be believed, then the evidence offered was in the stories and novels nominated.  That has been the repeated claim, that “certain” kinds of work are blocked while certain “other” kinds of work get preferential treatment, on ideological grounds. What grounds? Why, the liberal/left/socialist agenda opposed to conservatism, with works of a conservative bent by outspoken or clearly conservative authors banished from consideration in favor of work with a social justice flavor. Obviously this is an exclusion based solely on ideology and has nothing to do with the quality of the work in question. In order to refute this, now, one finds oneself in the uncomfortable position of having to pass judgment on quality and name names.

Yes, this more or less is the result of any awards competition anyway.  The winners are presumed to possess more quality than the others. But in the context of a contest, no one has to come out and state the reason “X” by so-and-so didn’t win (because it, perhaps, lacked the quality being rewarded). We can—rightly—presume others to be more or less as good, the actual winners rising above as a consequence of individual taste, and we can presume many more occupy positions on a spectrum. We don’t have to single anyone out for denigration because the contest isn’t about The Worst but The Best.

But claiming The Best has been so named based on other criteria than quality (and popularity) demands comparisons and then it gets personal in a different, unfortunate, way.

This is what critics are supposed to do—not fans.

In order to back their claims of exclusion, exactly this was offered—certain stories were held up as examples of “what’s wrong with SF” and ridiculed. Names were named, work was denigrated. “If this is the kind of work that’s winning Hugos, then obviously the awards are fixed.”  As if such works could not possibly be held in esteem for any other reason than that they meet some ideological litmus test.

Which means, one could infer, that works meeting a different ideological litmus test are being ignored because of ideology. It couldn’t possibly be due to any other factor.

And here’s where the ugly comes in, because in order to demonstrate that other factors have kept certain works from consideration you have to start examining those works by criteria which, done thoroughly, can only be hurtful.  Unnecessarily if such works have an audience and meet a demand.

For the past few years organized efforts to make this argument have churned the punchbowl, just below the surface. This year it erupted into clear action. The defense has been that all that was intended was for the pool of voters to be widened, be “more inclusive.” There is no doubt this is a good thing, but if you already know what kind of inclusiveness you want—and by extension what kind of inclusiveness you don’t want, either because you believe there is already excess representation of certain factions or because you believe that certain factions may be toxic to your goal—then your efforts will end up narrowing the channel by which new voices are brought in and possibly creating a singleminded advocacy group that will vote an ideological line. In any case, their reason for being there will be in order to prevent Them from keeping You from some self-perceived due. This is kind of an inevitability initially because the impetus for such action is to change the paradigm.  Over time, this increased pool will diversify just because of the dynamics within the pool, but in these early days the goal is not to increase diversity but to effect a change in taste.  What success will look like is predetermined, implicitly at least, and the nature of the campaign is aimed at that.

It’s not that quality isn’t a consideration but it is no longer explicitly the chief consideration. It can’t be, because the nature of the change is based on type not expression.

Now there is another problem, because someone has pissed in the punchbowl. It’s one of the dangers of starting down such a path to change paradigms through organized activism, that at some point someone will come along and use the channels you’ve set up for purposes other than you intended.  It’s unfortunate and once it happens you have a mess nearly impossible to fix, because now no one wants to drink out of that bowl, on either side.

Well, that’s not entirely true.  There will be those who belly up to the stand and dip readily into it and drink.  These are people who thrive on toxicity and think as long as they get to drink from the bowl it doesn’t matter who else does or wants to. In fact, the fewer who do the better, because that means the punch is ideally suited to just them. It’s not about what’s in the bowl but the act of drinking. Perhaps they assume it’s supposed to taste that way but more likely they believe the punch has already been contaminated by a different flavor of piss, so it was never going to be “just” punch. They will fail to understand that those not drinking are refraining not because they don’t like punch but because someone pissed in the bowl.

As to the nature of the works held up as examples of what has been “wrong” with SF…

Science fiction is by its nature a progressive form. It cannot be otherwise unless its fundamental telos is denied. Which means it has always been in dialogue with the world as it is. The idea that social messaging is somehow an unnatural or unwanted element in SF is absurd on its face.  This is why for decades the works extolled as the best, as the most representative of science fiction as an art form have been aggressively antagonistic toward status quo defenses and defiantly optimistic that we can do better, both scientifically and culturally.  The best stories have been by definition social message stories. Not preachments, certainly, but that’s where the art comes in.  Because a writer—any writer—has an artisitic obligation, a commitment to truth, and you don’t achieve that through strident or overt didacticism. That said, not liking the specific message in any story is irrelevant because SF has also been one of the most discursive and self-critical genres, constantly in dialogue with itself and with the world. We have improved the stories by writing antiphonally.  You don’t like the message in a given story, write one that argues with it. Don’t try to win points by complaining that the message is somehow wrong and readers don’t realize it because they keep giving such stories awards.

Above all, though, if you don’t win any awards, be gracious about it, at least in public. Even if people agree with you that you maybe deserved one, that sympathy erodes in the bitter wind of performance whining.

 

______________________________________________________________________________________

*I will not go into the quite lengthy minutiae of this group, but let me post a link here to a piece by Eric Flint that covers much of this and goes into a first class analysis of the current situation.  I pick Eric because he is a Baen author—a paradoxical one, to hear some people talk—and because of his involvement in the field as an editor as well as a writer.

And So It Begins

Campaign season seems to begin earlier and earlier every time it comes around, but this time it’s starting up almost two years before?  Well, in many ways it began in 2008 and has continued almost nonstop since.

Ted Cruz has announced his candidacy.

I have two reactions to this.  The first is, perhaps predictably, “You have to be kidding.”  But the other is an unpleasant chill running through my entire nervous system.  I have come finally to embrace the maxim “Never underestimate the power of human stupidity.”  There are and will be fervent supporters for this demagogue and over the last couple of decades it has disturbed me how such thoughtless, anti-intellectual, entrenched ideologues seem to creep ever closer to the White House.  On the one hand, Romney lost because he really did not understand the mood of the nation.  On the other, those who mourn his loss have, at least in part, put enough of their kind in congress to effectively cripple national government.

I feel this would all be solved by the simple expedient of a 95% voter turnout.

No, I do not support any suggestion of mandatory voting.  Freedom does not thrive where choice is limited, and choosing not to vote is as viable a freedom as choosing to vote.

It would be less troubling if I believed that this was the case, that people were choosing not to vote.  I think for many people it’s just too much trouble, low down on the list of priorities behind shopping and yard work.  For many others, whether we wish to accept it or not, obstructions effectively dissuade voting.  And for still more, a deep pessimism that voting does no good keeps them from even knowing who the candidates are or what the issues may be.  Throw in a thick broth of lazy and there you have it.

So Ted Cruz may get and keep support from people who will find it easier to vote slogans than to actually find out something about their candidates.  He mouths the appropriate small-minded palaver about government overreach and too much regulation and the loss of American prestige.  Some people nod knowingly, as if they actually understand what he’s talking about.  If they did, they would know him for the political half-wit he seems to be.  He’s going to know how to get out the vote among those who think, when they do, in terms of feelings and disapprovals rather than by issues, so he may run a solid campaign by such metrics, but he would not know how to be a president if he won.

To wit, there may well be government overreach, but it’s not a single thing liable to a simple solution.  There is no cabal to which you can just say No and stop the problem.  And frankly, as with most things in America, one person’s overreach is another’s necessary program.  Likewise with regulation. Sure, there may well be—and assuredly are—too many inappropriate regulations imposed upon us by government.  Just as surely, my list will be different than your list, so exactly how do we come to some agreement about which should go and which should stay?  And, just to make matters worse, which government?  Municipal, county, state, or federal?  Not all regulations are from the same source.  This is why democracy, whether we like it or not, is an ongoing process, a conversation, requiring engagement by the citizenry.  It doesn’t run on its own.  We can’t just elect someone and then ignore everything afterward.

As for American prestige, that’s one of those noble-sounding but useless phrases that can mean anything.  The decline of American prestige?  In what way and for whom?  It’s not quantifiable, for one thing.  For another, it’s as personal as the other two points.  For some, having the world afraid of us is evidence of “ascendancy” and “prestige.”  Like we’re all of us school kids in the playground, throwing our machismo around to count coup.  For others, respect is what we want, and that’s something you earn by cooperation.  Working with other nations, more to help them with their problems than ours, but getting in return some help with ours, and then knowing when their problems are caused by us and being willing to do something about it.  Not sexy, but in the long run more effective.

I recall seeing one of the last big conferences Bush attended before he left office, and all these prominent leaders of other countries mounting the stage, many of them putatively allies, and it was obvious that none of them respected Bush.  He was all but snubbed.  They saw him as a rube.  A clueless tool of his handlers.  Whether that assessment was correct is immaterial, that was the perception, and let’s be honest, in politics perception is more than half the game.

That is not the case with Obama.  Again, whether you like it or not.

Or perhaps people just don’t recognize respect when they see it.  Respect is a voluntary thing, not something you can demand, and certainly not something frightened people give.

Cruz is a demagogue.  He also doesn’t seem to give a damn about anything other than his career.  His people are perhaps aware of his deficits.  He made his announcement to run for office in a packed auditorium—filled with students who were required to be there.  Many of them may well have shown up for him anyway, but not all, and it was little more than some opportunistic stage craft.

What he represents, if in fact he represents anything other than himself, is a laundry list of regressive ideas that are everything we’ve come to expect from reactionary coalitions of malcontents who don’t like the idea that America has to be shared with people they don’t like.  That he is one of the poster boys for a Tea Party that still won’t let go the idea that Obama is not a citizen is profoundly ironic.

To be clear, the charges that Cruz is ineligible to run for the presidency are as groundless as they were for Obama.  His mother was born in Wilmington, Delaware.  End of argument.  He’s a “natural born” American.

Still, that some people are throwing the charge at him already carries a small schadenfreude about it.

As far as I know, no one in recent memory who began their active campaign this early has made it through the primaries.  I could be wrong about that, but I think it’s so.  Which means he’s being poorly advised OR this is part of a larger Party strategy to set him up to take all the flack while another candidate, more moderate, more “electable” is positioned for a later announcement closer to time.  If so, I have to wonder if Cruz knows.

It’s going to be an interesting season.

We Prospered: Leonard Nimoy, 1931 to 2015

He was, ultimately, the heart and soul of the whole thing.  The core and moral conscience of the congeries that was Star Trek.  Mr. Spock was what the entire thing was about.  That’s why they could never leave him alone, set him aside, get beyond him. Even when he wasn’t on screen and really could be nowhere near the given story, there was something of him.  They kept trying to duplicate him—Data, Seven-of-Nine, Dax, others—but the best they could do was borrow from the character.

I Am Not Spock came out in 1975.  It was an attempt to explain the differences between the character and the actor portraying him.  It engendered another memoir later entitled I Am Spock which addressed some of the misconceptions created by the first.  The point, really, was that the character Spock was a creation of many, but the fact is that character would not exist without the one ingredient more important than the rest—Leonard Nimoy.

Spock IDIC

I was 12 when Star Trek appeared on the air.  It is very difficult now to convey to people who have subsequently only seen the show in syndication what it meant to someone like me.  I was a proto-SF geek.  I loved the stuff, read what I could, but not in any rigorous way, and my material was opportunistic at best.  I was pretty much alone in my fascination.  My parents worried over my “obsessions” with it and doubtless expected the worst.  I really had no one with whom to share it.  I got teased at school about it, no one else read it, even my comics of choice ran counter to the main.  All there was on television were movie re-runs and sophomoric kids’ shows.  Yes, I watched Lost In Space, but probably like so many others I did so out of desperation, because there wasn’t anything else on!  Oh, we had The Twilight Zone and then The Outer Limits, but, in spite of the excellence of individual episodes, they just weren’t quite sufficient.  Too much of it was set in the mundane world, the world you could step out your front door and see for yourself.  Rarely did it Go Boldly Where No One Had Gone Before in the way that Star Trek did.

Presentation can be everything.  It had little to do with the internal logic of the show or the plots or the science, even.  It had to do with the serious treatment given to the idea of it.  The adult treatment.  Attitude.  Star Trek possessed and exuded attitude consistent with the wishes of the people who watched it and became devoted to it.  We rarely saw “The Federation” it was just a label for something which that attitude convinced us was real, for the duration of the show.  The expanding hegemony of human colonies, the expanse of alien cultures—the rather threadbare appearance of some of the artifacts of these things on their own would have been insufficient to carry the conviction that these things were really there.  It was the approach, the aesthetic tone, the underlying investment of the actors in what they were portraying that did that.  No, it didn’t hurt that they boasted some of the best special effects on television at that time, but even those couldn’t have done what the life-force of the people making it managed.

And Spock was the one consistent on-going absolutely essential aspect that weekly brought the reality of all that unseen background to the fore and made it real.  There’s a reason Leonard Nimoy started getting more fan mail than Shatner.  Spock was the one element that carried the fictional truth of everything Star Trek was trying to do.

And Spock would have been nothing without the talent, the humanity, the skill, the insight, and the sympathy Leonard Nimoy brought to the character.  It was, in the end, and more by accident than design, a perfect bit of casting and an excellent deployment of the possibilities of the symbol Spock came to represent.

Of all the characters from the original series, Spock has reappeared more than any other.  There’s a good reason for that.

Spock was the character that got to represent the ideals being touted by the show.  Spock was finally able to be the moral center of the entire thing simply by being simultaneously on the outside—he was not human—and deeply in the middle of it all—science officer, Starfleet officer, with his own often troublesome human aspect.  But before all that, he was alien and he was treated respectfully and given the opportunity to be Other and show that this was something vital to our own humanity.

Take one thing, the IDIC.  Infinite Diversity in Infinite Combination.  It came up only a couple of times in the series, yet what a concept.  Spock embodied the implications even in his trademark comment “Fascinating.”  He was almost always at first fascinated.  He wanted before anything else to understand. He never reacted out of blind terror.  Sometimes he was on the other side of everyone else in defense of something no one seemed interested in understanding, only killing.

I’m going on about Spock because I know him.  I didn’t know Mr. Nimoy, despite how much he gave of himself.  I knew his work, which was always exemplary, and I can assume certain things about him by his continued affiliation with a character which, had he no sympathy for, would have left him behind to be portrayed by others long since.  Instead, he kept reprising the role, and it was remarkably consistent.  Spock was, throughout, a positive conscience.

On the side of science.  I can think of no other character who so thoroughly exemplified rational morality.  Spock had no gods, only ideals.  He lived by no commandments, only morality.  His ongoing championing of logic as the highest goal is telling.  Logic was the common agon between Spock and McCoy, and sometimes between Spock and Kirk.  I suspect most people made the same mistake, that logic needs must be shorn of emotion.  Logic, however, is about “sound reasoning and the rules which govern it.” (Oxford Companion to Philosophy)  This is one reason it is so tied to mathematics.  But consider the character and then consider the philosophy.  Spock is the one who seeks to understand first.  Logic dictates this.  Emotion is reactive and can muddy the ability to reason.  Logic does not preclude emotion—obviously, since Spock has deep and committed friendships—it only sets it aside for reason to have a chance at comprehension before action.  How often did Spock’s insistence on understanding prove essential to solving some problem in the show?

I suspect Leonard Nimoy himself would have been the first to argue that Spock’s devotion to logic was simply a very human ideal in the struggle to understand.

Leonard Nimoy informed the last 4 decades of the 20th Century through a science fictional representation that transcended the form.  It is, I believe, a testament to his talent and intellect that the character grew, became a centerpiece for identifying the aesthetic aspects of what SF means for the culture, and by so doing became a signal element of the culture of the 21st Century.

Others can talk about his career.  He worked consistently and brought the same credibility to many other roles.  (I always found it interesting that one his next roles after Star Trek was on Mission: Impossible, taking the place of Martin Landau as the IM team’s master of disguise.  As if to suggest that no one would pin him down into a single thing.)  I watched him in many different shows, tv movies, and have caught up on some of his work prior to Star Trek (he did a Man From U.N.C.L.E. episode in which he played opposite William Shatner) and in my opinion he was a fine actor.  He seems to have chosen his parts carefully, especially after he gained success and the control over his own career that came with it.  But, as I say, others can talk about that.  For me, it is Spock.

I feel a light has gone out of the world.  Perhaps a bit hyperbolic, but…still, some people bring something into the world while they’re here that has the power to change us and make us better.  Leonard Nimoy had an opportunity to do that and he did not squander it.  He made a difference.  We have prospered by his gifts.

I will miss him.