Bruce Fleming's Blog

Traffic

Jul 07 2011 | 11 comments

This morning I stopped at the end of my street, gauging traffic to find out when it was safe to pull into the main road. I looked right, left, seeing the configurations of cars. Was the hole on the left big enough to let me enter? No. Then the right became possible and I looked left again. That had filled up. Finally, after gauging things in both directions several more times, I saw a situation being created that would let me enter, with the car on the left far enough away, and nothing on the right. And I pulled in.  At this point it was safe for me to pull in. No one else was interested in this process of aiming at, and hitting, an unseen target, because no one else was in my situation. If it works, nobody cares; if there’s an accident, by contrast, we might have to articulate this and go over it many times. The accident is comparable to the question in philosophy that gets asked, and answered, the belief that is articulated, the tone of voice that is challenged. Most of life merely happens, without such radical stops and doublings-back: it is the error of those who consider to think that the times when we consider are primary.

We have the means to express this knowledge in more public terms—though most of the time we would never think of doing so. We could write out a report beginning, The incident took place at such and such a time, such and such a place. If we got the other drivers to add their recollections, or (unlikely but possible) got a video from a surveillance helicopter, we would be able to make this purely personal, transitory knowledge public, caught in the cross-hairs of the time-space continuum, something firm and incontrovertible.

Most of the world passes without this kind of freezing in terms of the public. The probability of a surveillance helicopter having been there is small indeed. And why would we try to nail down something this ephemeral to begin with? Usually we don’t, which means that most of life passes by in the way it passes by, as part of the realm of personal and situational knowledge. The realm of the personal is the default position of knowledge, before it is transformed into something else, and when it isn’t.

It's not by chance that the scenario above where we try to wrest public information out of a private moment evokes a police investigation, or a trial, circumstances under which we attempt to render private events public. It is related to the way we boil down events to create the abstractions of science. Such objectivizing is only rarely asked for, and indeed couldn’t happen more than occasionally, even if we were willing to try.  During the time period when we are articulating and justifying, we are doing a slew of things that themselves will never be subjected to the same degree of scrutiny, precisely because we are spending our time and energy scrutinizing one incident in the increasingly distant past. Objectifications of private moments are themselves facts of life, and disrupt people’s lives. This means, they can’t always be achieved, and even when they are, it is at a cost. We didn’t have to wait for Heisenberg’s Uncertainty Principle to know that the act of getting information alters the thing about which we now know more. Accusing a friend of the disloyalty we suspect him of may elicit information and help us decide whether or not he is guilty of it, but this will certainly have repercussions.

Consonance

Jul 07 2011 | 2 comments

 

The world is buoyed up by the fact that it is what is: the default of existence is consonance, not dissonance. We can justify or prove or discover any particular thing, but we all this is on as-needed basis, which means, a situation of lack arises. For every one thing we demand an explanation for, there are an infinity of things we don’t. So we can’t explain everything, and while we are explaining one thing, we fail to explain everything else.

Though the default of our existence is consonance, its fabric is one of dissonance. Every act of naming, explaining, discovering, thinking about or altering the world presupposes a rift in the lute, as Tennyson had it, a crack or a moment of friction. Perfect consonance would be like floating in a hot tub of brine that buoys us up: it’s warm, it’s womb-like, and we merely are. But this doesn’t happen often, and it doesn’t last too long. We do not glide forward in life, we negotiate forward with fits and starts, sometimes through a thorny way. But we solve specific problems, not all problems, give specific answers, not all answers.

We discover we were wrong, and we correct the error; we can question any particular belief, but doing so takes time, and precludes us from questioning others. The real is all there is, but we can alter that real by taking detours into the unreal: the not-yet, or never-will-be-real realm that doesn’t exist but that somehow allows us to alter what does. The daily mysterious is the fact that the world is what it is: the red car passing as I write isn’t a blue one, and it is driven along a specific course. We can paint that car blue and drive a different course, but the world then contains the before, the after, and the in between. All this is what is.

Conceptualization is the basis of thought: it allows us to break the bond between the world of sensations in which we find ourselves at any given moment and to visualize alternatives. And this in turn allows us to change the world. Identifying a brown blur as a leaf allows us to figure out it’s stuck to our glasses, perhaps in a gale, and make us realize we have to reach up and pull it off. Calling a leaf a “leaf” allows us to see it as one of a type of object that is related to other leaves, and that, like other leaves, has certain qualities in common with them. (It doesn’t matter if these common qualities are the result of “family resemblance” or equal connection to a Platonic Idea.)

Conceptualization, such as calling this thing a “leaf,” or water “water” (the latter the “miracle” of William Gibson’s play about Annie Sullivan and Helen Keller, The Miracle Worker) is the basis of explanation, which is at least two levels of abstraction beyond where we are: explanation is a parallel set of facts more general than the individual thing we have identified. We conceptualize to call the sensation we have a “headache” (of course “sensation” is already a conceptualization) and abstract beyond that to suggest or repeat that our headache is caused by X in our brain. We can’t conceptualize this whole process: it’s something we do, except that this suggests that “we” and “do” are logically prior to anything else. What’s the motion of motion? Stillness.

We only ask for explanations for the things for which we do so, which means things we want explained. Our default is lack of explanation. Asking for explanation occurs as the result of aberrations, when something is not what we expect. We don’t ask for an explanation of why my bed is still under me in the morning, since that’s what’s supposed to happen and we’ve very likely never even conceived of anything else happening, nor why tuna fish cans contain tuna, since that’s just what they are and what they do. More likely, we would ask why this can that says it’s got tuna doesn’t, or why I wake up with the bed gone. If a child asks for a treat we don’t ask why; more likely is to ask what’s wrong if he refuses one.  Explanation is the furthest layer at any given point, which doesn’t mean it will always be so. We may not be able to say what causes the cause, and typically don’t ask: explanation goes a layer at a time, and is offered on as as-needed basis, with lack of need being the default.

If we ask why we have headaches, we have to look and experiment: we can’t make something up. Let’s say we discover that it’s because the brain releases X: when X is released we get headaches, when it isn’t we don’t. We do studies to make sure this is causality rather than correlation (though coming up with these usually requires ingenuity). But finally we have it. At this point we stop, at least for now, because is the question we wanted to answer. We don’t ask “why?” in a monotonous string, like a child. What’s the explanation for the release of X? At this point it’s not even clear what this question means, as it seems an example of the infinity of questions we don’t want to answer at any given point, such as: what is the meaning of my sandwich? Huh? The questions we ask at any given time are a tiny fraction of those we could; perhaps at some time even this last question would make sense (as we’d say) enough to really try and answer it. Now we brush it aside. And the history of ideas or of philosophy is the trail of the relatively few questions we have asked and had arguments about: we know how the arguments go, as we do not know how the argument over the  meaning of my sandwich goes.

But at some point, if we had a reason, we could ask why the brain does things in this way. The sort of answer we could eventually give can vary: a valid explanation is one we accept, it’s not a certain sort of content. And we can explain anything we do, even this process—only we cannot, until we have done so, explain what allows us to explain. And while we are doing this, we are failing to explain an infinity of other things. Life is bigger than anything within it.

Es stimmt

Jul 07 2011 | 2 comments

Es stimmt

In German we say that something is the case by saying “es stimmt”—it’s the same locution as to “tune” a piano, ein Klavier stimmen. It’s on pitch. This is in fact our sensation when we search for a word or action, or go back and correct something that seems false to us. We are bringing our actions into unison with an unseen postulate. People acting or interacting with each other are like boats where somebody has a hand on the tiller. The boat go where they go; we can show after the fact where this was, but we can’t predict more than a certain amount. No outsider can look at the patterns of the boat’s wake and conclude that this was the only possible pattern it could have taken. In any conversation, I could have said a dozen things that would have kept the boat moving in almost the same direction—which is to say, where the variations would have been uninteresting (we needn’t have talked about exactly the things we talked about). And yet at each moment, I could have pushed the boat in another direction, or the other person, who wasn’t part of my program, could have done so, and I would have had to react. If the boat had gone in another direction, that would have opened up a dozen more things for that moment, and a dozen for the moment beyond that, and more beyond that. After the fact the pattern it takes seems fixed, but in fact it wasn’t while it was unfolding.

That most boats keep a straight course most of the time rather than flitting all over is merely something we realize, not something that has to be. Or rather, what is, is. We learn how people are in “most boats” or “most of the time.” We come to learn what the bounds of predictability will be: a child can appear before us suddenly missing a tooth, and we say merely, “I see you lost your tooth.” We have learned that that’s not unusual. Sometimes even we don’t know what to expect. Why is John happy one moment and sad the next? If we know him to be bipolar (manic-depressive, as we once said) we’re not surprised—but outsiders might be. Those unfamiliar with autism might find the demeanor of my autistic daughter strange. I do not.

Sometimes we’re aware of our hand on the tiller, but usually not. And by definition most other people aren’t, most of the time. To the extent they are, it’s because they know the world, have experiences that tell them someone’s story isn’t holding together, someone else is upset, someone has other motives than he says: they’ve paid attention to surfaces. Patterns will form, or not: they help us process what we see. Being alive to surfaces is its own end.

Consider the things in any given moment that are shot at an invisible target that we can nonetheless determine has been hit or not; we can further determine whether we need to try it again. None of this is subjected to analysis, nor is it rendered in the terms of an inflexible notation system such as science is. Typing this sentence, for example: there is no visible target of “what I mean to say,” but I can control the words as I write them, and correct as they come out, or after. And I know when I’ve achieved what I “meant to say”—which doesn’t exist until I pull it from the ether. We can ask for what I “meant to say” in any given instance, the way we can find scientific principles to graph any situation (though it takes a lot of looking) but in this time we are refusing to do the same with countless other things, including the terms we are considering.

Or consider:  Suddenly, on day, I have the nagging itch that I suddenly understand something about the points I am making here that I had not understood before. This sense moves me downstairs to the computer where I open my file and find the place I need to be, or as here, just begin typing in a blank spot in fear of losing the idea. This idea leads me to another, it may be, and this to yet another—or perhaps it sputters out after one and needs several days to be kicked into life again. In any case, how do I know what which one feels like, whether to go on, whether there is more to be mined at that particular time, or not? Much the same way as I know how long a handshake is appropriate with whom: that’s what it feels like.  Others can’t share our sense of achieving or not achieving the goal, and we lack a step-function like scientific diagram that applies to this.

How can I aim  at just the right tone of irony in my voice to respond to something my wife has said to me? How do I know if I’ve achieved it? How can I modulate my tone if I sense it’s coming out too strong? What is “irony” in tone of voice? Perhaps we could come up with a measurement expressed in the blank givens of a mechanical description system (something about sound waves and tilt of the head)—but how can we vary this to explain why more or less irony is required because of the fact that we have already had this discussion three times in the last two days? What if it had been only two? How do we program a machine to take account of that? Perhaps we can, if we realize it’s a factor. But we can never program everything: perhaps we’ve both seen the same movie the day before in which an ironic tone of voice was used: I’m quoting some of that tone or the words, say. Fine: we can play catchup ball and put some variability into the system for “saw same movie yesterday.” How about “saw same movie, but one partner was tired and wasn’t paying attention”? We can never foresee all the things that affect interactions and put them on a grid.

Think of all the blades of grass in any lawn. Most people want only the sensation of “lawn,” of a green carpet. But if we look closer, we see that there are worlds between every two blades, and three blades of grass in a row will make a symphony of near- but not absolute symmetries, little bending uprights that echo each other without being completely congruent, some leaning over others, a symphony of verdant scimitars. Paying attention to only one square foot of any normal lawn could take us a lifetime. Sane people, however, can’t afford to spend their lives contemplating the blades of grass under their feet: we simply ignore the fact that the world is full of particulars, refusing to let it bother us.

Still, this unrecognized murmur of unperceived particulars is tapped over and over. It’s the reservoir from which things we do notice surge.

For example, in the sudden April snow on the woods behind my house, which I see in the breaking daylight as I jump rope on the back deck. All the branches, not yet fleshed out with proper leaves but disfigured in a green haze by buds, are outlined in a thin set of white lines, the snow not enough to accumulate but enough to accent. I see all the trees, the curvature of the ground, the trees that have fallen, with their roots up in the air like women in hoop skirts, roots meant to anchor the trees vertically to the ground but rendered non-functional by horizontality: the sun glimmers behind other houses, whitening the white.

Yet it’s only chance that had me up at that hour, chance that had me, or probably anyone, notice this. I can take the attitude that it’s a good thing I was up. At least someone noticed it; somehow, it seems, this wasn’t in vain on the part of the world. It almost seems as if I’ve saved the day, or at least the morning: I’ve been conscious of this virtuosity on the part of the world. I’m conscious of the “save”: I got to see this after all. By the same token, the narrowness of the save—the fact that I’m not usually up at this hour at all—reminds me that most of the world goes unperceived. The fact of so much of what seems waste can remind me that little of it is saved” in this sense, which ought to make me question whether the saving has a point, given that it’s so rare. If I need to see the world to save it from non-being, that doesn’t bode well for most of the world: after all, I’m not usually around. Indeed, nobody is, and somehow the world goes on producing these things, which may seem therefore wasted, like meals lovingly prepared that no one eats, that simply spoil and are thrown away.

The Russian Formalist theoretician Victor Shklovsky thought that a lot of the world spoiled in this sense. He was horrified by a passage in Tolstoy’s diary noting that when he, Tolstoy went to dust the table, he couldn’t remember if he had or hadn’t. Tolstoy is shaken with the existential feeling that the unnoticed is the unoccurred: we alone cause the world to have been, a later echo of Bishop Berkeley: esse est percipi, to be is to be perceived. Shklovsky echoes his feeling.

The solution to this horrible situation, Shklovsky suggested, was to notice the world. He believed that it was only artists who made people notice the world. Hence his famous conclusion that “Art makes the stone stoney.” Unnoticed, the world simply isn’t. His conclusion is that art and artists are necessary for any of the world to be at all, to be saved from oblivion.

But Shklovsky was wrong about the middle term of his reasoning, the assertion that noticing only takes place in art. It can also take place in what I’m calling the aesthetic sense of life. I noticed the dusting of snow on my trees as the sun rose, and need never have tried to make art from this. Whether or not I try and transmit this perception to others is a subsequent decision that has nothing to do with the noticing, but we speak of art only if I do decide to make this attempt.

It’s also a mistake to think that the world is producing finished meals that somehow we have to show up for; if we don’t they’re thrown away. In fact, it’s only when we show up that there’s a meal. What causes the noticing to happen is an effect not of the world itself but of what we, the perceiver, are familiar with. The woods in the snow made an impression because it looked so different than it usually does. That’s what we notice: a situation where we can establish commonality (same woods, same place) but also are aware of differences: how the trees looked in the snow vs. how they looked outside. For the same reason we think the world transfigured in the spring when, as in Washington as I write, the world is turned to frothing pink, with all the ornamental cherry trees all over the center of the city in gushing bloom. But if this were the norm, the way green leaves are, we’d presumably not notice them even if we saw them—or only the way we do the green leaves, occasionally, the sky dark, the air sweet-smelling, or in their first, pale green phase that itself looks so different from the norm. Green is no less startling a color than pink, only we’re used to it.

Interest is produced by variations from the norm: the fact that I have the background of woods without snow in comparison with woods with snow look interesting. Or the light of full day in comparison with which the faint glimmer of dawn is interesting.

It’s true that all the things we liked, we noticed. However we tend to draw a false converse: if we could notice them all, we’d like them all. I fact, we’d simply be overwhelmed, which is why we fail to notice most things to begin with. It might be interesting, as an artwork, to take photographs of the same three blades of grass in my lawn over a period of time. But if we did this with the next three, and the next three, and the next three, people would turn away. Interest isn’t “fair”: it doesn’t mean the world is this interesting, in fact the opposite—that one thing being interesting presupposes many things that aren’t.

Let’s say we could get all the six billion people on the Earth busy noticing for every minute of their waking time. Or create another six billion whose job was merely to notice. Why would that be better? They’re all busy noticing; who notices that they notice? How would that valorize the world? Would we be sure that even then we’d scratched the surface of things to notice in the world? And for that matter, the most fundamental question of all is this: in what way, beyond my own pleasure, have I “saved” the world by noticing it? Perhaps the world doesn’t care to be noticed.

A letter to the household hints columnist “Heloise” provides a paradigm for the way our knowledge of the world alters over time. A reader writes to say that of course everyone knows that putting some lemon juice or citrus on fresh-cut fruit keeps it from turning brown and enhances the taste. For those who may find themselves momentarily without a lemon or lime, however, she recommends using powdered lemonade mix, which has the same effect.

This is a good example of lateral thinking: if a lemonade isn’t available, try using lemonade mix. The problem is, someone trying this out might discover it doesn’t work. I’m not convinced, as it is, that powdered “lemonade” mix has much to do with lemons. It seems likely to me it’s nothing but chemicals. In fact, one of them may be an acid that produces the tartness of chemical lemonade, but I’d guess that it’s not the same acid that’s in lemons—assuming that’s what produces the brown discoloration of cut fruit.

Putting ourself in the position of the person in the kitchen who’s been told that lemons prevent cut fruit from turning brown, we can see the logic of moving from lemons to lemonade mix. It might or might not work, as someone wise to the ways of the mass-market food preparation world would be aware, since many things which bear the names of natural products have nothing to do with those products. If all we need to know is that lemons produce the effect we want, that’s what we know: lemons produce this effect. It’s only if we have to substitute something for lemons, lemons (say) being unavailable, or if something goes wrong that we look for an “explanation” of what causes lemons to have this effect. We don’t ask for a more precise “explanation” than the explanation we already have unless the one we have fails to work or must in some way be generalized.

Because the properties of citrus fruit is not virgin territory, it seems likely that the active ingredient is an acid of some sort, this being the most remarkable quality of lemons. It probably suffices to ask someone with a degree in nutrition to get the answer. An educated guess would suggest that there is some acid in powdered lemonade that produces the same result, though not necessarily the same one as what we call citric acid in lemons. Something produces the “tang” in powdered lemonade. If we isolate this, we can ask what other sources contain it. Perhaps it’s not powdered lemonade at all, but other household items or prepared foods. We can suggest other things that seem to belong to this same family—though in fact for our purposes they may not. Perhaps vinegar has the same result—though we would understand that taste issues would be the reason why this is not generally suggested as an alternative. But perhaps a splash, with lots of sugar, in an emergency…

In any questioning situation, there is a layer of what we have and the layer of “explanation” we are looking for in order to bring the world into focus. The explanation is an explanation with respect to the layer we have; it itself might require its own explanation some day. For someone with an unlimited supply of lemons, knowing that “lemons prevent fruit from turning brown” may itself be the explanation, the response to a child that asks, “Grandma, why are you squeezing lemons on the peaches?” “Lemons” in general and “fruit” in general are already explanations with respect to these lemons, these peaches.

In simple cases like this, it’s likely we already collectively have the answers to questions like, what is it in lemons that keeps peaches, or any fruit, from turning brown (not all fruit turns brown, so this too is something we might want explained: which fruit turns brown and why?). We have only to ask.

It’s not absolutely necessary that we find ourselves without lemons before we ask this question, though this is likely to be the impetus. It’s perfectly possible that someone idly wondering might ask, Is there anything else that can have this effect? This is the mind-set of what we call “pure science”; people who set about looking for explanations for things we don’t have an immediate need for. But later on, suddenly, people might well need this explanation, say if the lemon supply dries up.

The explanations we come up with are always the generic ones of the order of “lemons cause fruit to stay fresh.” They are generic by contrast with subsequent explanations. The explanation we come up with always seems more specific, seems to fit exactly the hole we want to fill. Only it’s always possible that this itself will have to be refined; what seemed a perfect fit no longer is a perfect fit.

Now we say, it’s not lemons that cause this effect, it’s the acid in the lemons (i.e. not the rind, not the color, not the shape, not even the fact of its being a fruit). We’ve refined our statement. Science is the process of helping us get to the more refined statement, not the statement itself. When we had lemons, saying that “lemons” cause this effect was satisfactory: to us it seemed as if this explanation filled the hole we had. There’s no guarantee that circumstances will continue to leave us satisfied with a statement, however. We can be satisfied by saying that “acid” or “acid X” causes this effect. But perhaps the day will come when the fruit fails to be kept from browning when we apply what we take to be another source of this acid, or a pure form from the laboratory. At that point we may realize it wasn’t the acid itself, it was some property of the acid, or this acid in these circumstances.

What is historical? My wife laments that she decided not to brave the crowds and traffic jams to see President Obama sworn in: that would have been to make history. To stay home in the comfort of one’s kitchen and watch it on the television—that’s not historical. Why not? Because everybody could do it. You saw him better than you would have if you’d been half a mile away. Yes, she says, but I wasn’t there. Only there isn’t as close as the television cameras, but in a place where you shared the air with him. But don’t we share the air with him even now, thirty miles away in Annapolis? Or thousands? In what way is the cup of tea and the fact of the tablecloth, what went on in this house on this day, less historical than being part of a crowd at this one event? All happened.

The question of what constitutes history was re-visited by the Annales school of history in the l950s: history is the little people, not a story of Great Men. Tolstoy spent scores of pages at the end of War and Peace defending the same idea. But even here the notion is that the sum total of many Little Men is greater than Great Men: no Little Man in him- or herself is supposed to be greater than the Great Man, or more historical, only, in a Marxist sense, collectively. My issue is quite different. How is me sitting with a cup of coffee at the kitchen table and a book not historical? It happened. How is taking part in the Battle of Gettysburg more historical than being gunned down on the streets of an inner-city war zone? Or an obscure battle of a sideline skirmish with no name? That precisely, people would say, is the difference: the history books can’t talk about everything, so what they talk about is historical.

Last fall we took our boys to see the battlefield at Gettysburg, one of the most “historic” places in America. It’s one of the fabled places of American, and military, history. It was the military turning point of the Civil War, the visitor’s center film tells the viewer, echoing the verdict of history, the “high water mark of the Confederacy”—the image is that of a flood, a river that has overflowed its natural low-lying banks and so becomes lethal, waters that are meant to be much lower than they have become and in any case must, having risen, fall. This image, repeated over and over—one of the stops on the battlefield is called the High-Water Mark, where the Union troops, on the heights of Little Round Top, mowed down and broke the back of (metaphors abound when conceptualizing battles) Pickett’s Charge—though Pickett was the general and so did not himself charge. Had it succeeded, the historians suggest, the South would probably have won the battle, and Washington, D.C. would have been open to them: the South might well have won its independence.

So this is the central battle of a central war—not to mention an extremely bloody one, with more than 7,000 dead or wounded. The photographs of the bodies in the Wheatfield—the only wheat field, the animated guide voice on the CD my wife bought in the visitor center shop and that interpreted what we saw as we drove from stop one to stop two and on to fourteen on the fall-struck battlefield—that’s spelled with a capital letter; it’s not merely a wheat field but The Wheatfield—are known to many schoolchildren, the most famous being Sullivan’s “The Harvest of Death”—though the battle, which raged from 1-3 July 1863, was too early for a harvest of any other kind.

And what was central about this war? Americans understand it “saved the Union,” once again a conceptualization from the point of view of the victor, as all history is said to be told. More astute historians, as the well-done wall texts in the new visitor center museum summarize, see the Civil War as having been written into the foundation documents of the nation, a half-slave-based, half-slave-free country divided into the units of states which had retained a good deal of their individual power. The compromises necessary to founding the country, they suggest, couldn’t last forever. The spark was the expansion into the West, with the question central to the election of l860, won by Abraham Lincoln, being whether they would be brought in as slave or non-slave states; the implication is that without this planned expansion westwards the uneasy compromise based on the power of the individual states to set their own laws on such matters would have held—until when? There no one speculates. Still it came to a head in the secession movement and the foundation of the Confederate States of America, and so to war.

So a war that determined the course of the country, written into the earliest foundation documents—could people only have read between the lines, or done anything about it if they had been able to; a battle that had marked the turning point of the war, itself one of the bloodiest battles of a bloody war (Antietam saw the single most lethal day in American battle history, and so is much visited as well)—all of it makes Gettysburg central, and explains the fact that so many people visit.

The most central battle of the central event of the century, fundamental to the existence of the country and implied in its very founding, the turning point, euologized by one of the two greatest US Presidents: the claim of Gettysburg to being a must-see destination is gilt-edged. This, clearly, is history.

What’s not history? Apparently all the rest of life. No one would travel from Maine to Pennsylvania to see a wheat field, rather than the Wheatfield. No one would make a day trip, except out of desperation, to look at a place where a battle had happened if everything around were scenes of battles—as is to a much greater extent the case in, say, France, where the l00 years war raged far and wide. Even a battlefield is old hat: it has to be a particular battle that decided the fate of nations, the bloodier the better—just as people go to Washington to look at one of the Largest Blue Diamonds In the World, the Hope Diamond, which is hardly as large as a walnut: most people probably don’t know that blue is good rare as opposed to defective, so this blue diamond is smaller than many white ones. A showcase nearby the Hope Diamond shows the rainbow effect of a CD as a way of illustration the light refraction through jewels: it’s arguably as pretty as any king’s ransom set of jewels. Only we all have these.

Gettysburg is the Hope Diamond of battlefields, made accessible because it was so recent in absolute terms, and so something that has been preserved uninterruptedly. Marathon is an empty plain; Normandy is real beaches (though with some degree of battlefield cachet); most battlefields of the World Wars in Europe were needed for subsequent development, and are in any case too numerous. But in America there have not been many battles, and there is space. The result is a historical spot.

The Hope Diamond is historical; my wife’s engagement ring is not: everyone in a certain class has one of those. A clump of peach trees down the road is not historical; The Peach Orchard (assuming it is written in capital letters) is. An obscure battle of a meaningless war is not historical, Gettysburg is.

Yet the individual could have been the same in both. How do we know whether what we are doing is historical or not? The CD suggested that the men who fought that day knew the importance of what they did. Or is this only meaningful in retrospect? Surely many people have thought that what they were doing was historical, only to be outvoted by history? What actor in a movie can say for sure that the movie is destined for greatness or the dustbin? Surely Oswald Mosley thought of himself as historical, certainly Adolf Hitler did: the first now seems ridiculous, and Hitler a great Fehlgeburt of history, not its legitimate inheritor.

But is there no pattern to what people say is valuable, or of historic interest? Can we, without waiting for the verdict of history or the view of other people, say that something is important (say, the Battle of Gettysburg) or valuable (say, the Hope diamond)? Surely we can have intimations, as the soldiers at Gettysburg are said to have realized how important what they were doing was. If we had to wait until we had the verdict of others, we’d never have art appraisers, who can say that something is a good example of X (worth a million monetary units) rather than a bad one (worth 100). We’d never have the sense of soldiers that they were taking part in an Important Battle.

But who’s to say that the soldiers in what turned out to be a minor skirmish—who equally fought and died—didn’t feel the same? Their opinions simply don’t valorize the verdict of history, and so are rejected and forgotten, and never quoted. Certainly everybody with Grandpa’s old violin up in the attic thinks he has a Stradivarius.

After Modernism ran out of steam with the second World War, one strange offspring of Modernism got a lot of press for a time, an ultra-elitist movement that was baptized Post-Modernism (sometimes written as one word, postmodernism).  This gave up on the Modernist attempt to capture the taste of life as lived and instead lived in the world of artworks—no longer was it offering the fragments of life, instead it developed the imposed structure. It solved the split of Modernism, in short, by voting for the structure half. But that meant it had nothing to say except to talk endlessly about itself and its own cleverness.

Post-Modernism is ironic, fragmentary, self-referential, sophisticated, and intensely boring, because it ceases to breathe, have sex, or sweat. It’s prim and self-satisfied, rather like the smug and feminized esthete played to perfection by the usually so masculine actor Daniel Day-Lewis in the Merchant/Ivory film of E.M. Forster’s “A Room With a View”: the earthy heroine, Lucy Honeychurch (great name, great actress: Helena Bonham Carter) comes into her own when she rejects this ball-less museum piece for his so-physical rival; the last scene shows her sexually contented on her honeymoon, back in Florence where she had gone earlier in the movie. It was Forster who wrote, “oh dear yes, the novel tells a story.” He thought plot inevitable, but boring. Post-Modernism wore the mask of not taking anything seriously, saw itself as perennially playing games, as having an audience that had seen it all and done it all. It was a revival of the precious aestheticism of the end of the nineteenth century, which had Oscar Wilde hoping he would be “worthy of his china.”

Post-Modernism was a movement of the classroom and the art gallery, at its height in the l980s with writers like the deft but too precious Argentine fabulist Borges, the American New Yorker “wasn’t that erudite?” fragment writer Barthelme, and the academic “let’s talk about how the novel is structured and call it a novel” Baltimore-based Barthes: the “three Bs” of American post-Modernist literature, as people usually said in the 80s. (We talk about the “three Bs” of classical music: Bach, Beethoven, and Brahams, so this phrase about the much lesser artists, the post-Modernists, was highly self-congratulatory). Post-Modernist artworks bored visitors to art galleries, who didn’t see the four previous layers of artworks they were referring to: post-Modernist art had to be learned, it couldn’t just be experienced. And an art based on other artworks rather than life is a dead art, a precious art, an art that sooner or later will wither, as post-Modernism did. Arguably, the suicide of David Foster Wallace was its last gasp.

Hence the necessity for Neo-Modernism. It goes back to Modernism and takes it in a different direction than post-Modernism did. What was good about Modernism as a movement was that it told the truth. Life is experienced as formless; we strive towards form. Where it went wrong was thinking form could be imposed—Joyce thinking of his hapless Leopold as a modern-day Ulysses, for example. Modernism also went wrong in thinking of itself as a potentially popular art, being insulted and upset that vast quantities of people were not interested in its sensibility.  They won’t: most people are happy for the momentary escape of living with prettier, richer, more athletic people. Large realistic epic novels about multiple generations will always dominate the best-seller lists, and be sought after by agents and editors. People want to tell themselves that things are other than they are. That’s just the way it is: it was the inability of the Modernists to accept that what they did was a literature of the few that curdled their sensibility, drove them to the snide insiderdom of post-Modernism.

Neo-Modernism will be a niche literature, by definition. But it doesn’t wall itself into its ivory tower like Post-Modernism: it does what it does in a corner of the public square. If people want to watch, fine. It’s not seeking to define itself as virtuous because inaccessible.

And it isn’t inaccessible. You just have to be prepared for literature that’s not escapist. Neither is it push-your-face-in-the-seamy-side-of-things, like Upton Sinclair or social realist works from the l930s. It’s about the fragmentary nature of life, and the fact that we can construct larger things out of these fragments. Twilley, my first novel, was written at age 19 in 1974, partly when a student at Haverford College. It has a vestigial plot: a young man, abandoned by his wife, walks in a semi-stunned state through a department store to use the bathroom (half the book), takes a bus ride, and then sits in the house where he grew up before masturbating in an empty field, the culmination of his loneliness.  What makes the bulk of the book is the imaginary worlds, some his and some just there for the reader, that emerge from between ordinary things. Between any two steps is a world of sensations—some small and some merely unnoticed. For the fact is that we do fail to notice much of life: we’re intent on where we’re going, and we can’t always be in a state of maximal perception. For the hero of Twilley, the visit to the department store was in search of a bathroom, which took only a few minutes. The pace of reading is much slower, and isn’t really from his point of view at all.

A Structure Opera takes the Modernist realization that all art is structure, whether of actions and feelings or other things (music is a structure of tones, painting of shapes and colors), and develops it: it’s given up plot entirely for another sort of structure, one derived from music. And Fragments in the Form of a Calendar says “fine, you want structure? Let’s take one with no weight, such as the list of days and months: that’s your structure.” They’re related to Modernism because they share Modernism’s great realization, that life as lived is not the same as life as portrayed (that’s also at the heart of Sartre’s Being and Nothingness, and of all phenomenology). They’re not post-Modernist because they’re about life, not themselves. They’re Neo-Modernist because they avoid the problems of Modernism: their structures make no pretense to weight, as Joyce would have us look for profundity in Homeric parallels, and they cease trying to replace or compete with escapist literature—which will always be with us.

There’s nothing wrong with escape, every now and again, just as dessert is fine as dessert. It’s just that you can’t live in that world, and can’t use escapist literature to understand life, only to run from it. What happens when you come back to reality? Flaubert thought you committed suicide. I don’t think you have to do that: I think there’s another alternative, that you understand the nature of life. This is the process Neo-Modernism sets out to exemplify.


BF




Recent Posts

  • Military Miscreants and Character American military brass are up to mischief. A lot of it. The Petraeus scandal from fall of 2012 t...
  • Is Life Boring? For most of us most of the time, life seems very sad, a real chump’s game. We typically fritter a...
  • Blood My brother died of AIDS in the third great wave of AIDS deaths, back in the early l990s. By then ...
  • Autistic Academics? After more than thirty years as an academic, including stints at the University of Freiburg in Ge...
  • A Rose is a Rose is a Rose Gertrude Stein is the source of a number of quotable quotes. Her best-known line may be the phras...
  • Background and Foreground: The Power Outage               During a snowstorm la...
  • Consider the Videocassette: Is Progress Possible? I am down to a single machine that plays videocassettes, which for a time in the l980s after...
  • Traffic This morning I stopped at the end of my street, gauging traffic to find out when it was safe...
  • Consonance   The world is buoyed up by the fact that it is what is: the default of existence is cons...
  • Es stimmt Es stimmt In German we say that something is the case by saying “es stimmt”—it’s the sa...