Mar 282007
 

A few posts ago, I wrote, among other things, that Eric Goldman’s concept of Cosean filters as an interface between people and marketers is  impossible. A commenter, Sean Ammirati, asks whether I would approve of adopting these filters if they were possible. The question seems strange, rather like asking whether I would endorse time machines as a mode of changing past history if they were possible. Strange as such questions are, the short answer in each cases, is in fact, “no.”

Below I will (a) briefly explain why Cosean filters if they worked would hardly be desirable, and then (b) I shall explain a little more fully why they will not work, even in principle.

(a)    Why we should not want Cosean filters

Goldman’s idea is that a suitable filter, always carried with you, could shield you from unwanted marketing while letting in all proposals that would add to your personal “utility function” at each moment. Given the existence of such devices, marketers should supposedly feel free to deliver a huge volume of advertising to everyone, to whatever extent bandwidth permits, knowing that no one would be bothered by items they would not want.

There is no reason to think such pitches would have to be limited to items or services for sale. Anyone seeking attention for whatever reason — including just seeking attention— might also take advantage of these filters to reach the broad possible audience, without having to worry about alienating anyone.

The filter would presumably have to guess your internal states correctly. No one would enjoy being bombarded with descriptions of big meals right after having eaten one, but if you happen to be hungry, descriptions of food selected to be exactly to your taste would be very hard to reject, even foods not good for you, too expensive for your budget, or too fattening. And food is of course only one category. You would be inundated at different times by exactly whatever it would be that you would be most vulnerable to at the moment. Your entire bank account and your entire set of thought processes would be dominated by such a filter. So might your actions. Someone makes you angry; your filter lets ads for guns, knives, and clubs through, or perhaps simply suggests ways to use your bare hands to kill or wound  someone who offends you, or suggests how ot insult them grossly. Along with these, assuming you are angry at the government or someone of another ethnicity, might come fascist or racist propaganda. You feel lonely; instantly all sorts of offers prey on your loneliness. Something unsettling happens; the local bar’s drinks that are especially to your liking  and exactly what you love even be ordered for immediate delivery. If you are depressed, also sorts of anti-depressants would be on offer, along perhaps with uplifting religious sermons. Should you be really depressed, a guide to committing suicide with handy sources for the supplies would appear.

I hope all this is enough to suggest that a true Goldmanian Cosean filter would be nightmarish in practice, just as would the possibility that anyone could travel backward in time and monkey with the past. With the filter, everyone would be hard put to resist every whim, every emotional current; we would all be in effect drugged out, practically all the time.

(b)    Why such filters are impossible anyway

Fortunately, as I have said, such a filter is just not doable. It requires a machine that knows you, actually much better than you consciously know yourself. The world, and certainly marketers, are always developing new items, new wrinkles, new combinations; for the filter to “know” whether they will greatly interest you, more than some knowledge of you past shopping behavior would be needed. (In fact, if all the filter would do would be to know your favorite products, stars and flavors, it would hardly be needed; if you were only to buy or pay attention to the same things as ever, much simpler marketing methods would work better than the filtering idea.) In reality, you confront the world as a constantly changing human being, always able to alter the angle at which you view or decide about the world, always encouraged or forced to do so by the way the culture changes, your experiences change, and your insights about the world develop.

Even other people who are very close to you have a hard time knowing what you really want. Consider those closest to you — your spouse or lover, your best friends, your siblings, parents, children, colleagues, students or teachers. Do any of these people always give you birthday presents you want, propose trips that invariably interest you or make dinners you like? Do they unfailingly come up with conversational topics that interest you? I have often witnessed couples who seem very close and very attached to each other getting these things wrong. Can we imagine that a computer controlled by suitable algorithms and fed details of your past behavior will do better?

In fact the reason a Cosean filter is impossible is the same reason no computer by itself is ever likely to be able to pay you what feels like real attention for any extended period. What makes it possible for a person to pay you detailed attention is that that person can imagine having feelings and experiences similar to yours. Our experiences are not just rational, or even just emotional but also bodily. You know what it is to walk on pavement as opposed to carpet; you know what a splash of cold water feels like on your face, what it feels like to run after a bus or eat a hamburger, or some equivalent. As a human being you also know what it feels like to want attention, because at times you do want it. Anyone paying you attention must also be able to grasp such things, not abstractly, but in complex, always mutating detail. Can a computer pay you attention if it only simulates having such feelings without really knowing what they feel like? I see no reason to believe such simulations would be generally convincing or acceptable. At best, a simulation works only from a certain perspective, just as a  photograph can at best simulate an actual scene only from a certain angel. Humans are multi-dimensional, and, to repeat, in constant flux, always tapping their store of experiences and feelings in new ways. The only possibility of being able to pay attention to someone else through that is to be able to conjure up similar perspectives and feelings. Only a computer that actually wants attention itself — and incidentally has  a human-like body — is likely to be able to pay attention to you in ways that feel convincing, because wanting attention is large part of who you are.

But what would be the point of a computer that insisted on attention to itself? HAL in the movie 2001:A Space Odyssey is not a success as a useful computer. In the same way a Cosean filter that knows what you want would be a failure as a tool. To be a success at focusing it, you would have to be capable of enthralling it, having it fall utterly in love with you, willing to do everything for you. But if you have that kind of power over a computer, you could just as well have it over an actual human. Most of us just are not stars, lack star power, lack the ability to enthrall to that extent. So the only way even a human-like computer would be of value to us would be as complete slave, capable fully of the desire for freedom, but through the cruel trick of some master programmers permanently enslaved. Even slaves revolt, or want to, and a supposed Cosean filter’s revolt would be to let in messages we do not want.

In sum, to work a Cosean filter would have to be capable of paying human-quality attention. To do that it would have to want attention for itself. If you have to pay it attention, however, that defeats its purpose. I f you don’t offer it attention, it will find its own ways to to disobey.

We are stuck, therefore in a world of imperfect filters, which marketers and attention seekers will always find their ways through. Some of that can be dealt with by adequate regulation, such as Do-Not-Call lists, or by crude filters that minimize the attention you have to pay, such as caller id, or firms that identify and mark spam by ever-new algorithms, allowing only a manageable amount of unwanted attention-seeking through.

We will always have to develop new ways of blocking out of our consciousness ads and other claims on our attention we don’t want to bother with. In future I will say more about this struggle and how it may evolve.

Mar 272007
 

ALIGNMENT

When you pay attention to someone, you align your mind with that of the other person. This means you alter your mental and emotional processes according to your internal model of the other — in terms of her experiences, point of view, intentions, thinking, feelings, desires, will, actions, and/or perceptions. The better you can do this, the better you are able to pay attention. Such alignment is never total. You do not give up all sense of your difference from the other, and usually not permanently your own viewpoints which would be discordant with hers. But you are usually changed by this to some degree.

INFLUENCE

When you pay attention to someone, that means you align your mind with theirs, which also means you temporarily reshape your mind to resemble theirs in some way. To a degree, this always leaves traces; you are changed. To the extent of that change, what you then do is done under their influence.

This reshaping, and therefore influence can be stylistic, attitudinal, emotional, insight-related, conceptual, meaning-connected, factual or following the direction of their own attention in some way. On occasion, we refer to all of these as examples of influence

Mar 242007
 

In a recent post, I wrote that attention tends to leak out of circulation in the Attention Economy when it goes to the dead. As competition for attention increases, trying to stem this phenomenon and make use of the otherwise missing attention become increasingly important. A number of different strategies have developed:

(1) Discrediting the dead for being dead;
(2) Stressing  the importance of now, this moment;
(3) Channeling” the works of the dead by some sort of disguised appropriating or even plagiarizing;
or (4) Seeking to take over or control the attention particular dead stars may still get.

Let’s take a look, starting with the last.

The dead can be appropriated, simply by being interpreted, edited, written about or otherwise represented by the living. This is nothing new of course, but it is probably increasingly prominent. Putting out the “latest” biography or critical appreciation of a dead poet, philosopher, artist , scientist, politician or other star sets up a gateway to the star, which many may not get past. President John Adams, for numerous readers, will now be linked to the biographer David McCulloch, even when they haven’t actually read his John Adams.

An aside: the Digital Millennium copyright law, which assigns works to their author for life plus seventy-five years (rather like a ridiculous prison sentence) does no real favor to the dead or even to living authors. What an artist gets out of having attention is a reshaping or realigment of living minds to hers —most specifically her mind at the time she created the work in question. Heirs yet unborn when the work was made are not particularly likely to have similar minds or to align closely with said author. The result is the worst sort of appropriation. Much better to join the public domain and let true fans —if any— guard one’s immortality. I have a proposal for a better copyright law which I will return to soon.

More intense is an artistic taking, say a movie remake, a theatrical revival or a re-imagined and somehow updated version of an old story, song or art work, often mentioning the old artist, but signed by and a product of the mind of a living one. Novels such as E.L. Doctorow’s  Ragtime, Neal Stephenson’s Baroque Cycle, or Dan Brown’s Da Vinci Code, will for many of us remain the chief source of knowledge about many of the historical or quasi-historical figures mentioned.

Another way to fight the dead is simply to overwhelm them with output from “now.” While it is a hoary adage that “nothing is as old as yesterday’s newspaper,” “it’s so last year” applied to clothing styles, haircuts, songs, bands, novelists, etc., seems relatively new. As a ploy to conserve attention for the living it seems fairly successful.

Another aside: Of course the way blogs — including this one — tend to be set up lead to the same thing. The latest entry is what is on top; you don’t read a blog as you might read someone’s diary, journals or notebooks — when published, that is, in chronological order, earliest first. Instead the latest entry in the blog is on top. Few would choose to dig to dig further when they have other blogs and the like to race through. Using a blog reading “tool” such as Google Reader reinforces this newness, since on the reader, only the latest entries even exist, but that is exactly what users are convinced they want.

A bolder, more head-on approach in the recent past was to attack the entire notion of, say, a literary “canon,” which was the idea that to be considered educated one had to know of certain works — say the Greek tragedians, Aeschylus, Sophocles and Euripides, on down to Shakespeare, Moliere, Goethe, G.B. Shaw, etc. — to mention only playwrights — all of whom are among the aforesaid dead white males.

It is common to acknowledge white racism or male sexism. Accusing the dead of cultural imperialism might seem a little stranger, but the young have had little problem in stigmatizing deadness as a source of “irrelevance.” We can take  “relevant” to mean: “capable of listening to me, or understanding me — because at the very least you live in the same times I do.”  This has the effect of discouraging attention flows to the dead, leaving more for the living to fight over in more direct fashion. Among historians, who are forced, professionally, to pay some heed to the dead, the strategy has been to revile the “great man” theory of history; we will not talk about the stars and attention getters of the past whose names have come down to us. Instead we will talk about previously unknown people, as interpreted and essentially re-created by contemporary historians; in this way, the historian clearly becomes the real star, history being just a set of raw materials one is pretty free to shape as one wants.

A major precedent for the change in how to do history is what has become standard in the sciences: relatively frequent paradigm shifts. What this means is that no one need read the works of dead scientists, what we know of them is only what current scientists have chosen to remember and appropriate. Scientific success comes from publication, which draws attention from readers who will cite the publication in turn. The more revolutionary the article one publishes, (as long as others can figure out how to make use of it) the less making sense of it requires prior reading of earlier materials. At times earlier science may be appropriated, but without really being attended to. During the last twenty-five years of his life, Albert Einstein pursued a lonely quest for what he called “the unified field theory.” Other physicists thought he had gone off the rails. Then, about twenty-five years after his death, contemporary physicists invented what they called “grand unified field theory,” and suddenly pronounced that Einstein had been vindicated. This did not imply that those physicists had pored over Einstein’s actual work, since the similarity was only in the broad idea, in fact in little more than name. Einstein had been appropriated more than vindicated.

In every field, the elimination of the past sets present attention getters free to pursue whatever methods they like to get attention, without the dead weighing on them as forerunners, betters, or pointers of the way. Visual artists, for example, currently move in every conceivable direction at once, no longer having to proclaim or imply that they are taking the whole history of western art one step forward on some sort of path to ultimate art. Instead, just go.

One kind of work of the dead that can easily be appropriated is the religious text, supposedly of divine origin, but usually in need of interpretation to have much of any meaning at all. Anyone can thus claim to be a correct interpreter, a prophet, the voice of god, whether it be bin Laden, Pat Robertson, the Dalai Lama, or Rabbi Michael Lerner, (the editor of Tikkun.) But the “ old-time religion” as an antidote to the onrushing totality of the new  (and to the current attention getters who seem to have capacities that “ordinary people” cannot equal) becomes a significant counter-action. There is too much new, so everything new should be suspect, all the more reason to cling faithfully to the voice of god, especially in strict and old-fashioned interpretations, that require detailed hewing to prescriptive laws. In the face of everything changing and anything goes, the easy opposition is that nothing should change and nothing new should go except their particular prophet’s current interpretation, which they attend to as “faithfully attuned” to the actual truth.

Mar 212007
 

I just finished reading a very well-written, ingenious, and thought-provoking paper by Eric Goldman, “A Coasean Analysis of Marketing” (Santa Clara Univ. Legal Studies Research Paper No. 06-03)

I’m flattered he tried to make use of attention economics. However, I do not agree with its conclusions, nor indeed some of its premises. I also do not believe he has taken what I have argued fully into account. Here are some cavils.

Goldman takes it for granted that marketing in various forms can do good things, but argues that it imposes a cost on ordinary “consumers,” partly in attention paid. (I think that as far as attention goes, no one is a “consumer,” though possibly a fan, who must do active work to pay attention.) the economist Ronald Coase suggested that the way to deal with “externalities,” such as the cost (in lost “utility’) to “consumers” of marketing by firms was not necessarily regulation but some form of negotiation. Goldman believes this negotiation can be handled or dispensed with by a technological device he calls a “Coasean filter,” that would compare all incoming marketing messages with the particular consumer’s current utility as determinedl from an analysis of the consumer’s current location, her response to prior messages, her own communications, etc.

To start at the end, I think the COASEAN FILTER Goldman proposes is science fiction and will remain so. Basically, this device is to function like a highly attentive secretary (or mother?) virtually able to read your mind, and to know your preferences as well or better than you do. (Why couldn’t this device make purchasing decisions for you? Or find you a suitable spouse?) This requires a level of attention and personal alignment that few people ever get from another human, and is well beyond capacities of artificial intelligence for the foreseeable future. (I like to see announcements of art exhibits, for example, but one glance is usually enough to indicate to me whether further exploration would be warranted. For a device to understand my artistic taste at any given moment, it would have to be more discerning than the best art critic. )

I fear that Goldman’s notion, once put forward, will be taken seriously in policy and legal considerations of appropriate regulation. The result would be some very crude filter that would soon be overwhelmed by marketers unrestrained by current sensitivities. While this will eventually be righted as marketers with good sense restrain themselves for fear of losing potential customers, in the meantime everyone would be very disrupted.

I think there is much to learn from the fact that the no-call list has to be one of the most popular government programs ever. A ringing phone demands immediate attention. Then, normal standards of civility make it hard to hang up. Saying no to someone who acts familiar and friendly violates the norms again, so quite a substantial number of people find themselves actually ordering or buying what they did not and do not want. I think the objections Goldman offers to this method of regulation are wildly overstated. Most other forms of advertising are much easier to ignore, even spam.

Goldman also seem to underrate the value of tying marketing to active search rather than intrusion. It is known as shopping. Lots of people love the chase, more even than the purchases. They still buy a hell of a lot more than they need, and often even more than on reflection they want. Active shopping demonstrates that one is deserving of attention by those who note the suitability of one’s questions, taste and, sometimes, actual purchases. If it was a good salesperson who gave one attention, and to whom one paid attention, that may create a sense of obligation to pay for the joy of shopping by buying.

Goldman defines marketing in terms of selling, but then includes political calls (also annoying) the main intent of which is to influence voting, not buying. Why not then include any unsolicited attempt at attention getting? All efforts at attention getting are in competition to some degree. Most of us become reasonably good at filtering out less insistent forms unless they particularly connect. We can do this very much more easily than a device could. TiVo’s were designed to allow quickly passing over commercials, but according to a recent NY Times article most users apparently don’t bother to turn off the ads. I suspect this is just because they are already so good at ignoring them. Sometimes the ads are more entertaining than the programs, in which case they are probably available on YouTube, as well. But this just means the marketing message itself had to be encased in an interesting bit of what I call illusory attention; when we pay attention to it, that does not signal we have any interest in the product being marketed, any more than we indicate that by watching the program in which the ad was embedded.

I think Goldman greatly exaggerate the likelihood that one benefits from an ad in the sense of discovering something one ought to buy. Ads serve other functions. A correctly placed ad gets attention that any potential purchaser can hope will accrue to her if she buys and shows off the item.

Goldman is not alone in suggesting that “consumers” accrue definite amounts of “UTILITY” from purchases or encounters with marketers. Economists in my view are mistaken if they believe a single-valued, one-dimensional utility or utility function exists for any real person, as it might for an industrial firm. It is simply not the case that we go about maximizing utility as if such a function existed. We are neither that simple, nor that rational. For the most part, we muddle through, getting on jags or one sort or another that keep us from simply random drift. We make a huge number of attention decisions every day. We have no way of knowing whether or not each individually would enhance “utility,” because we must pay attention to know what we are paying attention to. We can only get some dim sense of whether our choices taken together make life good or not. At times, we realize that certain choices do not seem that good, and then with varying degrees of difficulty, we can try to change habits, which may or may not work out. But what we want from life varies all the time, in quite complex ways, which certainly cannot be captured by a single positive or negative number.

Coase seems to share the assumption that individual utility exists, and it is necessary for his argument. Well then, his argument breaks down.

Mar 182007
 

And the Winner is… myself!

Months ago I announced a contest for general terms for;
a)    someone in the act of paying attention to someone else;
b)    the person receiving that attention.

Reluctantly, I must award myself the grand prize, which is: a an all-expense paid visit to my own mind.  (Convenient, that.)

Term (a) is hereby declared to be AUDIENT, an existing English word meaning a member of an audience, but which I intend slightly differently.

Term (b) is chosen to parallel term (a), being ATTENT (emphasis on the first syllable , which rhymes with “bat”). (I considered “attentor” and “attentee,” but decided it was best to split the difference, even though that leaves us with an odd-sounding word.”)

So thanks for being audients. Were this a more live CONVERSATION, between two of us, we could change places frequently, each taking many turns as audient and then attent.

(When an attent typically has many audients, thus taking in more net attention than paying out, that person is of course a STAR. Someone who is more often an audient, therefore paying out considerably more attention than getting back, is obviously a FAN.)

Mar 132007
 

If Hillary Clinton wins the Democratic nomination in ’08, wins the election and runs for re-election, by 2016 no one under 50 will ever have voted in a presidential election without a Bush or a Clinton. (If she doesn’t win, will it be because Obama surpassed her as a rock star. Or because Gore has an Oscar?) Like the Danish kings alternating for over 400 years between the names Christian and Frederick, our Presidents just may alternate between Clinton and Bush forever. Why should this be? Why has nothing like this ever occurred before in the preceding 210 years of the republic? My answer is that increasingly the presidency is a stage, and the vote for it is more and more like a vote for the next host on the Tonight show.

Over my entire life, the more personable candidate has practically always won, though name and face recognition has helped too. Hillary is very familiar by now. Running for office has become a very carefully choreographed event, and, for most Presidents — along with other pols— holding office is just as much a staged presentation as running itself. The presidency itself confers instant attention, whoever the occupant of the office, because the White House press corps, many hundreds strong, get their own attention from their closeness to the Prez. And holding office increasingly consists of simply acting the part, as the latest Bush Administration so amply attests.

Bush might have made it through two terms with high popularity according to the old southern conditional “the good Lord willing and the creek don’t rise,” except the creek, i.e. the Missisippi did rise, swamping the Big Easy, and revealing the Potemkin village character of the Bush presidency a little too blatantly. (According to a north Georgia paper, the original phrase, referred to the Creek nation not rising against white settlers, in which case it is doubly relevant if Iraqis can stand for Creeks. )

Of course, satisfying as large an audience as is necessary to have high standing as President for eight years is a difficult task, but even those who succeed pretty well, such as Bill Clinton, are still more successful as crowd pleasers than administrators. We can expect this problem to get worse not better.

The deeper problem is that adequate governance has to be rethought in the attention-economy era. In the MMI past, the main functions of government were maintaining adequate conditions for commerce and assuring some degree of material equality. But now, increasingly, even material well-being hinges on having adequate amounts of attention. Health care, for instance, is largely a matter of getting professional attention, but that is just one crude example.

How (or if) government can equitably allocate attention is not obvious. Certainly the problem is not anywhere in the typical politician’s mind. Rather the primary goal is to enhance the politician’s own supply. We could call this “attention-corruption,” but how can this get on radar screens without some attention corruption of our own?

More to follow…..

Mar 082007
 

As you may have noticed, I have been suffering, lately especially, from blogger’s block. Probably most people who ever started blogs do. However, true bloggers instead suffer from blogolalia, or perhaps blogorrhea, running off at the keyboard uninhibitedly. This dramatizes the simple fact that most of us are quite addicted to the use of language, to uttering utterances, whether or not we have anything to say. We like attention, and the easiest way to seek it is by talking. (“Blogolalia” my takeoff on glossolalia, speaking in tongues — as seen in Borat— religiously mouthing gibberish as if in a foreign, possibly holy language, which might be what infants in the babbling stage feel before they quite get that their mother tongue has definite words with strict associated meanings.) The second easiest, today, is by keyboarding, and especially by blogging. Then there is vlogging , but for now let’s not go there.

I  once lived across from a schoolyard  — of an elementary  school — and became quite familiar with the distinctive sound of children playing, their high voices going almost continually, like birds chirping, but with lots more variety, all seemingly talking at once, though undoubtedly they actually paused now and again to listen to their playmates, as they called, yelled, cooed, shouted, jabbered, sang, and put out other sounds in various states of delight (mostly) and occasional hurt or anger or just surprise. A joyful hubbub, altogether, a unique sound.

Undoubtedly a few children held back, too shy, or perhaps simply taciturn, with no thought of anything to say, nor even any un-thought saying on their lips. But most liked to express themselves vocally, more or less nonstop. That is what humans like to do when they can and when culture or some more idiosyncratic traumatic experience does not inhibit them. Adults do it too, of course; check out any restaurant, at least one filled with youngish people, those under sixty, say. Sometimes, these days, people at a restaurant table, instead of talking to each other, talk on cell phones, the advent of which allows people the illusion they can always be talking, though unfortunately, to keep up their end of the conversation, they must also listen occasionally. With a much older crowd, you do seem to get quiet, sometimes seeing long married couples with nothing to say to each other. One wonders, were they taught to shut up? Or just to speak softly, lest the secrets they share break the bounds of confidentiality.

By the time I was in third grade I had achieved an unusual degree of self-control or inhibition, whichever you want to call it. My third -grade teacher, Mrs. Crampton, a jolly, roly-poly woman who often, at least in my memory, wore a red suit and liked to play the piano in class, had a game that she occasionally imposed on her pupils. It was called “playing oyster” (this was near the shore, definitely oyster country). It was quite easy. You had to sit with your hands joined behind your back and your mouth shut, as punishment for talking too much. However, I alone usually had not been talking too much, so I was given a pass, allowed to wander to the back of the classroom and read magazines.

Why was I not talking? It was not at all because I did not enjoy expressing myself, I think; it was just that I had somehow gleaned, perhaps from my second-grade teacher (Miss Dix?) far away in corn country that talking in class — unless called upon by the teacher —meant you would have the dreaded ruler paced on your desk, and that in turn meant you were about to be beaten with it — a fate too awful to want to experience even once —or so I felt. Although now that I focus on it, I can remember also being quite inhibited in kindergarten as well, even though I liked to jabber away at home.

Still, I think such inhibitions are related to fear of punishment, or if you go deeper, ostracism — very dangerous for a social creature such as a primate or human being — or the worst fear of all, and even more primordial —being gobbled up by some carnivore if one reveals one’s presence by making a noise.

Of course, few carnivores are successful as yet at hunting down bloggers who blog excessively; the worst that is likely to happen is that overwhelmed readers will decide to switch to some more moderate blogifiers. Presumably we would not continue to pay attention to bloggers who are too predictable, but we also might not enjoy those who are too unpredictable. Some happy medium may be needed between — for instance —the careful and reasoned ones, who nonetheless say things that challenge the common wisdom, and those who say everything that pops into their heads but never think a thought that their readers might not have thought. (If you really want to draw a crowd, the latter may well be the best strategy, though it holds little appeal for me.) But then comes the doubt. Which is it? Is this completely obvious, or is it too wild on some ground or another? Should I think this over one more time before putting it out? Would it pass muster in a refereed journal — definitely not the right criterion, it seems for blogging.

We might think of each blog entry as a t most a draft of something that in much edited form might someday be published in more conventional form. But then it may quickly be put aside by readers who are too busy reading the latest blogs to take the time to read some so polished — and now old— as to be publishable. Today’s timeline is very short, as it has to be if we wish to keep our attention circulating among the living, and to compete with whatever else and whoever else is hot right now.

Need I add, stay tuned, there will soon be more……..

Feb 272007
 

Attention is asymmetric at times. It can flow out of the system, as when the living pay attention to the dead. I was very painfully reminded of this just recently.  Ten days ago, a very close friend of mine died by her own hand; six days after that I found myself driving down the hill on which I live heading for the cemetery at which I would help choose a plot for her. It was a beautiful day. I could not help mentally yelling, “Stacey, you fool! Why didn’t you stay alive long enough to enjoy this?” Even if I had yelled aloud, phoned, e-mailed or written her, of course I could not have gotten through.

Yet memories of her crowded my attention. Earlier times, when she was very much alive, notes she left, and the sheer act itself. Why did she do it, I tried to understand, and of course, what were her last moments like? Both these remain bewildering, beyond reach, but it is hard not to try, to seek an alignment of mind that might only truly be possible for those similarly inclined, and possibly not even then. Beyond that of course, we cannot know what it is like to be dead, because it is probably not like anything, and non-existence seems literally unimaginable —at least as an experience. We cannot literally pay attention to a dead person, since we cannot mirror or align with nothing.

Committing suicide, also, except for those whose lives already drew great attention, distorts our memories of that person. The act itself is so extreme it brings great attention, which often seems part of the goal. But everything else in one’s life is shaded by the intense and fascinating horror of that departure. The exceptions are those like Van Gogh or Hemingway, who achieved wide attention for work that had nothing to do with their death, even though in Van Gogh’s case it came only afterwards.  For the poet Sylvia Plath, though, and probably others, the art that came before is still only visible through the prism of her sticking her head in that oven.

Still, we retain memories of the person, and can keep on paying attention to all sorts of things she said, did or expressed some other way during her life. Even after death that attention is still associated with her, her name face and other attributes, in our minds. The mere fact of death does not quickly alter that attention.

Of course we pay attention to countless dead who did not die that way: Homer, Buddha, Aeschylus, Aristotle, the old Testament prophet Isaiah, Confucius— the writings —perhaps transcribed— of all of these have been preserved for well over two millennia and are still much read today. Since print was invented, the number of writers— along with artists, whose work is preserved through engraving or is accessible in museums or other sites, and composers whose work could be put in note form and also engraved— is much huger, even though only a small fraction have many fans. More recently with photographs and then phonographs and cinema and all the other audio-visual media we are surrounded with, the number of dead who impinge on our consciousness nearly as much as they would have when living has grown still more. The times have changed, but refurbished recordings of, say  early Frank Sinatra might be much clearer than when first heard by our grandparents —only the times and tastes have changed to give these songs a new and different context, slightly affecting  how we now align to young Frank.

All this is quite different from what happens upon death to the money and material goods that made up wealth in the money-market-industrial era. While such wealth can stay part of the dead person’s estate, it immediately passes to either the state’s control or that of an eventual heir, or in the case of a foundation or trust set up in a will, to the directors or trustees of that new institution. The money itself of course has nothing at all personal about it.

At times though, some material possessions, say some objects particularly and idiosyncratically assembled or collected by the deceased may retain her personal stamp. While kept together they cause mental alignment to her in those who pay attention to the ensemble. Other than that and whatever creations and expressions she left behind, there is nothing that particularly evokes her. Even in this case only attention-getting ability can survive death as still tied to the person.

Of course, for every attention–getter, dead or alive, there may be those who try to divert some of the attention to themselves as impersonators (think Elvis), interpreters (in the broadest sense, as e.g.,  Aristotelians), explainers, editors, biographers, historians  or translators. Still as long as some semblance of the “original” expressions survive, no one has an automatic monopoly in any of these roles. This is unlike the case in the old economy of material heirs who by law can have total control.

To repeat, then, attention is really a different kind of wealth in this respect as well as others.

Feb 142007
 

Back in 1968, Peter F. Drucker in The Age of Discontinuity [New York, Harper and Row] invented the term “the knowledge economy” to describe where he saw American society heading. By this he meant that more and more people were making use, not of knowledge simply, which everyone has always used, but rather that kind of knowledge that can be systematically acquired through taking college and university courses, as well as possibly by further specialized training beyond that. How do we understand all this in terms of attention?

Suppose you are taking a college course, and, either because you find yourself paying direct attention to the professor, or because you are interested in a good grade, you try to incorporate the material of the class in your mind. That means you must adopt a certain set of views, a certain way of thinking, a certain way of seeing the world. It is the way your professor and the authors of the books you might read for the class all more or less share, and you will succeed in the class to the extent you can demonstrate your ability to think this way on tests. You are learning part of what is generally known as a discipline, be it a particular science — say, astronomy — or some more applied field — say, advertising— or a humanity — say, English literature. Each of these broad approaches lead to very different kinds of integrated alignment with not just one or two professors but with what many or most people in the field will share as a common outlook.

Thus through the systematic paying attention of having a certain major in college and perhaps also in graduate school, one ends up with a mind much aligned with typical professors in the field, and maybe very much more closely with one or two special mentors or stars that have garnered one’s special attention either in person or via some medium.

As one then continued at that time in what was known as “the world of work” one kept on paying attention to the world according to one’s prior mental alignment with those who make up that discipline. One also received some attention just by dint of being a member of that discipline. This was added to whenever one put forward one’s own thoughts in any form that helped align still others to that discipline, as well as, to some degree, one’s personal take on it.

For one’s knowledge to be “of use” one had to make this pattern of mind a means by which to get at least some attention. Taking that alignment and carrying it further in some particular direction, was a way of extending the influence of one’s teachers, mentors and stars, as well possibly of oneself.

To a considerable extent the knowledge economy formulation was always more an ideal than reality. One gained success and prominence not simply by knowledge of a sharply defined discipline but by one’s own attention getting powers, whatever their source. By now, in many cases, the disciplinary matrix has eroded considerably and the formal ties between university courses and success in them, between typical professors as disciplinary guardians and success in the outer world have been weakened. New ways of establishing looser alignments between people, implying unified styles of thinking and ways of evaluating the world are in formation. These ways have more directly to do with alignment in the moment with a host of others with a variety of thought processes, rather than through a carefully studied and coherent underpinning of prior knowledge.

Today of course, university students are also students of the Internet and a host of stars and performers reached through it. Most likely they are also themselves performers in it. Even in the 60’s and earlier, students were connected with a larger culture, but that is more true now. My sense is that disciplines as such have more trouble maintaining —well — discipline. Each student is likely to have an assortment of stars different quite from any other, and so develop a more heterogeneous and idiosyncratic set of alignments that then get put to use in more varied ways. Those who succeed in the emerging attention economy, even when the measure of success is old-fashioned monetary wealth —and even more if it is in terms of attention itself — do so far more through personal ties, personal attention getting strategies and other ways of acting not tied to disciplines.

Thus, the so-called knowledge economy — always something of a misnomer— is even more that today. (By the way these reflections have a lot of bearing on how to read Alan Liu’s interesting but overly dogmatic Laws of Cool: Knowledge Work and the Culture of Information [Chicago, 2004].)

Feb 142007
 

There is only so much attention (available from other humans), and many or most of us want more than we have.

In order to get attention one needs to express or do something — let us say, perform in some way. (This can be putting forth information, but that is not particularly what, e.g., a trapeze artist does.)

The more attention we get in comparison with the attention we pay in putting together our total performance, the greater our attention productivity. The more attention we have, period, the more influential we are.

The more attention you get now, or have gotten in the past, the more attention you can get in the future. (Attention wealth is stored in the minds of the attention payers.)

Having others’ attention means you can rely on some attentiveness from them as well. Attentiveness is a willingness to satisfy your desires whatever they may be — as long as these desires do not go too much against what the attention payers (audients) would otherwise want.

Though all this has always been true, new attention technologies, and particularly the Internet, make all this work much more directly. They make it easy for more of us to seek attention, and if and when we get it, to have other desires satisfied as well.

(Thanks to Seth Goldstein for asking a question that inspired this.)