28 January, 2007

2 great articles - ALTRUIST & Future of USA

MUST HEAR AUDIO -- download the Indymedia Radio interviews with these two authors:

David Graeber or mp3 direct

Chalmers Johnson or mp3 direct

Army of Altruists

PROPOSITION I: NEITHER EGOISM NOR ALTRUISM IS A NATURAL URGE; THEY IN
FACT ARISE IN RELATION TO EACH OTHER AND NEITHER WOULD BE CONCEIVABLE
WITHOUT THE MARKET

First of all, I should make clear that I do not believe that either
egoism or altruism is somehow inherent in human nature. Human motives are
rarely that simple. Rather, egoism and altruism are ideas we have about
human nature. Historically, one has tended to arise in response to the
other. In the ancient world, for example, it is generally in the rimes and
places that one sees the emergence of money and markets that one also sees
the rise of world religions–Buddhism, Christianity, and Islam. If one sets
aside a space and says, “Here you shall think only about acquiring
material things for yourself,” then it is hardly surprising that before
long someone else will set aside a countervailing space and declare, in
effect: “Yes, but here we must contemplate the fact that the self, and
material things, are ultimately unimportant.” It was these latter
institutions, of course, that first developed our modern notions of
charity.

On the alienated right to do good

You know, education, if you make the most of it, you study hard, you
do your homework and you make an effort to be smart, you can do welt. If
you don’t, you get stuck in Iraq.

–Sen. John Kerry (D., Mass.)

Kerry owes an apology to the many thousands of Americans serving in
Iraq, who answered their country’s call because they are patriots and not
because of any deficiencies in their education.

–Sen, John McCain (R., Ariz.)

In the lead-up to the midterm elections, the Republicans’ single fleeting
ray of hope was a botched joke by Senator John Kerry. The joke was
obviously aimed at George W. Bush, but they took it to suggest that Kerry
thought only those who flunked out of school end up in the military. It
was all very disingenuous, Most knew perfectly well that Kerry’s real
point was to suggest that the president wasn’t very bright. But the right
smelled blood. The problem with “aristo-slackers” like Kerry, wrote one
blogger on the website of National Review, is that they assume “the troops
are in Iraq not because they are deeply committed to the mission (they
need to deny that) but rather because of a system that takes advantage of
their lack of social and economic opportunities…. We should clobber them
with that ruthlessly until the day of the election–just like we did in ‘04–
because it is the most basic reason they deserve to lose.”

In the end, it didn’t make a lot of difference, because most Americans
decided they were not deeply committed to the mission either–insofar as
they were even sure what the mission was. But it seems to me the question
we should really be asking is: why did it take a military catastrophe (not
to mention a strategy of trying to avoid any association with the sort of
north-eastern elites Kerry typifies for so many Americans) to allow the
Democrats to finally emerge from the political wilderness? Or, in other
words: why has this Republican line proved so effective?

It strikes me that to get at the answer, one has to probe far more deeply
into the nature of American society than most commentators are willing to
go. We’re used to reducing all such issues to an either/or: patriotism
versus opportunity, “values” versus bread-and-butter issues like jobs and
education. But I would argue that to frame things this way plays into the
hands of the right. Certainly, many people do join the army because they
are deprived of opportunities. But the real question to be asking is:
opportunities to do what?

Let me offer an anthropological perspective on the question. It first came
home to me a year or two ago when I was attending a lecture by Catherine
Lutz, a fellow anthropologist from Brown University who has been studying
U.S. military bases overseas. Many of these bases organize outreach
programs, in which soldiers venture out to repair schoolrooms or to
perform free dental checkups for the locals. These programs were created
to improve local relations, but they were apparently at least as effective
in their psychological impact on the soldiers, many of whom would wax
euphoric when describing them: e.g., “This is why I joined the army,”
“This is what military service is really all about–not just defending your
country, but helping people.” The military’s own statistics point in the
same direction: although the surveys do not list “helping people” among
the motives for enlistment, the most high-minded option available–”to do
something to be proud of”–is the favorite.

Is it possible that America is actually a nation of frustrated altruists?
Certainly this is not the way that we normally think about ourselves. (Our
normal habits of thought, actually, tend toward a rough and ready
cynicism. The world is a giant marketplace; everyone is in it for a buck;
if you want to understand why something happened, first ask who stands to
gain by it. The same attitudes expressed in the back rooms of bars are
echoed in the highest reaches of social science. America’s great
contribution to the world in the latter respect has been the development
of “rational choice” theories, which proceed from the assumption that all
human behavior can be understood as a matter of economic calculation, of
rational actors trying to get as much as possible out of any given
situation with the least cost to themselves. As a result, in most fields,
the very existence of altruistic behavior is considered a kind of puzzle,
and everyone from economists to evolutionary biologists has become famous
through attempts to “solve” it–that is, to explain the mystery of why bees
sacrifice themselves for hives or human beings hold open doors and give
correct street directions to total strangers. At the same time, the case
of the military bases suggests the possibility that in fact Americans,
particularly the less “affluent ones, are haunted by frustrated desires to
do good in the world.

It would not be difficult to assemble evidence that this is the case.
Studies of charitable giving, for example, have shown the poor to be the
most generous: the lower one’s income, the higher the proportion of it
that one is likely to give away to strangers. The same pattern holds true,
incidentally, when comparing the middle classes and the rich: one study of
tax returns in 2003 concluded that if the most affluent families had given
away as much of their assets as even the average middle-class family,
overall charitable donations that year would have increased by $25
billion. (All this despite the fact that the wealthy have far more time
and opportunity.) Moreover, charity represents only a tiny part of the
picture. If one were to break down what typical American wage earners do
with their disposable income, one would find that they give much of it
away, either through spending in one way or another on their children or
through sharing with others: presents, trips, parties, the six-pack of
beer for the local softball game. One might object that such sharing is
more a reflection of the real nature of pleasure than anything else (who
would want to eat a delicious meal at an expensive restaurant all by
himself?), but this is actually half the point. Even our self-indulgences
tend to be dominated by the logic of the gift. Similarly, some might
object that shelling out a small fortune to send one’s children to an
exclusive kindergarten is more about stares than altruism. Perhaps: but if
you look at what happens over the course of people’s actual lives, it soon
becomes apparent that this kind of behavior fulfills an identical
psychological need. How many youthful idealists throughout history have
managed to finally come to terms with a world based on selfishness and
greed the moment they start a family? If one were to assume altruism were
the primary human motivation, this would make perfect sense: The only way
they can convince themselves to abandon their desire to do right by the
world as a whole is to substitute an even more powerful desire to do right
by their children.

What all this suggests to me is that American society might well work
completely differently than we tend to assume. Imagine, for a moment, that
the United States as it exists today were the creation of some ingenious
social engineer. What assumptions about human nature could we say this
engineer must have been working with? Certainly nothing like rational
choice theory. For clearly our social engineer understands that the only
way to convince human beings to enter into the world of work and the
marketplace (that is, of mind-numbing labor and cutthroat competition) is
to dangle the prospect of thereby being able to lavish money on one’s
children, buy drinks for one’s friends, and, if one hits the jackpot,
spend the rest of one’s life endowing museums and providing AIDS
medications to impoverished countries in Africa. Our theorists are
constantly trying to strip away the veil of appearances and show how all
such apparently selfless gestures really mask mine kind of self-interested
strategy, but in reality American society is better conceived as a battle
over access to the right to behave altruistically. Selflessness–or, at
least, the right to engage in high-minded activity–is not the strategy. It
is the prize.

if nothing else, I think this helps us understand why the right has been
so much better, in recent years, at playing to populist sentiments than
the left. Essentially, they do it by accusing liberals of cutting ordinary
Americans off from the right to do good in the world. Let me explain what
I mean here by throwing out a series of propositions.

PROPOSITION I: NEITHER EGOISM NOR ALTRUISM IS A NATURAL URGE; THEY IN FACT
ARISE IN RELATION TO EACH OTHER AND NEITHER WOULD BE CONCEIVABLE WITHOUT
THE MARKET

First of all, I should make clear that I do not believe that either egoism
or altruism is somehow inherent in human nature. Human motives are rarely
that simple. Rather, egoism and altruism are ideas we have about human
nature. Historically, one has tended to arise in response to the other. In
the ancient world, for example, it is generally in the rimes and places
that one sees the emergence of money and markets that one also sees the
rise of world religions–Buddhism, Christianity, and Islam. If one sets
aside a space and says, “Here you shall think only about acquiring
material things for yourself,” then it is hardly surprising that before
long someone else will set aside a countervailing space and declare, in
effect: “Yes, but here we must contemplate the fact that the self, and
material things, are ultimately unimportant.” It was these latter
institutions, of course, that first developed our modern notions of
charity.

Even today, when we operate outside the domain of the market or of
religion, very few of our actions could be said to be motivated by
anything so simple as untrammeled greed or utterly selfless generosity.
When we are dealing not with strangers but with friends, relatives, or
enemies, a much more complicated set of motivations will generally come
into play: envy, solidarity, pride, self-destructive grief, loyalty,
romantic obsession, resentment, spite, shame, conviviality, the
anticipation of shared enjoyment, the desire to show up a rival, and so
on, These are the motivations impelling the major dramas of our lives that
great novelists like Tolstoy and Dostoevsky immortalize but that social
theorists, for some reason, tend to ignore, if one travels to parts of the
world where money and markets do not exist–say, to certain parts of New
Guinea or Amazonia–such complicated webs of motivation are precisely what
one still finds. In societies based around small communities, where almost
everyone is either a friend, a relative, or an enemy of everyone else, the
languages spoken tend even to lack words that correspond to
“self-interest” or “altruism” but include very subtle vocabularies for
describing envy, solidarity, pride, and the like. Their economic dealings
with one another likewise tend to he based on much more subtle principles.
Anthropologists have created a vast literature to try to fathom the
dynamics of these apparently exotic “gift economies,” but if it seems odd
to us to see, for instance, important men conniving with their cousins to
finagle vast wealth, which they then present as gifts to bitter enemies in
order to publicly humiliate them, it is because we are so used to
operating inside impersonal markets that it never occurs to us to think
how we would act if we had an economic system in which we treated people
based on how we actually felt about them.

Nowadays, the work of destroying such ways of life is still often done by
missionaries–representatives of those very world religions that originally
sprang up in reaction to the market long ago. Missionaries, of course, are
out to save souls; but they rarely interpret this to mean their role is
simply to teach people to accept God and be more altruistic. Almost
invariably, they end up trying to convince people to be more selfish and
more altruistic at the same time. On the one hand, they set out to teach
the “natives” proper work discipline, and try to get them involved with
buying and rolling products on the market, so as to better their material
lot. At the same time, they explain to them that ultimately, material
things are unimportant, and lecture on the value of the higher things,
such as selfless devotion to others.

PROPOSITION II: THE POLITICAL RIGHT HAS ALWAYS TRIED TO ENHANCE THIS
DIVISION AND THUS CLAIMS TO BE THE CHAMPION OF BOTH EGOISM AND ALTRUISM
SIMULTANEOUSLY. THE LEFT HAS TRIED TO EFFACE IT

Might this not help to explain why the United States, the most
market-driven, industrialized society on earth, is also among the most
religious? Or, even more strikingly, why the country that produced Tolstoy
and Dostoevsky spent much of the twentieth century trying to eradicate
both the market and religion entirely?

Whereas the political left has always tried to efface this distinction–
whether by trying to create economic systems that are nor driven by the
profit motive or by replacing private charity with one or another form of
community support–the political right has always thrived on it. In the
United States, for example, the Republican Party is dominated by two
ideological wings: the libertarians and the “Christian right.” At one
extreme, Republicans are free-market fundamentalists and advocates of
individual liberties (even if they see those liberties largely as a matter
of consumer choice); on the other, they are fundamentalists of a more
literal variety, suspicious of most individual liberties but enthusiastic
about biblical injunctions, “family values,” and charitable good works. At
first glance it might seem remarkable that such an alliance manages to
hold together at all (and certainly they have ongoing tensions, most
famously over abortion). But, in fact, right-wing coalitions almost always
take some variation of this form. One might say that the right’s approach
is to release the dogs of the market, throwing all traditional verities
into disarray; and then, in this tumult of insecurity, offer themselves
tip as the last bastion of order and hierarchy, the stalwart defenders of
the authority of churches and fathers against the barbarians they have
themselves unleashed. A scam it may be, but it is a remarkably effective
one; and one result is that the right ends up seeming to have a monopoly
on value. It manages, we might say, to occupy both positions, on either
side of the divide: extreme egoism and extreme altruism.

Consider, for a moment, the word “value.” When economists talk about value
they are really talking about money–or, more precisely, about whatever it
is that money is measuring; also, whatever it is that economic actors are
assumed to be pursuing. When we are working for a living, or buying and
selling things, we are rewarded with money. But whenever we are not
working or buying or selling, when we are motivated by pretty much
anything other than the desire to get money, we suddenly find ourselves in
the domain of “values.” The most commonly invoked of these are, of course,
“family values” (which is unsurprising, since by far the most common form
of unpaid labor in most industrial societies is child-rearing and
housework), but we also talk about religious values, political values, the
values that attach themselves to art or patriotism–one could even,
perhaps, count loyalty to one’s favorite basketball team. All are seen as
commitments that are, or ought to be, uncorrupted by the market. At the
same time, they are also seen as utterly unique; whereas money makes all
things comparable, “values” such as beauty, devotion, or integrity cannot,
by definition, be compared. There is no mathematical formula that could
possibly allow one to calculate just how much personal integrity it is
right to sacrifice in the pursuit of art or how to balance
responsibilities to your family with responsibilities to your God.
(Obviously, people do make these kinds of compromises all the time. But
they cannot be calculated.) One might put it this way: if value is simply
what one considers important, then money allows importance to take a
liquid form, by enabling us to compare precise quantities of importance
and trade one off for the other. If someone does accumulate a very large
amount of money, the first thing he or she is likely to do is to try to
convert it into something unique, whether it be Monet’s water lilies, a
prizewinning racehorse, or an endowed chair at a university.

What is really at stake here in any market economy is precisely the
ability to make these trades, to convert “value” into “values.” All of us
are striving to put ourselves in a position in which we can dedicate
ourselves to something larger than ourselves. When liberals do well in
America, it’s because they can embody that possibility: the Kennedys, for
example, are the ultimate Democratic icons not just because they started
as poor Irish immigrants who made enormous amounts of money but because
they are seen as having managed, ultimately, to turn all that money into
nobility.

PROPOSITION III: THE REAL PROBLEM OF THE AMERICAN LEFT IS THAT ALTHOUGH IT
DOES TRY IN CERTAIN WAYS TO EFFACE THE DIVISION BETWEEN EGOISM AND
ALTRUISM, VALUE AND VALUES, IT LARGELY DOES SO FOR ITS OWN CHILDREN. THIS
HAS ALLOWED THE RIGHT, PARADOXICALLY, TO REPRESENT ITSELF AS THE CHAMPION
OF THE WORKING CLASS

This proposition might help explain why the left in America is in such a
mess. Far from promoting new visions of effacing the difference between
egoism and altruism, value and values, or providing a model for passing
from one to the other, progressives cannot even seem to understand the
problem. After the last presidential election, the big debate in
progressive circles was the relative importance of economic issues versus
what was called “the culture wars.” Did the Democrats lose because they
were not able to spell out any plausible economic alternatives, or did the
Republicans win because they successfully mobilized evangelical Christians
around the issue of gay marriage? The very fact that progressives frame
the question this way not only shows they are trapped in the right’s terms
of analysis; it demonstrates that they do not understand how America
really works.

Let me illustrate what i mean by considering the strange popular appeal,
at least until recently, of George W. Bush. In 2004 most of the American
liberal intelligentsia did not seem to be able to get their minds around
it. After the election, what left so many of them reeling was their
suspicion that the things they most hated about Bush were exactly what so
many Bush voters liked about him. Consider the debates, for example. If
statistics are to be believed, millions of Americans watched George Bush
and John Kerry lock horns, concluded that Kerry won, and then went off and
voted for Bush anyway. It was hard to escape the suspicion that, in the
end, Kerry’s articulate presentation, his skill with words and arguments,
had actually counted against him.

This sent liberals into spirals of despair. They could not understand why
decisive leadership was equated with acting like an idiot. Neither could
they understand how a man who comes from one of the most elite families in
the country, who attended Andover, Yale, and Harvard, and whose signature
facial expression is a self-satisfied smirk, ever convinced anyone he was
a “man of the people.” I must admit I have struggled with this as well. As
a child of working-class parents who won a scholarship to Andover in the
1970s and, eventually, a job at Yale, I have spent much of my life in the
presence of men like Bush, every inch of them oozing self-satisfied
privilege. But, in fact, stories like mine–stories of dramatic class
mobility through academic accomplishment–are increasingly unusual in
America.

America, of course, continues to see itself as a land of opportunity, and
certainly from the perspective of an immigrant from Haiti or Bangladesh it
is. But America has always been a country built on the promise of
unlimited upward mobility. The working-class condition has been
traditionally seen as a way station, as something one’s family passes
through on the road to something else. Abraham Lincoln used to stress that
what made American democracy possible was the absence of a class of
permanent wage laborers. In Lincoln’s day, the ideal was that wage
laborers would eventually save up enough money to build a better life: if
nothing else, to buy some land and become a homesteader on the frontier.

The point is not how accurate this ideal was; the point is that most
Americans have found the image plausible. Every time the road is perceived
to be clogged, profound unrest ensues. The closing of the frontier led to
bitter labor struggles, and over the course of the twentieth century, the
steady and rapid expansion of the American university system could be seen
as a kind of substitute. Particularly after World War II, huge resources
were poured into expanding the higher education system, which grew
extremely rapidly, and all this growth was promoted quite explicitly as a
means of social mobility. This served during the Cold War as almost an
implied social contract, not just offering a comfortable life to the
working classes but holding out the chance that their children would not
be working class themselves. The problem, of course, is that a higher
education system cannot be expanded forever. At a certain point one ends
up with a significant portion of the population unable to find work even
remotely in line with their qualifications, who have every reason to be
angry about their situation, and who also have access to the entire
history of radical thought. By the late Sixties and early Seventies, the
very point when the expansion of the university system hit a dead end,
campuses were, predictably, exploding.

What followed could be seen as a kind of settlement. Campus radicals were
reabsorbed into the university but set to work largely at training
children of the elite. As the cost of education has skyrocketed, financial
aid has been cut back, and the prospect of social mobility through
education–above all liberal arts education–has been rapidly diminished.
The number of working-class students in major universities, which steadily
grew until the Seventies, has now been declining for decades. The matter
was further complicated by the fact that this overall decline of
accessibility happened at almost exactly the same time that many who had
previously been excluded (the G.I. Bill of Rights, after all, had applied
basically to white males) were finally being welcomed. These were the
identities celebrated in the campus “identity politics” of the Eighties
and Nineties–an inclusiveness that notably did not extend to, say,
Baptists or “rednecks.” Unsurprisingly, many focused their rage not on
govern. merit or on university administrations but on minorities, queers,
and feminists.

Why do working-class Bush voters tend to resent intellectuals more than
they do the rich? It seems to me that the answer is simple. They can
imagine a scenario in which they might become rich but cannot possibly
imagine one in which they, or any of their children, would become members
of the intelligentsia. If you think about it, this is not an unreasonable
assessment. A mechanic from Nebraska knows it is highly unlikely that his
son or daughter will ever become an Enron executive. But it is possible.
There is virtually no chance, however, that his child, no matter how
talented, will ever become an international human-rights lawyer or a drama
critic for the New York Times. Here we need to remember not just the
changes in higher education but also the role of unpaid, or effectively
unpaid, internships. It has become a fact of life in the United States
that if one chooses a career for any reason other than the salary, for the
first year or two one will not be paid. This is certainly true if one
wishes to be involved in altruistic pursuits: say, to join the world of
charities, or NGOs, or to become a political activist. But it is equally
true if one wants to pursue values like Beauty or Truth: to become part of
the world of books, or the art world, or an investigative reporter. The
custom effectively seals off such a career for any poor student who
actually does attain a liberal arts education. Such structures of
exclusion had always existed, of course, especially at the top, but in
recent decades fences have become fortresses.

If that mechanic’s daughter wishes to pursue something higher, more noble,
for a career, what options does she really have? Likely just two: She can
seek employment at her local church, which is hard to get. Or she can join
the army.

This is, of course, the secret of nobility. To be noble is to be generous,
high-minded, altruistic, to pursue higher forms of value. But it is also
to be able to do so because one does not really have to think too much
about money. This is precisely what our soldiers are doing when they give
free dental examinations to villagers: they are being paid (modestly, but
adequately) to do good in the world. Seen in this light, it is also easier
to see what really happened at universities in the wake of the 1960s–the
“settlement” I mentioned above. Campus radicals set out to create a new
society that destroyed the distinction between egoism and altruism, value
and values. It did not work out, but they were, effectively, offered a
kind of compensation: the privilege to use the university system to create
lives that did so, in their own little way, to be supported in one’s
material needs while pursuing virtue, truth, and beauty, and, above all,
to pass that privilege on to their own children. One cannot blame them for
accepting the offer. But neither can one blame the rest of the country for
hating them for it. Not because they reject the project: as I say, this is
what America is all about. As I always tell activists engaged in the peace
movement and counter-recruitment campaigns: why do working-class kids join
the army anyway? Because, like any teenager, they want to escape the world
of tedious work and meaningless consumerism, to live a life of adventure
and camaraderie in which they believe they are doing something genuinely
noble. They join the army because they want to be like you.

~~~~~~~~

By David Graeber

David Graeber is an anthropologist and activist currently living in New
York City. An associate professor at Yale, he is the author of Toward an
Anthropological Theory of Value and Fragments of an Anarchist Anthropology

pittsburgh gal Says:
January 14th, 2007 at 10:39 pm

this article is awesome, so insightful! we should have a department of
peace that pays to send kids from all backgrounds overseas to do good
stuff- like peacecorps or americorps, but actually fund it like the
military is.

sirila Says:
January 18th, 2007 at 4:16 pm

Thank you Mr Graeber and whatever good-citizen typed this all out. I agree
it is a tour de force. It’s not often I read something *new*, but this is
very thought-provoking. I appreciate the opportunity to email it to
friends.

a-train Says:
January 19th, 2007 at 9:58 am

I agree completely with Sirila re: your article. I’d add that the
implications for political and social action (particularly on the left)
are profound and far-reaching.

Additionally, I’m in law school right now and I have seen your thesis play
out over and over again (though I was not careful enough of an observer to
note the phenomena until I read your article). The legal jobs in the
“altruistic” practice areas (civil liberties, legal aid, etc) are among
the most competitive, and in most cases only the most privileged can
afford to do what it takes to land those jobs (unpaid internships, travel,
etc.).

http://www.sleepykid.org/blog/2007/01/13/army-of-altruists/

===============================

A National Intelligence Estimate on the United States
{1}

by Chalmers Johnson {2}

Harper's Magazine (January 2007)

KEY JUDGMENTS

The United States remains, for the moment, the most
powerful nation in history, but it faces a violent
contradiction between its long republican tradition and
its more recent imperial ambitions.

The fate of previous democratic empires suggests that
such a conflict is unsustainable and will be resolved
in one of two ways. Rome attempted to keep its empire
and lost its democracy. Britain chose to remain
democratic and in the process let go its empire.
Intentionally or not, the people of the United States
already are well embarked upon the course of
non-democratic empire.

Several factors, however, indicate that this course
will be a brief one, which most likely will end in
economic and political collapse.

Military Keynesianism: The imperial project is
expensive. The flow of the nation's wealth from
taxpayers and (increasingly) foreign lenders through
the government to military contractors and
(decreasingly) back to the taxpayers - has created a
form of "military Keynesianism", in which the domestic
economy requires sustained military ambition in order
to avoid recession or collapse.

The Unitary Presidency: Sustained military ambition is
inherently anti-republican, in that it tends to
concentrate power in the executive branch. In the United
States, President George W Bush subscribes to an
esoteric interpretation of the Constitution called the
theory of the unitary executive, which holds, in effect,
that the president has the authority to ignore the
separation of powers written into the Constitution,
creating a feedback loop

in which permanent war and the unitary presidency are
mutually reinforcing.

Failed Checks on Executive Ambition: The US
legislature and judiciary appear to be incapable of
restraining the president and therefore restraining
imperial ambition. Direct opposition from the people,
in the form of democratic action or violent uprising, is
unlikely because the television and print media have
by and large found it unprofitable to inform the public
about the actions of the country's leaders. Nor is it
likely that the military will attempt to take over the
executive branch by way of a coup.

Bankruptcy and Collapse: Confronted by the limits of
its own vast but nonetheless finite financial resources
and lacking the political check on spending provided by
a functioning democracy, the United States will within
a very short time face financial or even political
collapse at home and a significantly diminished ability
to project force abroad.

DISCUSSION

Military Keynesianism

The ongoing US militarization of its foreign affairs
has spiked precipitously in recent years, with
increasingly expensive commitments in Afghanistan and
Iraq. These commitments grew from many specific
political factors, including the ideological
predilections of the current regime, the growing need
for material access to the oil rich regions of the
Middle East, and a long-term bipartisan emphasis on
hegemony as a basis for national security. The
domestic economic basis for these commitments, however,
is consistently overlooked. Indeed, America's hegemonic
policy is in many ways most accurately understood as
the inevitable result of its decades-long policy of
military Keynesianism.

During the Depression that preceded World War II, the
English economist John Maynard Keynes, a liberal
capitalist, proposed a form of governance that would
mitigate the boom-and bust cycles inherent in
capitalist economies. To prevent the economy from
contracting, a development typically accompanied by
social unrest, Keynes thought the government should
take on debt in order to put people back to work. Some
of these deficit-financed government jobs might be
socially useful, but Keynes was not averse to creating
make-work tasks if necessary. During periods of
prosperity, the government would cut spending and
rebuild the treasury. Such countercyclical planning was
called "pump-priming". Upon taking office in 1933, US
President Franklin Roosevelt, with the assistance of
Congress, put several Keynesian measures into effect,
including socialized retirement plans, minimum wages
for all workers, and government-financed jobs on
massive projects, including the Triborough Bridge in
New York City, the Grand Coulee Dam in Washington, and
the Tennessee Valley Authority, a flood-control and
electric-power generation complex covering seven
states. Conservative capitalists feared that this
degree of government intervention would delegitimate
capitalism - which they understood as an economic
system of quasi-natural laws - and shift the balance of
power from the capitalist class to the working class and
its unions. For these reasons, establishment figures
tried to hold back countercyclical spending.

The onset of World War II, however, made possible a
significantly modified form of state socialism. The
exiled Polish economist Michal Kalecki attributed
Germany's success in overcoming the global Depression
to a phenomenon that has come to be known as "military
Keynesianism". Government spending on arms increased
manufacturing and also had a multiplier effect on
general consumer spending by raising worker incomes.
Both of these points are in accordance with general
Keynesian doctrine. In addition, the enlargement of
standing armies absorbed many workers, often young
males with few skills and less education.

The military thus becomes an employer of last resort,
like Roosevelt's Civilian Conservation Corps, but on a
much larger scale.

Rather than make bridges and dams, however, workers
would make bullets, tanks, and fighter planes. This made
all the difference. Although Adolf Hitler did not
undertake rearmament for purely economic reasons, the
fact that he advocated governmental support for arms
production made him acceptable not only to the German
industrialists, who might otherwise have opposed his
destabilizing expansionist policies, but also to many
around the world who celebrated his achievement of a
"German economic miracle".

In the United States, Keynesian policies continued to
benefit workers, but, as in Germany, they also
increasingly benefited wealthy manufacturers and other
capitalists. By the end of the war, the United States
had seen a massive shift. Dwight Eisenhower, who helped
win that war and later became president, described this
shift in his 1961 presidential farewell address:

Our military organization today bears little relation
to that known by any of my predecessors in peacetime, or
indeed by the fighting men of World War II or Korea.

Until the latest of our world conflicts, the United
States had no armaments industry. American makers of
plowshares could, with time and as required, make swords
as well. But we can no longer risk emergency
improvisation of national defense; we have been
compelled to create a permanent armaments industry of
vast proportions. Added to this, three and a half
million men and women are directly engaged in the
defense establishment. We annually spend on military
security alone more than the net income of all United
States corporations.

This conjunction of an immense military establishment
and a large arms industry is new in the American
experience. The total influence - economic, political,
and even spiritual - is felt in every city, every
statehouse, every office of the federal government. We
recognize the imperative need for this development. Yet
we must not fail to comprehend its grave implications.
Our toil, resources and livelihood are all involved; so
is the very structure of our society.

Eisenhower went on to suggest that such an
arrangement, which he called the "military industrial
complex", could be perilous to American ideals. The
short-term economic benefits were clear, but the very
nature of those benefits - which were all too carefully
distributed among workers and owners in "every city,
every statehouse, every office of the federal
government" - tended to short-circuit Keynes's
insistence that government spending be cut back in good
times. The prosperity of the United States came
increasingly to depend upon the construction and
continual maintenance of a vast war machine, and so
military supremacy and economic security became
increasingly intertwined in the minds of voters. No one
wanted to turn off the pump.

Between 1940 and 1996, for instance, the United States
spent nearly $4.5 trillion on the development, testing,
and construction of nuclear weapons alone. By 1967, the
peak year of its nuclear stockpile, the United States
possessed some 32,000 deliverable bombs. None of them
was ever used, which illustrates perfectly Keynes's
observation that, in order to create jobs, the
government might as well decide to bury money in old
mines and "leave them to private enterprise on the
well-tried principles of laissez faire to dig them up
again". Nuclear bombs were not just America's secret
weapon; they were also a secret economic weapon.

Such spending helped create economic growth that
lasted until the 1973 oil crisis. In the 1980s,
President Ronald Reagan once again brought the tools of
military Keynesianism to bear, with a policy of
significant tax cuts and massive deficit spending on
military projects, allegedly to combat a new threat
from Communism. Reagan's military expenditures
accounted for 5.9 percent of the gross domestic product
in 1984, which in turn fueled a seven percent growth
rate for the economy as a whole and helped reelect
Reagan by a landslide.

During the Clinton years military spending fell to
about three percent of GDP, but the economy rallied
strongly in Clinton's second term due to the boom in
information technologies, weakness in the previously
competitive Japanese economy, and - paradoxically -
serious efforts to reduce the national debt. {3} With
the coming to power of George W Bush, however,
military Keynesianism returned once again. Indeed,
after he began his war with Iraq, the once-erratic
relationship between defense spending and economic
growth became nearly parallel. A spike in defense
spending in one quarter would see a spike in GDP, and
a drop in defense spending would likewise see a drop in
GDP.

To understand the real weight of military Keynesianism
in the American economy today, however, one must
approach official defense statistics with great care.
The "defense" budget of the United States - that is,
the reported budget of the Department of Defense - does
not include: the Department of Energy's spending on
nuclear weapons ($16.4 billion slated for fiscal
2006), the Department of Homeland Security's outlays
for the actual "defense" of the United States ($41
billion), or the Department of Veterans Affairs'
responsibilities for the lifetime care of the seriously
wounded ($68 billion). Nor does it include the billions
of dollars the Department of State spends each year to
finance foreign arms sales and militarily related
development or the Treasury Department's payment of
pensions to military retirees and widows and their
families (an amount not fully disclosed by official
statistics). Still to be added are interest payments by
the Treasury to cover past debt-financed defense
outlays. The economist Robert Higgs estimates that in
2002 such interest payments amounted to $138.7 billion.

Even when all these things are included, Enron-style
accounting makes it hard to obtain an accurate
understanding of US dependency on military spending.
In 2005, the Government Accounting Office reported to
Congress that "neither DOD nor Congress can reliably
know how much the war is costing" or "details on how
the appropriated funds are being spent". Indeed, the
GAO found that, lacking a reliable method of tracking
military costs, the Army had taken to simply inserting
into its accounts figures that matched the available
budget. Such actions seem absurd in terms of military
logic. But they are perfectly logical responses to the
requirements of military Keynesianism, which places
its emphasis not on the demand for defense but rather
on the available supply of money.

The Unitary Presidency

Military Keynesianism may be economic development by
other means, but it does very often lead to real war,
or, if not real war, then a significantly warlike
political environment. This creates a feedback loop:
American presidents know that military Keynesianism
tends to concentrate power in the executive branch, and
so presidents who seek greater power have a natural
inducement to encourage further growth of the
military-industrial complex. As the phenomena feed on
each other, the usual outcome is a real war, based not
on the needs of national defense but rather on the
domestic political logic of military Keynesianism.

As US Senator Robert La Follett Sr observed, "In times
of peace, the war party insists on making preparation
for war. As soon as prepared for war, it insists on
making war." George W Bush has taken this natural
political phenomenon to an extreme never before
experienced by the American electorate. Every
president has sought greater authority, but Bush - whose
father lost his position as forty-first president in a
fair and open election - appears to believe that
increasing presidential authority is both a birthright
and a central component of his historical legacy. He is
supported in this belief by his vice president and chief
adviser, Dick Cheney.

In pursuit of more power, Bush and Cheney have
unilaterally authorized preventive war against nations
they designate as needing "regime change", directed
American soldiers to torture persons they have seized
and imprisoned in various countries, ordered the
National Security Agency to carry out illegal "data
mining" surveillance of the American people, and done
everything they could to prevent Congress from
outlawing "cruel, inhumane, or degrading" treatment of
people detained by the United States. Each of these
actions has been undertaken for specific ideological,
tactical, or practical reasons, but also as part of a
general campaign of power concentration.

Cheney complained in 2002 that, since he had served as
Gerald Ford's chief of staff, he had seen a significant
erosion in executive power as post-Watergate presidents
were forced to "cough up and compromise on important
principles".

He was referring to such reforms as the War Powers Act
of 1973, which requires that the president obtain
congressional approval within ninety days of ordering
troops into combat; the Budget and Impoundment Control
Act of 1974, which was designed to stop Nixon from
impounding funds for programs he did not like; the
Freedom of Information Act of 1966, which Congress
strengthened in 1974; President Ford's Executive Order
11905 of 1976, which outlawed political assassination;
and the Intelligence Oversight Act of 1980, which gave
more power to the House and Senate select committees on
intelligence. Cheney said that these reforms were
"unwise" because they "weaken the presidency and the
vice presidency", and added that he and the president
felt an obligation "to pass on our offices in better
shape than we found them".

No president, however, has ever acknowledged the
legitimacy of the War Powers Act, and most of these
so-called limitations on presidential power had been
gutted, ignored, or violated long before Cheney became
vice president. Republican Senator John Sununu of New
Hampshire said, "The vice president may be the only
person I know of that believes the executive has
somehow lost power over the last thirty years". Bush and
Cheney have made it a primary goal of their terms in
office, nonetheless, to carve executive power into the
law, and the war has been the primary vehicle for such
actions. John Yoo, Bush's deputy assistant attorney
general from 2001 to 2003, writes in his book War By
Other Means (Atlantic Monthly Press, 2006), "We are
used to a peacetime system in which Congress enacts
laws, the President enforces them, and the courts
interpret them. In wartime, the gravity shifts to the
executive branch." Bush has claimed that he is "the
commander" and "the decider" and that therefore he does
not "owe anybody an explanation" for anything. {4}
Similarly, in a September 2006 press conference, White
House spokesman Tony Snow engaged in this dialogue:

Q: Isn't it the Supreme Court that's supposed to
decide whether laws are unconstitutional or not?

A: No, as a matter of fact the president has an
obligation to preserve, protect, and defend the
Constitution of the United States. That is an
obligation that presidents have enacted through signing
statements going back to Jefferson. So, while the
Supreme Court can be an arbiter of the Constitution,
the fact is the president is the one, the only person
who, by the Constitution, is given the responsibility
to preserve, protect, and defend that document, so it
is perfectly consistent with presidential authority
under the Constitution itself.

Snow was referring to the president's habit of signing
bills into law accompanied by "statements" that,
according to the American Bar Association, "assert
President Bush's authority to disregard or decline to
enforce laws adopted by Congress". All forty-two
previous US presidents combined have signed statements
exempting themselves from the provisions of 568 new
laws, whereas Bush has, to date, exempted himself from
more than 1,000.

Failed Checks on Executive Ambition

The current administration's perspective on political
power is far from unique. Few, if any, presidents have
refused the increased executive authority that is the
natural byproduct of military Keynesianism. Moreover,
the division of power between the president, the
Congress, and the judiciary - often described as the
bedrock of American democracy - has eroded
significantly in recent years. The people, the press,
and the military, too, seem anxious to cede power to a
"wartime" president, leaving Bush, or those who follow
him, almost entirely unobstructed in pursuing the
imperial project.

Congress: Corrupt and indifferent, Congress, which the
Founders believed would be the leading branch of
government, has already entirely forfeited the power to
declare war. More recently, it gave the president the
legal right to detain anyone, even American citizens,
without warrant, and to detain non-citizens without
recourse to habeas corpus, as well as to use a variety
of interrogation methods that he could define, at his
sole discretion, to be or not be torture.

The Courts: The judicial branch is hardly more
effective in restraining presidential ambition. The
Supreme Court was active in the installation of the
current president, and the lower courts increasingly
are packed with judges who believe they should defer to
his wishes. In 2006, for instance, US District Judge
David Trager dismissed a suit by a
thirty-five-year-old Canadian citizen, Maher Arar, who
in 2002 was seized by US government agents at John F
Kennedy Airport and delivered to Syria, where he was
tortured for ten months before being released. No
charges were filed against Arar, and his torturers
eventually admitted he had no links to any crime. In
explaining his dismissal, Trager noted with approval an
earlier Supreme Court finding that such judgment would
"threaten 'our customary policy of deference to the
President in matters of foreign affairs'".

The Military: It is possible that the US military
could take over the government and declare a
dictatorship. {5} That is how the Roman republic
ended. For the military voluntarily to move toward
direct rule, however, its leaders would have to ignore
their ties to civilian society, where the symbolic
importance of constitutional legitimacy remains potent.
Rebellious officers may well worry about how the
American people would react to such a move. Moreover,
prosecutions of low-level military torturers from Abu
Ghraib prison and killers of civilians in Iraq have
demonstrated to enlisted ranks that obedience to
illegal orders can result in their being punished,
whereas officers go free. No one knows whether ordinary
American soldiers would obey clearly illegal orders to
oust an elected government or whether the officer corps
has sufficient confidence to issue such orders. In
addition, the present system already offers the
military high command so much - in funds, prestige, and
future employment via the military-industrial revolving
door - that a perilous transition to anything
resembling direct military rule would make little sense
under reasonably normal conditions.

The People: Could the people themselves restore
constitutional government? A grassroots movement to
break the hold of the military industrial complex and
establish public financing of elections is
conceivable. But, given the conglomerate control of the
mass media and the difficulties of mobilizing the United
States' large and diffuse population, it is unlikely.
Moreover, the people themselves have enjoyed the
Keynesian benefits of the US imperial project and - in
all but a few cases - have not yet suffered any of its
consequences. {6}

Bankruptcy and Collapse

The more likely check on presidential power, and on US
military ambition, will be the economic failure that
is the inevitable consequence of military Keynesianism.
Traditional Keynesianism is a stable two-part system
composed of deficit spending in bad times and debt
payment in good times. Military Keynesianism is an
unstable one-part system. With no political check,
debt accrues until it reaches a crisis point.

In the fiscal 2006 budget, the Congressional Research
Service estimates that Pentagon spending on Operation
Enduring Freedom and Operation Iraqi Freedom will be
about $10 billion per month or an extra $120.3 billion
for the year.

As of mid-2006, the overall cost of the wars in Iraq
and Afghanistan since their inception stood at more
than $400 billion. Joseph Stiglitz, the Nobel
Prize-winning economist, and his colleague, Linda
Bilmes, have tried to put together an estimate of the
real costs of the Iraq war. They calculate that it
will cost about $2 trillion by 2015. The conservative
American Enterprise Institute suggests a figure at the
opposite end of the spectrum - $1 trillion. Both figures
are an order of magnitude larger than what the Bush
Administration publicly acknowledges.

At the same time, the US trade deficit, the largest
component of the current account deficit, soared to an
all-time high in 2005 of $782.7 billion, the fourth
consecutive year that America's trade debts set
records. The trade deficit with China alone rose to
$201.5 billion, the highest imbalance ever recorded
with any country. Meanwhile, since mid-2000, the
country has lost nearly three million manufacturing
jobs. To try to cope with these imbalances, on March 16
2006, Congress raised the national debt limit from
$8.2 trillion to $9 trillion. This was the fourth time
since George W Bush took office that the limit had to be
raised. Had Congress not raised it, the US government
would not have been able to borrow more money and would
have had to default on its massive debts.

Among the creditors that finance this unprecedented
sum, two of the largest are the central banks of China
($854 billion in reserves of dollars and other foreign
currencies) and Japan ($850 billion), both of which
are the managers

of the huge trade surpluses these countries enjoy with
the United States. This helps explain why the United
States' debt burden has not yet triggered what
standard economic theory would predict, which is a
steep decline in the value of the US dollar followed by
a severe contraction of the American economy - the
Chinese and Japanese governments continue to be
willing to be paid in dollars in order to sustain
American demand for their exports. For the sake of
domestic employment, both countries lend huge amounts
to the American treasury, but there is no guarantee how
long they will want or be able to do so.

CONFIDENCE IN KEY JUDGMENTS

It is difficult to predict the course of a democracy,
and perhaps even more so when that democracy is as
corrupt as that of the United States. With a new
opposition party in the majority in the House, the
country could begin a difficult withdrawal from
military Keynesianism. Like the British after World War
II, the United States could choose to keep its
democracy by giving up its empire. The British did not
do a particularly brilliant job of liquidating their
empire, and there were several clear cases in which
British imperialists defied their nation's commitment to
democracy in order to keep their foreign privileges -
Kenya in the 1950s is a particularly savage example -
but the people of the British Isles did choose democracy
over imperialism, and that nation continues to thrive as
a nation, if not as an empire.

It appears for the moment, however, that the people of
the United States prefer the Roman approach and so will
abet their government in maintaining a facade of
constitutional democracy until the nation drifts into
bankruptcy.

Of course, bankruptcy will not mean the literal end of
the United States any more than it did for Germany in
1923, China in 1948, or Argentina in 2001. It might,
in fact, open the way for an unexpected restoration of
the American system, or for military rule, revolution,
or simply some new development we cannot yet imagine.
Certainly, such a bankruptcy would mean a drastic
lowering of the current American standard of living, a
loss of control over international affairs, a process of
adjusting to the rise of other powers, including China

and India, and a further discrediting of the notion
that the United States is somehow exceptional compared
with other nations. The American people will be forced
to learn what it means to be a far poorer nation and
the attitudes and manners that go with it. {7}

NOTES

1 The CIA's website defines a National Intelligence
Estimate as "the most authoritative written judgment
concerning a national security issue prepared by the
Director of Central Intelligence." These forecasts of
"future developments" and "their implications for the
United States" seldom are made public, but there are
exceptions. One was the NIE of September 2002 ,
"Iraq's Continuing Programs for Weapons of Mass
Destruction", which became notorious because virtually
every word in it was false. Another, an April 2006 ME
entitled "Trends in Global Terrorism: Implications for
the United States", was partly declassified by
President Bush because its main conclusion - that
"activists identifying themselves as jihads" are
"increasing in both number and geographic dispersion" -
had already been leaked to the press.

2 The CIA is prohibited from writing an NIE on the
United States, and so I have here attempted to do so
myself, using the standard format for such estimates. I
have some personal knowledge of NIEs because from 1967
to 1973 I served as an outside consultant to the CIA's
Office of National Estimates. I was one of about a dozen
so-called experts invited to read draft NIEs in order
to provide quality control and prevent bureaucratic
logrolling.

3 Military Keynesianism, it turns out, is not the only
way to boost an economy.

4 In a January 2006 debate, Yoo was asked if any law
could stop the president, if he "deems that he's got to
torture somebody", from, say, "crushing the testicles
of the person's child". Yoo's response: "I think it
depends on why

the president thinks he needs to do that".

5 Though they undoubtedly would find a more
user-friendly name for it.

6 In 2003, when the Iraq war began, the citizens of
the United States could at least claim that it was the
work of an administration that had lost the popular
vote. But in 2004, Bush won that vote by more than
three million ballots, making his war ours.

7 National Intelligence Estimates seldom contain
startling new data. To me they always read like magazine
articles or well-researched and footnoted graduate
seminar papers. When my wife once asked me what was so
secret about them, I answered that perhaps it was the
fact that this was the best we could do.

_____

Chalmers Johnson is the author of Blowback
(Metropolitan, 2000), The Sorrows of Empire
(Metropolitan, 2004), and, most recently, Nemesis: The
Last Days of the American Republic, which will be
published in February by Metropolitan Books. His last
article for Harper's Magazine, "The War Business:
Squeezing a Profit from the Wreckage in Iraq", appeared
in the November 2003 issue.

http://www.mail-archive.com/marxism-thaxis@lists.econ.utah.edu/msg03570.html

Bookmark and Share
posted by u2r2h at Sunday, January 28, 2007

0 Comments:

Post a Comment

<< Home

Locations of visitors to this page Politics Blogs - Blog Top Sites