Saturday 27 February 2010

Douglas Wilson's Letter From America

Then the Climate Data Went Blooey

Written by Douglas Wilson
Wednesday, February 24, 2010 7:24 am

So there's weather and there's climate. When the global warming shillists were back in their prime, they would regale us with anecdote after anecdote about the weather, claiming that this told us something about the climate. The actual climate claims were based on something else, but they sought to persuade the boobs by means of pointing to the weather. Katrina was the result of global warming, etc.

Then something funny happened. The weather shifted in some very odd ways, we started hearing about climate change instead of global warming, and the Al Gore effect became apparent in weather patterns. Everywhere that poor man went, a snow storm followed. Those who lived by the weather anecdote died by the weather anecdote.

They probably could have managed this, because enormous amounts of money were involved, not to mention a power grab of Orwellian proportions. Men like this were not going to slowed down by snow flurries in Georgia.

But then the climate data went blooey. Turns out the books were cooked, rigged, made up, massaged, bought and paid for, and then lost. There went the climate, and the phrase "the science is settled" took on a much more ominous meaning.

And then God, for His mercies endure forever, sent the world the winter of winters. The one enthroned in Heaven laughs; He holds them in derision.

Cocoon Societies

Thinking Positively About Slums

New Zealand is a cocoon society. It is focused upon cotton wool swaddling of its citizens, with the government and its agencies responsible to prevent harm or danger. The culture and prevailing political ideology is fixated with stopping harm occurring, rather than with mitigating downstream damage. This ideology is legislatively prescribed in the Occupational Safety and Health regime, the ACC regime, the Resource Management Act, and literally thousands upon thousands of standards, rules, and regulations written by bureaucratic process and enforced by state functionaries designed to prevent unintended harmful consequences ever occurring.

It is no surprise, therefore, that a cocoon society struggles to increase its standard of living. Risk is so mitigated it stops human ambition, drive, and creativity in its tracks. It makes both labour and capital productively weak and far less creative and dynamic.

A recent article in an environmentalist magazine, Prospect, argued that slums have enormous unintended beneficial effects for humanity. The point is that when human beings are faced with great difficulties in a society that lets them be, they prove to be remarkably adaptive, creative, and very effective at solving their own problems in a way that benefits the whole, as an unintended consequence. Slums, the article argued, are good for the environment, if you take a developmental view, rather than a static snapshot approach.

The article, by Stewart Brand was entitled, How Slums Can Save the Planet. It argues that whilst 60 million people in the developed world are leaving the countryside and drifting to cities every year in the longer run it is a very good thing.

Now, in New Zealand, we would prefer not to have slums in our cities. We want to be cocooned from such developments. Our noses wrinkle at the mere imagination of the olfactory offence. But times are a-changing--at least elsewhere in the world.
The reversal of opinion about fast-growing cities, previously considered bad news, began with The Challenge of Slums, a 2003 UN-Habitat report. The book’s optimism derived from its groundbreaking fieldwork: 37 case studies in slums worldwide. Instead of just compiling numbers and filtering them through theory, researchers hung out in the slums and talked to people. They came back with an unexpected observation: “Cities are so much more successful in promoting new forms of income generation, and it is so much cheaper to provide services in urban areas, that some experts have actually suggested that the only realistic poverty reduction strategy is to get as many people as possible to move to the city.”
Slums, apparently, are very successful in raising the living standards of their occupants. They do so efficiently and cost-effectively. And there is not a risk-prevention agency in sight! The reason? The remarkable creativity, energy, and adaptability of human beings--at least when they are not being enervated by thousands of rules, and sapped by finger-wagging no-no's.
The magic of squatter cities is that they are improved steadily and gradually by their residents. To a planner’s eye, these cities look chaotic. I trained as a biologist and to my eye, they look organic. Squatter cities are also unexpectedly green. They have maximum density—1m people per square mile in some areas of Mumbai—and have minimum energy and material use. People get around by foot, bicycle, rickshaw, or the universal shared taxi.
What makes slums so effective at gradual and steady improvement? In the first place, they are remarkably efficient in utilizing resources and in recycling and re-using anything and everything. Out of this comes division of labour, expertise, trade, commerce, and rising standards of living.
Not everything is efficient in the slums, though. In the Brazilian favelas where electricity is stolen and therefore free, people leave their lights on all day. But in most slums recycling is literally a way of life. The Dharavi slum in Mumbai has 400 recycling units and 30,000 ragpickers. Six thousand tons of rubbish are sorted every day. In 2007, the Economist reported that in Vietnam and Mozambique, “Waves of gleaners sift the sweepings of Hanoi’s streets, just as Mozambiquan children pick over the rubbish of Maputo’s main tip. Every city in Asia and Latin America has an industry based on gathering up old cardboard boxes.”

Secondly, slums develop into concentrated urban centres which use resources most cost-effectively.
In his 1985 article, Calthorpe made a statement that still jars with most people: “The city is the most environmentally benign form of human settlement. Each city dweller consumes less land, less energy, less water, and produces less pollution than his counterpart in settlements of lower densities.” “Green Manhattan” was the inflammatory title of a 2004 New Yorker article by David Owen. “By the most significant measures,” he wrote, “New York is the greenest community in the United States, and one of the greenest cities in the world…The key to New York’s relative environmental benignity is its extreme compactness. Manhattan’s population density is more than 800 times that of the nation as a whole. Placing one and a half million people on a twenty-three-square-mile island sharply reduces their opportunities to be wasteful.” He went on to note that this very compactness forces people to live in the world’s most energy-efficient apartment buildings. . . .

Urban density allows half of humanity to live on 2.8 per cent of the land. Demographers expect developing countries to stabilise at 80 per cent urban, as nearly all developed countries have. On that basis, 80 per cent of humanity may live on 3 per cent of the land by 2050. Consider just the infrastructure efficiencies. According to a 2004 UN report: “The concentration of population and enterprises in urban areas greatly reduces the unit cost of piped water, sewers, drains, roads, electricity, garbage collection, transport, health care, and schools.” In the developed world, cities are green because they cut energy use; in the developing world, their greenness lies in how they take the pressure off rural waste.
A standard objection is to point to the enormous problems faced by rapidly growing cities. And the problems are real. But human ingenuity, creativity, and adaptability enable cities to solve the problems and develop above and beyond. In New Zealand we have got sucked into an ideology which says prevention of the problems in the first place is better than not having to solve problems. When it comes to economic development most often this is not the case. Societies which try to move from subsistence agriculture to developed industrial economies without going through the pressures of urban population drift and struggles of Dickensian England have succumbed to wishful thinking. The chaotic stench of slum life is a necessary stage in most developing economies. And people in those circumstances provide their own best longer term solutions.

Whilst it has been fashionable in an elitist kind of way to tut-tut over the harshness of urban living conditions in England as it was going through the Industrial Revolution, it is conveniently forgotten that although it is reported that one could smell London from over ten miles away, thousands upon thousands of people drifted to London and other urban centres because their recently left rural living conditions were substantially worse. People will courageously and willingly accept difficult living conditions if two factors exist: firstly, that urban slum dwelling is marginally better than the rural impoverishment whence they recently came; secondly, that the slum provides the prospect of improvement in the future.

Finally, New Zealand's attempt to prevent environmental despoilation at all costs and "trade" on an image/value of being clean and green is tailor made for the prevailing ideology of cocoonism. However, it locks in economic stagnation. Paradoxically, it is also likely to lead to greater environmental damage in the longer term.

Friday 26 February 2010

Letter From America

Too Many Apologies

Aimless apologies are just one of the incidental symptoms of an increasing loss of a sense of personal responsibility.

Thomas Sowell.
(First published in nationalreview ONLINE.)

Tiger Woods doesn’t owe me an apology. Nothing that he has ever done has cost me a dime nor an hour of sleep.

This is not a plea to be “non-judgmental.” I am very judgmental about all sorts of things, including Tiger Woods’s bad behavior. But that is very different from saying that he somehow owes me an apology.

For all I know, my neighbors may be judgmental when I drive out of my driveway in a 15-year-old car. But they have never said anything to me about it, and I have never offered them an apology. This is not equating driving a 15-year-old car with what Tiger Woods did. But the point is that any apology he might make should be made to his family, who were hurt, not to the public, who might be disappointed in him, but not really hurt.

Public apologies to people who are not owed any apologies have become one of the many signs of the mushy thinking of our times. So are apologies for things that other people did.

Among the most absurd apologies have been apologies for slavery by politicians. For one thing, slavery is not something you can apologize for, any more than you can apologize for murder. If someone says to you that he murdered someone near and dear to you, what are you supposed to say? “No problem, we all make mistakes”? Not bloody likely!

Slavery is too serious for an apology, and somebody else’s being a slaveowner is not something for you to apologize for. When somebody who has never owned a slave apologizes for slavery to somebody who has never been a slave, then what began as mushy thinking has degenerated into theatrical absurdity — or, worse yet, politics.

Slavery has existed all over the planet for thousands of years, with black, white, yellow, and other races being both slaves and enslavers. Does that mean that everybody ought to apologize to everybody else for what their ancestors did? Or are the only people who are supposed to feel guilty the ones who have money that others want to talk them out of?

This craze for aimless apologies is part of a general loss of a sense of personal responsibility in our time. We are supposed to feel guilty for what other people did, but there are a thousand cop-outs for what we ourselves did.

Back in the 1960s, when so many foolish ideas flourished simply because they were new, a New York Times columnist tried to make the case that we were all somehow responsible for the assassination of John F. Kennedy. That was considered to be Deep Stuff. It made you one of the special folks when you believed that, instead of one of the rest of us poor dumb slobs who believed that the man who shot him was responsible.

For more than a century, the intelligentsia have been trying to get us to focus on the “root causes” of crime —which was supposedly created by “society” — instead of locking up thieves or executing murderers. If some people don’t have the money or the achievements of others, that too is society’s fault, in the eyes of those for whom personal responsibility is an outmoded idea.

Personal responsibility is a real problem for those who want to collectivize society and take away our power to make our own decisions, transferring that power to third parties like themselves, who imagine themselves to be so much wiser and nobler than the rest of us.

Aimless apologies are just one of the incidental symptoms of an increasing loss of a sense of personal responsibility — without which a whole society is in jeopardy.

The police cannot possibly maintain law and order by themselves. Millions of people can monitor their own behavior better than any third parties can. Cops can cope with that segment of society that has no sense of personal responsibility, but not if that segment becomes a large part of the whole population.

Yet increasing numbers of educators and the intelligentsia seem to have devoted themselves to undermining or destroying a sense of personal responsibility and making “society” responsible instead. Aimless apologies are just one small symptom of this larger and more dangerous attitude.

The Best Science is Always a Judgment Call

In Celebration of Rigorous Guesswork

The ongoing furore over global warming has thrown up a much broader issue. The very nature of science itself is under attack. It is an issue worth debating. If one outcome of the global warming farce is that we come to have a far more realistic perspective upon science it will have been all to the good.

We are not holding our breath, however. In the modern world, "science" has been set up as the great exposer of religion, the source of infallible knowledge which "proves" that faith in the Living God is without foundation: science has settled the question of God, and the infallible conclusion is that He cannot possibly exist. Because science has now spoken on infinite, eternal, and unchangeable realities it is necessary that it itself be regarded as infallible and certain. Consequently the truth claims of science have become more and more strident over the past two hundred years. Society at large has not only accepted this--it has actively demanded it.

But "coal face" scientists know in their bones that the entire edifice of absolute, infallible, and certain objectivity is itself a myth, a house of cards built upon water.
As John Christy, once an IPCC lead author, and a credible scientist in his own right said in a recent article in Nature specifically of climate science:
There is still much messy, contentious, snail-paced and now, hopefully, transparent, work to do.
Whenever you come across a scientist speaking thus, his credibility as a scientist rises substantially.

Uncertainty (or scepticism) lies at the heart of all true science, as it does in every other area of human knowledge, for man is a creature, fallible and finite. He cannot know anything with absolute infallible certainty, except that which the infallible, infinite, eternal and unchangeable God reveals to him.

Michael Polanyi, one of the great physical chemists of the last century, and one of the best philosophers of science in the modern world understood the implicit subjectivity of the scientific enterprise. He argued that much of science is no more nor less than guesswork. This, of course, makes him an troublesome prophet in our modern age, which, having replaced God with science, requires attributes of deity from the latter, including infallibility. He writes:
We may conclude that just as there is no proof of a proposition in natural science which cannot conceivably turn out to be incomplete, so also there is no refutation which cannot conceivably turn out to be unfounded. There is a residue of personal judgement required in deciding--as the scientist eventually must--what weight to attach to any particular set of evidence in regard to the validity of a particular proposition.

The propositions of science thus appear to be in the nature of guesses. They are founded on the assumptions of science governing the structure of the universe and on the evidence of observations collected by methods of science; they are subject to a process of verification in the light of further observations according to the rules of science; but their conjectural character remains inherent in them. (Michael Polanyi, Science, Faith and Society, [Chicago: University of Chicago Press, 1964], p.31f. Emphasis, ours.)
In the light of this, recent claims that the science of global warming is "settled" merely serve to show how profoundly unscientific the whole enterprise must be.

Not only are premises and propositions of science conjectural guesswork, and subject to constant, ongoing modification, the very facts or data themselves with which science works are subject to uncertainty and require judgement or guesswork.
The scientist in pursuit of research has incessantly to make decisions whether to take a new instrument reading or some other new sense impression as signifying a new fact, or to regard it merely as a new indication of an old fact--or else to reject it as having no significance at all. These decisions are guided by the premisses of science and more particularly by the current surmises of the time, but ultimately there always enters an element of personal judgement. (Polanyi, op cit. p.90.)
But, to make matters more dense and complex, even in the most rigorous of experimental verification, there remains the problem of "strange coincidence". Polanyi cites several examples from the history of science, including one from his own laboratory. Once studying tin crystals he and his colleagues noticed a particular feature. "Hundreds of such specimens were produced and some of them photographed and their pictures published. Identical photographs were published by C. Burger who had independently made the same discovery." Polanyi, op cit. p. 95. This research was continued for several years, when suddenly the particular feature disappeared and has not been seen again in tin crystals studied. No-one has been able to explain this. "One is reminded--to take for once an example from the field of biology--of the mysterious loss of smell of the musk plant which seems to have occurred a few years ago suddenly all over the planet." (Ibid., p. 95.)

This is science at its rigorous best. When it is most sceptical and careful and non-dogmatic it is a truly helpful servant. When science, however, is separated from its creaturely limitations and is elevated into the realm of infallible dogmatic certainty, at that point it ceases to be rigorous and genuinely scientific, and becomes a stupid, error-ridden, and superstitious idolatry.

Thursday 25 February 2010

Douglas Wilson's Letter From America

Way More Leafy Greens

Written by Douglas Wilson
First published in Blog and Mablog.
Wednesday, February 24, 2010

[Douglas Wilson has been posting a series of blogs on a growing fastidiousness over food in some Christian circles. It has stirred up some controversy. In this post he responds to some of his critics and reflects on what being stirred up might mean for the one so afflicted. Ed.]

D.L Moody once said that if you throw a rock into a pack of stray dogs, the one that yelps is the one that got hit. In a similar vein, the wicked flee when no man pursues (Prov. 28:1).

When I attack the trends that are conspiring to introduce food laws into the Church, and I point out that these food laws have a dubious ancestry, coming as they do from a wicked and perverse generation, do I mean to say that any Christians who don't eat exactly the way I do are wicked and perverse themselves? Of course not. There are Christians who eat way more leafy greens than I do who are much finer Christians than I am.

Do I mean to introduce my own kind of inverse food laws into the Church, so that anyone who just wants to eat "just a little healthier" comes under suspicion? Of course not. What other people eat (if it has no higher authority than that they want to) is none of my concern or business. When other believers say grace over foods I think odd, I think it is wonderful. When they thank their Creator for their food, they are not talking to me.

Eating a little healthier is great, just so long as you are eating a little healthier than you were, and it is not cast to yourself as eating a little healthier than he is.

Do I think (or did I ever say?) that anyone who cares about stewardship of agricultural resources is a Marxist hippie who struggles with sexual perversion? No, actually. The cultural mandate back in Genesis requires careful stewardship of what God has entrusted to us -- my central complaint about the stewardship schtick is that most of those urging it today are statists, and the state is the enemy of even the possibility of real stewardship. Stewardship is a basic Christian duty, which is why so many people want to pretend they are doing it. It is much easier to put a green decal on your car, or widen your phylacteries some other way, than it is to actually conserve something for real.

Anyone who doesn't see Tetzel all over again in the practice of selling carbon offsets (which you can now do at a stock exchange level), doesn't know the first thing about biblical worldview analysis. And if you don't know the first thing about how to see what a culture is actually doing, then you have no business teaching Christians what to do. If you don't understand the times, and you don't know what Israel should do, then you are not from the tribe of Issachar.

So, if I am not trying to do all these things, what am I doing? Who am I talking about? What's the point? The answer can be divided up into two categories. In the first place, I am critiquing one of the great spirits of the age, the teaching that is contained in books, articles and documentaries that is being insinuated into the Christian community. The false doctrine contained there can have really destructive effects in the lives of Christians, and as a pastoral counselor, I have more than once had a front row seat. Secular counselors have noticed some of the same pathologies -- one doctor has even coined the phrase orthorexia nervosa -- "an excessive focus on eating healthy foods."

The second category would be that of the Christian reader who wonders if I am talking about "his position." There is no telling from this distance, but if my qualifications are missed, if my point is inverted, if I am made to affirm what I have denied and versa vica, and if it is thought that I must hate farmers, then I probably am.

The Twilight Years, Part VI

Dark Have Been My Dreams of Late

In this series of posts on the Twilight Years, as presented by historian Richard Overy, we have seen how firstly historians, followed by economists, then biologists in Britain during the Inter-War years were all harbingers of doom, all warning of the coming collapse of civilization. In almost every case, the solutions and preventatives offered involved a vast expansion of state powers in an attempt to do something to prevent the decline.

At the turn of the century the universal consensus was one of unbridled optimism for the future. Within twenty years it had swung to dark ideas of the breakdown and collapse of Britain. We have argued that this bi-polarity of manic optimism followed by dark pessimism is intrinsic to secular humanism, the dominant religion of the age.

But there is another bi-polarity present within secular humanism. It is the tendency to lurch between rationalism and irrationalism. It was rationalism which insisted upon endlessly repeated historical patterns of ebb and flow, and upon the inherent contradictions within an economic system built upon private ownership of property and free-exchange of goods, and upon the threats of miscegenation. It was rationalism which urged a vastly expanded role of government planning to counteract these threats. (What else is “planning” but an attempt to extend "rational" patterns over the chaos of an unregulated economy?) But when rationalism was at its height, irrationalism suddenly surged up and became very popular at the same time, arguing that all such attempts were ultimately useless and futile.

The particular manifestation that irrationalism took in the Inter-War years was the rise of Freudian psychoanalysis. In the 1920's, psychoanalysis was a new branch of psychological science. It immediately captured widespread interest. Suddenly, it was the irrational which was the dominant reality, not the rational. Overy writes:
There is an unavoidable impression that psychoanalysis was precisely attuned to the age in which it emerged. Barbara Low, author in 1920 of the first popular introduction to psychoanalysis in Britain, recalled Freud's own observation that, “increasingly manifest in modern civilised life are the Neurotic and the Hysteric”. Low reflected that pressure of civilization had been “too extreme, too rapid in its action” for many people to adapt to its demands. Psychoanalysis was the therapeutic instrument for dealing with the dysfunctional nature of modern society by liberating mankind from the paralysing fear of the primitive instincts that lay concealed in the unconscious portion of the mind. (Richard Overy, The Twilight Years: The Paradox of Britain Between the Wars [New York: Viking/Penguin, 2009], p.137,8)
The implicit promise in psychoanalysis was that if the irrational and suppressed neuroses asserted to lie within each individual were teased forth by psychoanalysis, so that they could be faced and one could come to terms with them, much of the dysfunction of modern society would evaporate. Freudianism argued that what was on the surface of man and society—which could be rationally analysed, evaluated, argued over and examined—was not what was real. The Real was what was subliminated unconsciously within the human psyche. The sublimated neuroses were essentially irrational. The rational man was therefore captive to the dark forces of irrationalism. Psychoanalysis held out the hope that these dark motivations could be exposed and corrected (that is, treated). This would free man to behave far more reasonably and constructively.

When psychoanalysis was taken out of the medical academic cloisters and popularised, it was immediately seized upon by the leading intellectuals of the day. Kingsley Martin, later to become editor of the New Statesman professed himself shaken to the core by the approach.
The fear that reason could no longer be relied upon to sort out the problems of the modern world was, Martin continued, “the most devastating of all”. (Overy, p. 145). Emphasis, ours.
Freudianism—in its narrower, stricter, medical manifestations—was unable to win sustained support. It could so quickly be reduced to absurdity. The obsession with sex was seen as ludicrous, if not distasteful. But in the Inter-War years the notion that the unconscious irrational mind lying beneath the rational superficial exterior and pulling puppet strings to control rational processes proved very attractive and even compelling to the wider public. It offered an explanation as to why things were "breaking down". Quite rapidly the theory morphed to its “mature” stage. The unconscious came to be seen not as something which just afflicted those who had neuroses of various kinds: rather the unconscious was one of the core realities of being human.

At work in every individual and in society and nations corporately were hidden irrational passions of love, hate, aggression, sadism, masochism and narcissism which would come forth unconsciously, at any point, to overthrow reason. Irrationality ultimately trumped rationality.
The effect of mass-circulation popular psychology was not to provide effective therapy but to familiarize growing circles of the population with the idea that every individual is prey of inner demons which could manipulate at will the outer person. (Overy, p.169).
It is difficult to conceive of a more dark, irrational world view.

By the 1930's most psychoanalysts had abandoned the consulting couch and had become social commentators, applying guilt, repression, eros, and the death drive to the wider society, to the social order, political behaviour, international crises and war. (Overy p.163). Civilization, it was thought, was about to be swamped by irrational dark forces of the human soul.
By 1939, as one sociologist put it, there had emerged a wide popular expectation among political and social scientists that only psychoanalysts could properly explain “man of the causes of our 'Modern Discontents'. Even socialists, argued the journal of the National Labour Colleges, should recognize “the melancholy truth” that man “is not at bottom a rational animal” and adapt their tactics accordingly. (Overy p. 173)

Under this variant of secular humanism, the irrational pole, civilization was the rational, the structured, the ordered, the controlled and restrained. But it was paper thin. Underneath were irrational, chaotic, primitive, infantile and arbitrary forces, and a
seething mass of instincts and drive that . . . was capable of a terrible aggression and an urge to morbid self-destruction. . . . (T)his paradigm . . . only served to illuminate what many people already suspected, that beneath the thin veneer of civilization there lurked a monstrous other self whose release would spell the end of civilized life and the triumph of barbarism. (Overy, p. 173).
In the end the Freudians, despite the wide public appeal and interest, could offer no hope. In their case, no amount of government planning, spending, or programming would ever be great enough to win. But the irrationalism of psychoanalysis added fresh impetus to the pessimism of the rationalists: not only was doom coming; it was inevitable and unavoidable, for man could not escape his unconscious, repressed, evil and brutal self. It served only to pour petrol on the fires of despair, which had been already burning brightly.

In our modern day, of course, humanist psychology has swung back to its more rationalist pole. Generally evil, violent, masochistic or destructive human passions are believed to be a result of conditioning by society itself. This offers the prospect that such dark forces can be ameliorated by social planning, government programmes, and above all, by government education. Once again, we labour under the fallacy of false cause, and therefore false cures. When the remedy proves a failure, the appeal of darker, more pessimistic alternatives is likely to rise once again.

Wednesday 24 February 2010

Will Marjah Be a Success?

Yes, and No

The new offence by Nato in Helmand province is grinding on. By all accounts it is slower than expected, but eventual success is almost assured. But what about this time next year? Herein lies the problem.

Paul Rogers, writing in Open Democracy, attempts to give a realistic assessment in an article entitled: Afghanistan: What It's Like.

Firstly, a summary of the present difficulties facing the operation:
The high-profile military campaign by International Security Assistance Force (Isaf) forces against Taliban militias in Afghanistan’s southern province of Helmand involves the deployment of some 15,000 heavily armed troops, who are supported by strike-aircraft, helicopter-gunships, artillery and armed drones - all ranged against perhaps 1,000 lightly armed insurgents. Despite this imbalance of power, Operation Moshtarak is already facing unanticipated difficulties.

The seizure of the main urban target, Marjah, has proved harder than expected for two quite different reasons. First, the incoming forces are discovering far greater numbers of deadly improvised-explosive devices (IEDs) than has been customary in past campaigns; in one case a contingent of United States marines took eight hours to move less than two kilometres because of the minute-by-minute need to locate and defuse IEDs. Second, in some areas where they feel more secure Taliban militias have offered strong resistance to US and Afghan army forces, often in house-to-house fighting. They have also made particularly effective use of sniper-fire, with militants often operating at long range (see CJ Chivers, “New Taliban weapon: snipers”, New York Times, 17 February 2010).

The second factor is notable in that many analysts had, drawing on earlier experience of such campaigns, expected the Taliban groups to retreat in the face of the massively superior firepower that United States forces could deploy. But there is abundant evidence of Taliban commanders being very quick to learn from changes in Isaf operating methods, and adapting their own tactics accordingly. In this respect three incidents on 13-15 February 2010 where Afghan civilians were killed in air-raids or shot in combat-zones - again part of a long-standing pattern - underline the limited use and often counterproductive effects of US air-assaults. Since the marines’ firepower advantage has begun to prove less reliable than expected, it looks very much as though the Taliban is more willing to offer direct opposition to foreign troops in and around Marjah.

None of this means that Operation Moshtarak will fail. There is every likelihood that the Taliban militias will soon withdraw from open conflict in the area, as they have already done further north. After all, Nato/Isaf’s great military superiority means that territorial gains are - at least in the short term - inevitable.
In Vietnam the inability of the US forces to occupy conquered territory indefinitely proved to be a major weakness. A victorious army must occupy and maintain command and control over territory for the long term. The US war manual in Afghanistan is that the Afghani army will take over this task, and large scale aid efforts will back them up--all designed to change the loyalties of tribesmen away from the Taliban to the Afghani government. How feasible or likely is this? The terrain and local realities mean that small (one hundred personnel) forward operating bases will need to be maintained to try to keep the Taliban in check going forward and protect the reconstruction efforts.

It turns out that forward operating bases have proved largely ineffective at controlling the Taliban once they have been "driven out" of an area. Only a small proportion of the personnel can be spared to maintain patrols. The Taliban have virtually unlimited freedom to re-group and operate an effective guerrilla war.
If the base seeks to maintain round-the-clock surveillance and cover a significant part of its (100 sq km) zone, then the very most that can be maintained is twelve troops on patrol at any one time. This means that each patrol-group is operating for the equivalent of eight hours a day (or night), seven days a week - fifty-six hours of arduous and dangerous active duty that even to the fittest of soldiers is hugely debilitating and even exhausting. It is a routine possible to sustain for long only via the rotation of fresh troops from the main base.

In practice, any given base-commander may choose to operate only day-time patrols over a large part of the area, with two groups out rather than one (perhaps further split into smaller units). That might cover a large area, but it also leaves the night free for insurgents to operate. It is true that bases will be aided by the use of reconnaissance drones and aircraft, airborne Sigint and Elint and satellite-based systems; but these have little effect without the work of the patrols.
There is every indication that the Taliban fighting in Marjah will just melt away, re-group, and wait to fight another day. The local population know that. They regard the national Afghani government as semi-foreign. Loyalties will likely remain intensely local.
The very size of Operation Moshtarak seems at first sight to make it more than enough to drive the Taliban out of central Helmand. The emerging reality is that the militants are adapting to the assault by melting into the surrounding communities, with a few engaging in direct combat, and that they are able to survive most of what is thrown at them. . . .

In any event, what happens after the peak of the assault matters more than its immediate, local details - and that will become apparent only over many months and even years. The fighting around Marjah is being intensively reported in the western media . . . . There will be much less attention on the aftermath of Operation Moshtarak - yet, . . . it is precisely when the media caravan has moved on that the deeper realities of the Afghanistan war are revealed.


Arrogance of the "Educated" Class

Labouring Under a Regimen of Fools

The Scriptures make a distinction between knowledge and wisdom. To know something is not the same as being wise with respect to it. For example, one may be highly educated in the intricacies of how a hunting rifle is constructed, its technical specifications, the peculiarities and functional applications of its calibre, the specifications of varieties of loads and what is superior for what conditions, its design provenance, the laws of physics operating when it is fired, and where that particular rifle fitted into the panoply of hunting rifles, past and present.

Knowing about all this does not ensure that the rifle will be used properly, safely, and effectively, without harm or damage to oneself or others. The Scripture warns of the dangers of knowledge when it says, "knowledge makes arrogant, but love builds up." (I Corinthians 8:1). In the Bible wisdom has to do not only with accurate and correct information, but the proper use and application of that knowledge to life. Wisdom is the happy condition of acting in a manner condign with the Creator as His sub-creator in His created world.

For many in the modern world, knowledge is the equivalence of wisdom. The our rationalistic age a wise person is the one who knows more, who is educated.
Thus, it has become increasingly common in political and social discourse in the United States (somewhat less so in New Zealand) to speak of the "educated classes" and the "uneducated classes": the former are superior, smarter, wiser, and better fitted to rule than the latter. Probably the most offensive thing about Sarah Palin to the liberal left in that country is that she is deemed uneducated, not part of the educated elite. She is therefore ignorant or stupid or both. She is consequently deemed completely unfit to be a political leader in the Republic.

In Biblical terms, however, it is possible that Palin may be deemed wise, and her political opponents, although Ivy League educated, fools. In Scriptural parlance, having read more books does not make one wise; it may merely mean one is a more loquacious polysyllabic fool. (In this vein, it is salutary to remember that when Ronald Reagan was campaigning for the Presidency, his political opponents and the media constantly portrayed him as simplistic and semi-idiotic. Without putting too fine a point on it, the narrative about Reagan was that he was dumb. He was not part of the educated class. Yet there are few now who would question that he was a highly effective President. In a similar vein, George Bush Jr was constantly caricatured as ignorant and stupid, a bumbling fool. Now with him there was a slight hiccup in the framing because he was a scion of the Ivy League: but he mangled his sentences (due to dyslexia) and that proved he was ignorant. Because he could not use words properly, he was obviously a fool, and was lampooned accordingly.)

In New Zealand, Anne Tolley, Minister of Education has regularly been smeared and attacked by her left wing political opponents, and by teacher unions as being ignorant, stupid, obtuse, and dumb. It is the same syndrome. Generally those on the left see themselves as being fundamentally smart because they are clever enough to promote statist solutions to every perceived problem. Their opponents are believed to be ignorant because either they don't see the problem in the first place, or if they do, they don't understand the only fundamental solution, which is always a state run, central government one.

Using the Bible's authoritative frame of reference we may acknowledge that a particular political leader may be highly educated, clever, smart, and intellectually sophisticated. But that does not make him or her wise: it may only serve to make them more dangerous. If, in their self-referencing cleverness, they pursue and enact laws and policies which are not condign with the Creator and His world, great damage will be done. Fools they will have proved to be.

Humility always attends the truly wise, for they recognize that they have no calling, authority, or right to attempt to remake the world in their own image. The truly wise consider themselves to be stewards, responsible to God, appointed and restricted to sub-creation, not original creation. The wise know they must work within God's frame; fools attempt to make a new world after their own image.

In our day, unfortunately, education often is inseparable from foolishness. To be under the aegis of the educated classes is not a pleasant prospect. For in our day, the "educated" are taught to be self-referential. They have been instructed that to be educated means that you have the knowledge, skills, insight and wisdom to be able to make a new world as it seems good to you. Because one is so endowed one deserves the people's respect and trust. One supposedly is therefore fit to lead and govern.

The Proverb says that one of the most difficult things for the earth to bear is when a slave becomes king. We can add to that. It is very difficult for a people to bear leaders who equate education with wisdom, who think that their education justifies their self-referential stupidity and foolishness.

Tuesday 23 February 2010

Sacrificing to the Body Politic

Beware Croesus Cloaked in a Noble Cause

In the US, John Edwards has been exposed as an ambitious charlatan, a populist who carefully crafted a public image that was far from the truth. The public focus has been recently upon his personal life. However, as David Bahnsen argues the damage he has done in his public life and professional career has been far greater.
Edwards managed to do the impossible in his legal career, and by impossible, I mean that he made most plaintiff’s trial lawyers look respectable by comparison.

No case was too frivolous for the Carolinian wonder boy, and no case was unwinnable either. He beat down doctors to a point of humiliating submission, and singlehandedly changed malpractice premium costs for an entire region of the country. Medical malpractice cases are a complex thing, I suppose, and I am not sure it is possible to prove my case, but I would propose to you that John Edwards legal war against doctors has to have cost lives over the years, and almost certainly has cost tremendous amounts of suffering and pain (for patients, or as John calls them, “the little guy”).

How could this be, you wonder? Wasn’t he just a knight in shining armor taking on an evil medical industry that refused to look out for the little guy? No. Not in the slightest bit. What he did was sue the American Red Cross repeatedly because some blood transfusions went wrong (a tragic freak accident, yes, but a tort, hardly).

What he did was put the fear of God in every doctor’s mind in his part of the world that if they slightly over-prescribed (or under-prescribed) a treatment, he would ruin their lives. Treatments that should have been recommended were inevitably avoided because of the threat of John the Dirtbag coming after them.

He carved out an impressive niche in tragic cerebral palsy cases, forcing obstetricians and their insurance carriers to settle cases (for huge dollar amounts) to avoid highly dubious accusations that an action (or inaction) on their part led to cerebral palsy in the cases of various infants. How any member of the medical profession ever voted for this wretch of a human being is beyond me.

At some point in John’s career he realized that ruining doctor’s lives to enrich himself had hit a ceiling, and he now had to turn his guns on a different classification of victim – one in which torts could really hit high numbers: product liability cases.

So he put companies out of business over freak home accidents and pocketed multi-million dollar fees every step of the way. He was notorious for the most contemptible behavior imaginable in the courtroom including claiming that deceased family members of his clients were inside his body talking through him. Like I said, a real piece of work.

As an aside, but not unrelated--virtually to a man, tort lawyers in the US support the Democratic party, and pony up significant donations to prove it. Consequently, the Democratic political machine assiduously protects the interests of tort lawyers, refusing to reform the corrupt and immoral US tort system.

It is noteworthy that in the 2,000 page Democratic House Health Bill and in the 3,000 page Democratic Senate Health Bill conspicuous by its absence is any attempt at medical tort reform. We know why.

A Terrorist By Any Other Name . . .

Politicians and the Truth

In recent years we have seen various governments (all left wing, it should be noted) that have focused an awful lot of energy upon constructing and manipulating public perception. Labour under Blair in the UK, Labour under Clark in New Zealand, Labour under Rudd in Australia, and Democrats under Obama in the US have all had an unhealthy preoccupation with "spin", trying to massage a message, telling the electorate how they should think, not by constructing a coherent or compelling argument so much as by revising the vocabulary and the categories applied to anything and everything.

It is the politician as salesman. It is politics in a Wittgensteinian post-modern world. Now, to be sure, all politicians engage in this to a certain extent. It is part of the territory, sadly. But some administrations seem to substitute spin for effective government, as if they were one and the same thing. To govern is to spin--at least that's how it appears. "You can't handle the truth," seems to be the consensus view about voters and the public.

Of course, in the long run, it all wears a bit thin.
The administration that spins eventually becomes itself "spun" as weak, insubstantial, and above all, untrustworthy. Smart oppositions press the point constantly. So Tony Abbott's characterisation of Kevin Rudd can gain traction because the mud finds places upon which to stick: "(h)e has called Rudd dishonest, deceptive and a serial promise-breaker, a toxic bore, a prime minister who hides behind a "wall of incomprehensible words and an army of spin doctors"." (Sydney Morning Herald)

The Obama administration believes it is vulnerable to criticism over terrorist attacks. Its polling has told it that the public thinks it is "weak" on combating terrorism. It is currently very busy extolling every advance, and downplaying every reverse. In particular, every lawless terrorist act, is rapidly reframed to be called a criminal deed, rather than an act of terror. Why? Criminals are ordinary, a dime a dozen, part of everyday life. Consequently, the word "terrorist" has now lost a good deal of its meaning.

Major Nidal Hasan who shot colleagues at Fort Hood was definitely not a terrorist because he was acting alone, we were told. In the lexicon of the White House, a terrorist has to be part of a conspiracy; you have to be planning with other people to commit an act of terror. President Obama insisted in his first public announcement that the murderous beast who tried to blow up the Detroit bound airliner on Christmas day was acting alone. (He had to retract this later when it became evident that he was in fact part of a conspiracy.) But why was it important to make the point? Because in the White House lexicon it would mean that he was not a terrorist. He would be simply a criminal or madman. He could not be used as evidence that the Obama administration is "soft" on terrorism. Category revision is the classic move of a spin doctor.

In the most recent case of the murderous man who flew his aircraft into a building in Austin, Texas ,the first thing the White House spokesman wanted to communicate
once again was that it was not an act of terror. Why? He was acting alone.

In truth all of these acts are terrorist acts. It is incumbent on leaders worthy of the name that they should tell the truth--especially on such life-and-death matters. Terrorism is nothing more nor less than the attempt to advance a political goal by killing or harming others, with the intent to scare people and governments politically, so they will do what the terrorist wants. There is always an intended public message in a true terrorist act. There is always murderous intent to kill innocents. There is always an intent to make others afraid so they will be cowed into compliance. It has nothing to do with whether one is acting alone or in concert.

Thus, Nidal Hasan's murders were a terrorist action. The suicide plane hitting a building in which the IRS was working was also a terrorist act. Most mass shootings are terrorist acts.

The more politicians attempt to spin it another way, the less credible they become. In the end people want the truth. They can handle it. They respect leaders who insist upon it--in fact, the credibility and authority of leaders who prove themselves honest rises substantially.

May God in His mercy grant us honest leaders.


Monday 22 February 2010

Doug Wilson's Letter From America

Sarah Palin and Personal Loyalties
Politics
Written by Douglas Wilson
Saturday, February 20, 2010

This will be a hard one to explain, but let me just put it out there anyway.

Those who have followed the political threads on this blog will know that I like and respect Sarah Palin. This is not to say that I believe she is necessarily equipped to be president, but my reservation at this point is not to be taken as a criticism of her compared to all the others jockeying for that position. In my book, the last person who was qualified to be president was probably James Madison.

That said, allow me to point to a couple of recent things that make me like her even more, and which indicate to me that whatever she is doing, it isn't the ordinary political game.

Here is the first indication. Harry Truman said that if you want a friend in Washington, you should get a dog. But contrasted with this, Sarah Palin understands that while political convictions are important, personal loyalties are crucial. I write this hoping that John McCain loses to his primary challenger, and I believe that it is important for the budding tax revolt that he lose. I will rejoice if he loses, and at the same time I think it is admirable in the extreme that Sarah Palin is campaigning for him. If Sarah Palin has endorsed Ron Paul's son in the Kentucky race, as she has, and she endorsed Hoffman in the NY race, as she did, then she certainly knows what the issues are, and knows the difference between party loyalties and personal loyalties. There would be no way for her to turn on John McCain, after what he did for her (for whatever reason), without selling her soul to the twin devils of political expediency and personal ambition. She isn't doing that, probably because she is more interested in going to Heaven than in going to Washington. Let's hope she keeps it that way. So I hope McCain loses, and I hope that Sarah continues give him her full support.

Second, while it is fun to see rowdy conservatives looking forward to the future, as they have been doing at the CPAC convention, I was extremely grateful that Sarah Palin gave that convention a miss. "Politics as usual" includes the methods, and not just the content. Though there are stout souls at such events, there are also opportunists aplenty. It is extremely difficult to show up at a convention like that without looking like Absalom glad-handing at the city gates.

I have my differences, certainly, but Sarah Palin continues to rise in my esteem.

Last Updated in Blog and Mablog on Saturday, February 20, 2010

Why We Love the Jews

Riches for the World

From the standpoint of the gospel they (the Jews) are enemies for your sake, but from the standpoint of God's choice they are beloved for the sake of the fathers; for the gifts and the calling of God are irrevocable.
Romans 11: 28-29
In Romans 11 we have the course of redemptive history, which is to say human history, presented in a nutshell. Redemptive history is how God, over time and through the ages, is working to save the world. Since the world is ruled by His Messiah, Jesus Christ, and since all enemies of Christ are being progressively, gradually, and ineluctably placed under His feet, redemptive history is actual human history. There is no other true meta-narrative.

Within this glorious narrative the most important dynamic is the Jew-Gentile dichotomy.
From the very beginning of God calling Israel and entering into a covenant with the fathers He made it clear that Israel was being called so as to redeem the entire world. All the families of the earth were going to be blessed through His calling of the Jewish people (Genesis 12:3).

In this first phase of redemptive history God separated Israel from the nations to be a holy nation unto Him. In the second phase of redemptive history--in which we now live--the stubborn disobedience and rebellion of the Jews (the natural branches of the redemptive tree, Romans 11:16) have been cut off, and God has turned to the Gentiles--all the rest of the families of the earth to graft them in.

Now, over the course of this phase to date millions of Jews have been re-grafted back in as they have come to embrace Jesus Christ as their long awaited and promised Messiah of God, and have repented and believed. But the majority have not. But a time will come--the third phase--when the Jewish people as a whole, as a majority, will be grafted back in again. This will result in turbo-charged divine blessings being poured out upon the whole world. (Romans 11:12)

Paul reveals this when he says that the partial spiritual hardening will continue to hold the Jews in check, until "the fulness of the Gentiles" has come in; but then God will turn once again to the Jewish people, and all Israel (then living) will be saved (Romans 11:25--27).

This is why Christians must maintain a deep humility towards the Jewish people--a profound love and longing for them. Of course, this does not mean that we excuse or rationalise away Jewish idolatry or disobedience or rejection of Christ. We would continue to speak the truth in as winsome a way possible. But we ever remember that for the sake of our fathers they remain beloved of God and that one day He, when the fulness of the Gentiles has come in, He will turn again to His ancient people, their eyes will be opened, and they will see and embrace their Messiah. There will be dancing in the streets all over the world.

But the fullness of the Gentiles must come in first--including the conversion to Christ of Islamic people all over the globe. Therefore, because we love the Jews and long for their salvation, every year we double down on our labours to make all the nations of the world true disciples of Jesus Christ. Then true blessings will fall upon Israel, and as a consequence the entire world will behold and experience the fullness and glories of our Saviour God.

Saturday 20 February 2010

Update on Australian Bush Fire Prevention

Irresponsible Environmentalism

We have posted previously on how the loss of life in Victorian bushfires had been indirectly caused by the failure to conduct periodic and regular controlled burnoffs. Opposition from greenie groups and townies-cum-lately bushdwellers had stymied such bushfire mitigation strategies.

Miranda Devine updates us on progress being made by the appointed Royal Commission, and the political backlash that is already underway.
Fire prevention a burning issue

MIRANDA DEVINE
February 20, 2010

A year after the Black Saturday inferno, which killed 173 people, Victoria's bushfire royal commission is at last reaching the pointy end of its inquiry - the politically charged topic of prescribed burning and the effect of massive, unmanaged fuel loads on the fire's ferocity.

So yesterday, instead of relaxing in retirement, 79-year-old Athol Hodgson, Victoria's former chief fire officer, strapped on his armour to give evidence on the most important, and bitterly contentious, issue before the commission.

He is fortunate to have a sharp mind and keen memory, and the long experience of fire that makes him possibly the most knowledgeable person in the state when it comes to Victoria's prescribed burning program and its uniquely flammable bush.

He also has seared into his mind the personal nature of the threat. As a nine-year-old on his parents' dairy farm in north-east Victoria during the 1939 Black Friday fires, which killed 79 people and incinerated 2 million hectares, Hodgson woke in fright on the back veranda, where he slept with his big dog, to see flames advancing 500 metres away though the paddocks surrounding the house.

He remembers his father and eight older sisters fighting all night to beat out the fire with wet hessian bags. And the next morning, when they came home to milk the cows, he was sent to watch the embers for any signs of fire.

But, he told the commission yesterday, the lessons of 1939 have been forgotten and there has been a ''failure to provide a safe environment'' for Victorians for three decades. . . .

''And unless the government acknowledges present targets of 130,000 hectares [a year] are not based on science, but based probably on some Treasury restraint, and realises something needs to be done, we'll get more tragedies.

''It's the most important of all the issues facing the royal commission. If they don't do something about the cause of these fires, which is fuel, large numbers of people are going to die.''

He says the build-up of fuel before last February's fires was ''the worst in the history of white settlement''.

He urged the royal commission to recommend a trebling of prescribed burning to 365,000 hectares per year.

His strategy of setting a mandated target means the government will be held accountable for any future bushfire calamity due to excessive fuel build-up.

Contrary to mischievous claims by green groups, no one says prescribed burning prevents bushfires. But the evidence is incontrovertible that reducing ground fuel by controlled burning in the cooler months does reduce the speed and intensity of inevitable wildfires, and actually makes it possible to control them.

As the chief fire control officer of the Department of Conservation, Forests and Lands, Hodgson experienced first-hand the political interference in the mid '80s which eventually hobbled the prescribed burning program that had kept Victoria's flammable bush safe since the Stretton Royal Commission into the 1939 fires.

He praises Joan Kirner as the best minister he worked for, but she was as susceptible as any politician of the time to greenie and NIMBY complaints. She ordered a prescribed burn in south-east Victoria be stopped because smoke was affecting the autumn festival in the town of Bright, which, ironically, later almost was burnt out.

Another time, a prescribed burn was stopped because of fears for the long-footed potoroo. ''After that, the program collapsed - every prescribed burn that caused someone's eyes to water and a few that got out of hand became media news.''

As he told the commission: ''The pressure … placed on individuals required to ignite and manage fires was enormous. Prescribed burning is not an exact science and … always involves some risk … In the political climate prevailing through the 1990s, any outcome that made news was unacceptable, regardless of whether or not the objective of burning was met. Staff were exposed to criticism … Many doubted the support they would receive if 'their burn' became newsworthy.''

People ''doing their best on the ground said, 'Bugger this, it's not worth it'. The program collapsed [to the point that] one year only 40,000 hectares was burned for the whole state.''

The Hodgson recommendations, if adopted, will embarrass the Brumby government, which has tried to offload blame onto global warming and now fears the commission is out of control.

In an indication of how high the stakes are in an election year, Brumby's lawyer launched an extraordinary attack on the commission this month, claiming it was ''irresponsible and sensationalist, adversarial, pointless and damaging'' to suggest anything other than that authorities were ''simply overwhelmed by an extraordinary, unprecedented fire''.

Unmoved, the royal commission is taking seriously submissions on the need for more prescribed burning. This week counsel assisting, Jack Rush, QC, pointed out that no prescribed burns had been conducted around Kinglake, where 42 people died, since 1981.

Ever on guard against logic, the Wilderness Society, Victorian National Parks Association and Australian Conservation Foundation, which have a lawyer representing their interests at the commission, have mounted their usual campaign of disinformation and devious obstruction.

Their submission claims prescribed burning does little or nothing to slow fires during extreme conditions.

They fool no one except fools. And the royal commission doesn't appear to be manned by fools.




Bureaucrats Pure as the Driven Snow

A Great Guffaw Moment

The West believes passionately in the sinlessness and perfection of man as an article of faith. At least in an abstract sense. Mocking and sneering at the Bible's declaration of universal human depravity, the West has turned from the worship of God and replaced it with a spurious reverence for man.

This deep and profound religious attachment to man, which is now the established religion of the West, is the ultimate and final expression of idolatry. When Adam and Eve rebelled against God and sinned, the Serpent acutely revealed what was the essence of the matter: in the day they ate of the Tree of the Knowledge of Good and Evil man would become like God, knowing good and evil for themselves. (Genesis 3:5)

"Primitive" cultures and civilisations were steeped in idolatry. It was an intrinsic part of the world-view. Men speculated over and created deities; they made representations of them; they bowed down to them and sacrificed to them. The truth of the Serpent's assertion lay hidden beneath the surface. Man bowing down to dumb idols does not much seem like man being as God. Yet it was. It was man who was determining for himself what gods existed, how they acted, what they were like, and how they were to be mollified and won over.

In the West, in the post-Enlightenment age, the veneer disguising the Serpent's proposition has been stripped off and tossed aside. Man has come forth asserting boldly and without shame his prerogatives to determine good and evil for himself. No longer does the modern man feel the need to genuflect to a higher power or entity in pseudo-humility. No longer does he fear retribution if he fails to do obeisance. In the West, man has grown up. There is no truth, right, wrong, or reality higher than the ratiocinations of the Western human rationalist mind.

It has finally emerged--the truth of Satan's observation about man as god. It has taken millennia, but eventually, in the West, the rebellion of man against God has reached its apogee. Man no longer feels the need to "hide" or disguise his assertion of supremacy and autonomy: he is confident enough to assert it outright. There is no entity or being higher than man. There is no god to be worshipped. There is only man. Idolatry has now reached its most consistent expression: it can go no further in terms of its overtness or the consistency of its outworking.

Another way of expressing this is to say that in the West idolatry has taken its most extreme form; the rebellion of man against God has matured to the point of succeeding in building a civilisation that is more or less consistently grounded upon the proposition that man is god. And this is a first in human history.

But just a few pesky problems remain. Whilst in the West man is now definitely in charge (at least in his own mind) Western civilization continues to be beset with a daunting array of problems ranging from disease to crime; poverty to illiteracy; credit crunches to obscene bonuses paid out to, in President Obama's words, "Wall Street fat cats". We could go on with an extensive litany of "sins" that remain part of modern experience. Man may be charge, but the old saw about lunatics running the asylum bites a bit too close to the bone.

Thus, human perfection and sinlessness must be seen as a relative concept. Some are more sinless and perfect than others it would seem. And here we come to one of the great guffaw moments of our age. In this most extreme form of idolatry, the West is forced to cast about for someone, some representation of humanity, that is more godlike who can be trusted to lead the race out of its remaining problems to its final perfect state.

In order for our extreme idolatry to maintain a veneer of credibility there must always be at least one class of human beings which has achieved a greater slice of divinity, which has risen above the petty and the petulant, which has achieved one of the key attributes of deity. This attribute is essential to lead other men in the West to their final perfected state, to an existence where manifold problems are solved once and for all. It is the attribute of absolute disinterest, of pure objectivity, which would lay aside self-interest, and act only in the interests of others, or for the good of man as a whole. Only such men and women are worthy of being trusted and believed in so as to lead all in the West to final perfection. When man is god, it will always turn out that some men are more divine than others. It is this deifying, not just of men in general, but of a specific type of man, or class of man, or individual as being more divine than others, and therefore worthy of our trust to bring man to perfection, which is the guffaw moment. It is the great joke of the age.

This uber-deifying of a certain man or class of men is so common and so intrinsic to the West that it no longer seems remarkable. It has achieved the status of being beyond debate, a truth to all intents and purposes self-evident.

Theodore Dalrymple, writing in City Journal gives us a classic rendition of this psychosis and spiritual blindness. He is writing about John Kenneth Galbraith, about whom he says many interesting things, which we will hopefully get to discuss in another post. Galbraith, of course, is probably the most lauded economist in the US in the twentieth century, and Galbraith has some very definite views about which section of Western society has achieved absolute disinterest and is godlike enough to lead the rest of us.

Dalrymple points out that Galbraith had a deep disdain for private business corporations--a disdain which is coming back into vogue. He held that the bigger and more successful a private business became, the more its management developed a bureaucratic mindset and began to look after its own interests. Clearly business managers, as a class, have not yet achieved the levels of perfection that are required of those whom we can trust. Where, then, does Galbraith place his faith?

In Galbraith's case, he places his faith firmly in government bureaucrats. Dalrymple takes up the narrative:
There remains, however, an astonishingly gaping absence in Galbraith’s worldview. While he is perfectly able to see the defects of businessmen—their inclination to megalomania, greed, hypocrisy, and special pleading—he is quite unable to see the same traits in government bureaucrats. It is as if he has read, and taken to heart, the work of Sinclair Lewis, but never even skimmed the work of Kafka.

For example, the chapter entitled “The Bureaucratic Syndrome” in his book The Culture of Contentment refers only to bureaucracy in corporations (and in the one government department he despises, the military). Galbraith appears to believe in the absurd idea that bureaucrats administer tax revenues to produce socially desirable ends without friction, waste, or mistake. It is clearly beyond the range of his thought that government action can, even with the best intentions, produce harmful effects.
And later:
Galbraith explains resistance to higher taxation thus: “It is the nature of privileged position that it develops its own political justification and often the economic and social doctrine that serves it best.” In other words, men—except for Marx and Galbraith—believe what it is in their interest to believe. It is hardly surprising that Galbraith always writes as if what he says is revealed truth and counterarguments are the desperate, last-ditch efforts of the self-interested and corrupt.

Galbraith never solved, or even appeared to notice, the mystery of how he himself could see through self-interest and arrive at disinterested truth. In general, his self-knowledge was severely limited.
But in believing as he did, Galbraith reflects the most widely held "vote" in the West for the class of men who have achieved higher states of deity and who can be trusted to lead us lesser gods into nirvana. It is the gummint. It is the state bureaucrat who is the ultimate disinterested, self-denying, other-centric human being. All these are attributes of deity. It is this breathtaking folly which makes us split our sides with laughter.

But, we are careful not to make our mirth too public. To question the higher-being-status of government functionaries comes perilously close to blasphemy in our age. Is it not self-evident that some are more human (and thereby more divine) than others? Is it not self-evident that whatever problems remain, government functionaries will deliver us from them? It is not self-evident that the more problems that arise the more functionaries we need? These things are believed by all, and are beyond dispute.

In the West we have made ourselves to be gods, knowing good and evil for ourselves. Government bureaucrats got there first. Human civilization's greatest tragic farce.

Friday 19 February 2010

Faux Outrage

Bethunism: A Neologism for Gross Idiocy

There can be few things more self-servingly maudlin than the display of faux outrage we have had to endure over a New Zealander currently being held on a Japanese whaling ship.

The facts are these--and they are not in dispute by anyone, apparently. One Peter Bethune, idiot extraordinaire, international man of misery, went down to the Southern Ocean to protest against Japanese whalers. His fizzy speed boat collided with one of the whaling vessels. He and his fellow conspirators called for the New Zealand government to come to their rescue and defend them against the hostile acts of the Japanese.

When that histrionic move failed, indefatigable Bethune upped the ante. He boarded the Shonan Maru 2 illegally and sought to make a citizen's arrest of the captain. He instead was arrested, and (we hope) being held in irons on the Japanese ship. His co-conspirators have tried to pressure the New Zealand government to intercede with the Japanese government to negotiate this enviro-warrior's release. They are angry that their calls have fallen on deaf ears. One of their number, Mr Paul Watson is reportedly concerned that he has not heard from Bethune (probably it is difficult to make contact when you are languishing in irons) and he is worried because apparently Mr Bethune has a cut hand. Diddums. Things like that tend to happen when you break the law and engage in hostile acts of piracy.

Here is our advice to the government. Foreign Minister McCully needs to contact his Japanese counterpart and inform him the position of the New Zealand government is that any NZ citizen who commits crimes (as defined by New Zealand criminal law) outside of this country must face the full weight of the laws of the local jurisdiction. Therefore, the government will not be interceding for Bethune in any way, and supports him being taken to Japan in irons, and delivered up to the Japanese authorities when the Shonan Maru 2 returns to that country. The government expects the Japanese to treat this matter according to the rule of its own law and international maritime law. This position of the NZ government naturally is without prejudice to its position on international whaling. That's it. Enough said and done.

We cannot leave this nonsense without commenting on the Labour politician, Chris Carter who has bizarrely urged the government to "get involved" so that relations don't sour any further with Japan. OK. So the government taking up Bethune's case to attempt to shield him from the legal consequences of his criminality is going to help our relationships with the Japanese government how exactly? Mr Carter is a joke.

But perversely he does expose an opportunity for improving relations with Japan--not that they are really under any threat. Supporting the Japanese government's rights to hold Bethune to account under the law will nurture our relationships with that country just nicely.

Thursday 18 February 2010

What's Going On Here?

Fishbait or Big Catch?

US media have been agog over the capture of a "big fish", Mullah Abdul Ghani Baradar, reportedly Number Two in the Afghanistan Taliban. He was captured in a "joint operation" between Pakistan and the CIA. The White House has trumpeted the "capture".

But now things are emerging which, if true, cast a different light on the event. We are reminded that in war the first casualty is the truth. The NY Times, after first announcing the capture and congratulating the White House, has now come out and raised all sorts of questions.

Here is a take on the conflicting stories, as carried today in National Review Online:
The Strange Case of Mullah Baradar [Dana M. Perino and Bill Burck]

Yesterday, the New York Times broke the story that one of the Taliban’s top military commanders, Mullah Abdul Ghani Baradar, had been captured in Karachi during a joint raid by Pakistan’s Inter-Services Intelligence (ISI) and the CIA. Baradar is reportedly second only to Mullah Omar in the Taliban’s loose hierarchy. The Times reported this as a huge victory, and the Obama administration took a few well-deserved bows.

At first blush, and based solely on the Times’s reporting and the administration’s reaction, this did indeed appear to be a major achievement. We noticed something odd with the triumphant tone, however. The article published yesterday noted that Baradar had been one of the Taliban’s “most approachable leaders” and one of the few Taliban commanders willing to negotiate with President Karzai's government.

This struck us as discordant with the dramatic raid, capture, and interrogation of Baradar initially described by the Times. Baradar was not captured in a spider hole, like Saddam Hussein was, or hiding out in Rawalpindi, Pakistan, like Khalid Sheikh Mohammed was. Instead, it appeared that his location was not much of a secret at all.

At first we wrote this off as more evidence of the on-again/off-again “cooperation” we receive from the Pakistani intelligence service, and Baradar’s capture was a good sign it was on again. But, according to new reporting by the Times today, the reality is far more complicated.

Today, The Times is reporting that the real story behind Baradar’s capture is that Pakistan wanted to gain a place at the table in negotiations between the U.S. and Karzai and the Taliban.

Specifically, Baradar, it turns out, was one of Karzai’s main contacts with the Taliban for years, and he was at the center of efforts to negotiate a peace with the Taliban. Pakistan was frustrated at being excluded from the talks, so it snatched up Baradar to gain an advantage.

The Times quotes an unnamed American intelligence official: “I know that our people had been in touch with people around [Baradar] and were negotiating with him. So it doesn’t make sense why we bite the hand that is feeding us. And now the Taliban will have no reason to negotiate with us; they will not believe anything we will offer or say.” If this is true, then the capture of Baradar is not exactly what it first appeared. And if Baradar was as central to Karzai’s and America’s efforts to negotiate with the Taliban as the article suggests, then there appears to be significant costs to the capture. Perhaps it was even unhelpful to Karzai and the U.S.

Does capturing Baradar really further U.S. strategy? (Perhaps the administration did not view him as a valuable contact and thought he would be more useful in custody and subject to interrogation.) Or does it actually harm U.S. strategy? Was it forced on the Obama administration by the Pakistanis? If so, does the administration’s triumphant tone reflect its true feelings about the importance of capturing Baradar, or is it a smokescreen?

The fact that the New York Times, not known for its strength of objectivity in covering the Obama administration, is reporting this suggests to us that there’s a better-than-even chance that the administration is trying to turn a lemon into lemonade.

Its public messaging is that the capture of Baradar is a huge win in the ongoing war with the Taliban. But is the administration concealing the downsides of the capture? We hope not. And we certainly hope the administration is not crowing about capturing Baradar in Pakistan in order to distract from the difficulties it has had on the home front with the KSM trial and Mirandizing the Christmas bomber. But if the Times story is accurate, the evidence is beginning to tilt in the wrong direction.
Curiouser and curiouser.

The Twilight Years, Part V

We're Eugenicists and We're Here to Help

The fear for the future that gripped Britain in the two decades after World War One rested not just on economic misdiagnoses. There was also a prevailing concern over miscegenation and genetic deformation. So argues Richard Overy in his book, The Twilight Years.

Professional and leading historians were telling the public that societies inevitably decline and that Britain had reached its high point and now decline was certain. Things would go from bad to worse from this point, the public were told and they believed it. Economists agreed, arguing that capitalism was so internally contradictory that it would grow to incompetence and collapse. Moreover, it was fundamentally immoral. No civilisation could be sustained upon such unethical and immoral foundations. But things might be assuaged if the worst excesses and contradictions could be offset by bureaucratic planning. The Soviet Union was held up as the way forward.

But a further problem was “sickness” in the racial body of the nation. Consider Julian Huxley's diagnosis of the threat, written in 1930:
What are we going to do? Every defective man, woman and child is a burden. Every defective is an extra body for the nation to feed and clothe, but produces little or nothing in return. Every defective needs care, and immobilises a certain quantum of energy and goodwill which could otherwise be put to constructive ends. Every defective is an emotional burden—a sorrow to someone, and in himself, a creature doomed, when unassisted, to live an incomplete and sub-human existence. Not only that, but if their numbers continue to increase, the burden . . . will gradually drag us down. Cited in Richard Overy, The Twilight Years: The Paradox of Britain Between the Wars (New York: Viking/Penguin, 2009), p.93.
The "sickness" with which society was afflicted consisted of sub-standard people having too many (defective) children, which were a burden upon society, bringing about its inevitable collapse. This “world-view” of course was Malthusian: population will always run ahead of food supply, until war, death, disease or famine kills off sufficient people to bring things into equilibrium. This was the diagnosis. The solution: control the reproduction of defectives through “birth control”.

One of the earliest advocates in Britain of birth control was Marie Stopes. Overy takes up the narrative:
In May 1921 Marie Stopes organized a public meeting on constructive birth control at the Queen's Hall in London . . . . She had been advised that she might find the hall almost empty, but on the night, according to a sceptical eye-witness, there was no “trickle of ill-dressed fanatics” but a packed crowd of “quite normal-looking people”. After a lengthy organ recital, Marie Stopes, resplendent in a shining white dress, took the stage to berate the audience about the perils and expense of allowing “wastrels” to breed. The record of the meeting indicates applause at every opportunity. The only people who should become parents, she insisted, were those who could “add individuals of value to the race”. In her final remarks of the evening she told the audience that if race selection were successful they would look at their grandchildren and “think almost that the gods had descended to walk upon the earth”. . . . (Overy p. 96)
The notion of the improvement of the race was the foundation upon which the birth-control movement was built.

In our day, this kind of rhetoric grates horribly--at least to many. But in the Inter-War years in Britain it did not. Why the difference? Firstly, the greater superiority of the Englishman was a commonly held view at the time. Britain was an imperial race, therefore superior: it could maintain its Empire only by maintaining its racial purity. Breeding superior progeny was seen as a key essential to maintain the glory of the Empire: without it, Britain would inevitably decline.
The problem was famously encapsulated by David Lloyd George, the first post-war prime minister, when he warned an audience that it was not possible to run an A1 empire with a C3 population. These alphabetic categories were used by the army to label the physical qualities of recruits. . . . Sir James Barr, onetime president of the British Medical Association, testily observed that “while the virility of the nation was carrying on he war, the derelicts were carrying on the race. Overy, p. 97-8
Secondly, the increasingly popular social Darwinism of the age naturally led to an assessment of the human race which would break society up into categories of superiors and inferiors, not by virtue of bearing authority to rule or to be obeyed, but by virtue of genetics. It was an easy step, once you had accepted the Darwinistic world-view, as almost all intellectuals and public protagonists had, to be able to suggest with the utmost sweet reasonableness that across the human race was a spectrum of inferior through to superior genetic models. The survival of the race depended upon breeding out the inferiors, and increasing the fecundity of those with superior genetic makeup. Stopes openly argued (without recrimination or reaction) that “race suicide” would result from “excessive breeding of inferior stock”.

Such sentiments are a blasphemous anathema to modern pagans. Why? Largely, we suspect, because of the loathing with which the eugenics movement in Nazi Germany is now held (at least officially). Ideas do have consequences: the consequences as evident in Nazi Germany are too terrible and horrible now to contemplate. It is now politely ignored and put under the carpet that eugenics was every bit as alive and fashionable in Britain as it was in pre-War Germany. It has, in the post-World War II West, forced eugenics into operating beneath the radar screen. Eugenics is still widely practised in the West, but not necessarily as a government programme or promotion. It has been reframed into a "personal choice". Tests are now routinely done on unborn children to ascertain whether they have disease or defection. When tests prove positive parents are invited/encouraged to kill the unborn child.

Throughout the twentieth century, eugenics was practised in the United States as well, as traced by Edwin Black in, War Against the Weak: Eugenics and America's Campaign to Create a Master Race (New York: Four Walls Eight Windows, 2003; Thunder's Mouth Press, 2004). As Michael Gerson points out in The Washington Post, Black recounts
efforts by distinguished scientists, academics, industrialists, health officials and jurists through much of the 20th century to “direct human evolution” by waging war against people with developmental and physical disabilities.

Black points out that early last century, the American Breeders Association -- supported by generous grants from Andrew Carnegie -- created a committee to study “the best practical means for cutting off the defective germ-plasm of the American population.” The panel included doctors, economists and attorneys from Harvard, Yale, Princeton and the University of Chicago.

Black continues: “During a number of subsequent conferences, they carefully debated the ‘problem of cutting off the supply of defectives,’ and systemically plotted a bold campaign of ‘purging the blood of the American people of the handicapping and deteriorating influences of these anti-social classes.’ Ten groups were eventually identified as ‘socially unfit’ and targeted for ‘elimination.’” Among those groups, according to Black, were the “feebleminded,” epileptics, the “insane,” the “deformed” and the “deaf.”

Eugenic sterilizations did not end in the United States until the 1970s, endorsed by a decision of the Supreme Court. Citizens with Down syndrome and other genetic challenges are increasingly rare in America, because of prenatal testing and abortion. And as such genetic perfection is pursued, those who lack it are subjected to increased prejudice.
Accordingly, Social Darwinism has been “reworked” and "re-morphed" in the modern generation. Now it is fashionable, once again, to believe that there are superiors and inferiors in the human race. But those of superior genetic stock are those who have reached a stage of enlightenment where they paternalistically and condescendingly believe in equal human rights for all; the invitation to all lesser un-enlightened mortals is to develop and evolve to higher states of being, which are essentially ideological and cerebral. Abortion is framed as a "woman's right to choose" or a "woman's right over her own body": the propaganda frames abortion as an act of a truly enlightened superior being. The invitation to inferior humans is to persuade them that they need not, in fact, remain inferior, but that they too can become truly enlightened. Naturalistic Darwinism morphed into social Darwinism, which has once again morphed into ideological Darwinism. But throughout the Darwinistic world-view, inconsistency and hypocrisy notwithstanding, remains firmly predominant.

But we digress. Returning to Britain in the Inter-War years, the idea of inferior breeding and poor genetic structures fitting seamlessly and neatly into the pessimism of the age in Britain in the twenties and thirties. The warning of a potential biological crisis was credible, given the general pessimism. Moreover, it buttressed the prevailing prejudices by appearing to give them a rational, scientific foundation. As soon as appeals to "science" could be made, the credibility and believability of the pessimistic outlook went up by several degrees. And it went both ways: the prevailing pessimistic outlook in turn made the claims of the eugenics movement appear well reasoned and well founded. For example, in the 1930's Leonard Darwin, fourth son of Charles, warned that
without biological correction Western civilization was destined to suffer the same slow decay that had been the lot 'of every great civilization.' . . . the problem lay in the inherited quality of the race, which, Darwin argued, had a natural tendency to decline as long as the 'less efficient strata' reproduced faster than the biologically inefficient. (Overy, p. 101)
Eugenics was the new advance in biological science which would prevent the inevitable decline occurring.

Eugenics became widely popular in academic and intellectual circles in the Inter-War period. The inevitable tendency was for “science” to overstate the influence of Nature (as contrasted with Nurture) so that it became seriously entertained that almost all social problems could be put down to defective Nature: alcoholism, syphilis, feeble-mindedness, crime, prostitution, delinquency were all understood to be due to an inherited predisposition and reflective of sub-standard genetics. (About the only vestige of this view which has survived to linger on in the modern world is the idea that homosexuality is a predisposed genetic condition. But, of course, this has also been parsed through the filters of ideological Darwinism, so that homosexuality, although a genetic defect, has been declared to be a human right.)

As eugenicists debated appropriate policies which would apply their science to the betterment and purification of the race, the inevitable question became where to draw the line. Unsurprisingly, it was suggested that about half the population was below average! But within this deficient half, there was believed to be a smaller sub-set which, if allowed to continue unchecked, threatened the entire race with genetic degeneration. And, it was observed repeatedly, the least genetically worthy were always breeding faster, having larger and larger families than, well, superior folk.

In the 1930's, eugenicists left no doubt about what needed to be done to preserve the race.
The language routinely used to describe biological intervention was uncompromising--'elimination' of the unfit, 'festering sores' to be cut out, a 'diseased constitution' to be medically repaired. . . . Cleansing the race left few options that did not involve severe levels of medical or social intervention. [It reduced to] two possibilities. The first was the 'lethal chamber', the second sterilization.” (Overy p. 115)
The lethal chamber was rejected as lacking sufficient public support. But compulsory sterilization was another matter.
There were few, if any, eugenicists who did not accept that sterilization, whether compulsory or voluntary, was the one remaining panacea capable to addressing the seriousness of their case for racial decline, and they worked throughout the inter-war years to persuade the government to set in place firm procedures for a national programme of sterilization targeted as the biologically and socially undesirable. (Overy p.117)
Now, it is important to remember that this was not some fringe maniacal group; eugenicists were mainstream, leading figures, intellectuals, and opinion leaders.
The ranks of self-confessed eugenists were swollen in the 1920's with a panoply of distinguished public figures in every field: the economist J.M. Keynes, who helped set up the Cambridge Eugenics Society before 1914 and remained a life-long supporter; the sexologist Havelock Ellis, who wrote pioneering books on sex before 1914; the zoologist Julian Huxley, grandson of Darwin's chief disciple Thomas Huxley and an early science celebrity; the psychologist Cyril Burt, pioneer of intelligence testing of schoolchildren and, as a result, a convinced hereditarian; the Irish playwright George Bernard Shaw, whose Man and Superman played on eugenics themes; William Inge, Dean of St Paul's, almost certainly the best-known churchman of his generation, who wanted and ideal British population of only 20 million, all with 'certificates of bodily and mental fitness'; and so on. Eugenic concern in the inter-war years was no longer the province of people the public might have regarded as enthusiastic cranks. (Overy, p. 106)
So, civilization was declining. Eugenicists argued it was substantially due to inferior, defective classes breeding far too promiscuously. The bad were multiplying; the superior were being overrun. Society would inevitably decline. Their analysis had the imprimatur of "science", and therefore it was seen as objective, credible, certain, and infallible. And, as we have seen recently, never get between a scientist and the limelight—unless one's life insurance is up to date. Overy's concluding paragraphs are reminiscent of our own recent experiences, this time to do with climate:
A great many biologists wanted to believe that they could explain crisis in convincing ways, and their science seemed self-evidently appropriate to the morbid contemplation of decay and regeneration. Biological explanations had about them the unmistakable stamp of progress, rooted as they were in programmes of scientific research and statistical assessment that were demonstrably at the cutting edge of their subjects. Identifying crisis and cure gave scientists a sense of social purpose and a high public profile even if it meant presenting complex and uncertain elements of their science in vulgar form in order to be understood”--and we may add, in order to make elements of the science appear more definite and certain. (Overy, p. 134)
We have seen how academic experts framed beliefs and expectations of impending long term decline in Great Britain during the Inter-War years. Their consensus view was taken up and reflected back in the media and in most sections of society during that period. We have argued that this polarity of unbridled optimism, followed by deep and abiding pessimism is an inevitable pathology of the religion of secular humanism becoming ascendant in a culture.

Initially the notion that man is the centre of the universe and lord of all he surveys proves wonderfully liberating and uplifting. However, man is far too puny to sustain the weight of deity. Soon his failures and phobias and imperfections intrude to the dashing of hopes and the defenestration of optimism. A deep gloom settles over the culture. The bi-polar mood swings are manifest most clearly within the Academy which had early championed the secular idolatry of rationalistic humanism.

This bi-polar pattern of extreme optimism, followed by much longer periods of apocalyptic alarmism and general despair continues unabated in our day. So has the eugenics movement. It has now morphed into strident promotion and militant practise of abortion to preserve the superiority of "enlightened" people--that is, those who have evolved to the point where they understand that an individual's rights and personal convenience trump anyone sufficiently powerless to defend or protect or assert or speak up for themselves. Death to them. Thus, the fittest survive. We have to do this to save ourselves, to become authentic, higher, self-actualised beings.