Tradition is not the worship of ashes, but the preservation of fire.—Gustav Mahler
Tuesday, January 31, 2006
Sauce Alito
You would be doing yourself a terrible injustice if you did not avail yourself of this preview of my article in The American Spectator exulting over the Alito victory.
No teaser quote: go and enjoy.
No teaser quote: go and enjoy.
More on Alito
I'd like to add that Alito would quite possibly not have been confirmed were this not an election year. The Democrats have hopes of re-capturing part or all of the Congress because of perceived weakness in the performance of the Bush presidency. They know that Americans are essentially conservative on the question of judges and do not want to chance making the court a midterm election issue.
Thus, we have the compromise. Hold a dog and pony filibuster show that allows the senators from safely liberal states and those who wish to audition for the primaries in 2008 to prove their love to the Dean/Moore/Move-On faction. Then, hold the real vote, let a few vulnerable Dems vote for Alito while the rest of the caucus votes no, and go back to "Bush lied, people died."
There's the script. Get ready.
Thus, we have the compromise. Hold a dog and pony filibuster show that allows the senators from safely liberal states and those who wish to audition for the primaries in 2008 to prove their love to the Dean/Moore/Move-On faction. Then, hold the real vote, let a few vulnerable Dems vote for Alito while the rest of the caucus votes no, and go back to "Bush lied, people died."
There's the script. Get ready.
The Steady Balkanization of America
As I understand it, amid the Alito struggle, budget wars, the creeping advance of Big Brother, and all the rest, there proceeds in Congress an effort supported by El Presidente W to enshrine a "Jewish Heritage Month" across the land. Now, I take a backseat to no one in terms of continuing announcement from rooftops of the supreme virtues of Torah, good deeds, charity, and chopped liver, and not necessarily in that order. But this is a terrible idea. There are insufficient days, or for that matter, hours, in a year for "heritage" genuflection to all of the myriad ethnic groups, skin tones, and hair colors that the traditional melting pot is supposed to comprise. And why not dog breeds? The plain reality is that the growing focus on group identity has yielded the incredible destructiveness of group identity politics, with all the of political correctitude, speech codes, litigation, mutual distrust, and general dishonesty attendant upon it. And this is no accident. That the various Jewish groups will support this is obvious, as are the long term adverse effects for all minorities, the Jews foremost among them. This Balkanization of America is supremely dangerous, and Jewish support for it is appalling.
Score One for Baker's Political Prognostication Powers
I said Alito would be confirmed by a comfortable margin with no filibuster.
Correct.
I also said it is the next vacant spot on the court, not O'Connor's seat, that will provoke the battle royale. With O'Connor's retirement and replacement you get four strong conservative votes, not five. Our side lost one when White was replaced by Ginsburg. A pretty serious swing, but the GOP wasn't complaining, now were they?
The next seat will make the Bork battle look like a party provided a Republican is doing the nominating. If not, the GOP will sit politely by while the Democrats appoint pretty much whoever they wish, AS USUAL.
Correct.
I also said it is the next vacant spot on the court, not O'Connor's seat, that will provoke the battle royale. With O'Connor's retirement and replacement you get four strong conservative votes, not five. Our side lost one when White was replaced by Ginsburg. A pretty serious swing, but the GOP wasn't complaining, now were they?
The next seat will make the Bork battle look like a party provided a Republican is doing the nominating. If not, the GOP will sit politely by while the Democrats appoint pretty much whoever they wish, AS USUAL.
The Great Zucchini
This is an amazing article about a guy who does children's parties in the DC area. Read it. Really. (HT to the great Mr. Lileks)
One of the odd things about having young children is watching people exert themselves in all sorts of ridiculous ways for their children's birthday parties. Why people are willing to spend hundreds of dollars on a party for three-year olds is beyond me.
One of the odd things about having young children is watching people exert themselves in all sorts of ridiculous ways for their children's birthday parties. Why people are willing to spend hundreds of dollars on a party for three-year olds is beyond me.
Monday, January 30, 2006
Slamming the Door on Another Counterargument
One of our commenters has repeatedly charged that the followers of Jesus chose to keep soldiering on after the crucifixion because of the desire for some material gain. In other words, they somehow cynically endured persecution in hopes of getting the big score and I don't mean heaven.
I decided to end the dime store atheist crap by looking a little deeper. Reading Gary Habermas, who is intensively engaged in this issue and famously debated the former atheist (now plain theist) Anthony Flew, I found the following statement which would seem to end this particular line of hazing:
It is the substantially unanimous verdict of contemporary critical scholars that Jesus' early disciples at least thought that they had seen the risen Jesus.
Since we now have the opinion of people who actually study the matter, rather than that of those who line their parrot cages with the latest issue of Skeptic magazine, we can put the cynical religious charlatans argument to rest.
I decided to end the dime store atheist crap by looking a little deeper. Reading Gary Habermas, who is intensively engaged in this issue and famously debated the former atheist (now plain theist) Anthony Flew, I found the following statement which would seem to end this particular line of hazing:
It is the substantially unanimous verdict of contemporary critical scholars that Jesus' early disciples at least thought that they had seen the risen Jesus.
Since we now have the opinion of people who actually study the matter, rather than that of those who line their parrot cages with the latest issue of Skeptic magazine, we can put the cynical religious charlatans argument to rest.
Christianity, Saul, Paul, and Mithraism
A comment on Hunter Baker's post "Justice v. The Resurrection" has proposed a counterargument based on two easily refuted propositions. The first, a hypothesized scenario in which Paul conspires with a variety of people to fake the resurrection, is absurd because Saul was in fact one of the most prominent opponents and persecutors of Christianity after the Crucifixion. It was not until Saul met the resurrected Jesus while on the road to Damascus that Saul, now known as Paul, converted to Christianity. No one at the time ever claimed that Paul supported Christ until well after the latter's death. Hence he could not have been any part of a conspiracy to deify Jesus Christ.
The second proposition, that the story of Christ was based on Mithraism, is equally wrong. The Catholic Encyclopedia makes the following points, which are easily confirmed by even the most superficial research into Mithraism:
"A similarity between Mithra and Christ struck even early observers, such as Justin, Tertullian, and other Fathers, and in recent times has been urged to prove that Christianity is but an adaptation of Mithraism, or at most the outcome of the same religious ideas and aspirations (e.g. Robertson, "Pagan Christs", 1903). Against this erroneous and unscientific procedure, which is not endorsed by the greatest living authority on Mithraism, the following considerations must be brought forward. (1) Our knowledge regarding Mithraism is very imperfect; some 600 brief inscriptions, mostly dedicatory, some 300 often fragmentary, exiguous, almost identical monuments, a few casual references in the Fathers or Acts of the Martyrs, and a brief polemic against Mithraism which the Armenian Eznig about 450 probably copied from Theodore of Mopsuestia (d. 428) who lived when Mithraism was almost a thing of the past -- these are our only sources, unless we include the Avesta in which Mithra is indeed mentioned, but which cannot be an authority for Roman Mithraism with which Christianity is compared. Our knowledge is mostly ingenious guess-work; of the real inner working of Mithraism and the sense in which it was understood by those who professed it at the advent of Christianity, we know nothing. (2) Some apparent similarities exist; but in a number of details it is quite probable that Mithraism was the borrower from Christianity. Tertullian about 200 could say: "hesterni sumus et omnia vestra implevimus" ("we are but of yesterday, yet your whole world is full of us"). It is not unnatural to suppose that a religion which filled the whole world, should have been copied at least in some details by another religion which was quite popular during the third century. Moreover the resemblances pointed out are superficial and external. Similarity in words and names is nothing; it is the sense that matters. During these centuries Christianity was coining its own technical terms, and naturally took names, terms, and expressions current in that day; and so did Mithraism. But under identical terms each system thought its own thoughts. Mithra is called a mediator; and so is Christ; but Mithra originally only in a cosmogonic or astronomical sense; Christ, being God and man, is by nature the Mediator between God and man. And so in similar instances. Mithraism had a Eucharist, but the idea of a sacred banquet is as old as the human race and existed at all ages and amongst all peoples. Mithra saved the world by sacrificing a bull; Christ by sacrificing Himself. It is hardly possible to conceive a more radical difference than that between Mithra taurochtonos and Christ crucified. Christ was born of a Virgin; there is nothing to prove that the same was believed of Mithra born from the rock. Christ was born in a cave; and Mithraists worshipped in a cave, but Mithra was born under a tree near a river. Much as been made of the presence of adoring shepherds; but their existence on sculptures has not been proven, and considering that man had not yet appeared, it is an anachronism to suppose their presence. (3) Christ was an historical personage, recently born in a well known town of Judea, and crucified under a Roman governor, whose name figured in the ordinary official lists. Mithra was an abstraction, a personification not even of the sun but of the diffused daylight; his incarnation, if such it may be called, was supposed to have happened before the creation of the human race, before all history. The small Mithraic congregations were like masonic lodges for a few and for men only and even those mostly of one class, the military; a religion that excludes the half of the human race bears no comparison to the religion of Christ. Mithraism was all comprehensive and tolerant of every other cult, the Pater Patrum himself was an adept in a number of other religions; Christianity was essential[ly] exclusive, condemning every other religion in the world, alone and unique in its majesty."
The second proposition, that the story of Christ was based on Mithraism, is equally wrong. The Catholic Encyclopedia makes the following points, which are easily confirmed by even the most superficial research into Mithraism:
"A similarity between Mithra and Christ struck even early observers, such as Justin, Tertullian, and other Fathers, and in recent times has been urged to prove that Christianity is but an adaptation of Mithraism, or at most the outcome of the same religious ideas and aspirations (e.g. Robertson, "Pagan Christs", 1903). Against this erroneous and unscientific procedure, which is not endorsed by the greatest living authority on Mithraism, the following considerations must be brought forward. (1) Our knowledge regarding Mithraism is very imperfect; some 600 brief inscriptions, mostly dedicatory, some 300 often fragmentary, exiguous, almost identical monuments, a few casual references in the Fathers or Acts of the Martyrs, and a brief polemic against Mithraism which the Armenian Eznig about 450 probably copied from Theodore of Mopsuestia (d. 428) who lived when Mithraism was almost a thing of the past -- these are our only sources, unless we include the Avesta in which Mithra is indeed mentioned, but which cannot be an authority for Roman Mithraism with which Christianity is compared. Our knowledge is mostly ingenious guess-work; of the real inner working of Mithraism and the sense in which it was understood by those who professed it at the advent of Christianity, we know nothing. (2) Some apparent similarities exist; but in a number of details it is quite probable that Mithraism was the borrower from Christianity. Tertullian about 200 could say: "hesterni sumus et omnia vestra implevimus" ("we are but of yesterday, yet your whole world is full of us"). It is not unnatural to suppose that a religion which filled the whole world, should have been copied at least in some details by another religion which was quite popular during the third century. Moreover the resemblances pointed out are superficial and external. Similarity in words and names is nothing; it is the sense that matters. During these centuries Christianity was coining its own technical terms, and naturally took names, terms, and expressions current in that day; and so did Mithraism. But under identical terms each system thought its own thoughts. Mithra is called a mediator; and so is Christ; but Mithra originally only in a cosmogonic or astronomical sense; Christ, being God and man, is by nature the Mediator between God and man. And so in similar instances. Mithraism had a Eucharist, but the idea of a sacred banquet is as old as the human race and existed at all ages and amongst all peoples. Mithra saved the world by sacrificing a bull; Christ by sacrificing Himself. It is hardly possible to conceive a more radical difference than that between Mithra taurochtonos and Christ crucified. Christ was born of a Virgin; there is nothing to prove that the same was believed of Mithra born from the rock. Christ was born in a cave; and Mithraists worshipped in a cave, but Mithra was born under a tree near a river. Much as been made of the presence of adoring shepherds; but their existence on sculptures has not been proven, and considering that man had not yet appeared, it is an anachronism to suppose their presence. (3) Christ was an historical personage, recently born in a well known town of Judea, and crucified under a Roman governor, whose name figured in the ordinary official lists. Mithra was an abstraction, a personification not even of the sun but of the diffused daylight; his incarnation, if such it may be called, was supposed to have happened before the creation of the human race, before all history. The small Mithraic congregations were like masonic lodges for a few and for men only and even those mostly of one class, the military; a religion that excludes the half of the human race bears no comparison to the religion of Christ. Mithraism was all comprehensive and tolerant of every other cult, the Pater Patrum himself was an adept in a number of other religions; Christianity was essential[ly] exclusive, condemning every other religion in the world, alone and unique in its majesty."
The New York Times misplaced $6.7 Trillion
“Corporate Wealth Share Rises for Top-Income Americans” is the January 29 headline of yet another uninformed New York Times piece by David Cay Johnston:
“New government data indicate that the concentration of corporate wealth among the highest-income Americans grew significantly in 2003, as a trend that began in 1991 accelerated in the first year that President Bush and Congress cut taxes on capital. In 2003 the top 1 percent of households owned 57.5 percent of corporate wealth, up from 53.4 percent the year before, according to a Congressional Budget Office analysis of the latest income tax data. The top group's share of corporate wealth has grown by half since 1991, when it was 38.7 percent. The analysis did not measure wealth directly. It looked at taxes on capital gains, dividends, interest and rents. Income from securities owned by retirement plans and endowments was excluded . . .”
All that matters in that egalitarian gibberish is that capital gains, dividends and interest earned inside tax-deferred savings account has been simply “excluded.” The story is only about taxable investments, which are mainly held by those with high incomes because tax-deferred saving plans have income limits and contribution limits that greatly limit their use among the affluent.
The CBO fabricates income distribution data from individual income tax returns. Yet the bulk of most peoples’ capital gains, dividends and interest income have become increasingly invisible on tax returns – stashed away inside IRA, Keogh and 401(k) plans, and 529 college plans. Only the income from taxable investments shows up in tax-based income studies by the CBO, or by Thomas Piketty and Emmanuel Saez.
If you exclude nearly all the assets of middle America from the count, then those at the top must indeed appear to have a big share of whatever assets are still left for tax collectors to tax.
We aren’t talking about small change. The excluded assets of IRA, Keogh and 401 (k) plans grew from $1.9 trillion in 1990 to $6.2 trillion by 2004, when both figures are measured in constant 2000 dollars.
The 2004 figure was $6.7 trillion when measured in 2004 dollars. With a middling 7 percent return and that $6.7 trillion would generate $469 billion of capital gains, dividends and interest income in the first year alone – income that does not appear in the CBOs tax-based studies of who earns what or in Mr. Johnston’s derived estimate of who owns what. A half trillion here, a few trillion there, and pretty soon it adds up to real money.
I am trying hard to finish writing a text on income and wealth for Greenwood Press. Not a moment too soon, apparently.
“New government data indicate that the concentration of corporate wealth among the highest-income Americans grew significantly in 2003, as a trend that began in 1991 accelerated in the first year that President Bush and Congress cut taxes on capital. In 2003 the top 1 percent of households owned 57.5 percent of corporate wealth, up from 53.4 percent the year before, according to a Congressional Budget Office analysis of the latest income tax data. The top group's share of corporate wealth has grown by half since 1991, when it was 38.7 percent. The analysis did not measure wealth directly. It looked at taxes on capital gains, dividends, interest and rents. Income from securities owned by retirement plans and endowments was excluded . . .”
All that matters in that egalitarian gibberish is that capital gains, dividends and interest earned inside tax-deferred savings account has been simply “excluded.” The story is only about taxable investments, which are mainly held by those with high incomes because tax-deferred saving plans have income limits and contribution limits that greatly limit their use among the affluent.
The CBO fabricates income distribution data from individual income tax returns. Yet the bulk of most peoples’ capital gains, dividends and interest income have become increasingly invisible on tax returns – stashed away inside IRA, Keogh and 401(k) plans, and 529 college plans. Only the income from taxable investments shows up in tax-based income studies by the CBO, or by Thomas Piketty and Emmanuel Saez.
If you exclude nearly all the assets of middle America from the count, then those at the top must indeed appear to have a big share of whatever assets are still left for tax collectors to tax.
We aren’t talking about small change. The excluded assets of IRA, Keogh and 401 (k) plans grew from $1.9 trillion in 1990 to $6.2 trillion by 2004, when both figures are measured in constant 2000 dollars.
The 2004 figure was $6.7 trillion when measured in 2004 dollars. With a middling 7 percent return and that $6.7 trillion would generate $469 billion of capital gains, dividends and interest income in the first year alone – income that does not appear in the CBOs tax-based studies of who earns what or in Mr. Johnston’s derived estimate of who owns what. A half trillion here, a few trillion there, and pretty soon it adds up to real money.
I am trying hard to finish writing a text on income and wealth for Greenwood Press. Not a moment too soon, apparently.
Justice v. The Resurrection
Michael Simpson posted about the relevance of religion to the academy and I commented that religion is indeed relevant because I have more evidence for the resurrection of Christ than I do for the existence of justice.
After that intentionally provocative comment, I received an email from one Tom Van Dyke encouraging me to be a bit more forthcoming. I was originally hesitant to do so because I haven't read the latest and the greatest on the subject of the resurrection, which is the treatment of the subject by N.T. Wright. Wright's work is at least partially responsible for the conversion of the famed horror writer Anne Rice. However, I remembered that William Lane Craig is very strong on the subject and I could probably get a condensed essay from him. I was right.
Here's a bit of whetting:
So complete has been the turn-about during the second half of this century concerning the resurrection of Jesus that it is no exaggeration to speak of a reversal of scholarship on this issue, such that those who deny the historicity of Jesus' resurrection now seem to be the ones on the defensive. Perhaps one of the most significant theological developments in this connection is the theological system of Wolfhart Pannenberg, who bases his entire Christology on the historical evidence for Jesus' ministry and especially the resurrection. This is a development undreamed of in German theology prior to 1950. Equally startling is the declaration of one of the world's leading Jewish theologians Pinchas Lapid, that he is convinced on the basis of the evidence that Jesus of Nazareth rose from the dead. Lapid twits New Testament critics like Bultmann and Marxsen for their unjustified skepticism and concludes that he believes on the basis of the evidence that the God of Israel raised Jesus from the dead.
I read through the essay and found it quite thorough and informative. If this blog were my sole property, I would paste the whole thing in and monopolize the real estate. Instead I will content myself with providing you with this very large LINK. (Don't get down on Craig for any typos in the essay, I think some noble person actually typed in the essay from dead tree to get it online.)
Read the essay and see whether I was exaggerating when I made my provocative statement. It's easy to be correct because the evidence for the existence of justice is weaker than expected, while the evidence for the resurrection is stronger.
Because we are an interfaith blog, I hasten to explain to my Jewish friends that I am not seeking to kick up some kind of battle over Christian history between Jews and Christians. Rather, I am trying to further the point that religion is relevant and not merely because of some psychological reason.
After that intentionally provocative comment, I received an email from one Tom Van Dyke encouraging me to be a bit more forthcoming. I was originally hesitant to do so because I haven't read the latest and the greatest on the subject of the resurrection, which is the treatment of the subject by N.T. Wright. Wright's work is at least partially responsible for the conversion of the famed horror writer Anne Rice. However, I remembered that William Lane Craig is very strong on the subject and I could probably get a condensed essay from him. I was right.
Here's a bit of whetting:
So complete has been the turn-about during the second half of this century concerning the resurrection of Jesus that it is no exaggeration to speak of a reversal of scholarship on this issue, such that those who deny the historicity of Jesus' resurrection now seem to be the ones on the defensive. Perhaps one of the most significant theological developments in this connection is the theological system of Wolfhart Pannenberg, who bases his entire Christology on the historical evidence for Jesus' ministry and especially the resurrection. This is a development undreamed of in German theology prior to 1950. Equally startling is the declaration of one of the world's leading Jewish theologians Pinchas Lapid, that he is convinced on the basis of the evidence that Jesus of Nazareth rose from the dead. Lapid twits New Testament critics like Bultmann and Marxsen for their unjustified skepticism and concludes that he believes on the basis of the evidence that the God of Israel raised Jesus from the dead.
I read through the essay and found it quite thorough and informative. If this blog were my sole property, I would paste the whole thing in and monopolize the real estate. Instead I will content myself with providing you with this very large LINK. (Don't get down on Craig for any typos in the essay, I think some noble person actually typed in the essay from dead tree to get it online.)
Read the essay and see whether I was exaggerating when I made my provocative statement. It's easy to be correct because the evidence for the existence of justice is weaker than expected, while the evidence for the resurrection is stronger.
Because we are an interfaith blog, I hasten to explain to my Jewish friends that I am not seeking to kick up some kind of battle over Christian history between Jews and Christians. Rather, I am trying to further the point that religion is relevant and not merely because of some psychological reason.
Cures for the Ailing U.S. Health Care System
Columbia University economics professor Glenn Hubbard has contributed an excellent analysis of the nation's ailing health-care system in today's issue of National Review Online. Hubbard makes the important point that the central problem with the U.S. health care system is the lack of consumer control over costs and treatment, which is caused by government impediments to private markets:
We found that many of the problems in our health-care system stem not from what happens in the doctor's office or hospital, but what happens in our tax code. If, on the one hand, an employer pays for an employee's health coverage, it is a tax-free cost for both the company and the individual, therefore allowing for generous health-care coverage in large companies — especially those with union-negotiated contracts. If, on the other hand an individual must pay for health-care costs out of pocket and these costs cannot be written-off, the medical expenses are more keenly felt and are, at times, hard to afford. This difference often results in the person avoiding to seek medical care until it is absolutely necessary — if at all.
Hubbard points out that the government's bias toward third-party payment systems is the crux of the problem:
Many policymakers are starting to see the problem. Last fall, the bipartisan President's Advisory Panel on Federal Tax Reform suggested capping the tax deductibility of health-insurance premiums so that employers could extend only so much coverage to their workers. And, if we could do so, removing all tax subsidies for health care would be the best answer. That outcome is most unlikely, and the key is to stop the tax bias against low-cost individually purchased health insurance. In our book, we propose making all health-care spending deductible. The difference in those policy suggestions is significant, but the effects would be similar. For once, all Americans would begin to manage the cost of their health care directly, instead of letting others worry about it.
A further problem, Hubbard notes, is the plethora of different state mandates regarding insurance, and the fact that insurance companies cannot compete across state lines. This lack of competition increases prices further, lowers acesss to insurance, suppresses productive investment, decreases the overall quality of services, and holds back innovation in service delivery:
A greater focus on consumer-driven health care requires further policy improvements: open and national health insurance markets, so that consumers have more choices in the kind of hospitals, doctors, and insurers they use; greater investments into health-information technology to identify and prevent errors before they occur; and reforms of medical-liability rules, so that good doctors and nurses can practice quality health care without being harassed by nuisance lawsuits.
Again, some conservative critics mistakenly think that federalization of our health-care insurance and regulatory markets would inherently be bad for health care. If one's default position is to fight national markets governed by national standards at all turns, I suppose there's no sense arguing the point. But let there be no doubt that national markets would work. Consider the state-by-state standards for gasoline admixtures — a mish-mash of formulas meant to satisfy environmentalists in California and the northeast, to the detriment of national gasoline supplies and refinery capacity. Think that's bad? Look at state-by-state health-insurance regulations. They are far more complex, and in effect create high insurance costs for captive consumers and benefits for some large insurers who alone can either lobby themselves out of trouble or maintain the product lines that each state requires. A few decades ago, banking was run this way, a situation remedied by national banking reform. Instead of the gargantuan national influence on banking that some feared, we have a true national market for a vital financial service — more choices, more products, and more usage. There are other ways to accomplish insurance-market reform, too, but the key is to promote the availability of low-cost insurance to individuals currently subject to costly state mandates.
This is just a sampling of what the article presents, and Hubbard's other writings on the subject provide further support for his approach. Enthusiastically recommended.
We found that many of the problems in our health-care system stem not from what happens in the doctor's office or hospital, but what happens in our tax code. If, on the one hand, an employer pays for an employee's health coverage, it is a tax-free cost for both the company and the individual, therefore allowing for generous health-care coverage in large companies — especially those with union-negotiated contracts. If, on the other hand an individual must pay for health-care costs out of pocket and these costs cannot be written-off, the medical expenses are more keenly felt and are, at times, hard to afford. This difference often results in the person avoiding to seek medical care until it is absolutely necessary — if at all.
Hubbard points out that the government's bias toward third-party payment systems is the crux of the problem:
Many policymakers are starting to see the problem. Last fall, the bipartisan President's Advisory Panel on Federal Tax Reform suggested capping the tax deductibility of health-insurance premiums so that employers could extend only so much coverage to their workers. And, if we could do so, removing all tax subsidies for health care would be the best answer. That outcome is most unlikely, and the key is to stop the tax bias against low-cost individually purchased health insurance. In our book, we propose making all health-care spending deductible. The difference in those policy suggestions is significant, but the effects would be similar. For once, all Americans would begin to manage the cost of their health care directly, instead of letting others worry about it.
A further problem, Hubbard notes, is the plethora of different state mandates regarding insurance, and the fact that insurance companies cannot compete across state lines. This lack of competition increases prices further, lowers acesss to insurance, suppresses productive investment, decreases the overall quality of services, and holds back innovation in service delivery:
A greater focus on consumer-driven health care requires further policy improvements: open and national health insurance markets, so that consumers have more choices in the kind of hospitals, doctors, and insurers they use; greater investments into health-information technology to identify and prevent errors before they occur; and reforms of medical-liability rules, so that good doctors and nurses can practice quality health care without being harassed by nuisance lawsuits.
Again, some conservative critics mistakenly think that federalization of our health-care insurance and regulatory markets would inherently be bad for health care. If one's default position is to fight national markets governed by national standards at all turns, I suppose there's no sense arguing the point. But let there be no doubt that national markets would work. Consider the state-by-state standards for gasoline admixtures — a mish-mash of formulas meant to satisfy environmentalists in California and the northeast, to the detriment of national gasoline supplies and refinery capacity. Think that's bad? Look at state-by-state health-insurance regulations. They are far more complex, and in effect create high insurance costs for captive consumers and benefits for some large insurers who alone can either lobby themselves out of trouble or maintain the product lines that each state requires. A few decades ago, banking was run this way, a situation remedied by national banking reform. Instead of the gargantuan national influence on banking that some feared, we have a true national market for a vital financial service — more choices, more products, and more usage. There are other ways to accomplish insurance-market reform, too, but the key is to promote the availability of low-cost insurance to individuals currently subject to costly state mandates.
This is just a sampling of what the article presents, and Hubbard's other writings on the subject provide further support for his approach. Enthusiastically recommended.
Saturday, January 28, 2006
Faith on the Quad
This is interesting: some academic group is calling for universities to engage religion more, both as a part of the curriculum and as part of students' lives. Who knows what this will mean in practice, but as an emphasis, it seems like a positive move. On one particular note, I think it's especially promising. In my teaching, I deal a lot of with religion and thorny moral/political issues it often touches on. What I've found is that students are very reluctant to engage on those issues, largely because they think that you just can't argue about religion - and, by extension, the sorts of moral issues it touches. I think they're making a mistake in equating the truth that we won't be resolving our moral and religious differences anytime soon with the (erroneous) claim that there's nothing to be discussed. But that's how they think. You might suppose that this sort of "method of avoidance" is productive of social comity - but that, too, would be a mistake. Since religion still, perhaps inevitably, shows up in discussions, the fact that people don't have any experience in discussing religion- related things, they have no idea how to do it reasonably and with some civility. I'm not sure that most universities will do that well in fostering civil dialogue, but it seems worth a shot.
Friday, January 27, 2006
2008: The Future Comes Not Too Late, But All Too Soon
Not being snarky---between Howard Dean, Kos, and then Ted Kennedy and Chuck Schumer clumsily trying to make respected jurist Sam Alito out as a racist, it's a genuine confusion as to who the Democratic Party is these days. I dunno if they know, either.
Our friend, gadfly, and SalonPremiumMember and featured letter-writer) James Elliott posits:
OK, baby. Lock and load.
When all else fails, try principle. Actually, that's just what the GOP was forced into after Nixon and all those years of Democrat control of Congress. Petty politics, technique, and mealy-mouthing only get you so far.
The GOP made its historic gains on the backs of two visionaries---Ronald Reagan and Newt Gingrich. (Their successors admittedly, and almost by definition, pale in comparison.)
Shooting spitballs is not a political philosophy. Nominate Russ Feingold instead of guys like the last two weasels, and let America vote up or down. Run on your beliefs instead of from them. In three years, nobody's going to remember who the hell Jack Abramoff is, or was.
Russ Feingold represents the Democratic Party as I best understand it (ADA lifetime average rating of 96, if we can believe the Wiki).
I disagree with Feingold on virtually everything, but I still trust his character. He conducts himself like a human being, like a statesman. We could do worse, and almost did with Gore and Kerry, who are wack. (Gore, wack. Kerry, trying to lead an unprecendented constitutional revolution via phone from Switzerland, wack.)
(Let's save Hillary for another day. Too much fun to use up here. Hehe. [sound of knife being sharpened])
Feingold vs. Gingrich in 2008. Now that would be fun. No middle ground there...
Our friend, gadfly, and SalonPremiumMember and featured letter-writer) James Elliott posits:
There are a lot of Democrats out there who don't mirror "the Loud Left." Hillary Clinton. Russ Feingold. John Kerry.
OK, baby. Lock and load.
When all else fails, try principle. Actually, that's just what the GOP was forced into after Nixon and all those years of Democrat control of Congress. Petty politics, technique, and mealy-mouthing only get you so far.
The GOP made its historic gains on the backs of two visionaries---Ronald Reagan and Newt Gingrich. (Their successors admittedly, and almost by definition, pale in comparison.)
Shooting spitballs is not a political philosophy. Nominate Russ Feingold instead of guys like the last two weasels, and let America vote up or down. Run on your beliefs instead of from them. In three years, nobody's going to remember who the hell Jack Abramoff is, or was.
Russ Feingold represents the Democratic Party as I best understand it (ADA lifetime average rating of 96, if we can believe the Wiki).
I disagree with Feingold on virtually everything, but I still trust his character. He conducts himself like a human being, like a statesman. We could do worse, and almost did with Gore and Kerry, who are wack. (Gore, wack. Kerry, trying to lead an unprecendented constitutional revolution via phone from Switzerland, wack.)
(Let's save Hillary for another day. Too much fun to use up here. Hehe. [sound of knife being sharpened])
Feingold vs. Gingrich in 2008. Now that would be fun. No middle ground there...
Freedom for the Indians
John J. Miller has written a very interesting piece in today's Opinion Journal, arguing that the way to eliminate poverty on Indian reservations is to eliminate Indian reservations—by returning to the Indians the same private property rights the rest of us hold, which were taken away entirely in 1934. Miller argues,
The main problem with Indian reservations isn't, as some argue, that they were established on worthless tracts of grassland. Consider the case of Buffalo County, S.D., which Census data reveal to be America's poorest county. Some 2,000 people live there. More than 30% of the homes are headed by women without husbands. The median household income is less than $13,000. The unemployment rate is sky high.
Just to the east of Buffalo County lies Jerauld County, which is similar in size and population. Yet only 6% of its homes are headed by women without husbands, the median household income is more than $30,000, and the unemployment rate hovers around 3%. The fundamental difference between these two counties is that the Crow Creek Indian Reservation occupies much of Buffalo County. The place is a pocket of poverty in a land of plenty.
Maybe we should give land back to the rez-dwellers, so that they may own private property the way other Americans do. Currently, the inability to put up land as collateral for personal mortgages and loans is a major obstacle to economic development. This problem is complicated by the fact that not all reservations have adopted uniform commercial codes or created court systems that are independent branches of tribal government--the sorts of devices and institutions that give confidence to investors who might have the means to fund the small businesses that are the engines of rural economies.
The economic argument is certainly well-founded, but what about the cultural-protection idea, the notion that Native American cultures have to be protected? Miller answers this in three ways. One is by pointing out that great majority of Indians do not see cultural protection as dispositive:
Intermarriage between Indians and non-Indians is pervasive, especially off the rez. More than half of all Indians already marry outside their race, according the Census. For racial purists who believe that the men and women of today's tribes should be preserved like frozen displays in natural-history museums, this is a tragedy akin to ethnic cleansing (albeit one based on love rather than hate).
The next point, and a highly compelling one, is that the cultural-protection argument is not truly sympathetic toward the real people who must live out that sustained culture:
Yet the real tragedy is that reservations, as collectivist enclaves within a capitalist society, have beaten down their inhabitants with brute force rather than lifting them up with opportunity. As their economies have withered, other social pathologies have taken root: Indians are distressingly prone to crime, alcoholism and suicide. Families have suffered enormously. About 60% of Indian children are born out of wedlock. Although accurate statistics are hard to come by because so many arrangements are informal, Indian kids are perhaps five times as likely as white ones to live in some form of foster care. Their schools are depressingly bad.
What's more, this modern-day entrepreneurship [the proliferation of Indian casinos] is part of a long tradition: Meriwether Lewis (of Lewis & Clark fame) described the Chinooks as "great hagglers in trade." I once visited Poverty Point, a 3,000-year-old set of earthen mounds in Louisiana; the museum there displayed ancient artifacts found at the site, including copper from the Great Lakes and obsidian from the Rockies. These prehistoric Americans were budding globalizers, and there's no reason why their descendants should remain walled off from the world economy.
As a classical liberal, I always seek policies that afford the greatest liberty with the greatest amount of social order. America's Indians today live under an oppressive order that ruins lives. Like Miller, I have long held the position that they should be freed to make their way like the rest of us. Given their proud history and fine background, I have no doubt that they would thrive if given true freedom. Allowing them to buy and sell their property would be a suitable first step. Once that was in place, other efforts to ameliorate social pathologies among the Indians would begin to have a chance of working.
It is high time that it were done.
The main problem with Indian reservations isn't, as some argue, that they were established on worthless tracts of grassland. Consider the case of Buffalo County, S.D., which Census data reveal to be America's poorest county. Some 2,000 people live there. More than 30% of the homes are headed by women without husbands. The median household income is less than $13,000. The unemployment rate is sky high.
Just to the east of Buffalo County lies Jerauld County, which is similar in size and population. Yet only 6% of its homes are headed by women without husbands, the median household income is more than $30,000, and the unemployment rate hovers around 3%. The fundamental difference between these two counties is that the Crow Creek Indian Reservation occupies much of Buffalo County. The place is a pocket of poverty in a land of plenty.
Maybe we should give land back to the rez-dwellers, so that they may own private property the way other Americans do. Currently, the inability to put up land as collateral for personal mortgages and loans is a major obstacle to economic development. This problem is complicated by the fact that not all reservations have adopted uniform commercial codes or created court systems that are independent branches of tribal government--the sorts of devices and institutions that give confidence to investors who might have the means to fund the small businesses that are the engines of rural economies.
The economic argument is certainly well-founded, but what about the cultural-protection idea, the notion that Native American cultures have to be protected? Miller answers this in three ways. One is by pointing out that great majority of Indians do not see cultural protection as dispositive:
Intermarriage between Indians and non-Indians is pervasive, especially off the rez. More than half of all Indians already marry outside their race, according the Census. For racial purists who believe that the men and women of today's tribes should be preserved like frozen displays in natural-history museums, this is a tragedy akin to ethnic cleansing (albeit one based on love rather than hate).
The next point, and a highly compelling one, is that the cultural-protection argument is not truly sympathetic toward the real people who must live out that sustained culture:
Yet the real tragedy is that reservations, as collectivist enclaves within a capitalist society, have beaten down their inhabitants with brute force rather than lifting them up with opportunity. As their economies have withered, other social pathologies have taken root: Indians are distressingly prone to crime, alcoholism and suicide. Families have suffered enormously. About 60% of Indian children are born out of wedlock. Although accurate statistics are hard to come by because so many arrangements are informal, Indian kids are perhaps five times as likely as white ones to live in some form of foster care. Their schools are depressingly bad.
Finally, Miller points out that the image of Native American societies which is being upheld by the reservation system is in fact an incomplete and distorted one, for Indians were as commercially inclined as anyone else before the U.S. government forced them to become a separate, isolated enclave within the continent they had come to share with a multitude of people of other ethnic backgrounds:
What's more, this modern-day entrepreneurship [the proliferation of Indian casinos] is part of a long tradition: Meriwether Lewis (of Lewis & Clark fame) described the Chinooks as "great hagglers in trade." I once visited Poverty Point, a 3,000-year-old set of earthen mounds in Louisiana; the museum there displayed ancient artifacts found at the site, including copper from the Great Lakes and obsidian from the Rockies. These prehistoric Americans were budding globalizers, and there's no reason why their descendants should remain walled off from the world economy.
As a classical liberal, I always seek policies that afford the greatest liberty with the greatest amount of social order. America's Indians today live under an oppressive order that ruins lives. Like Miller, I have long held the position that they should be freed to make their way like the rest of us. Given their proud history and fine background, I have no doubt that they would thrive if given true freedom. Allowing them to buy and sell their property would be a suitable first step. Once that was in place, other efforts to ameliorate social pathologies among the Indians would begin to have a chance of working.
It is high time that it were done.
Small Frey
And yet another from the man who just won't shaddap....
This time I offer my take on the Oprah turnaround (well, it is Harpo Productions) on the James Frey "autobio" and the J.T. Leroy story as exposed by New York.
Here is a small slice:
Remember the first axiom of journalism. "Dog Bites Man" is not a headline; "Man Bites Dog" is the ideal. What this means is that every time you read an article about the man biting the dog, you should really be cheered by the invisible headline which reads: "99 Percent of Men Don't Bite Dogs." Pessimists have a tendency to extrapolate the wrong message, thinking that men must be biting dogs everywhere and the order of existence has broken down. It is the optimist who is the smart reader, who grasps the true import of the story.
This time I offer my take on the Oprah turnaround (well, it is Harpo Productions) on the James Frey "autobio" and the J.T. Leroy story as exposed by New York.
Here is a small slice:
Remember the first axiom of journalism. "Dog Bites Man" is not a headline; "Man Bites Dog" is the ideal. What this means is that every time you read an article about the man biting the dog, you should really be cheered by the invisible headline which reads: "99 Percent of Men Don't Bite Dogs." Pessimists have a tendency to extrapolate the wrong message, thinking that men must be biting dogs everywhere and the order of existence has broken down. It is the optimist who is the smart reader, who grasps the true import of the story.
Brassy Tax
Just when you thought that you would have to wait until next week for another of my articles, I slipped in a quickie before the weekend. My newest essay over at The American Spectator ruminates over the moral foundation of the state's right to tax.
Here are a few lines, in case you have a mirror handy:
In brief, the brief for all taxation is the notion that the state provides something that facilitates the transaction. If a person earns income in a certain place, he does so by relying on the protection of his person and his property -- and often the enforcement of the contracts -- afforded by the local governing authority. If he buys a product or a piece of real estate, he can be taxed on the same basis. The state in effect takes a commission.
Here are a few lines, in case you have a mirror handy:
In brief, the brief for all taxation is the notion that the state provides something that facilitates the transaction. If a person earns income in a certain place, he does so by relying on the protection of his person and his property -- and often the enforcement of the contracts -- afforded by the local governing authority. If he buys a product or a piece of real estate, he can be taxed on the same basis. The state in effect takes a commission.
Thursday, January 26, 2006
A New Wrinkle on Kelo: Bank Just Says No
From today's Washington Times:
BB&T Corp., the second-biggest bank in the Washington area, said yesterday that it will not lend money to developers who plan to build commercial projects on land taken from private citizens through the power of eminent domain.
"The idea that a citizen's property can be taken by the government solely for private use is extremely misguided; in fact, it's just plain wrong," said John Allison, the bank's chairman and chief executive officer.
I'm embarrassed that it never occurred to me to propose this as a pro-market approach to fighting Kelo. But now that the bee's in my bonnet: Hell yeah! I'm going to forward this to the public affairs folks at Citigroup, and will be happy to report back what, if any, response I receive.
Wednesday, January 25, 2006
Dear Beloved Infidels: The Bell Tolls for Thee
Islam's theological/philosophical enemy isn't the Calvary Baptist Church.
It's the brothels of Amsterdam. You see, the Qur'an has a soft spot for "People of The Book," (i.e., the Bible). But the modernist, secular West is the infidel, the absolute enemy.
Tag, you're it. The Prophet Muhammad started out and made his rep by fighting and conquering the pagans of Mecca, not the Christians. Or the Jews, either. You could look it up.
Which is why anti-religionist Christopher Hitchens, if I may use a dirty word, argues that it's secularists who should most be alarmed by the Islamist threat. Remember that many of the 9-11ers were educated in the West, and saw (what they viewed as) its depravity first-hand. They became convinced that such an empty society a) lacked the will to defend itself and b) deserves to fall. They have met the enemy, and they've decided it's you.
Sure, Islamism is also political: Usama bin Laden's 1996 fatwa/recruiting ad was based on the Iraq sanctions that killed innocent women and children, and also on the crusader (read US/UK) military presence in Saudi Arabia, which was there solely to keep an eye on Saddam.
Well, our crusader-in-chief and his neo-con puppeteers took care of both those bones of contention, n'est ce-pas? Sanctions mooted, troops out of the Land of Two Holy Places. A military presence in Iraq with one foot out the door hardly qualifies as a casus belli now, let alone cause for a whole damn worldwide jihad. Islamism has historically been far more patient at such passing indignities. It sees history in terms of eras, not election cycles.
American narcissism tends to place us, for better or ill (and mostly the latter these days), at the center of humanity's universe: surely we are the only big stick behind the West, with a little help from the UK. And surely our mastery of mass media (Hollywood, CNN) makes us appear to be the biggest of dogs in the current age. But the sins of colonialism and of contemporary moral squalor are largely European: the US is a country geographically far far away from Islamism---more an image than a reality to them. We are not (as of yet) morally fallen, and we were only silent partners or minor participants in the divvying up of the Third World in the colonial period of 1850-1950.
Have I mentioned that I don't like Europe? Not for back then, not now. I understand completely why nobody likes "white people." I don't like 'em much myself. Not only the sins of our European fathers but their children's today are visited upon us, their distant cousins in the United States.
To our beloved infidels, who are indistinguishable from the modern philosophical Left who dominate the Old Country: word up. You can wash your hands and stick the blame for the world situation on Queen Isabella, Napoleon, Admiral Nelson, Lord Balfour, Roosevelt, Churchill, deGaulle, Nixon, Reagan, or a Bush or two.
But the bell tolls for thee, not me. I can pay the dhimmi tax, and they'll leave me alone, as a good-hearted albeit confused person of The Book. But you're toast. Doomed.
So we shall hang together or hang separately, it seems. I will die, if I must, for Cindy Sheehan, or even for Hugh Hefner. Will you die for Pat Robertson? For me? The fate of the West depends on your answer.
It's the brothels of Amsterdam. You see, the Qur'an has a soft spot for "People of The Book," (i.e., the Bible). But the modernist, secular West is the infidel, the absolute enemy.
Tag, you're it. The Prophet Muhammad started out and made his rep by fighting and conquering the pagans of Mecca, not the Christians. Or the Jews, either. You could look it up.
Which is why anti-religionist Christopher Hitchens, if I may use a dirty word, argues that it's secularists who should most be alarmed by the Islamist threat. Remember that many of the 9-11ers were educated in the West, and saw (what they viewed as) its depravity first-hand. They became convinced that such an empty society a) lacked the will to defend itself and b) deserves to fall. They have met the enemy, and they've decided it's you.
Sure, Islamism is also political: Usama bin Laden's 1996 fatwa/recruiting ad was based on the Iraq sanctions that killed innocent women and children, and also on the crusader (read US/UK) military presence in Saudi Arabia, which was there solely to keep an eye on Saddam.
Well, our crusader-in-chief and his neo-con puppeteers took care of both those bones of contention, n'est ce-pas? Sanctions mooted, troops out of the Land of Two Holy Places. A military presence in Iraq with one foot out the door hardly qualifies as a casus belli now, let alone cause for a whole damn worldwide jihad. Islamism has historically been far more patient at such passing indignities. It sees history in terms of eras, not election cycles.
American narcissism tends to place us, for better or ill (and mostly the latter these days), at the center of humanity's universe: surely we are the only big stick behind the West, with a little help from the UK. And surely our mastery of mass media (Hollywood, CNN) makes us appear to be the biggest of dogs in the current age. But the sins of colonialism and of contemporary moral squalor are largely European: the US is a country geographically far far away from Islamism---more an image than a reality to them. We are not (as of yet) morally fallen, and we were only silent partners or minor participants in the divvying up of the Third World in the colonial period of 1850-1950.
Have I mentioned that I don't like Europe? Not for back then, not now. I understand completely why nobody likes "white people." I don't like 'em much myself. Not only the sins of our European fathers but their children's today are visited upon us, their distant cousins in the United States.
To our beloved infidels, who are indistinguishable from the modern philosophical Left who dominate the Old Country: word up. You can wash your hands and stick the blame for the world situation on Queen Isabella, Napoleon, Admiral Nelson, Lord Balfour, Roosevelt, Churchill, deGaulle, Nixon, Reagan, or a Bush or two.
But the bell tolls for thee, not me. I can pay the dhimmi tax, and they'll leave me alone, as a good-hearted albeit confused person of The Book. But you're toast. Doomed.
So we shall hang together or hang separately, it seems. I will die, if I must, for Cindy Sheehan, or even for Hugh Hefner. Will you die for Pat Robertson? For me? The fate of the West depends on your answer.
Joel Stein: Traitor Or Mere Ignoramus?
So: Many of my friends out there in right-leaning Blogovia are as mad as hell and aren't going to take it anymore. What is it this time? Well, an intellectual midget by the name of Joel Stein---who has a weekly column on the LA Times op-ed page, which speaks volumes about both the paper and Stein himself---argued a couple of days ago that he does not support the troops, and that liberals/leftists who oppose the war but "support the troops" are wusses. So it's back to the "cancel the subscription" mass email, etc. etc.
Forgive me, but what precisely is the problem here? Nancy Pelosi and the other leftist pols who oppose the war, its initial rationale, its conduct, ad infinitum, and who gave not a fig about the suffering of Iraqis under Saddam, but who simultaneously "support the troops" in fact are hypocrites, liars, and, well, wusses, in that they simply cannot bring themselves to take a position that would engender harsh political criticism, however honestly reflective of their actual views. Stein, on the other hand, has told us what he really believes, however disgusting it is. And he admits freely that he knows nothing about the military, about war, about terrorism, about the wounded and dead, about the motivations that induce self-sacrifice and heroism, about actual conditions in Iraq under Baathism and after, and so on. He simply believes that those who volunteer for dishonorable missions ought not be honored. Or something utterly incoherent. He is merely a youngish yuppie, self-satisfied, self-absorbed, full of self-esteem, and entirely earnest in his belief that mainstream journalists are intellectuals. Better yet: There may be a book deal in the offing, and perhaps even a movie. He is perfect for the LA Times op-ed page. (Full disclosure: That page over the years has run about 50 of my op-eds. Oh, shut up.)
Stein is a regular at Time, another fact that speaks volumes. That he has told us what he really believes is admirable. That our honorable troops in the field could not care less about a fleck of dust like Stein is obvious.
Forgive me, but what precisely is the problem here? Nancy Pelosi and the other leftist pols who oppose the war, its initial rationale, its conduct, ad infinitum, and who gave not a fig about the suffering of Iraqis under Saddam, but who simultaneously "support the troops" in fact are hypocrites, liars, and, well, wusses, in that they simply cannot bring themselves to take a position that would engender harsh political criticism, however honestly reflective of their actual views. Stein, on the other hand, has told us what he really believes, however disgusting it is. And he admits freely that he knows nothing about the military, about war, about terrorism, about the wounded and dead, about the motivations that induce self-sacrifice and heroism, about actual conditions in Iraq under Baathism and after, and so on. He simply believes that those who volunteer for dishonorable missions ought not be honored. Or something utterly incoherent. He is merely a youngish yuppie, self-satisfied, self-absorbed, full of self-esteem, and entirely earnest in his belief that mainstream journalists are intellectuals. Better yet: There may be a book deal in the offing, and perhaps even a movie. He is perfect for the LA Times op-ed page. (Full disclosure: That page over the years has run about 50 of my op-eds. Oh, shut up.)
Stein is a regular at Time, another fact that speaks volumes. That he has told us what he really believes is admirable. That our honorable troops in the field could not care less about a fleck of dust like Stein is obvious.
Ask an Economist
Robert Samuelson has a very balanced look at the healthcare problem.
You'll just be smarter and better informed for reading it.
You'll just be smarter and better informed for reading it.
Tuesday, January 24, 2006
Torture(d) Logic
Harry Reid has got to be the most disingenuous member of the Senate. Check him out in this AP story:
Democrats are working to get a large opposition vote to make their points against President Bush.
"I think it sends a message to the American people that this guy is not King George, he's President George," said Senate Democratic leader Harry Reid of Nevada.
Bush should have picked a woman, said Reid, who urged the president last year to pick White House counsel Harriet Miers. "They couldn't go for her because she was an independent woman," Reid said of Miers, whose nomination was withdrawn under conservative criticism.
You've got to be kidding me, Dusty Harry. Had Bush stuck with Harriet Miers, who was underqualified and tied to Bush like his ranch kerchief, then we might have been able to sustain the King George charge.
Actually, if he looks like a King, it is the King George who suffered the revolution of his American subjects, because it was a revolt that brought Alito in. Quietude was the road to Harriet.
Democrats are working to get a large opposition vote to make their points against President Bush.
"I think it sends a message to the American people that this guy is not King George, he's President George," said Senate Democratic leader Harry Reid of Nevada.
Bush should have picked a woman, said Reid, who urged the president last year to pick White House counsel Harriet Miers. "They couldn't go for her because she was an independent woman," Reid said of Miers, whose nomination was withdrawn under conservative criticism.
You've got to be kidding me, Dusty Harry. Had Bush stuck with Harriet Miers, who was underqualified and tied to Bush like his ranch kerchief, then we might have been able to sustain the King George charge.
Actually, if he looks like a King, it is the King George who suffered the revolution of his American subjects, because it was a revolt that brought Alito in. Quietude was the road to Harriet.
The Mozart Model
Dr. Francis Schaeffer wisely looked to the culture and intellectual traditions of Western civilization for answers to the important question he repeatedly asked in his excelllent writings: How should we then live? I suspect that quite a few of us look to the arts, and particularly the popular arts, with exactly that in mind, whether consciously or otherwise. An excellent look at the works of Wolfgang Amadeus Mozart in the current Weekly Standard, by Fred Baumann, examines the great composer with that in mind:
. . . [L]istening to Mozart calls to mind (and in some ways turns you into) a certain kind of person, a more complicated sort than we mostly go in for today. Not a redemptive Wagnerian hero or cynical slacker, not a high-minded virtuoso of compassion and/or righteous indignation, not a "realist" or an "idealist," but someone who both acknowledges, lives in, accepts the viewpoint of, and participates in, all human feelings--even the ugly ones, as we see in the marvelous revenge arias given to the Count, Dr. Bartolo, and Figaro--but who also, in the end, maintains as sovereign the viewpoint of rationality and order. . . .
In invoking, and to some degree creating, such a person, Mozart implicitly makes a kind of moral case, a case for how we should live. It is not "aesthetic" in the sense of replacing the moral with formal beauty; it is much closer to what we find in Shakespeare's Tempest or Measure for Measure; i.e., models of a kind of control of the passions that gives them their due. Yet it is presented aesthetically, not through argument or exhortation. . . .
In the end, the romantic hero and the homo economicus turn out to be not basically different, but two sides of the same forged coin. The Mozartean hero, whom we approach, admire, and even learn to resemble, if only slightly, puts them to shame.
It is a figure that we don't meet much otherwise. On sale for generations now have been simpler models of heroism, at their best the superficially cynical but deeply moral idealist (say, Humphrey Bogart) but, more typically, various chest-pounding moralists and romantics.
For that reason--that we tend to operate, as though instinctively, on romantic and post-romantic antitheses about passion and reason--it is, in fact, harder to hear Mozart well today than it used to be. Insofar as his music transcends our categories, we either consign him to the realm of the pretty-pretty or turn him, as some 20th-century criticism did, into a grotesque quasi-existential Angst-ling. And of course, Nietzsche was right that the language of aristocratic, pre-Romantic taste is no longer available to us.
The article makes one want to listen to some Mozart and contemplate how we should then live. It will transform your understanding of the music and of the preternaturally wise and kindhearted man who made it.
. . . [L]istening to Mozart calls to mind (and in some ways turns you into) a certain kind of person, a more complicated sort than we mostly go in for today. Not a redemptive Wagnerian hero or cynical slacker, not a high-minded virtuoso of compassion and/or righteous indignation, not a "realist" or an "idealist," but someone who both acknowledges, lives in, accepts the viewpoint of, and participates in, all human feelings--even the ugly ones, as we see in the marvelous revenge arias given to the Count, Dr. Bartolo, and Figaro--but who also, in the end, maintains as sovereign the viewpoint of rationality and order. . . .
In invoking, and to some degree creating, such a person, Mozart implicitly makes a kind of moral case, a case for how we should live. It is not "aesthetic" in the sense of replacing the moral with formal beauty; it is much closer to what we find in Shakespeare's Tempest or Measure for Measure; i.e., models of a kind of control of the passions that gives them their due. Yet it is presented aesthetically, not through argument or exhortation. . . .
In the end, the romantic hero and the homo economicus turn out to be not basically different, but two sides of the same forged coin. The Mozartean hero, whom we approach, admire, and even learn to resemble, if only slightly, puts them to shame.
It is a figure that we don't meet much otherwise. On sale for generations now have been simpler models of heroism, at their best the superficially cynical but deeply moral idealist (say, Humphrey Bogart) but, more typically, various chest-pounding moralists and romantics.
For that reason--that we tend to operate, as though instinctively, on romantic and post-romantic antitheses about passion and reason--it is, in fact, harder to hear Mozart well today than it used to be. Insofar as his music transcends our categories, we either consign him to the realm of the pretty-pretty or turn him, as some 20th-century criticism did, into a grotesque quasi-existential Angst-ling. And of course, Nietzsche was right that the language of aristocratic, pre-Romantic taste is no longer available to us.
The article makes one want to listen to some Mozart and contemplate how we should then live. It will transform your understanding of the music and of the preternaturally wise and kindhearted man who made it.
Promoted Back to the Top: The Point of Politics
This post has been attracting a lot of comments, so I thought I'd bring it back up top for convenience's sake. --Hunter B.
Ross Douthat, newly returned from filling in for Andrew Sullivan, points to an essay on the ol' question of why those red-staters are voting red. (follow the links)
Now, I think the question is a bit hackneyed, not least because the fact that some state tends conservative or liberal is a long way from being able to say anything about the effects of social conditions on voting behavior. Having 55% of a state's voters (not citizens, mind you) who vote conservative or liberal and then making snarky comments (a la the NYT's Frank Rich) about how funny it is that those states have higher divorce rates, watch Desperate Housewives, etc. doesn't get you very far.
In any case, it seems to me that the whole question is based on a misunderstanding, namely, that politics is primarily about economics and only then about "cultural" issues. That's just nonsense, mostly dreamed up by people who *want* politics to be all about economics. Politics is, rather, primarily about culture, it is a vehicle for people to decide "who" they are. Economic decisions, the allocation of resources or opportunities, is a part of that "who-ness", but it does not contain it. Economics does, of course, shape culture, but I think it's a mistake to think it's primary.
Ross Douthat, newly returned from filling in for Andrew Sullivan, points to an essay on the ol' question of why those red-staters are voting red. (follow the links)
Now, I think the question is a bit hackneyed, not least because the fact that some state tends conservative or liberal is a long way from being able to say anything about the effects of social conditions on voting behavior. Having 55% of a state's voters (not citizens, mind you) who vote conservative or liberal and then making snarky comments (a la the NYT's Frank Rich) about how funny it is that those states have higher divorce rates, watch Desperate Housewives, etc. doesn't get you very far.
In any case, it seems to me that the whole question is based on a misunderstanding, namely, that politics is primarily about economics and only then about "cultural" issues. That's just nonsense, mostly dreamed up by people who *want* politics to be all about economics. Politics is, rather, primarily about culture, it is a vehicle for people to decide "who" they are. Economic decisions, the allocation of resources or opportunities, is a part of that "who-ness", but it does not contain it. Economics does, of course, shape culture, but I think it's a mistake to think it's primary.
Monday, January 23, 2006
Supreme Court Spectator
Here's a preview (by a few hours) of my article at The American Spectator, available to the broader public at midnight Eastern Time.
It discusses the Supreme Court's agenda after Alito and suggests taking aim at the Kelo vs. City of New London ruling which is taking takings to a level that most folks can't take.
Here, sample a smidgen:
...the piquant tale of Mr. Logan Darrow Clements. This man with the three cognomina may become more than a nominal cog in the historical battle to set the Supreme Court aright. In his low-key way he has taken aim at Kelo vs. City of New London. That disastrous decision of recent vintage allows municipalities to initiate takings of private property for the public advantage of enhancing the local tax base. This means that if The Donald convinces the city elders that he could build a revenue-generating casino right where your patio used to be, that putative benefit trumps your ownership. Your good deed will not go unpunished.
Mr. Clements has chosen a novel means of protest, one he compares to the Boston Tea Party. He has proposed to the sleepy New Hampshire burg of Weare that its most illustrious citizen, Justice David Souter, be evicted from his home to allow for construction of a hotel, the Lost Liberty Inn. On what grounds would it be built? On Souter's grounds. That is, the grounds of his vote with the majority in Kelo. Clements has already assembled the 25 signatures required to place his petition on the ballot in March: nine out of ten locals approached signed on the dotted line! Perhaps his idea is less dotty than it seemed.
It discusses the Supreme Court's agenda after Alito and suggests taking aim at the Kelo vs. City of New London ruling which is taking takings to a level that most folks can't take.
Here, sample a smidgen:
...the piquant tale of Mr. Logan Darrow Clements. This man with the three cognomina may become more than a nominal cog in the historical battle to set the Supreme Court aright. In his low-key way he has taken aim at Kelo vs. City of New London. That disastrous decision of recent vintage allows municipalities to initiate takings of private property for the public advantage of enhancing the local tax base. This means that if The Donald convinces the city elders that he could build a revenue-generating casino right where your patio used to be, that putative benefit trumps your ownership. Your good deed will not go unpunished.
Mr. Clements has chosen a novel means of protest, one he compares to the Boston Tea Party. He has proposed to the sleepy New Hampshire burg of Weare that its most illustrious citizen, Justice David Souter, be evicted from his home to allow for construction of a hotel, the Lost Liberty Inn. On what grounds would it be built? On Souter's grounds. That is, the grounds of his vote with the majority in Kelo. Clements has already assembled the 25 signatures required to place his petition on the ballot in March: nine out of ten locals approached signed on the dotted line! Perhaps his idea is less dotty than it seemed.
Now I've Gone and Done It
I have resisted the mass compulsion for four years. Friends and colleagues could rave about it, the media could cover it ad nauseam, but no. I would not succumb. It was only out of curiosity that I sampled it last weekend, and now I am hooked, pathetically waiting for the next fix.
I am addicted to 24.
This is bad in a number of ways, mostly connected to the fact that I'm one of the half dozen people left in North America who doesn't have a TiVo box. I missed four minutes of episode three when my husband called me from the office. (And brother, he won't be doing that again. I nearly ripped his head off.) It took my daughter two commercial breaks to explain what had happened. Oh, and everything the drug czar says about addiction destroying entire families is spot on. My daughters have the Jack Bauer Jones just as bad as I do.
So I'm left with one question: if there's a 12-step program for 24, does it only get you halfway clean and sober?
I am addicted to 24.
This is bad in a number of ways, mostly connected to the fact that I'm one of the half dozen people left in North America who doesn't have a TiVo box. I missed four minutes of episode three when my husband called me from the office. (And brother, he won't be doing that again. I nearly ripped his head off.) It took my daughter two commercial breaks to explain what had happened. Oh, and everything the drug czar says about addiction destroying entire families is spot on. My daughters have the Jack Bauer Jones just as bad as I do.
So I'm left with one question: if there's a 12-step program for 24, does it only get you halfway clean and sober?
Law Suitable
Top Ten reasons why Judge Alito should not be confirmed:
10) His mob ties clash with the robe.
9) His CAP dues are overdue.
8) He believes that a wife must notify her husband of a sex-change operation.
7) He prefers eating roe caviar to watching Dwayne Wade play basketball: a real Republican.
6) He is named after the prophet Smauel, a clear breach of the separation between church and state. (Now if we could apply that to Ruth Bader-Ginsberg and David Souter...)
5) He's from Philadelphia. W.C. Fields would turn over in his grave (or his ghost would return wielding an assegai...).
4) Judge Al Ito? After the way he botched the Simpson trial?
3) He's a lawyer. Yecch.
2) His wife is a crybaby.
1) He takes Ted Kennedy seriously.
Oh, oh, there's a big guy named Vinnie knocking on the door. Be back in a sec, I think...
10) His mob ties clash with the robe.
9) His CAP dues are overdue.
8) He believes that a wife must notify her husband of a sex-change operation.
7) He prefers eating roe caviar to watching Dwayne Wade play basketball: a real Republican.
6) He is named after the prophet Smauel, a clear breach of the separation between church and state. (Now if we could apply that to Ruth Bader-Ginsberg and David Souter...)
5) He's from Philadelphia. W.C. Fields would turn over in his grave (or his ghost would return wielding an assegai...).
4) Judge Al Ito? After the way he botched the Simpson trial?
3) He's a lawyer. Yecch.
2) His wife is a crybaby.
1) He takes Ted Kennedy seriously.
Oh, oh, there's a big guy named Vinnie knocking on the door. Be back in a sec, I think...
MBA's as a Force for the Suck?
Occasional commenter ChETHB raised another interesting point in the discussion about the trouble with GM:
I have been thinking more about the decline of the American automobile industry and have come up with one more tidbit to throw out - the rise of MBA's in American industry. This started in the mid-60's and it has been my personal observation that decision-making in technically oriented businesses has suffered as managers/executives with technical degrees have been replaced by executives holding MBA degrees. Could something this simple have started the downfall of GM?
I found this statement provocative. My own corporate experience suggested that the really valuable people are those who know how to do things. Meanwhile, there were a lot of MBA's (and in my case, an MPA) running around not adding a lot of value. If I had been in charge, I would have fired me, a bunch of MBA types, and all of the Andersen "change" consultants.
What thinkest thou, fair readers and fellow contributors? Is the rise of the MBA a good thing?
I have been thinking more about the decline of the American automobile industry and have come up with one more tidbit to throw out - the rise of MBA's in American industry. This started in the mid-60's and it has been my personal observation that decision-making in technically oriented businesses has suffered as managers/executives with technical degrees have been replaced by executives holding MBA degrees. Could something this simple have started the downfall of GM?
I found this statement provocative. My own corporate experience suggested that the really valuable people are those who know how to do things. Meanwhile, there were a lot of MBA's (and in my case, an MPA) running around not adding a lot of value. If I had been in charge, I would have fired me, a bunch of MBA types, and all of the Andersen "change" consultants.
What thinkest thou, fair readers and fellow contributors? Is the rise of the MBA a good thing?
Sunday, January 22, 2006
Green = $Green
Anybody remember John Brunner's precience in creating the Puritan Foods corporation in his early-70s sci-fi masterpiece The Sheep Look Up? Pure food for a polluted world.
Now, here in California, we have the Whole Foods supermarket chain. Morally admirable organic chow. Transgendered rutabagas, economic-equity grown coffee, tofu for the masses. Socially conscious consumerism. It helps if you have a few extra pennies to rub together to assuage your guilt at the need to eat and drink in order to survive. (Banana-carrot-cauliflower smoothies, $4.95. Yum.)
Avowed lefty ("I'm not a lefty!") Bill Moyers left the PBS show he created, Now, but it's still in production, and they paid homage to this neo-capitalist capitalist enterprise the other day. They interviewed one of the top brass, and it turns out that Whole Foods has a salary cap ($400K or so)---execs can make only 14 times what a cashier makes. The exec agreed with the Now reporter that such near-egalitarianism sure helps company (cashier) morale. A cashier captured on video was seen smiling. Brightly.
Man, isn't that cool? Pure foods for the body, sterling business ethics for the soul. If it weren't already so overpriced and tasteless, I'd be willing to pay double for their grub.
Oh yeah, in one of those quick disclosure tags at the end, like in every Erectile Dysfunction drug ad, Now did mention that the executive they interviewed snarfed up $1.8 million in stock options last year. Since Whole Foods is a non-union shop, they made a mint during the great supermarket labor strike here in California.
Cashiers get stock options at Whole Foods, too. Perhaps that's why that cashier was smiling, and damned brightly. ¡Viva la Reagan Revolution! Con tofu.
Now, here in California, we have the Whole Foods supermarket chain. Morally admirable organic chow. Transgendered rutabagas, economic-equity grown coffee, tofu for the masses. Socially conscious consumerism. It helps if you have a few extra pennies to rub together to assuage your guilt at the need to eat and drink in order to survive. (Banana-carrot-cauliflower smoothies, $4.95. Yum.)
Avowed lefty ("I'm not a lefty!") Bill Moyers left the PBS show he created, Now, but it's still in production, and they paid homage to this neo-capitalist capitalist enterprise the other day. They interviewed one of the top brass, and it turns out that Whole Foods has a salary cap ($400K or so)---execs can make only 14 times what a cashier makes. The exec agreed with the Now reporter that such near-egalitarianism sure helps company (cashier) morale. A cashier captured on video was seen smiling. Brightly.
Man, isn't that cool? Pure foods for the body, sterling business ethics for the soul. If it weren't already so overpriced and tasteless, I'd be willing to pay double for their grub.
Oh yeah, in one of those quick disclosure tags at the end, like in every Erectile Dysfunction drug ad, Now did mention that the executive they interviewed snarfed up $1.8 million in stock options last year. Since Whole Foods is a non-union shop, they made a mint during the great supermarket labor strike here in California.
Cashiers get stock options at Whole Foods, too. Perhaps that's why that cashier was smiling, and damned brightly. ¡Viva la Reagan Revolution! Con tofu.
Saturday, January 21, 2006
Comment Promotion: GM Cars
It's the weekend, so I thought we'd continue the Car Talk (apologies NPR). Here are the last two comments on the GM quality thread. ChETHB had several interesting things to say about his GM driving experience and Kathy follows up with some thoroughly delightful prose on her own history with the brand. Beware "the Golden Bitch." And no, that doesn't refer to Kathy. You'll see.
ChETHB said...
Just had to weigh in on this one since I became an auto enthusiast in the mid-50's -- IMHO, the quality of GM vehicles was without equal during that time period.
I worked in a gas station summer of 1959 and had the opportunity to look very closely at many different cars - GM vehicles were the best.
I believe this trend continued until, perhaps, the late 60's, based on my experience. For example, my new 1968 Chevelle SS396 rattled like a bucket of bolts as I took delivery and drove it away from the dealer. Sadly, I traded in a 1963 Impala SS that had zero rattles at 95,000 miles and got in excess of 18 mpg on the highway. With a high performance engine, I still never got below 13 mpg in that 1963 Impala and that was with some impressive hotrodding. The 1968 Chevelle (hindered no doubt by EPA regulations) never exceeded 11 mpg and generally got 6-8 mpg in the city.
I was still firmly in GM's corner (although shaken) until the mid-70's when they began to introduce small cars that were shoddy junk. It was pretty much the same with the other members of the Big 3. After the oil crisis in 1973, Americans were clamoring for nice, smaller, fuel efficient vehicles. Detroit provided cheaply-made, small, junky vehicles - shoddy interiors, very few options, no luxury appointments, and so forth.
The Japanese, on the other hand, after having been soundly beaten down with their initial introductions to the US, went home, did their homework, and came back with small, fuel-efficient cars that had the luxury appointments that Americans wanted and expected. The rest is history. Detroit, and especially GM, continued producing the kinds of cars that Americans didn't want and in addition, allowed their quality to sag lower and lower.
In summary, I think GM could have maintained their superior position had they simply responded to the market. Instead, they continued their view that they knew best what the customer wanted and consequently, their market share has continued to slide as the customer finds what he wants in the Japanese and European vehicles. The unfortunate part is that hundreds of thousands of Americans are directly or indirectly affected by the poor state of GM's business acumen.
I think all the different car names and models are an attempt to stuff one bad apple under the rug and replace it with something new, all the while hoping that the customer doesn't realize that it's the same thing with a different name. I generally agreee that the newer American cars have been significantly improved over their predecessors. I drive rental cars occasionally and have noticed that the current American offerings have seen considerable improvement. These are basically new cars so I have no impression about the reliability and maintenance requirements.
I can offer a final commment, however. When I get home from a business trip, it is always refreshing to crawl behind the wheel of my Acura and drive away. Unfortunately, I don't believe that GM currently builds a comparable vehicle.
Kathy Hutchins said...
"I believe this trend continued until, perhaps, the late 60's, based on my experience. For example, my new 1968 Chevelle SS396 rattled like a bucket of bolts as I took delivery and drove it away from the dealer."
This comports with my experience as well. My very first car was a 1967 Chevy Camaro, straight 6 230, purchased in 1974 (for $795.00 cash. Would that such things were possible today, eh?). It had none of the features American consumers would demand today -- no a/c, power steering, power brakes. It did have a radio. I drove it hard, and often stupidly, for six years, put something like 120K miles on it, turned around and sold it to a kid in Texas for $1200.
It was a great car, but unfortunately in 1967 you could no longer count on a GM product's quality from specimen to specimen. My younger sister's first car was also a 1967 Camaro, purchased in 1976, one of the souped up Super Sport models with a 350 V8, power everything. It was a piece of junk from start to finish -- electrics, hydraulics, finish work, seals -- the car was such a constant headache we called it "The Golden Bitch." I spent so much time giving that pile of manure jump starts I should have applied for a tow truck license.
ChETHB said...
Just had to weigh in on this one since I became an auto enthusiast in the mid-50's -- IMHO, the quality of GM vehicles was without equal during that time period.
I worked in a gas station summer of 1959 and had the opportunity to look very closely at many different cars - GM vehicles were the best.
I believe this trend continued until, perhaps, the late 60's, based on my experience. For example, my new 1968 Chevelle SS396 rattled like a bucket of bolts as I took delivery and drove it away from the dealer. Sadly, I traded in a 1963 Impala SS that had zero rattles at 95,000 miles and got in excess of 18 mpg on the highway. With a high performance engine, I still never got below 13 mpg in that 1963 Impala and that was with some impressive hotrodding. The 1968 Chevelle (hindered no doubt by EPA regulations) never exceeded 11 mpg and generally got 6-8 mpg in the city.
I was still firmly in GM's corner (although shaken) until the mid-70's when they began to introduce small cars that were shoddy junk. It was pretty much the same with the other members of the Big 3. After the oil crisis in 1973, Americans were clamoring for nice, smaller, fuel efficient vehicles. Detroit provided cheaply-made, small, junky vehicles - shoddy interiors, very few options, no luxury appointments, and so forth.
The Japanese, on the other hand, after having been soundly beaten down with their initial introductions to the US, went home, did their homework, and came back with small, fuel-efficient cars that had the luxury appointments that Americans wanted and expected. The rest is history. Detroit, and especially GM, continued producing the kinds of cars that Americans didn't want and in addition, allowed their quality to sag lower and lower.
In summary, I think GM could have maintained their superior position had they simply responded to the market. Instead, they continued their view that they knew best what the customer wanted and consequently, their market share has continued to slide as the customer finds what he wants in the Japanese and European vehicles. The unfortunate part is that hundreds of thousands of Americans are directly or indirectly affected by the poor state of GM's business acumen.
I think all the different car names and models are an attempt to stuff one bad apple under the rug and replace it with something new, all the while hoping that the customer doesn't realize that it's the same thing with a different name. I generally agreee that the newer American cars have been significantly improved over their predecessors. I drive rental cars occasionally and have noticed that the current American offerings have seen considerable improvement. These are basically new cars so I have no impression about the reliability and maintenance requirements.
I can offer a final commment, however. When I get home from a business trip, it is always refreshing to crawl behind the wheel of my Acura and drive away. Unfortunately, I don't believe that GM currently builds a comparable vehicle.
Kathy Hutchins said...
"I believe this trend continued until, perhaps, the late 60's, based on my experience. For example, my new 1968 Chevelle SS396 rattled like a bucket of bolts as I took delivery and drove it away from the dealer."
This comports with my experience as well. My very first car was a 1967 Chevy Camaro, straight 6 230, purchased in 1974 (for $795.00 cash. Would that such things were possible today, eh?). It had none of the features American consumers would demand today -- no a/c, power steering, power brakes. It did have a radio. I drove it hard, and often stupidly, for six years, put something like 120K miles on it, turned around and sold it to a kid in Texas for $1200.
It was a great car, but unfortunately in 1967 you could no longer count on a GM product's quality from specimen to specimen. My younger sister's first car was also a 1967 Camaro, purchased in 1976, one of the souped up Super Sport models with a 350 V8, power everything. It was a piece of junk from start to finish -- electrics, hydraulics, finish work, seals -- the car was such a constant headache we called it "The Golden Bitch." I spent so much time giving that pile of manure jump starts I should have applied for a tow truck license.
Friday, January 20, 2006
Who Said It?
"Our opponents are our fellow citizens, not our enemies. Honorable people can have honest political differences. And we should strive for civility and intellectual integrity in our debates."
Hillary Clinton
Adolf Hitler
James Elliott
Karl Rove
Hillary Clinton
Adolf Hitler
James Elliott
Karl Rove
The Great Narcissist Brando
Fellow Reform Clubber Tom Van Dyke kindly pointed me toward an excellent analysis of the career of Marlon Brando, by Nicholas Stix, at Mens News Daily. Stix correctly points out that (1) Brando was an immensely talented actor who accomplished several great performances and a lot of utterly atrocious ones, (2) the style of acting Brando pioneered was going to happen anyway, and (3) Brando's real stock in trade was not sensitivity but narcissism, and this played out in his personal life as well as in his performances.
Considering the basic impluse behind Brando's characterizations, Stix writes,
All the talk about Brando’s “sensitivity” is so much rot. The sycophantic “experts” who say that he played “sensitive” brutes are confusing emotional neediness with sensitivity. In other words, they can’t tell a narcissist from a saint.
Stix's observation that Brando's stock in trade was narcissism is a key point. Regarding Brando's influence on acting styles, Stix writes,
Since Brando’s death, we have been told that he somehow gave actors “permission” to be emotionally authentic. We have also heard, from Brando-apologist Richard Schickel, that it was the movies that let Brando down, beginning in the 1960s, rather than the other way around. Baloney!
A more intense acting style was coming into fashion after World War II, before Brando’s arrival on the Hollywood scene. Witness Kirk Douglas’ driven performances as boxer “Midge Kelly” in Champion (1949), as “Det. Jim McLeod” in Detective Story (1951), and as Vincent Van Gogh in Lust for Life (1956). And already in 1946, in It’s a Wonderful Life, note the embittered, emotionally raw quality of so much of Jimmy Stewart’s performance as “George Bailey,” a quality that characterized much of Stewart’s best 1950s’ work with directors Anthony Mann and Alfred Hitchcock. Something was in the air.
I have written elsewhere that what was "in the air" was the rise of anti-authoritarianism and personal narcissism throughout American society, and I think that Brando's ascendance, as Stix observes, was a powerful manifestation of that (and in fact I gave Brando as one of many examples of the post-World War II cultural change that led to what I call the Omniculture).
Stix also correctly accords credit to writer-director Elia Kazan for the rise of this acting style. I think that Brando's innovation in film acting was a mixed blessing at best, but would have been inevitable with the onset of TV anyway and, more importantly, the general rise in narcissism in the society. In Stix's discussion of the films of the 1960s and '70s, for example, change the word "antihero" to "narcissist" and you'll see it fits perfectly and in fact makes more sense (in that, for example, one's personal behavior can seldom be described as anti-heroic, as that is a dramatic/literary term):
Johnny Strabler was one of the early versions of what became the ultimate 1960s Hollywood cliché: The “anti-hero.” During the mid-1950s, in his brief career, James Dean would specialize in this type, in East of Eden, Rebel Without a Cause (the ultimate anti-hero movie title), and Giant, before dying in an automobile accident in 1955. Another then-famous anti-hero role was Paul Newman’s performance as Billy the Kid, in Arthur Penn’s The Left Handed Gun, in 1958. (Though I admire much of Arthur Penn’s work, when I saw the movie on The Late Show about thirty years ago, I found it so dreadful, that I shut it off after a few minutes.)
In the 1960s, the anti-hero became the dominant shtick in Hollywood, as Steve McQueen, Clint Eastwood, Lee Marvin, Newman and Redford, (and a few years later) Charles Bronson, and countless other actors would earn millions of dollars portraying anti-hero crooks and cops alike. (On TV, for a producer to sell a cop series, it had to be about an “unorthodox” cop.)
However, the anti-hero shtick did not help Marlon Brando. Brando’s problem was that, rather than seeing the playing of anti-heroes as a calculated career move, he adopted the anti-hero as his personal shtick. But if you really act like an anti-hero (i.e., a juvenile delinquent) in your personal and professional life, you become a source of grief to all who depend on you.
I would suggest that Brando didn't change at heart after his twenties, which Stix argues. On the contrary, Brando just did what he could get away with at all times, and after his great run of early '50s performances he could get away with much more. His career is indeed a tale of great talent often wasted, and it is a cautionary tale about the consequences of narcissism both for others and oneself.
Considering the basic impluse behind Brando's characterizations, Stix writes,
All the talk about Brando’s “sensitivity” is so much rot. The sycophantic “experts” who say that he played “sensitive” brutes are confusing emotional neediness with sensitivity. In other words, they can’t tell a narcissist from a saint.
Stix's observation that Brando's stock in trade was narcissism is a key point. Regarding Brando's influence on acting styles, Stix writes,
Since Brando’s death, we have been told that he somehow gave actors “permission” to be emotionally authentic. We have also heard, from Brando-apologist Richard Schickel, that it was the movies that let Brando down, beginning in the 1960s, rather than the other way around. Baloney!
A more intense acting style was coming into fashion after World War II, before Brando’s arrival on the Hollywood scene. Witness Kirk Douglas’ driven performances as boxer “Midge Kelly” in Champion (1949), as “Det. Jim McLeod” in Detective Story (1951), and as Vincent Van Gogh in Lust for Life (1956). And already in 1946, in It’s a Wonderful Life, note the embittered, emotionally raw quality of so much of Jimmy Stewart’s performance as “George Bailey,” a quality that characterized much of Stewart’s best 1950s’ work with directors Anthony Mann and Alfred Hitchcock. Something was in the air.
I have written elsewhere that what was "in the air" was the rise of anti-authoritarianism and personal narcissism throughout American society, and I think that Brando's ascendance, as Stix observes, was a powerful manifestation of that (and in fact I gave Brando as one of many examples of the post-World War II cultural change that led to what I call the Omniculture).
Stix also correctly accords credit to writer-director Elia Kazan for the rise of this acting style. I think that Brando's innovation in film acting was a mixed blessing at best, but would have been inevitable with the onset of TV anyway and, more importantly, the general rise in narcissism in the society. In Stix's discussion of the films of the 1960s and '70s, for example, change the word "antihero" to "narcissist" and you'll see it fits perfectly and in fact makes more sense (in that, for example, one's personal behavior can seldom be described as anti-heroic, as that is a dramatic/literary term):
Johnny Strabler was one of the early versions of what became the ultimate 1960s Hollywood cliché: The “anti-hero.” During the mid-1950s, in his brief career, James Dean would specialize in this type, in East of Eden, Rebel Without a Cause (the ultimate anti-hero movie title), and Giant, before dying in an automobile accident in 1955. Another then-famous anti-hero role was Paul Newman’s performance as Billy the Kid, in Arthur Penn’s The Left Handed Gun, in 1958. (Though I admire much of Arthur Penn’s work, when I saw the movie on The Late Show about thirty years ago, I found it so dreadful, that I shut it off after a few minutes.)
In the 1960s, the anti-hero became the dominant shtick in Hollywood, as Steve McQueen, Clint Eastwood, Lee Marvin, Newman and Redford, (and a few years later) Charles Bronson, and countless other actors would earn millions of dollars portraying anti-hero crooks and cops alike. (On TV, for a producer to sell a cop series, it had to be about an “unorthodox” cop.)
However, the anti-hero shtick did not help Marlon Brando. Brando’s problem was that, rather than seeing the playing of anti-heroes as a calculated career move, he adopted the anti-hero as his personal shtick. But if you really act like an anti-hero (i.e., a juvenile delinquent) in your personal and professional life, you become a source of grief to all who depend on you.
I would suggest that Brando didn't change at heart after his twenties, which Stix argues. On the contrary, Brando just did what he could get away with at all times, and after his great run of early '50s performances he could get away with much more. His career is indeed a tale of great talent often wasted, and it is a cautionary tale about the consequences of narcissism both for others and oneself.
Thursday, January 19, 2006
Homnick Does Elder
Reform Club contributor, American Spectator regular, and Jewish World Review columnist Jay Homnick was the featured guest today on the nationally syndicated Larry Elder Show. (Elder happens to be my favorite talker and his flagship station is here in Los Angeles, so it was a great kick to hear Brother Jay as I was driving home.)
The topic was Jay's recent AmSpec piece, where he reveals his eyewitness testimony about how Senator Chuck Schumer got his start in the politics biz with his plan to drive blacks out of a section of Brooklyn. After a 30-year silence, Jay says he decided to speak out only after Sen. Schumer's recent attempt to connect Supreme Court nominee Sam Alito with racist sentiments.
If Alito's casual membership in a group in 1972, whose magazine said some untoward things calls into question his fitness for the court, what does that say about Chuck Schumer, the mastermind of a political plot against blacks in 1974, and his fitness for the U.S. Senate?
We may never know. Elder found it remarkable (but par for the course) that so far, there has been zero interest in Jay's testimony (aside from a feeler from a cable opinion show and Elder's people themselves) from the American press. That is perhaps the most interesting angle: when Bob Livingston went after Clinton on adultery, it wasn't his own dirty laundry, but really his hypocrisy that cost him not only the Speakership of the House, but his entire congressional career.
Now, that was fair, I think. Whither Chuck Schumer? Surely in this day and age, active racism is more egregious than merely diddling the help or doing blow. Where is Katie Couric?
Jay was great, of course, and got in a line that broke the host up (and a subsequent caller)---that now that she's so tough on Iran, the other senator from New York, one Hillary Clinton, shall henceforth be known as The Battleaxis of Evil.
The topic was Jay's recent AmSpec piece, where he reveals his eyewitness testimony about how Senator Chuck Schumer got his start in the politics biz with his plan to drive blacks out of a section of Brooklyn. After a 30-year silence, Jay says he decided to speak out only after Sen. Schumer's recent attempt to connect Supreme Court nominee Sam Alito with racist sentiments.
If Alito's casual membership in a group in 1972, whose magazine said some untoward things calls into question his fitness for the court, what does that say about Chuck Schumer, the mastermind of a political plot against blacks in 1974, and his fitness for the U.S. Senate?
We may never know. Elder found it remarkable (but par for the course) that so far, there has been zero interest in Jay's testimony (aside from a feeler from a cable opinion show and Elder's people themselves) from the American press. That is perhaps the most interesting angle: when Bob Livingston went after Clinton on adultery, it wasn't his own dirty laundry, but really his hypocrisy that cost him not only the Speakership of the House, but his entire congressional career.
Now, that was fair, I think. Whither Chuck Schumer? Surely in this day and age, active racism is more egregious than merely diddling the help or doing blow. Where is Katie Couric?
Jay was great, of course, and got in a line that broke the host up (and a subsequent caller)---that now that she's so tough on Iran, the other senator from New York, one Hillary Clinton, shall henceforth be known as The Battleaxis of Evil.
Elder States Man
This is a heads-up for Clubbers. I'll be on the Larry Elder Show on KABC in Los Angeles at 8:05 Eastern time.
It streams over the Internet at www.kabc.com
It streams over the Internet at www.kabc.com
Kids' Stuff
It seems that the most logical and commonsensical movies these days are those directed at children. Increasingly, moreover, kids' films are also among the most insightful into social realities. The Incredibles, for example, comically places litigiousness and a concern for individual responsibility at the center of its story. Sky High observes how the American education system suppresses children's natural creativity and ambition. The two Shrek films are full of satirical jabs at modern society.
It should be little surprise, then, that the new film Hoodwinked, based on the fairy tale of Little Red Riding Hood, actually deals with issues such as intellectual property and piracy. In this cheeky version of the story, Granny has a snack-food empire that is threatened by an unknown intellectual-property thief who has been stealing recipes from businesses all around the forest. Beginning with an incident at Granny's house—where Red is menaced by the Wolf, disguised as Granny, when the lumberjack bursts in and all are carted off to the police station so that the authorities can set things straight—the film moves on to a Rashomon-style investingation in which each of the various characters involved in the central events gives their version of the story.
Comical allusions abound in the subsequent flashbacks that look at the central events and place them in context, as is appropriate for a film dealing with intellectual property theft. We see references to Star Wars movies, Indiana Jones, James Bond, Bruce Lee, Jet Li, Jackie Chan, Mission Impossible, The Matrix (all too inevitably, alas), and much more.
The use of a Rashomon-style narrative form, however, does not induce any doubts about the human search for truth, as it does in Kurosawa's film. The makers of Hoodwinked treat the central story as a puzzle-style mystery, with the investigation being led by a suave detective, a long-legged frog named Nick Flippers, based on Nick Charles of the Thin Man novel and movies. As a result, the effect of the film is exactly opposite that of Rashomon, for in Hoodwinked everything has a cause and it is indeed possible for humans to know the truth.
Naturally, everything turns out well at the end. The thief is identified and taken into custody, Granny has been revealed as a swinging elderly babe, Red is given a chance to throw off the chains of her all-work-and-no-play lifestyle, and the forest's economy is able to get back to normal. On the whole, an interesting and surprisingly mature treatment of the issues.
Would that we could say the same about movies aimed at adults these days. For those who are sick of watching sensitive men moon over distant, emotionally disturbed women, or hikers tortured and killed by strangers in the wilderness, or young adults out on benders and venery hunts, or modern-day cowboys whose love dare not speak its name, or tendentious dramas about the evils of corporate America, or repressed individuals who throw off the shackles of conventionality and learn to follow their impulses—or much of the rest of the wonderfully mature and sophisticated movie fare of our time—today's movies aimed toward children may be just the thing.
Judging by their output, it appears that today's Hollywood believes that true maturity, intelligence, and decency are kids' stuff. Apparently they have studied their Jean Jacques Rousseau well but not wisely.
It should be little surprise, then, that the new film Hoodwinked, based on the fairy tale of Little Red Riding Hood, actually deals with issues such as intellectual property and piracy. In this cheeky version of the story, Granny has a snack-food empire that is threatened by an unknown intellectual-property thief who has been stealing recipes from businesses all around the forest. Beginning with an incident at Granny's house—where Red is menaced by the Wolf, disguised as Granny, when the lumberjack bursts in and all are carted off to the police station so that the authorities can set things straight—the film moves on to a Rashomon-style investingation in which each of the various characters involved in the central events gives their version of the story.
Comical allusions abound in the subsequent flashbacks that look at the central events and place them in context, as is appropriate for a film dealing with intellectual property theft. We see references to Star Wars movies, Indiana Jones, James Bond, Bruce Lee, Jet Li, Jackie Chan, Mission Impossible, The Matrix (all too inevitably, alas), and much more.
The use of a Rashomon-style narrative form, however, does not induce any doubts about the human search for truth, as it does in Kurosawa's film. The makers of Hoodwinked treat the central story as a puzzle-style mystery, with the investigation being led by a suave detective, a long-legged frog named Nick Flippers, based on Nick Charles of the Thin Man novel and movies. As a result, the effect of the film is exactly opposite that of Rashomon, for in Hoodwinked everything has a cause and it is indeed possible for humans to know the truth.
Naturally, everything turns out well at the end. The thief is identified and taken into custody, Granny has been revealed as a swinging elderly babe, Red is given a chance to throw off the chains of her all-work-and-no-play lifestyle, and the forest's economy is able to get back to normal. On the whole, an interesting and surprisingly mature treatment of the issues.
Would that we could say the same about movies aimed at adults these days. For those who are sick of watching sensitive men moon over distant, emotionally disturbed women, or hikers tortured and killed by strangers in the wilderness, or young adults out on benders and venery hunts, or modern-day cowboys whose love dare not speak its name, or tendentious dramas about the evils of corporate America, or repressed individuals who throw off the shackles of conventionality and learn to follow their impulses—or much of the rest of the wonderfully mature and sophisticated movie fare of our time—today's movies aimed toward children may be just the thing.
Judging by their output, it appears that today's Hollywood believes that true maturity, intelligence, and decency are kids' stuff. Apparently they have studied their Jean Jacques Rousseau well but not wisely.
Superb Analysis of GM's Troubles
Over at my much beloved American Spectator, automotive columnist Eric Peters has an excellent analysis of what is troubling GM. He suggests that the company makes too many models in too great a variety, particularly given the company's market share.
I think he's right, at least in part. There are other reasons. I've become a Honda man all the way. So is everybody in my family. We are the type of people who would typically buy American, but the quality issue drove us over to Honda.
I still remember my first car, a 1980 Ford Mustang Ghia (everybody asks what Ghia means -- I don't know, like GT, I guess). That car looked good, had decent power, but just felt kind of loose and lazy in an undefinable way. The best way to describe it is to say that when I got my next car, a 1986 Honda Accord, I could immediately feel miles of difference in the quality, responsiveness, tightness, solidity, etc. of the car. It was just better. I moved on to my grandfather's Caprice Classic (can't recall the year, but still boxy). It drove like a sofa on wheels. Comfy, but didn't feel as good as the Honda.
The conviction settled in my mind, deservedly or not, that the Japanese imports really were better cars.
It is my suspicion that millions of Americans had the same experience in the 80's and early 90's and made the same long term call.
When in the market for a car a few years ago, I test drove a Ford Ranger. I was shocked by how solid and tight it felt. It felt like quality. It felt like a Japanese import. I didn't buy it because I still didn't trust the car to last like a Honda. Reading Eric Peters' article, I think it is possible that the American cars are much better made today.
The bottom line is that I suspect that general queasy feeling about American cars is just as much to blame for GM's troubles as Eric Peters' thesis about an excessive diversity of models.
I think he's right, at least in part. There are other reasons. I've become a Honda man all the way. So is everybody in my family. We are the type of people who would typically buy American, but the quality issue drove us over to Honda.
I still remember my first car, a 1980 Ford Mustang Ghia (everybody asks what Ghia means -- I don't know, like GT, I guess). That car looked good, had decent power, but just felt kind of loose and lazy in an undefinable way. The best way to describe it is to say that when I got my next car, a 1986 Honda Accord, I could immediately feel miles of difference in the quality, responsiveness, tightness, solidity, etc. of the car. It was just better. I moved on to my grandfather's Caprice Classic (can't recall the year, but still boxy). It drove like a sofa on wheels. Comfy, but didn't feel as good as the Honda.
The conviction settled in my mind, deservedly or not, that the Japanese imports really were better cars.
It is my suspicion that millions of Americans had the same experience in the 80's and early 90's and made the same long term call.
When in the market for a car a few years ago, I test drove a Ford Ranger. I was shocked by how solid and tight it felt. It felt like quality. It felt like a Japanese import. I didn't buy it because I still didn't trust the car to last like a Honda. Reading Eric Peters' article, I think it is possible that the American cars are much better made today.
The bottom line is that I suspect that general queasy feeling about American cars is just as much to blame for GM's troubles as Eric Peters' thesis about an excessive diversity of models.
Wednesday, January 18, 2006
Thread for Comments on Munich Review
I haven't been able to post a comment to S.T. Karnick's Munich review below, so I figure others might be having the same problem. Consider this a thread for posting comments on the film and/or Karnick's review.
For my part, I find myself encouraged by the review. I have been avoiding the film because of exactly the conservative critiques Karnick mentions. I'm glad to hear there is no such obvious agenda at work. When I queried my parents about the film, they likewise disavowed the presence of any moral equivocating between the Israelis and the terrorists in the story.
As usual, the quality of the review is excellent. They don't call him the world's greatest living . . . or perhaps I should say, I don't call him the world's greatest living film critic in the English language™ for nothing.
For my part, I find myself encouraged by the review. I have been avoiding the film because of exactly the conservative critiques Karnick mentions. I'm glad to hear there is no such obvious agenda at work. When I queried my parents about the film, they likewise disavowed the presence of any moral equivocating between the Israelis and the terrorists in the story.
As usual, the quality of the review is excellent. They don't call him the world's greatest living . . . or perhaps I should say, I don't call him the world's greatest living film critic in the English language™ for nothing.
Bare Bones Reporting
What is the process, one is led to wonder, that editors employ to determine which stories run with pictures and which run without photographic accompaniment?
A mystery, one suspects, better left for the ages. (The underage, perhaps.)
A mystery, one suspects, better left for the ages. (The underage, perhaps.)
Spielberg's Munich Mistakes
Conservatives have written very critically about Steven Speilberg's Munich, saying that the film essentially posits moral equivalency between the terrorists who arranged the kidnapping and eventual killing of innocent Israeli athletes at the Munich Olympic games and the agents whom the Israeli government set on the trail to kill the organizers of the atrocity. Spielberg's public statements support the notion that he sees a connection between the events of the film and U.S. involvement in Iraq, and does not approve of the latter.
Upon viewing the film, however, I think that these critics are wrong and that the film is not an allegory for the Iraq War, Spielberg's public statements notwithstanding. Furthermore, I do not believe that Spielberg intended any moral equivalency between the two sides, but instead that he was simply exploring the questions and letting the viewers come to their own conclusions. As a spectator, there was no doubt in my mind that the Israelis were right in what they were doing. Others will undoubtedly draw other conclusions, but I think everyone would judge the situation based on the beliefs about justice, retribution, etc., that they held upon entering the theater. I think it entirely absurd to believe that Spielberg's film would change a person's fundamental thoughts on such matters.
However, I do think that Spielberg was wrong to do the film this way—for dramatic and aesthetic reasons. Throughout the film, we repeatedly see the Israeli agents agonizing over the morality of their task, and discussing it in anguished terms. This is silly. Those who took the job must have had some qualms about it, to be sure, but they must also have known that what they were doing was essential. Nothing is accomplished, in dramatic terms, in their discussing it further, especially on the childlike level that the screenwriters handle it in the present case. Once the agents set out on their path, the only real moral drama is in their attempt to get the job done without endangering innocents. Spielberg includes some of that, but it is overwhelmed by the overall moral-rightness question.
This is particularly damaging to the film's effect because Speilberg and his writers, in what can be seen as deference to the overwhelming importance of these issues, fail to create interesting characters. (Papa, played by Michel Lonsdale, is the only character in the film who is capable of surprising us, which is the most important indicator of whether a character is real or just a cardboard cutout.) A filmmaker captivated by issues cannot exercise the artistic freedom necessary to create real characters and real drama. In addition, the central story—the hunt for the terrorists and schemes to kill them—is the sort of heist-film material that absolutely requires quirky characters because the story elements are so predictable.
One can imagine, then, how compelling this film would have been if it had been directed by a more intelligent, sophisticated filmmaker such as Robert Aldrich (The Dirty Dozen) or Howard Hawks (To Have and Have Not, The Big Sleep, Rio Bravo, etc.). Both of these directors were masters of the art of making stock characters into full-bodied, complex, interesting people, as the films mentioned here exemplify. One could easily see, for example, the Israeli bombmaker in Munich as being much more interesting if his relationship with the group leader, Avner, were more like that of John Chance (John Wayne) and Stumpy (Walter Brennan) in Rio Bravo, with the bombmaker complaining, "First you say I didn't use enough explosives, now you say I use too much! Nobody can ever please you. That's it—I'm going back to Israel!" Likewise with the intellectual guy, the easygoing blonde chap, and the other stereotypical characters at the center of the film. And that is especially true of Avner, whose home life, pregnant wife, and descent into paranoia do nothing to distinguish him in our minds as a unique individual.
A contemporary example of the approach I am suggesting is in the television program NCIS, in which characters dealing with disastrous situations—such as the potential hijacking of a train full of spent nuclear fuel rods that could be turned into a giant "dirty" bomb—have quirky personalities that affect how they act, and which make us like them and feel even more intensely the desire for them to succeed.
If Speilberg did not want his audience to pull for his characters to succeed, then he should have chosen another subject, because this kind of film—a "characters on a task" story—absolutely requires audience sympathy for the central characters. We need not approve of everything the central characters do, or even approve of the task they have set out to accomplish (as in heist movies, in which the characters are setting out to steal other people's property), but we must at least have some reason to identify with them and sympathize with them. Spielberg gives us very little of that in Munich.
More interesting characterizations would not have made Spielberg's film less serious; it would make it more compelling for us, as we could more easily see the characters as real people, identify with them, and care about what happens to them. The failure to fulfill his aesthetic obligations, not political ones, is Spielberg's real mistake in Munich.
Upon viewing the film, however, I think that these critics are wrong and that the film is not an allegory for the Iraq War, Spielberg's public statements notwithstanding. Furthermore, I do not believe that Spielberg intended any moral equivalency between the two sides, but instead that he was simply exploring the questions and letting the viewers come to their own conclusions. As a spectator, there was no doubt in my mind that the Israelis were right in what they were doing. Others will undoubtedly draw other conclusions, but I think everyone would judge the situation based on the beliefs about justice, retribution, etc., that they held upon entering the theater. I think it entirely absurd to believe that Spielberg's film would change a person's fundamental thoughts on such matters.
However, I do think that Spielberg was wrong to do the film this way—for dramatic and aesthetic reasons. Throughout the film, we repeatedly see the Israeli agents agonizing over the morality of their task, and discussing it in anguished terms. This is silly. Those who took the job must have had some qualms about it, to be sure, but they must also have known that what they were doing was essential. Nothing is accomplished, in dramatic terms, in their discussing it further, especially on the childlike level that the screenwriters handle it in the present case. Once the agents set out on their path, the only real moral drama is in their attempt to get the job done without endangering innocents. Spielberg includes some of that, but it is overwhelmed by the overall moral-rightness question.
This is particularly damaging to the film's effect because Speilberg and his writers, in what can be seen as deference to the overwhelming importance of these issues, fail to create interesting characters. (Papa, played by Michel Lonsdale, is the only character in the film who is capable of surprising us, which is the most important indicator of whether a character is real or just a cardboard cutout.) A filmmaker captivated by issues cannot exercise the artistic freedom necessary to create real characters and real drama. In addition, the central story—the hunt for the terrorists and schemes to kill them—is the sort of heist-film material that absolutely requires quirky characters because the story elements are so predictable.
One can imagine, then, how compelling this film would have been if it had been directed by a more intelligent, sophisticated filmmaker such as Robert Aldrich (The Dirty Dozen) or Howard Hawks (To Have and Have Not, The Big Sleep, Rio Bravo, etc.). Both of these directors were masters of the art of making stock characters into full-bodied, complex, interesting people, as the films mentioned here exemplify. One could easily see, for example, the Israeli bombmaker in Munich as being much more interesting if his relationship with the group leader, Avner, were more like that of John Chance (John Wayne) and Stumpy (Walter Brennan) in Rio Bravo, with the bombmaker complaining, "First you say I didn't use enough explosives, now you say I use too much! Nobody can ever please you. That's it—I'm going back to Israel!" Likewise with the intellectual guy, the easygoing blonde chap, and the other stereotypical characters at the center of the film. And that is especially true of Avner, whose home life, pregnant wife, and descent into paranoia do nothing to distinguish him in our minds as a unique individual.
A contemporary example of the approach I am suggesting is in the television program NCIS, in which characters dealing with disastrous situations—such as the potential hijacking of a train full of spent nuclear fuel rods that could be turned into a giant "dirty" bomb—have quirky personalities that affect how they act, and which make us like them and feel even more intensely the desire for them to succeed.
If Speilberg did not want his audience to pull for his characters to succeed, then he should have chosen another subject, because this kind of film—a "characters on a task" story—absolutely requires audience sympathy for the central characters. We need not approve of everything the central characters do, or even approve of the task they have set out to accomplish (as in heist movies, in which the characters are setting out to steal other people's property), but we must at least have some reason to identify with them and sympathize with them. Spielberg gives us very little of that in Munich.
More interesting characterizations would not have made Spielberg's film less serious; it would make it more compelling for us, as we could more easily see the characters as real people, identify with them, and care about what happens to them. The failure to fulfill his aesthetic obligations, not political ones, is Spielberg's real mistake in Munich.
Tuesday, January 17, 2006
Lost but Well-founded
Somewhat overlooked in the sad tale of the dozen miners was the truly heroic - some would say saintly - character of their dying moments, as reflected in the notes they penned. Their thoughts were only to assuage the fears and pain of their families. The scene was redolent of Balaam's pronouncement: "May my spirit die the death of the righteous..." (Numbers 23:10)
Over at The American Spectator, I composed a brief paean to their lives and deaths.
Herewith the merest foretaste:
"Let not Ambition mock their useful toil,
Their homely joys, and destiny obscure;
Nor Grandeur hear with a disdainful smile
The short and simple annals of the poor."
This was a very solid group of men; we need to mourn them and learn to appreciate more those that remain. They work hard and are not wont to complain. Nor do they come home and spew a gospel of resentment. Instead, they live a friendly small-town existence with strong religious affiliation: no atheists in that foxhole. Look at the beautiful letters that they left their families when they sensed that death was near. No bitterness, no complaint, just love and reassurance to parents, spouses and children. What does it tell you about the character of a person when his primary concern in his dying moments is to mollify his loved ones with the image of him passing painlessly?
Rest in peace.
Over at The American Spectator, I composed a brief paean to their lives and deaths.
Herewith the merest foretaste:
"Let not Ambition mock their useful toil,
Their homely joys, and destiny obscure;
Nor Grandeur hear with a disdainful smile
The short and simple annals of the poor."
This was a very solid group of men; we need to mourn them and learn to appreciate more those that remain. They work hard and are not wont to complain. Nor do they come home and spew a gospel of resentment. Instead, they live a friendly small-town existence with strong religious affiliation: no atheists in that foxhole. Look at the beautiful letters that they left their families when they sensed that death was near. No bitterness, no complaint, just love and reassurance to parents, spouses and children. What does it tell you about the character of a person when his primary concern in his dying moments is to mollify his loved ones with the image of him passing painlessly?
Rest in peace.
Big Time Student Athletes
For years the NCAA (National College Athletic Association) has been making excuses for the appalling graduation rate of Division I athletes. According to the U.S. Department of Education (DOE), only 62 percent of athletes earn a degree. The NCAA recently disputed this figure, slightly. Whose figure is correct? Who cares? Both are awful.
The truth of the matter is that Division I athletes are generally engaged in gut courses and fail to meet even modest academic standards. Weight lifting, basket weaving and “communications” majors are hardly the basis of a liberal education. As B. David Ridpath, assistant professor of sports administration at Mississippi State University, bluntly says, “It’s too easy for colleges to water down their curriculums and let athletes take easy majors.”
Basketball programs had the worst graduation rate of any sport, with just 58 percent of players earning degrees within six years. At some colleges, only a tiny fraction of enrolled basketball players graduate, no matter how puny the academic requirements. Many of these athletes should not be in college at all. Far too many are there only to play basketball. In fact, student-athlete is an oxymoron. College means little more to many than the minor league from which they hope to land a pro offer. Yet only a very few “student-athletes” end up with one.
Graduation rates for Division I football players do not fare much better. Of the 56 Division I-A teams competing in bowl games this year, eleven had graduation rates below 50 percent. The University of Texas, whose football team went to the Rose Bowl and won the national collegiate championship, had a graduation rate of 31 percent according to DOE--40 percent according to the NCAA.
R. Gerald Turner, president of Southern Methodist University and vice chairman of the Knight Commission admits that, “Far too many schools are reaping financial rewards for post season play, while they’re failing to graduate the athletes who have enabled their success on the field.” What he’s really saying is that administrators tolerates the educational travesty because of the money successful basketball and football programs bring.
There is some hopeful news: Eight out of the 17 men’s sports had graduation scores of over 80 percent. Lacrosse led the way with 89 percent of its players graduating. But no one would confuse lacrosse with big time football or March Madness.
The two sports that generate the greatest revenue and alumni zeal, football and basketball, are in a class by themselves. Coaches earning seven figure salaries are naturally far more interested in the ability of a kid to hit a three point shot or run the “50” in 4.3 seconds than whether they can do calculus. In Tempe, Arizona during the recent Fiesta Bowl, I was amazed at how many Notre Dame and Ohio State alumni traveled long distances to see their teams play. At least 100,000 fans jammed into Sun Devil stadium. There were parties all over town; the restaurants and bars were filled to capacity. The money and alcohol flowed.
The kids on the field were filled with emotion. But when the curtain comes down on college athletics, how many of them will end up in the pros? How many will be prepared for the next chapter in their lives? How many will have the skills of even the most rudimentary college education?
Alumni fans might think a little about this, the next time they pump their fists for the home team.
The truth of the matter is that Division I athletes are generally engaged in gut courses and fail to meet even modest academic standards. Weight lifting, basket weaving and “communications” majors are hardly the basis of a liberal education. As B. David Ridpath, assistant professor of sports administration at Mississippi State University, bluntly says, “It’s too easy for colleges to water down their curriculums and let athletes take easy majors.”
Basketball programs had the worst graduation rate of any sport, with just 58 percent of players earning degrees within six years. At some colleges, only a tiny fraction of enrolled basketball players graduate, no matter how puny the academic requirements. Many of these athletes should not be in college at all. Far too many are there only to play basketball. In fact, student-athlete is an oxymoron. College means little more to many than the minor league from which they hope to land a pro offer. Yet only a very few “student-athletes” end up with one.
Graduation rates for Division I football players do not fare much better. Of the 56 Division I-A teams competing in bowl games this year, eleven had graduation rates below 50 percent. The University of Texas, whose football team went to the Rose Bowl and won the national collegiate championship, had a graduation rate of 31 percent according to DOE--40 percent according to the NCAA.
R. Gerald Turner, president of Southern Methodist University and vice chairman of the Knight Commission admits that, “Far too many schools are reaping financial rewards for post season play, while they’re failing to graduate the athletes who have enabled their success on the field.” What he’s really saying is that administrators tolerates the educational travesty because of the money successful basketball and football programs bring.
There is some hopeful news: Eight out of the 17 men’s sports had graduation scores of over 80 percent. Lacrosse led the way with 89 percent of its players graduating. But no one would confuse lacrosse with big time football or March Madness.
The two sports that generate the greatest revenue and alumni zeal, football and basketball, are in a class by themselves. Coaches earning seven figure salaries are naturally far more interested in the ability of a kid to hit a three point shot or run the “50” in 4.3 seconds than whether they can do calculus. In Tempe, Arizona during the recent Fiesta Bowl, I was amazed at how many Notre Dame and Ohio State alumni traveled long distances to see their teams play. At least 100,000 fans jammed into Sun Devil stadium. There were parties all over town; the restaurants and bars were filled to capacity. The money and alcohol flowed.
The kids on the field were filled with emotion. But when the curtain comes down on college athletics, how many of them will end up in the pros? How many will be prepared for the next chapter in their lives? How many will have the skills of even the most rudimentary college education?
Alumni fans might think a little about this, the next time they pump their fists for the home team.
Monday, January 16, 2006
Bush Hit List
Well, say it three times fast.
We at The Reform Club occasionally promote worthy riffs from our commenters to the main board, and in this case, I'd like to promote our own Jay D. Homnick's complaints about our current White House Occupant to give 'em their own air:
1) Blew the relationship with Senator Jeffords and cost the Republicans a Senate majority for two years.
2) Left the same stupid wet-foot Cuba policy where refugees are repatriated if they don't make it to shore.
3) Has completely ignored the immigration problem; in fact, he has actually made the border patrols weaker. This is bad government and bad politics, not to mention dangerous.
4) Has continued a completely hypocritical policy of saying that the U.S. must never negotiate with terrorists while insisting that Israel must kowtow to terrorists and accede to their demands.
5) Has shown an almost comical level of disengagement from, if not downright ignorance of, the political situation in South and Central America, which is becoming more dangerous to the United States with each passing day.
6) Has not really made a move (not that Clinton did either) to limit our dependency on oil or to improve the terms under which we acquire it.
7) Has not had the courage to fight environmentalists over their stranglehold on the building of new oil refineries.
8) Has not figured out approaches to getting the middle-of-the-road person in America to see him as a "uniter, not a divider".
I'll add not vetoing anything, like the heinous McCain-Feingold, and after drilling in ANWR was scotched, not figuring out how to make fuel out of caribou. Additions encouraged. (For maximum effect, keep 'em Homnick-short.)
We at The Reform Club occasionally promote worthy riffs from our commenters to the main board, and in this case, I'd like to promote our own Jay D. Homnick's complaints about our current White House Occupant to give 'em their own air:
1) Blew the relationship with Senator Jeffords and cost the Republicans a Senate majority for two years.
2) Left the same stupid wet-foot Cuba policy where refugees are repatriated if they don't make it to shore.
3) Has completely ignored the immigration problem; in fact, he has actually made the border patrols weaker. This is bad government and bad politics, not to mention dangerous.
4) Has continued a completely hypocritical policy of saying that the U.S. must never negotiate with terrorists while insisting that Israel must kowtow to terrorists and accede to their demands.
5) Has shown an almost comical level of disengagement from, if not downright ignorance of, the political situation in South and Central America, which is becoming more dangerous to the United States with each passing day.
6) Has not really made a move (not that Clinton did either) to limit our dependency on oil or to improve the terms under which we acquire it.
7) Has not had the courage to fight environmentalists over their stranglehold on the building of new oil refineries.
8) Has not figured out approaches to getting the middle-of-the-road person in America to see him as a "uniter, not a divider".
I'll add not vetoing anything, like the heinous McCain-Feingold, and after drilling in ANWR was scotched, not figuring out how to make fuel out of caribou. Additions encouraged. (For maximum effect, keep 'em Homnick-short.)
A Very Gory Opportunism
Ex-everything (senator, vice president, sane person) Al Gore seized the occasion of Dr. Martin Luther King Day to excoriate the Bush administration by comparing its wiretapping of terrorist phone calls to the government's spying on MLK's personal life in the '60s.
Mr. Gore forgot to mention it was not a power-mad fascist Republican, but modern lefty saint Bobby Kennedy who authorized it.
Must have been an oversight.
Source Material Addendum:
"At the outset, let me emphasize two very important points. First, the Department of Justice believes, and the case law supports, that the President has inherent authority to conduct warrantless physical searches for foreign intelligence purposes and that the President may, as has been done, delegate this authority to the Attorney General."---Deputy Attorney General Jamie S. Gorelick, July 14, 1994
BEFORE THE PERMANENT SELECT COMMITTEE ON INTELLIGENCE
U.S. HOUSE OF REPRESENTATIVES
CONCERNING WARRANTLESS PHYSICAL SEARCHES CONDUCTED IN THE U. S. FOR FOREIGN INTELLIGENCE
(Mr. Gore must have been out sick that day.)
Pre-History
This Niall Ferguson essay - essentially arguing that a failure to pre-empt Iran's nuclear ambitions will set the stage for a nuclear war in the near future - is both well-done and frighteningly plausible.
But it's worth remembering that there has never been real war between nuclear powers. The closest we've come to is the occasional shelling and raiding between Pakistan and India. (Hmmm....maybe China and the USSR, but I'm not sure China had nukes then or at least not more than a few). In any case, here's what seems to me a much more likely scenario:
The US draws down its forces in Iraq, beginning in 2006 and substantially completed by 2008. (Either we will be successful and will be able to draw down or the continuing instability will be exploited by the Kos wing of the Democratic Party to gain electoral success and force the withdrawal). If Iran's nukes are not pre-empted (and is there anyone who doesn't think the Iranians are trying to develop nuclear weapons?), then the Iranians will have achieved a strategic standoff with Israel. But I think they're still unlikely to initiate a nuclear exchange with Israel, simply because the Israelis have enough nukes to obliterate Iran (and, most importantly, its leadership). Rather, Iran will use the nukes as a way of making itself invulnerable to American and Israeli pressure and will then seek to establish itself as the *the* power in the Middle East. This means, first of all, exporting its Islamism to Iraq and Afghanistan, undermining their relatively pro-American regimes. Second, it means undermining the secular regimes in Syria, Lebanon, Pakistan, and Saudi Arabia and attempting to establish a pan-Islamic confederation that both controls a significant portion of the world's oil supplies and, with Iranian and Pakistani nukes, remains relatively invulnerable to international pressure. (The Europeans can't impose sanctions because they are too dependent on the oil and the US will be unable to move against the Iranians because the Europeans - and perhaps the Israelis - will not want to risk the obliteration of one of their cities).
What the nuclear arming of Iran threatens is not a hot war ala WWII, but another Cold War where a radical ideology backed up by the gun takes over a strategically crucial part of the world. Israel might end up as a new West Berlin, hemmed in by its enemies. Not a happy scenario.
But it's worth remembering that there has never been real war between nuclear powers. The closest we've come to is the occasional shelling and raiding between Pakistan and India. (Hmmm....maybe China and the USSR, but I'm not sure China had nukes then or at least not more than a few). In any case, here's what seems to me a much more likely scenario:
The US draws down its forces in Iraq, beginning in 2006 and substantially completed by 2008. (Either we will be successful and will be able to draw down or the continuing instability will be exploited by the Kos wing of the Democratic Party to gain electoral success and force the withdrawal). If Iran's nukes are not pre-empted (and is there anyone who doesn't think the Iranians are trying to develop nuclear weapons?), then the Iranians will have achieved a strategic standoff with Israel. But I think they're still unlikely to initiate a nuclear exchange with Israel, simply because the Israelis have enough nukes to obliterate Iran (and, most importantly, its leadership). Rather, Iran will use the nukes as a way of making itself invulnerable to American and Israeli pressure and will then seek to establish itself as the *the* power in the Middle East. This means, first of all, exporting its Islamism to Iraq and Afghanistan, undermining their relatively pro-American regimes. Second, it means undermining the secular regimes in Syria, Lebanon, Pakistan, and Saudi Arabia and attempting to establish a pan-Islamic confederation that both controls a significant portion of the world's oil supplies and, with Iranian and Pakistani nukes, remains relatively invulnerable to international pressure. (The Europeans can't impose sanctions because they are too dependent on the oil and the US will be unable to move against the Iranians because the Europeans - and perhaps the Israelis - will not want to risk the obliteration of one of their cities).
What the nuclear arming of Iran threatens is not a hot war ala WWII, but another Cold War where a radical ideology backed up by the gun takes over a strategically crucial part of the world. Israel might end up as a new West Berlin, hemmed in by its enemies. Not a happy scenario.
Sunday, January 15, 2006
NFL Playoff Observations
I'll do like Rush and mix in a little NFL commentary with the politics.
(However, I'll avoid making a controversy out of the actual non-controversy that is the black quarterback. For the record, I think Donovan McNabb is outrageously good. On the other hand, I'm quite annoyed with Daunte Culpepper, my first round fantasy pick who sunk me completely this year.)
Here we go:
1. Denver running backs are less and less likely to get big free agent dollars to go elsewhere. Shanahan knows how to make RB's look good. He is better at coaching the run than anyone else in the league. That team simply does not need high draft pick RB talent to succeed.
2. Rex Grossman of the Chicago Bears has the palest skin I've ever seen on any player in the National Football League. He is even more pale than "Whitey" Sven Ivory, the former albino third string safety for those great Vikings teams of the 70's. The man borders on being gray. He may actually have the proverbial ice water in his veins which would explain the pallor.
3. Indianapolis deserved to lose their game. This was not the same squad we've watched dominate virtually without effort.
I think this is a case of a team that needed more adversity on the field and less off the field. There is no way Tony Dungy (clearly an NFL supercoach) could have continued in the same vein of stupendous success after his son's death. When Peyton Manning waived off Dungy's punt squad late in the third quarter, you could see a legend just ready to be born as the QB took over for his beleaguered skipper. Unfortunately for the Colts, the transformation was too late in coming. Had Manning taken the reins a bit earlier his team might have had a chance.
By the way, for the record, Troy Polamalu DID intercept that Manning pass late in the fourth quarter. It will be a permanent mystery as to how an experienced NFL referee could botch a call so badly. Luckily for the NFL and everyone involved, the Steelers won anyway which left the mistake moot.
4. The Carolina Panthers are absolutely legit. Anybody that can score that many points and drive the ball so effectively against an unreal Bears defense is destined for the Superbowl. I'm going out on a limb to predict the Panthers beat the Seahawks in a close one to go to Detroit.
5. The Steelers are going to beat the Broncos. Both teams play a similar style, but the Steelers are cresting at just the right time. The pieces are all in place. Roethlisberger gets to be Tom Brady this time. The Steelers defense will pick Jake Plummer off and score points in the victory.
6. I've learned to dislike Tom Brady. He always struck me as a winner, but this year the ugly side of the overcompetitive player came out in the QB. He complained too much about being written off at mid-season and spent too much time whining about not getting calls during the Denver game on Saturday. Hopefully, a spell of not being the champion will be good for him and restore Brady to class-act status.
7. Michael Vick is overrated. He is overrated. He is overrated. The man is the most elusive open field runner in the history of the game this side of Barry Sanders, but he is not a good enough passer. As a Falcons fan, I don't want to see him shoved into a pocket passer mold, but it would at least be nice to see Atlanta become a little more hospitable to free agent wide receivers. Right now the Peach City is the place where WR's go to watch their dreams die.
8. My crystal ball is cracked on Brett Favre. I could see him coming back for a couple of great years to quiet the critics, but I fell in love with his gutsy play years ago and am incapable of being objective.
(However, I'll avoid making a controversy out of the actual non-controversy that is the black quarterback. For the record, I think Donovan McNabb is outrageously good. On the other hand, I'm quite annoyed with Daunte Culpepper, my first round fantasy pick who sunk me completely this year.)
Here we go:
1. Denver running backs are less and less likely to get big free agent dollars to go elsewhere. Shanahan knows how to make RB's look good. He is better at coaching the run than anyone else in the league. That team simply does not need high draft pick RB talent to succeed.
2. Rex Grossman of the Chicago Bears has the palest skin I've ever seen on any player in the National Football League. He is even more pale than "Whitey" Sven Ivory, the former albino third string safety for those great Vikings teams of the 70's. The man borders on being gray. He may actually have the proverbial ice water in his veins which would explain the pallor.
3. Indianapolis deserved to lose their game. This was not the same squad we've watched dominate virtually without effort.
I think this is a case of a team that needed more adversity on the field and less off the field. There is no way Tony Dungy (clearly an NFL supercoach) could have continued in the same vein of stupendous success after his son's death. When Peyton Manning waived off Dungy's punt squad late in the third quarter, you could see a legend just ready to be born as the QB took over for his beleaguered skipper. Unfortunately for the Colts, the transformation was too late in coming. Had Manning taken the reins a bit earlier his team might have had a chance.
By the way, for the record, Troy Polamalu DID intercept that Manning pass late in the fourth quarter. It will be a permanent mystery as to how an experienced NFL referee could botch a call so badly. Luckily for the NFL and everyone involved, the Steelers won anyway which left the mistake moot.
4. The Carolina Panthers are absolutely legit. Anybody that can score that many points and drive the ball so effectively against an unreal Bears defense is destined for the Superbowl. I'm going out on a limb to predict the Panthers beat the Seahawks in a close one to go to Detroit.
5. The Steelers are going to beat the Broncos. Both teams play a similar style, but the Steelers are cresting at just the right time. The pieces are all in place. Roethlisberger gets to be Tom Brady this time. The Steelers defense will pick Jake Plummer off and score points in the victory.
6. I've learned to dislike Tom Brady. He always struck me as a winner, but this year the ugly side of the overcompetitive player came out in the QB. He complained too much about being written off at mid-season and spent too much time whining about not getting calls during the Denver game on Saturday. Hopefully, a spell of not being the champion will be good for him and restore Brady to class-act status.
7. Michael Vick is overrated. He is overrated. He is overrated. The man is the most elusive open field runner in the history of the game this side of Barry Sanders, but he is not a good enough passer. As a Falcons fan, I don't want to see him shoved into a pocket passer mold, but it would at least be nice to see Atlanta become a little more hospitable to free agent wide receivers. Right now the Peach City is the place where WR's go to watch their dreams die.
8. My crystal ball is cracked on Brett Favre. I could see him coming back for a couple of great years to quiet the critics, but I fell in love with his gutsy play years ago and am incapable of being objective.
Things that Don't Mix: Horror Flicks and Kiddies
I've been kind of keeping this to myself, but DP of Rock, Paper, Dynamite and Thomas Hibbs of NRO have rekindled the flicker of a particular thought in my brain.
As he discusses the horror film Hostel, currently a low budget hit eclipsing older releases Narnia and King Kong, Hibbs noted a distressing phenomenon:
Yet, the most depressing and horrifying thing about these sorts of films is, alas, not the explicit gore. It is the fact that at nearly every screening of a gruesome horror film I attend (from Massachusetts to Texas), I see parents in the audience with young children. That strikes me as a serious form of child abuse and a more convincing sign of the impending apocalypse than anything depicted on the screen.
I had the same thought a few years back when I went to see Blade 2 with Wesley Snipes. I was shocked to see several small children in the theatre who had been brought by their "parents" who were engaging in their own mysterious version of "parenting." It wasn't quite Kill Bill, but the film had graphic portrayals of bodily mutilation that took tatooing several steps up the cruelty scale and mass murder with blood hosing everywhere.
I don't need to see a study to know that the children exposed to this kind of film will become insensitive to violence, killing, etc. To use a more biblical expression, I'd say it hardens hearts. My own experience bears this out. As a teenager, my friends and I took advantage of the combination of video rental privileges and driver's licenses to rent every horrible thing we could get our hands on. The more a film pushed the border of tastelessness, violence, and sexual priggishness, the more likely we were to give it a viewing. I particularly recall a film that portrayed graphic serial rape of a woman caught in the wilderness Deliverance-style by a group of bad men. The first time I saw it I was shocked and shaken. The fourth time I was laughing.
After years of exercising more personal vigilance in my viewing choices, I've managed to recover my sense of shock at the depiction of outrageous behavior onscreen. I can only imagine how warped an individual's sensibilities can become after dulling the edge of the conscience on reels and reels of bloody, sex and violence-drenched celluloid (or digital media), particularly when the process begins with non-parenting parents initiating their toddlers into onscreen bloodsport.
This damage to the mind's facility for perceiving moral distinctions is the basic problem with total liberation of entertainment from social constraints. All the barrier-busting and fun-poking at stuffy taboo protectors leads to an arena with no-holds barred. What demons will wrestle in the virtual stadiums of the future? I'm not at all sure we want to know.
As he discusses the horror film Hostel, currently a low budget hit eclipsing older releases Narnia and King Kong, Hibbs noted a distressing phenomenon:
Yet, the most depressing and horrifying thing about these sorts of films is, alas, not the explicit gore. It is the fact that at nearly every screening of a gruesome horror film I attend (from Massachusetts to Texas), I see parents in the audience with young children. That strikes me as a serious form of child abuse and a more convincing sign of the impending apocalypse than anything depicted on the screen.
I had the same thought a few years back when I went to see Blade 2 with Wesley Snipes. I was shocked to see several small children in the theatre who had been brought by their "parents" who were engaging in their own mysterious version of "parenting." It wasn't quite Kill Bill, but the film had graphic portrayals of bodily mutilation that took tatooing several steps up the cruelty scale and mass murder with blood hosing everywhere.
I don't need to see a study to know that the children exposed to this kind of film will become insensitive to violence, killing, etc. To use a more biblical expression, I'd say it hardens hearts. My own experience bears this out. As a teenager, my friends and I took advantage of the combination of video rental privileges and driver's licenses to rent every horrible thing we could get our hands on. The more a film pushed the border of tastelessness, violence, and sexual priggishness, the more likely we were to give it a viewing. I particularly recall a film that portrayed graphic serial rape of a woman caught in the wilderness Deliverance-style by a group of bad men. The first time I saw it I was shocked and shaken. The fourth time I was laughing.
After years of exercising more personal vigilance in my viewing choices, I've managed to recover my sense of shock at the depiction of outrageous behavior onscreen. I can only imagine how warped an individual's sensibilities can become after dulling the edge of the conscience on reels and reels of bloody, sex and violence-drenched celluloid (or digital media), particularly when the process begins with non-parenting parents initiating their toddlers into onscreen bloodsport.
This damage to the mind's facility for perceiving moral distinctions is the basic problem with total liberation of entertainment from social constraints. All the barrier-busting and fun-poking at stuffy taboo protectors leads to an arena with no-holds barred. What demons will wrestle in the virtual stadiums of the future? I'm not at all sure we want to know.
Subscribe to:
Posts (Atom)