When I turn 35 I will have my first mammogram.
In the United States, mammography is recommended for breast cancer screening every one to two years beginning at age 40. The best available evidence suggests that mammography screening among women aged 40 to 74 reduces breast cancer mortality.
But due to a few minor risk factors, three doctors have suggested I undergo a baseline mammogram at 35. I’m not thrilled with the idea of having a technician I’ve never met manipulate my breasts into squishing position, but being felt up and flattened out sounds a lot better than being dead, so I’ll take my chances.
Of women who receive annual screening mammography beginning at age 40, six out of 10,000 over a decade will have their lives saved. Breast cancer will be detected and cured in many more, but regular mammograms will only make a life or death difference for six of every 10,000 women in that group. Mammograms are of extremely high value to those women and their families, but don’t offer much bang for the buck when it comes to the other 9,994 women.
And wringing more bang from every health care buck is reason enough for Canadian and British recommendations that women wait until age 50 to begin receiving screening mammographies. In these countries where cost-effectiveness studies influence health policy and medical practice, six saved lives aren’t worth the substantial costs associated with all those extra mammograms and the false positives they sometimes produce.
Canadian women are offered routine mammograms every two years, but only from age 50-69 because “evidence is not conclusive” that routine mammograms benefit younger and older women. Doctors have some leeway with regard to high risk patients.
In the United Kingdom, mammograms are recommended every three years beginning some time between age 50 and 53. Based on guidelines developed by the Orwellian-named NICE (National Institute for Clinical Excellence), the National Health Service insists that for women under 40, “mammograms should only be used as part of clinical trials into screening and that they shouldn’t be used under age 30 at all.” According to NICE, “Healthcare professionals should respond to women who present with concerns but should not, in most instances, actively seek to identify women with a family history of breast cancer.”
It is hardly shocking that the breast cancer mortality is 9 percent higher in Canada and 88 percent higher in the United Kingdom. Nine of 10 middle-aged American women (89 percent) have had a mammogram, compared to less than three-fourths of Canadians (72 percent). And British and Canadian patients wait for care about twice as long as Americans.
There are indeed valid criticisms American health care, but one area in which we excel is that we don’t base guidelines for care on cost-utility analysis. That’s why the U.S. ranks first in providing the “right care” for a given condition and has the best survival rate for breast cancer.
Obamacare may force Americans to give up those bragging rights.
The “right care” may soon be defined in part by how much that care costs. Health care reformers acknowledge the impossibility of implementing universal health care without introducing cost containment measures, and Democrats are enamored with a method used by the British called “comparative effectiveness research” (CER.)
AARP CEO and CER proponent Bill Novelli describes comparative effectiveness research as “a wonky term that just means giving doctors and patients the ability to compare different kinds of treatments to find out which one works best for which patient.” And at its best, that’s just what CER does. CER is not inherently bad. For example, it can help doctors cut through seductive pharmaceutical advertising to identify older, less commonly prescribed drugs that are just as effective as newer, more expensive ones.
But with CER, the devil is in the details.
CER can lead to one-size-fits-all medicine and encourages a purely analytical approach to care that is not always beneficial to the patient. The mythical average patient overshadows the individual patient, leaving most of us with about as many options as a public school cafeteria at lunchtime.
And in the UK, NICE includes cost as a determining factor in the comparative effectiveness studies that inform clinical guidelines. Determinations about whether citizens will have access to drugs, tests, and procedures are based on cost per quality of life year (QALY.)
The QALY score is a fairly crude metric that takes into account both the number and quality of years a medical intervention is expected to add to a patient’s life. Here’s the upshot of using QALYs to determine cost effectiveness:
On the QALY scale, 0 means you’re dead, 1 means you’re in perfect health, and varying levels of debility fall in between. Imagine two groups of people, one with a QALY of 1 and the other with a score of 0.5. An expensive technology brings a year of life to both groups. But in the second, that technology would be counted as having provided only six months, and thus be twice as expensive. It may be deemed too costly for that patient group.
The older you are, the sicker you are, the more disabled you are, the less cost effective it is to treat you. And if the cost per QALY of a medical intervention you need exceeds £20-30,000 (around $32,000 – 48,000), you’re out of luck. Drugs, particularly end-of-life treatments, are routinely rejected for use due to poor cost-effectiveness. And screening tests, like the mammograms American women take for granted, are severely restricted to ensure expenditures remain under the cost per QALY threshold.
Liberal proponents of health care reform accuse conservatives of paranoia and fear mongering about health care rationing. Critics of CER are demonized as extremist spewers of far right talking points who don’t care about improving clinical effectiveness. Surely a uniquely American flavor of a CER board would never become as proscriptive as NICE.
But it seems conservative anxiety (and perhaps a bit of healthy paranoia) is more than warranted by Washington Democrats singing the praises of cost-cutting comparative effectiveness studies. Bear with me while I review some of the health care rationing talk in CER clothing coming from inside the beltway.
The stage for CER to become a significant component of health care reform was set when President Obama’s stimulus bill passed with a $1.1 billion appropriation for CER. In April, Senate Minority Whip Jon Kyl (R-AZ) introduced a budget amendment to ensure that CER would be used appropriately:
Statement of Purpose:
To protect all patients by prohibiting the use of data obtained from comparative effectiveness research to deny coverage of items or services under Federal health care programs and to ensure that comparative effectiveness research accounts for advancements in genomics and personalized medicine, the unique needs of health disparity populations, and differences in the treatment response and the treatment preferences of patients.
The amendment was defeated 54-44.
Last week, members of the New Democrat Coalition proposed HR 2505, a bill to establish a new government bureaucracy called the Health Care Comparative Effectiveness Research Institute. The Institute would prioritize research based on both clinical and economic factors, including “the effect or potential for an effect on health expenditures associated with a health condition or the use of a particular medical treatment, service, or item.” This would not be a problem if there were safeguards to ensure that best practices are not interpreted to mean the least expensive practices.
Officials at National Institutes of Health (NIH) recently announced a stimulus-funded initiative to integrate cost-effectiveness into clinical research. “Cost-effectiveness research will provide accurate and objective information to guide future policies that support the allocation of health resources for the treatment of acute and chronic diseases across the lifespan,” according to the call for proposals.
Back at the Whitehouse, President Obama has been paying lip service to the clinical benefits of CER. At the same time, he recently lamented that “the chronically ill and those toward the end of their lives are accounting for potentially 80 percent of the total health care bill … there is going to have to be a conversation that is guided by doctors, scientists, ethicists. And then there is going to have to be a very difficult democratic conversation that takes place.” That, he explained, was part of the need for “some independent group that can give you guidance” on the ethical dilemmas involved with rationing end-of-life care.
During her Senate confirmation process, Secretary of HHS Kathleen Sebelius declined to voice her support for prohibiting the use of comparative effectiveness data to withhold care from patients. Her ideas echo those of Tom Daschle, the tax-dodging health policy wonk who wrote in his book that the U.S. “won’t be able to make a significant dent in health-care spending without getting into the nitty-gritty of which treatments are the most clinically valuable and cost effective.”
Then there’s Peter Orszag, Obama’s director of the Office of Management and Budget and a major player in crafting health care reform. For the most part, Orszag’s commentary on CER has been limited to lauding its ability to improve patient care while reducing waste. But when asked a few months ago if the Obama administration has a position on empowering the CER board to make reimbursement decisions, Orszag said, “Not at this point.”
But perhaps of greatest concern is a January House report that included the following statement on CER funding:
By knowing what works best and presenting this information more broadly to patients and healthcare professionals, those items, procedures, and interventions that are most effective to prevent, control, and treat health conditions will be utilized, while those that are found to be less effective and in some cases, more expensive, will no longer be prescribed.
Sound familiar? Cough, NICE, cough, ahem.
And as Jim DeMint explains, “CER is only one step in the Obama administration’s insidious plan to take over American health care … for our own good.”
But would CER really lead to health care rationing in the United States? Of course. That’s pretty much the point. The debate is not about whether or not CER would be used for rationing, but rather, whether rationing is ethical and useful, and how far we’re willing to go to save a buck and level the economic playing field.
If health care reform shapes up as many Democrats anticipate, CER Institute guidelines will initially apply to the public insurance option expected to be the centerpiece of the Democrats’ proposal. But eventually they would slide down the slippery slope into the private sector. A public insurance option would also ride roughshod over the already anemic competition among overregulated private sector insurers, making the survival of private insurance unlikely. As in the United Kingdom, recommendations will become rules and suggestions will become mandates in order to contain the costs of universal coverage.
To what extent will this result in government control of the doctor-patient relationship? Ultimately, a bureaucratic board will determine when, how, and whether or not you and your family receive care.
Comparative effectiveness research will no longer be just a political hot potato; it will be the basis for the next great American culture war. Instead of clashing over God, guns, and gays, we’ll battle over the monetary value of human life, the sanctity of doctor-patient relationships, the right to medical self-determination, and my favorite hot button issue, the duty to die.
Would cases like Terry Schiavo’s be decided based on financial considerations?
Where will fetuses fall on the QALY scale? How about the elderly or people with Down syndrome? Will they automatically receive limited treatment due to limited resources?
Will smokers be eligible for chemotherapy? Will overweight people have restrictions placed on cardiac care? Will we feel differently about those decisions when we’re footing the bill for everyone?
And you thought the abortion debate was contentious.
Obviously these questions address the most extreme examples of what could happen if we continue on our current path toward universal health care. But government efforts at cost containment through CER may push us toward debating these issues sooner than we think. Hopefully we’ll never see the day when questions like these go beyond an academic exercise.
Meanwhile, I’ll be saving up for a date with a mammography machine in one of those thriving medical tourism meccas. I hear Costa Rica is a breathtaking location for a 35th birthday celebration.
Like a lot of kids raised in liberal New York City, I was taught that anyone who wants a gun is probably the last person who should be allowed to own one. I learned to consider the Second Amendment a quaint throwback to less civilized times and had it drilled into my head that only psychos, criminals, and men with small penises carry guns. Most gun violence could be blamed on economic inequalities created by Reaganomics, according to the elementary school teacher who made sure a Mondale/Ferraro sticker was affixed to each student’s binder.
Then I grew up, read the Bill of Rights, and married a gun nut.
Across the country in Phoenix, Meghan McCain was brought up with a more informed view on the right to bear arms. Her brothers were avid hunters and she developed a deep respect for the Second Amendment. Today she’s an NRA member with a lifetime of positive gun experiences under her belt.
I confess I have a soft spot for Meghan McCain. I don’t agree with all of what she writes and I wish she’d add something new to the national political conversation instead of recycling a mishmash of talking points. But I admire her practical decision to milk her campaign fame for all it’s worth, and I think she’s wise to go the contrarian Republican route. Controversy sells, as evidenced by her six figure book deal.
McCain and I agree on the Second Amendment issue. But while her devotion to gun rights confirms her bitter clinger bona fides, she appears to have absorbed a different kind of liberal humbuggery on the issue of gun violence.
The real solution to preventing gun violence is not taking away the tools, but tackling its causes: poverty, inadequate health care, mental illness, joblessness, inadequate housing, and poor education. Desperate people will make anything a weapon. We need to eliminate desperation, not guns.
Translation: guns don’t kill people, people with less money and education than Meghan McCain kill people. (And sometimes the mentally ill do it too.)
Way to scapegoat the impoverished!
I was under the impression that identifying poverty as the root cause of violent crime was no longer in vogue – after all, that would let guns off the hook – but apparently President Obama feels otherwise. Eight days after the 9/11 attacks, Barack Obama attributed the tragedy to the terrorists’ lack of empathy stemming from a “climate of poverty and ignorance, helplessness and despair.” And in a 2007 speech, Obama called poverty “a disease that infects an entire community in the form of unemployment and violence.” Obama’s first pick for Commerce Secretary, Bill Richardson, shared similar thoughts during the 2007 NAACP Presidential Primary Forum when he said, “the key in eliminating gun violence is eliminating poverty, eliminating hate.”
Perhaps Meghan McCain is simply repeating liberal talking points, but it seems to me that even among the political left, violent crime is usually approached as a complex phenomenon caused by a multitude of sociological and psychological factors. Many recognize that it reeks of classism to suggest that poverty creates desperation-fueled violence. It’s also unsupported by evidence. While a correlation exists between certain crimes and poverty, research has not proven a cause and effect relationship. There are simply too many variables.
Even Marxist criminologists don’t attribute crime to poverty, but rather to relative deprivation like income inequality. But both are silly assumptions: if all of the poverty-stricken or people who find life unfair engaged in violent criminal activities, the world would be in chaos. But clearly most of the world’s have-nots eke out their years without erupting into violence.
Instead, couldn’t it be that violent crime perpetuates poverty? We see this on an individual level among both victims and convicted criminals. It is also evident on the community level. Neighborhoods decimated by gun violence fail to attract entrepreneurs who might help the areas prosper. Crime also keeps property values low and drives up insurance premiums.
It may well be that poverty has little to do with being deprived, and everything to do with being depraved. And it isn’t economic poverty, but moral poverty that is to blame for gun violence.
Between covering breaking news of the First Pup’s television debut and yukking it up on prime time with juvenile teabag jokes, media personalities have been absolutely swamped this month. Undoubtedly the talking heads would have found an angle that combined the two stories had Bo Obama not already been neutered.
But somehow amidst these concerns of grave national interest, and before swine flu began dominating the news cycle, little coverage was given to the implications of a significant tax day announcement. President Obama revealed that a rewritten federal tax code will soon “make it easier, quicker and less expensive for you to file a return, so that April 15th is not a date that is approached with dread every year.”
A simplified tax code is something most Americans can get behind. The federal tax code now stands at a whopping 70,000 pages. 85 percent of American adults say the federal tax code is complex, and 82 percent say the tax system needs to be completely overhauled.
So what can we expect from an Obama approved tax code revision? The first phase of the administration’s plan, conceived during the presidential campaign by economic adviser Austan Goolsbee, aims to eliminate tax returns for 17 million Americans.
Under the “Simple Return” plan, the Internal Revenue Service would complete tax returns for taxpayers whose sole income comes from one employer and whose interest income comes from one bank. The IRS would then send a copy of the return to the taxpayer. If the first wave of the program worked well, it could be expanded to other taxpayers.
The second and third waves of the Simple Return plan could bring the total to 52 million participating taxpayers.
The good news? Next year you may not have to file a federal tax return.
The bad news? The IRS will prepare your return for you.
Forgive me for plucking off a bit of low hanging fruit, but it’s hard to overlook that the IRS is overseen by Treasury Secretary Timothy Geithner, a man who failed to pay several several years worth of self-employment taxes until his finances went under the Senate microscope during confirmation hearings. His excuse was something along the lines of “TurboTax made me do it.” Who better to oversee the team of bureaucrats charged with calculating your tax bill?
But easy targets aside, this “simplification” is a political cop out. Many of the complexities in our federal tax code are there because some politician, special interest group, or narrow constituency has lobbied for their existence. Some serve legitimate government interests, but many don’t. President Obama, like those who preceded him, will not risk alienating powerful voting blocs by ordering a careful review of the tax breaks and incentives that contribute to the corpulence of the code.
Instead, he pretends that the solution to a bloated tax code is a bloated IRS. The IRS has been serving up spoiled broth, and Barack Obama wants to hire more cooks. That’s certainly a different way of doing things, but simpler? I don’t think so.
Proponents of the Simple Return plan boast that it could save taxpayers 225 million hours currently spent on tax compliance, and $2 billion in tax return preparation costs. But how many additional Internal Revenue Service agents will it take to complete returns on behalf of 52 million taxpayers? How many millions of taxpayer dollars will it take to implement this program?
And even if Austan Goolsbee’s plan wouldn’t cost taxpayers a dime, should the government encourage people to shift partial responsibility for their finances to the IRS? What federal interest does it serve when citizens relinquish personal responsibility to become more dependent on the government for tasks they can accomplish themselves? Shouldn’t we be wary when the federal government offers to permanently shoulder our burdens?
Keep in mind, the vast majority of the tax returns we’re talking about are simple 1040EZ forms. Even the IRS estimates a 1040EZ takes just a few hours to complete, but for anyone with a pulse and a modicum of aptitude with a calculator, the compliance time is more likely to be minutes, not hours. The cost for individual taxpayers filing the 1040EZ is virtually nil – perhaps the nuisance of spending an evening in – except when they choose to hire a tax preparer. Unless the filer is illiterate or disabled, tax preparation fees are completely optional.
Why should compliance costs that are voluntarily incurred by individuals become the collective responsibility of American taxpayers?
For the most conscientious among us, compliance costs will, of course, not change at all. Even if our tax returns are completed by IRS bureaucrats, we will spend the same amount of time and money checking calculations as we do now. Only the people willing to put absolute faith in IRS number crunchers will “benefit” from this new type of government dependence. I’m betting those are the same people who place absolute faith in President Obama.
Grab your pitch fork! Light your torch! There’s a battle to be waged in the name of egalitarianism. There are wrongs to be righted on behalf of the aggrieved proletariat.
No weapon is off limits to this populist mob of angry legislators, outraged officials, indignant journalists and seething private citizens. Punitive taxation, public shaming, intimidation, and even threats of physical violence are all fair play if the greedy rich at AIG are to get their just deserts.
“Get the bonus, we will get your children.”
“I would be very careful when I went out side. This is just a warning. If I were ya’ll I would be real afraid.”
“Publish the list of those yankee scumbags so some good old southern boys can take care of them.”
“We will hunt you down. Every last penny. We will hunt your children and we will hunt your conscience. We will do whatever we can to get those people getting the bonuses. Give back the money or kill yourselves.”
“The Revolution is coming. The family members of your executives are not safe. Your blood will run through the streets in the coming months.”
New York Attorney General Andrew Cuomo attempted to satisfy an increasingly bloodthirsty public by threatening to disclose the names of AIG bonus recipients if they did not return the payments. And back in Washington, Rep. Barney Frank demanded the names of recipients and refused to keep them confidential in response to safety concerns.
To further address public cries for the heads of AIG executives, the House easily passed a bill to impose a 90 percent tax on executive bonuses at bailed out companies. The legislation garnered support from most House Democrats and nearly half of Republicans, though it appears to be dead in the Senate.
Even President Obama wondered how AIG executives could “justify this outrage to the taxpayers” and with utter disregard for the sanctity of private contracts, asked Treasury Secretary Timothy Geithner to “pursue every legal avenue to block these bonuses and make the American taxpayers whole.”
One problem: block the bonuses and you lose the talent.
Why should you care if AIG suffers a blow to its executive workforce?
Forget your outrage that taxpayers are underwriting these bonuses and think for a minute. Panicky legislators tossed barrels of cash at AIG many months ago and our only hope of getting those billions back is to ensure the company is skillfully dismantled by knowledgeable executives. If AIG assets aren’t sold off in an orderly, uninterrupted manner, your government’s investment will become your tax liability.
I know it hurts to say it, but keeping the remaining AIG executives at the company is in your best interest.
Many AIG executives worked for $1 salaries last year with the expectation that they would be compensated with bonuses if they remained at the beleaguered company. This manner of structuring compensation helped AIG retain qualified employees to dismantle the company. What we’re calling retention bonuses are essentially deferred salary payments postponed to ensure talent sticks around.
Even if you had the specialized knowledge, would you work for just a dollar a year? Would you pass up a stable, high paying job at a solvent company out of sheer loyalty to AIG? And where else should AIG management have looked to find expertise on dissolving these complex financial instruments and assets? Could we spare the time for training and learning curves?
Unfortunately, the threats have worked and the strong-arming has paid off. Jake DeSantis, an executive Vice President at AIG Financial Products, published his letter of resignation in the New York Times this week:
After 12 months of hard work dismantling the company — during which A.I.G. reassured us many times we would be rewarded in March 2009 — we in the financial products unit have been betrayed by A.I.G. and are being unfairly persecuted by elected officials. In response to this, I will now leave the company and donate my entire post-tax retention payment to those suffering from the global economic downturn. My intent is to keep none of the money myself.
I take this action after 11 years of dedicated, honorable service to A.I.G. I can no longer effectively perform my duties in this dysfunctional environment, nor am I being paid to do so. Like you, I was asked to work for an annual salary of $1, and I agreed out of a sense of duty to the company and to the public officials who have come to its aid. Having now been let down by both, I can no longer justify spending 10, 12, 14 hours a day away from my family for the benefit of those who have let me down.
Like most current AIG executives, DeSantis was not responsible for the company’s massive credit default swap losses, but that hasn’t insulated him from the witch hunt conducted by Barney Frank, Andrew Cuomo, and others. Rather than remain at AIG out of fear, he has elected to leave on his own terms.
News of two more AIG resignations was announced Thursday. Mauro Gabriel, president and CEO of Banque AIG, and Jim Shephard, deputy CEO are leaving due to the hostile business environment at AIG. There is some concern that a failure to find replacements could result in hundreds of billions of dollars in derivative contract defaults.
If that happens, good luck attracting qualified talent to help wrap up this AIG mess.
Will the rabble-rousing have been worth it then? Will that pound of executive flesh fill the coffers at Treasury? No, but that won’t stop the public hunger for class warfare from continuing to eclipse law, ethics, and even self-interest.
When Barack Obama convened a White House forum on health care reform last week, there was one ground rule: check fresh ideas at the door. Of course, you’d never know that from Obama’s opening remarks rife with the usual bipartisan Mad Libbery:
In this effort, every voice has to be heard. Every idea must be considered. Every option must be on the table. There should be no sacred cows. Each of us must accept that none of us will get everything that we want, and that no proposal for reform will be perfect. If that’s the measure, we will never get anything done. But when it comes to addressing our health care challenge, we can no longer let the perfect be the enemy of the essential. And I don’t think anybody would argue that we are on a sustainable path when it comes to health care.
Despite the inclusive rhetoric, invitees were carefully selected to ensure no contraband proposals made it past security checkpoints. Among the attendees were the usual suspects:
The vast majority of the groups represented at the summit strongly support a federal health insurance plan, and some are even advocates of a single-payer system. The list of summit participants included no fewer than nine unions: SEIU, UFCW, USW, Teamsters, UAW, CWA, Change to Win, AFSCME, and AFL-CIO.
The attendance list also included Physicians for a National Health Program (“Our Mission: Single-Payer National Health Insurance“) and other liberal advocacy groups such as the Center for American Progress, Campaign for America’s Future, AARP, Planned Parenthood, Families USA, and Health Care for America Now.
Advocates of free market health care models were conspicuously absent. Michael Cannon notes that the guest list excluded representatives from some of the top health policy think tanks in the world, including:
- American Enterprise Institute (the #5 think tank in the world for health policy)
- Cato Institute (ranked #7)
- National Center for Policy Analysis (ranked #10)
- Manhattan Institute
- Pacific Research Institute
- Galen Institute
- The Heritage Foundation
What could analysts from these policy centers bring to the table? Here’s just one example of an innovative approach to health care outlined by John Cochrane in a paper published by the Cato Institute. He proposes a systemic reform that would separate health coverage into two products: medical insurance and what he calls health-status insurance. “Medical insurance covers your medical expenses in the current year, minus deductibles and copayments. Health-status insurance covers the risk that your medical premiums will rise,” he explains.
John Cochrane’s free market solution would provide portability, preserve choice, and increase affordability. But as Reason Magazine’s Ronald Bailey points out in his excellent summary of the plan, Dr. Cochrane did not receive an invitation to the White House summit.
John Cochrane and other creative thinkers have been locked out of the debate, but the Teamsters and UAW have the president’s ear as he prepares to make a $634 billion down payment on health care reform. What happened to “every voice has to be heard”?
A recent op-ed in the New York Times gently chided President Obama for his tendency to shun the objective pronoun case.
Since his election, the president has been roundly criticized by bloggers for using “I” instead of “me” in phrases like “a very personal decision for Michelle and I” or “the main disagreement with John and I” or “graciously invited Michelle and I.”
The rule here, according to conventional wisdom, is that we use “I” as a subject and “me” as an object, whether the pronoun appears by itself or in a twosome. Thus every “I” in those quotes ought to be a “me.”
Proper pronoun selection is a reasonable expectation of a former Harvard Law Review editor with two Ivy League degrees under his belt. However, writers Patricia T. O’Conner and Stewart Kellerman suggest it is not our infallible leader, but the overly rigid rules of modern English that require correction.
So should the president go stand in a corner of the Oval Office (if he can find one) and contemplate the error of his ways? Not so fast.
For centuries, it was perfectly acceptable to use either “I” or “me” as the object of a verb or preposition, especially after “and.” Literature is full of examples. Here’s Shakespeare, in “The Merchant of Venice”: “All debts are cleared between you and I.” And here’s Lord Byron, complaining to his half-sister about the English town of Southwell, “which, between you and I, I wish was swallowed up by an earthquake, provided my eloquent mother was not in it.”
It wasn’t until the mid-1800s that language mavens began kvetching about “I” and “me.”
Most of us have inexplicable gaps in our education that lead to embarrassing linguistic mistakes. I had almost completed high school before someone told me the “l” in “wolf” isn’t silent. I never made that mistake again.
Now, however, we live in a time when failure is becoming obsolete. Corporations are too big to fail, high school students are graded on a curve, and irresponsible borrowers receive taxpayer-funded subsidies. Why shouldn’t we relax the rules of grammar to accommodate a public official’s ignorance?
For eight years, ridiculing George W. Bush’s semantic errors and grammatical gaffes was a national pastime. Examples of “Bushisms” were painstakingly chronicled in dozens of books and calendars. His most glaring verbal missteps were emblazoned on all manner of merchandise, from tote bags to thongs, and every instance of linguistic incompetence was presented as irrefutable proof of Bush’s general incompetence as President and Commander-in-Chief.
When President Obama screws up, we review Shakespeare for precedent. Apparently he’s too big to fail.