With less than four months to go until Election Day, Democrats increasingly have no confidence in the Obama White House’s political instincts. As a result, more and more Democratic candidates are avoiding the president when he comes to their neighborhood. Senator Mark Udall famously avoided showing up with Obama at a fundraiser in the senator’s honor in Colorado last week. John Foust, the Democratic congressional candidate in a suburban Virginia district just outside Washington, D.C., snubbed the president this week by failing to show up for a presidential event in his area.
Representative Henry Cuellar of Texas was flabbergasted by Obama’s petulant refusal to visit the Texas border last week, calling him “aloof” and “detached” and his decision “bizarre.”
The Virginia Progress PAC, a Democratic committee supporting Senator Mark Warner, issued a list of talking points for potential donors that laid out the challenge the Obama albatross represents for Democrats this fall: “The 2014 midterm elections are shaping up to be similar to the wave elections of 1994 and 2010, particularly with an unpopular President and an unpopular piece of major legislation that will serve as a referendum on the sitting President. . . . A difficult political climate coupled with the rising unpopularity of President Obama could affect the Democratic brand as a whole and hurt Senator Warner.”
Bob Beckel, a former Democratic campaign consultant, said on Fox News this week that he spoke with a Democrat “intimately involved in [Obama’s] campaigns, both of them.” The message was sobering: “He said you have to know what it’s like to get through [presidential counselor] Valerie Jarrett and Michelle Obama, and I think that’s a tough deal for anybody on a staff to do. . . . [Obama] lives in a zone that nobody else goes to.”
Indeed, Democrats are becoming increasingly vocal about their concern that their president is isolated and not connecting with the political reality around him. “The Democratic party is like a wedding party with the common goal of getting to the ceremony on time,” a former Democratic congressman told me. “There is a caravan of cars, but the lead car is driven by a guy who is weaving in and out of traffic and is dangerous to the other cars behind him. Do you follow the guy you agreed to follow, or do you make your own way to the wedding? More and more people are leaving the caravan.”
All of Washington is talking about our detached president — one who would go to two fundraisers in New York last night after a plane carrying 23 Americans was shot down over Ukraine. In 2012, Obama famously flew off to fundraisers in Las Vegas the day after the Benghazi attack killed our ambassador to Libya and three other Americans.
“Obama does not appear to relish being chief executive,” writes liberal journalist Edward Luce in the Financial Times. Luce notes that Obama has headlined 393 fundraisers since he took office, double the number that George W. Bush had attended at this point in his presidency. Veteran journalist Patrick Smith writes, “I can think of two names for this. One is ‘outmoded arrogance.’ The other is ‘asleep at the wheel.’ Whatever the moniker, some measure of incompetence lies behind it.”
Democrats are happy for the president to raise money, which he can still do by appealing to the fat cats in the party’s environmental, gay, and feminist bases. But they increasingly don’t want to appear with him in front of ordinary voters or follow his lead on policy. For example, more and more Democrats in swing districts or states are looking for a way to separate themselves from the Obama White House’s chaotic border policy. Much of the grumbling is private for now, but it is increasingly seeping into public discourse.
And the grumbling goes beyond politics. A disengaged, petulant president who gives the impression that someone else is minding the White House store isn’t good for the country.
One presidential historian says that if the president’s bizarre behavior deepens, people will start making jokes comparing Obama to President Woodrow Wilson, who was debilitated by illness during his last two years in office, with decisions increasingly made by his aides and his wife, Edith. “The comparisons of course wouldn’t be fair, but they don’t have to be to have elements of truth to them.”
Barack Obama’s disdain for the slow, grinding mechanisms of government has become unmistakable of late. So it is little surprise that, frustrated by congressional inaction on his proposal for “comprehensive immigration reform,” the president last month declared that he would “fix as much of our immigration system as I can on my own.” The result, intimated by White House senior adviser Dan Pfeiffer last week, is a “very significant” executive action to be unveiled by the end of the summer. If reports of the contents of the order are credible, not only will the action fail to “fix” America’s immigration system, it will further undo the constitutionally prescribed separation of powers that this administration has already done so much to weaken.
The White House is reportedly weighing two options for executive action similar in kind to the Deferred Action for Childhood Arrivals (DACA) program that was implemented — also by executive fiat, via memorandum — in 2012. One option would grant temporary legal status to illegal-immigrant parents of U.S. citizens, authorizing them to remain in the country and to work here. The second option would do the same for illegal-immigrant parents of DACA recipients. These actions could affect anywhere from 3 to 6 million people.
Although the specifics are unknown, any unilateral action of this magnitude and type would be unprecedented. Permission to work would secure for millions of illegal immigrants the benefits of lawful status despite the absence of a green card or a pathway to citizenship. Already illegal immigrants, taken in toto, represent a net drag on the American economy of $55 billion a year, according to the Heritage Foundation, since they and their families make use of direct benefits (such as Social Security and Medicare), means-tested welfare benefits, public education, and other government-funded resources. The tacit moral sanction granted by a new DACA-type program would ensure that program participants are eventually guaranteed these services.
It is not unlikely that a new program would, like DACA, be pitched as a temporary measure. DACA deferrals, for instance, are given in two-year increments, after which recipients must renew their grant. But these “temporary” programs are no such thing. Consider Temporary Protected Status, established in 1990 to provide for illegal immigrants who, for reasons of war or natural disaster, cannot return to their home countries at the moment, but who do not qualify as refugees. Not one TPS beneficiary has been deported because his status expired. TPS status still shields Honduran refugees fleeing Hurricane Mitch, which struck in 1998. By this precedent, there is no reason to believe the Obama administration will aggressively enforce any new, supposedly temporary program.
In addition, any new DACA-style program will have the tendency to encompass persons beyond its target demographic. As U.S. Citizenship and Immigration Services (USCIS) officers report in the wake of DACA, anyone who appears to be under the maximum deferral age — that is, any illegal immigrant who appears younger than 33 years old — is presumed to be eligible for DACA. As of March 31, some 550,000 “DREAMers” have received permits under the order, but the program has functionally shielded from investigatory and/or enforcement actions probably 2 to 3 million illegal immigrants. No doubt a similar presumption would obtain under a new program, protecting millions who are technically ineligible.
DACA also belies the claim that unilateral executive actions are simply large-scale enactments of prosecutorial discretion, pragmatic measures necessitated by the federal government’s lack of resources. DACA has proven to involve a massive expenditure of both time and money that has required USCIS officers to table entrance applications from legal immigrants to accommodate the deluge of applicants from illegal immigrants. A de facto amnesty of 5 million illegal immigrants would overwhelm an already inundated system.
The problem, though, is finally one of constitutional order. Is Congress — and, through it, the electorate — responsible for the laws governing America’s borders? Or does one man get to decide who may enter and work in the United States? The assumption by the president of the ability to unilaterally welcome or reject migrants is a rank violation of the separation of powers. The president would no longer be enforcing existing law; he would be writing it anew at will on a scale heretofore unimagined.
Earlier this month Texas Republican senator Ted Cruz introduced a bill (S. 2666) that would cut off federal funds for the continued implementation of DACA and would prohibit any “agency or instrumentality of the Federal Government” from using federal resources “to authorize any alien to work in the United States” who was not lawfully admitted under the Immigration and Nationality Act. Although the bill is unlikely to pass the Democrat-controlled Senate, it will put pressure on red-state Democrats to defend their decision to countenance this executive-branch power grab.
Perhaps that can assist in the Republican campaign to retake the Senate. The campaign to restore immigration laws, and the rightful place of Congress in our constitutional order, will take much longer.
The Obama Justice Department thinks this is racist. It was crated by a veteran about the unnecessary deaths caused by Veterans Administration policies and the ensuing coverup.
By Charles C. W. Cooke
Nineteen terrifying words from the Omaha World-Herald: The U.S. Department of Justice has joined the discussions over a controversial float in the Norfolk Independence Day parade.
Thus did the federal government dispatch an emissary to investigate a minor instance of Midwestern dissent.
A quick recap for the happily uninitiated: The “controversial float” in question was one of many included in this year’s Independence Day parade in Norfolk, Neb. The entry, which featured a zombie standing on an outhouse marked “Obama Presidential Library,” was created by a veteran named Dale Remmich, and was designed, Remmich claims, to express the “political disgust” that he feels at the Obama administration’s mismanagement of the Department of Veteran Affairs. As is the habit now, pictures of the float were quickly pushed around the Internet, attracting the attention and disapprobation of such august institutions as the Washington Post, CBS, ABC, and the Huffington Post — and, it seems, the interest of the United States Department of Justice. This week, the World-Herald reports, the DOJ “sent a member of its Community Relations Service team, which gets involved in discrimination disputes, to a Thursday meeting about the issue.” Present at the summit were the NAACP, the mayor of the Nebraska town in which the float was displayed, and the Independent Order of Odd Fellows, which sponsored the event.
Now for the obvious question: Why? What, exactly, was the problem here? Nobody was killed. Nobody was maimed. Nobody had their material or spiritual interests injured, nor were they stripped of their livelihoods. No federal or state laws were broken. Indeed, not even private rules were broken. More to the point, there was no “discrimination dispute” of the sort with which the DOJ likes to concern itself. Instead, a few free people were vexed because a politician that they like was depicted in an unflattering light. One might well ask, “So what?” Once, Americans tackled the Oregon Trail. Are they now in need of their political “discussions” being arbitrated by glorified social workers sent by Uncle Sam?
In a typically risible statement, Nebraska’s state Democratic party described the incident as one of the “worst shows of racism and disrespect for the office of the presidency that Nebraska has ever seen.” That this is almost certainly true demonstrates just how much progress the United States has made in the last 50 years — and, in consequence, how extraordinarily difficult the professionally aggrieved are finding it to fill their quotas. If a fairly standard old saw is among the worst things to have happened to the Cornhusker State in recent memory, the country is in rather good shape, n’est-ce pas?
Exactly what it was about the float that rendered it “racist” was, of course, never explained. Instead, the assertion was merely thrown into the ether, ready to be accepted uncritically by the legions of righteously indignant keyboard warriors that lurk around social media as piranhas around a fresh carcass. But, for future reference at least, it would be nice to have the details of the offense unpacked. Are outhouses racist now? Are zombies? Or was it perhaps the overalls in which the zombie was dressed? Moreover, if any of these are now redolent of something sinister, at what point was this association held to be operative? A popular cartoon from 2006 depicted a latrine standing in the middle of the desert, on its outer wall the words “Bush Presidential Library.” Was this “racist,” or is this one of those timeless truths that were only discovered in 2009?
The float’s maker has insisted that the zombie represented himself and not the president. “I’ve got my bibs on, my walker, I’m covering my ears and I’m turning a bit green; I intended it to look like a zombie who has had enough,” he explained. Unsurprisingly, the NAACP didn’t buy it. “Looking at the float, that message absolutely did not come through,” the president of the outfit’s Iowa and Nebraska chapters griped. Fair enough. Arguendo , let’s presume that some of the spectators misunderstood the piece and believed that the president of the United States was being compared to a toilet-dwelling zombie. Again: Who cares? Are we now so hopelessly epicene that we expect federally funded conflict-resolution teams to swoop in on the hinterlands if the locals mutter too loudly about the government? I rather hope that we are not.
Frankly, as superficially appealing as they might sound, appeals to “the dignity of the office” are invariably prissy, serving more often than not as a means by which humorless partisans might grumble about their team’s being dinged without appearing hypersensitive. Indeed, far from damaging the national fabric, astringent mockery of the powerful is a healthy and necessary thing — a source of valuable catharsis that serves also as a canary in the proverbial coal mine. When I see the most powerful man in the country being not only mocked, but hanged and burned in effigy too, my first thought is less “gosh, how awful” than “wow, is this a free country or what?” A historical rule of thumb: If a ragtag group of political dissenters can simulate the violent execution of the head of the executive branch and not be so much as scratched as a result, the country is a free one. Who cares if a few of our more delicate sorts reach for the smelling salts?
It is always tempting to believe one’s own time to be particularly interesting or fractious, but there is little in politics that is genuinely new. Sharp and violent denunciations of the executive branch have been a feature of American life since the republic’s first days. Before the Revolution, the colonists routinely hanged likenesses of unpopular royal representatives, including King George III;Andrew Oliver , the Massachusetts Distributor of Stamps; and the loyalist Supreme Court justice, Thomas Hutchinson . Afterward, having dispensed with the old guard, Americans took to lambasting the new, among them George Washington, who had effected the king’s defeat; Thomas Jefferson, who had authored the charter of separation; and James Madison , who had drafted the lion’s share of the new Constitution. Chief Justice John Jay’s 1795 treaty with the British was so wildly unpopular among the Jeffersonians that Jay reported being able to travel from Boston to Philadelphia by the light of his burning effigies. Later, during the Civil War, Abraham Lincoln was subjected to the treatment. In one form or another, most presidents have been.
The modern era has served as no exception to the rule. During his two terms, George W. Bush was the object of considerable opprobrium, his likeness being frequently hanged, knived in the forehead, and even assassinated on prime-time television. At the height of the Left’s umbrage, progressive heroes Ben Cohen and Jerry Greenfield went so far as to take a twelve-foot effigy of Bush on a national tour, setting fire to it at each stop to the audience’s hearty cheers. Ben and Jerry make ice cream, not apple pie. But their barnstorming road trip could not have been more American. There are few things more indicative of human liberty than the ability to castigate power with impunity — up to and including the moment of offense. “To learn who rules over you,” Voltaire suggested, “simply find out who you are not allowed to criticize.” Is Barack Obama to be a ruler? Charles C. W. Cooke is a staff writer at National Review.
Often, crazy things seem normal for a time because logical catastrophes do not immediately follow.
A deeply suspicious Richard Nixon systematically and without pushback for years undermined and politicized almost every institution of the federal government, from the CIA and the FBI to the IRS and the attorney general’s office. Nixon seemed to get away with it — until his second term. Once the public woke up, however, the eventual accounting proved devastating: resignation of a sitting president, prison sentences for his top aides, collapse of the Republican party, government stasis, a ruined economy, the destruction of the Vietnam peace accords that had led to a viable South Vietnam, the end of Henry Kissinger’s diplomatic breakthroughs, and a generation of abject cynicism about government. Did Nixon ever grasp that such destruction was the natural wage of his own paranoia?
In the post-Watergate climate of reform, for nearly three years a naïve Jimmy Carter gave utopian speeches about how American forbearance would end the Cold War and create a new world order based on human rights — until America’s abdication started to erode the preexisting global order. Scary things followed, such as the fall of the shah of Iran, the rise of Iranian theocracy, the taking of American hostages in Tehran, revolutions and insurrection throughout Central America, the Soviet invasion of Afghanistan, radical Islamists taking over Mecca, more gas lines, continued stagflation, and China invading Vietnam. Did the puritanical Carter ever understand what might be the consequences of his own self-righteousness in an imperfect world?
Barack Obama likewise has done some crazy things that seemed for years to have no ramifications. Unfortunately, typical of the ways of Nemesis (a bitter goddess who waits until the opportune moment to demand payment for past hubris), suddenly the bills for Obama’s six years of folly are coming due for the American people.
When a president occasionally fails to tell the truth, you get a scandal like the monitoring of the Associated Press reporters. When a president serially fails to tell the truth, you get that plus the scandals involving the IRS, the NSA, the VA, Benghazi, and too many others to mention.
The same is true abroad. The American public hardly noticed when Obama recklessly withdrew every peacekeeper from Iraq. Did he not boast of “ending the Iraq War”? It did not mind when the U.S. posted dates for withdrawal from Afghanistan. Trashing all the Bush–Cheney anti-terrorism protocols, from Guantanamo to renditions, did not make much sense, when such policies had worked and, in fact, were of use to Obama himself. But again, most Americans took no note. Apparently the terrorists did, however, and they regrouped even as the president declared them “on the run.”
Lecturing Israel while praising Islamist Turkey was likewise ignored. America snoozed as its president insidiously redefined its role in the Middle East as secondary to the supposed pivot to Asia. Each new correction in and of itself was comparatively minor; but in aggregate they began to unravel the U.S.-inspired postwar global order.
At first, who cared whether Iran serially violated every Obama deadline on halting nuclear enrichment? Did we worry that Libya, where Obama was proud of having led from behind, was descending into Somalia? Few Americans were all that bothered over Obama’s empty order to Syrian president Bashar Assad to step down, or over Obama’s later vacuous red-line threats that bombs would follow any use by Assad of chemical weapons.
Few noted that Obama lied to the nation that a video had caused the deaths of four Americans in Benghazi, that Obama had known who the real terrorist perpetrators were but had ordered no immediate action to kill or capture them, and that Americans had been engaged in mysterious and still unexplained covert activities in Benghazi. After all that, we still shrugged when the president traded five top terrorist leaders for an alleged American deserter.
Trashing George W. Bush’s policy toward Vladimir Putin while promising a new reset approach (illustrated with a plastic red button) to an aggressive dictator raised few eyebrows at the time. Nor did many Americans worry that our Pacific allies were upset over Chinese and North Korean aggression that seemed to ignore traditional U.S. deterrence.
We were told that only Obama-haters at home had catalogued the president’s apologies abroad, his weird multicultural bowing to authoritarians, his ahistorical speeches about mythical Islamic achievements, his surreal euphemisms for radical Islam, terrorism, and jihadism, his shrill insistence about civilian trials for terrorists and closing Guantanamo, or the radical cutbacks at the Pentagon, coupled with the vast increase in entitlement spending.
But after six years of all that, our allies have got the message that they are on their own, our enemies that there are few consequences to aggression, and neutrals that joining with America does not mean ending up on the winning side. The result is that the Middle East we have known since the end of World War II has now vanished.
Supposedly crackpot fantasies about a worldwide “caliphate” are becoming reified. What were once dismissed as conspiracy theories about an “Iranian arc” — from a nuclear Tehran through Syria to Hezbollah in Lebanon to the borders of Israel to the Shiite minorities in the Gulf kingdoms — do not seem so crazy.
The idea of visiting the Egyptian pyramids or hoping to reengage with a reforming Libya is absurd. The best of the Middle East — Israel, Jordan, Kurdistan — no longer count on us. The worst — ISIS, Iran, Syria — count on us to remain irrelevant or worse. Old allies in the Gulf would probably trust Israel or Russia more than the Obama administration. In the next two years, if Obama continues on his present course, we are going to see things that we could not have imagined six years ago in the Middle East, as it reverts to premodern Islamic tribalism.
The same trajectory has been followed on the home front. Americans at first were amused that the great conciliator — and greatest political recipient on record of Wall Street cash — went after the rich with an array of hokey epithets and slurs (fat cats, corporate-jet owners, Vegas junketeers, limb-lopping and tonsil-pulling doctors, business owners who should not profit, or should know when they have made enough money, or should admit they didn’t build their own businesses). Few connected the dots when the polarizing attorney general — the John Mitchell of our time — referred to African-Americans as “my people” and all the rest of the nation as “cowards.” Did we worry that the craziest things seem to come out of the president’s own mouth — the Trayvon-like son he never had, the stereotyping police, the absence of a “smidgen” of corruption in the Lois Lerner IRS scandal, or the mean Republicans who “messed” with him?
The president before the 2012 elections lamented to Latino groups that he did not have dictatorial powers to grant amnesty but urged them in the meantime to “punish our enemies” — a sort of follow-up to his 2008 “typical white person” incitement. Who was bothered that with “a pen and a phone” Obama for the first time in American history emasculated the U.S. Border Patrol, as part of a larger agenda of picking and choosing which federal laws the executive branch would enforce?
Those choices seemed to be predicated on two extralegal criteria: Did a law contribute to Obama’s concept of social justice, and did it further the progressive political cause? If the answer was no to either, the statute was largely unenforced. No president since World War II has done more to harm the U.S. Constitution — by ordering the executive branch not to enforce particular laws, by creating by fiat laws never enacted by Congress, by monitoring the communications of journalists and average Americans, by making appointments contrary to law — to the apparent yawns of the people.
Too few also seemed to care that almost everything the president had promised about Obamacare — keep your health plan, retain your doctor, save money on your premiums, sign up easily online, while we were lowering the annual deficit and reducing medical expenditures — was an abject lie. In such a climate, Obama felt no need to issue accurate data about how many Americans had lost their health plans, how many had simply transferred to Obamacare from Medicaid, how many had actually paid their premiums, or how many were still uninsured. The media ignored the serial $1 trillion deficits, the chronic high unemployment and low growth, the nonexistence of the long-promised “summer of recovery,” and the nonappearance of “millions of shovel-ready and green jobs.” The fact that electrical-power rates, gasoline prices, and food costs have soared under Obama as wages have stagnated has never really been noticed. Nor have the record numbers of Americans on food stamps and disability insurance.
Meanwhile, as Obama has refused to enforce immigration law, the result is chaos. Tens of thousands of children are flooding across our border illegally, on the scent of Obama’s executive-order amnesties. Advocates of open borders, such as progressive grandees Mark Zuckerberg and Nancy Pelosi, assume that these impoverished Third World children will not enroll in the private academies attended by their children or grandchildren, or need housing in one of their vacation estates, or crowd their specialists’ waiting rooms. They do not worry about the effects of illegal immigration on the wages of low-income Americans. Dealing first-hand with the ramifications of open borders is for unenlightened, illiberal little people.
Obama’s economic legacy is rarely appreciated. He has institutionalized the idea that unemployment between 6 and 7 percent is normal, that annual deficits over $500 billion reflect frugality, that soaring power, food, and fuel costs are not proof of inflation, that zero interest rates are the reward for thrift, that higher taxes are always a beginning, never an end, and that there is no contradiction when elite progressives — the Obamas, the Clintons, the Warrens — trash the 1-percenters, while doing everything in their power to live just like them.
We are the roost and, to paraphrase the president’s former spiritual adviser, Obama’s chickens are now coming home to us.
You may recall that when the IRS political-persecution scandal first started to become public, the agency’s story was that the trouble was the result of the misguided, overly enthusiastic actions of a few obscure yokels in Cincinnati. That turned out to be a lie, as we all know. But the IRS made a similar case successfully in the matter of its criminal disclosure of the confidential tax records of the National Organization for Marriage, whose donor lists were leaked to left-wing activists in order to use them against the Romney campaign. The IRS admitted that an employee leaked the information, but said it was an accident, that it involved only a single employee making a single error, etc., and the court agreed that NOM could not show that the leak was the result of malice or gross negligence.
Truly, the IRS must be the unluckiest agency in the history of the federal government. Oops! It’s leaking confidential taxpayer information to political activists. D’oh! It’s improperly and illegally targeting conservative organizations for harassment and investigation and misleading Congress, investigators, and the public about the scope and scale of that wrongdoing. Dang! It cannot produce the emails that investigators have demanded as part of the inquiry into its actions. Rats! Its employees are openly campaigning for Barack Obama’s reelection while on the IRS’s clock, using IRS resources, and holding taxpayers hostage. And, who could have seen it coming? The IRS violated the Federal Records Act by refusing to archive relevant documents. With a string of bad luck like that, sure, accidentally releasing NOM’s confidential taxpayer information to left-wing activists seems right at home.
That these events represent an unconnected string of unfortunate events — all of which just so happen to benefit the Left and its IRS allies while hurting conservatives and IRS critics — beggars belief. Add to that mix the willful dishonesty, the staged press rollout, complete with planted questions, intended to preempt questions about the internal investigation and its results, the naked lie that the wrongdoing was limited to a few nobodies in Cincinnati — the only way to believe that story is to desire very deeply to believe it.
The alternative and much more likely — undeniable, to my mind — explanation is that the Internal Revenue Service is engaged in an active and ongoing criminal conspiracy to misappropriate federal resources for political purposes, to use its investigatory powers, including the threat of criminal prosecution, for purposes of political repression, and to actively mislead Congress and the public about the issue; that the Justice Department is turning a blind eye to these very serious crimes for political purposes and is therefore complicit in the cover-up; that these crimes were encouraged if not outright suborned by Senate Democrats; and that the White House is at the very least passively complicit, refusing to lift so much as a presidential pinkie as the IRS runs amok.
And, apparently, there’s nobody in Washington with the power and the inclination to do anything about it.
Mr. Williamson, who now writes for National Review, was once the editor of the Ardmore Pa. -based Main Line Times
Almost everything the administration has alleged about Benghazi has proven false. Yet also, in Machiavellian fashion, the Obama group successfully peddled useful fictions, effectively deluded the country, adroitly ensured President Obama’s reelection, and cast aspersions on those who sought the truth.
In that sense, so far, the lies about Benghazi have won, the truth has failed.
So what really happened?
The Obama administration felt that it was behind the curve concerning the 2011 unrest in Libya. The so-called Arab Spring revolutions had toppled other governments in North Africa, and it seemed that protesters would do the same in Syria and Libya.
Hillary Clinton, Samantha Power, and Susan Rice did not want to be “on the wrong side of history,” especially given that it looked as if Moammar Qaddafi was likely to fall soon and needed only a little nudge. Given that the British and French were out in front, “lead from behind” seemed a safe, cheap way for the U.S. to intervene and yet not quite intervene — a sort of larger version of a drone strike.
But after Qaddafi’s fall, almost everything that followed proved the U.S. intervention to be a failure. The Americans had ceded leadership to France and Britain and seemed to boast about that fact. They had distorted the U.N. resolutions by going way beyond establishing no-fly zones and sending humanitarian aid. Obama had shown no interest in sending in postbellum peacekeeping troops or in organizing a U.N. force to prevent a Mogadishu on the Mediterranean. The result was a mess for most of 2011–12, as post-Qaddafi Libya settled into something like Somalia or the Sudan.
Al-Qaeda franchises emerged just as the parent organization had been declared to be on the run. Rumors spread that jihadists were arming themselves from the unprotected Qaddafi arsenal in the fashion of an unsettled Iraq around May 2003. Syria’s Assad had no intention of stepping down as ordered by President Obama. And so a full-scale civil war began in Syria, and the Arab Spring descended into tribal violence.
The U.S. decided to round up the most dangerous weapons of Qaddafi’s arsenal and to stealthily monitor the growing though supposedly nonexistent al-Qaeda presence in the detritus of Libya. A large CIA contingent was dispatched to Benghazi; nearby, a “consulate” opened. Ambassador Chris Stevens did his best to coordinate U.S. stealth efforts with what passed for a Libyan government. Rumors, never confirmed, spread that the CIA was shipping some of the Qaddafi arsenal to anti-Assad forces in Syria, hopefully the more secular insurgents. Other talk mentioned al-Qaeda prisoners held for interrogation by the CIA — another no-go topic in the 2012 campaign narrative of a defunct al-Qaeda, a secular Muslim Brotherhood, and an Obama who sees and hears no interrogations.
Stevens and others privately warned that the U.S. presence lacked sufficient security; they feared that the U.S. was doing enough to incite a terrorist response, but not enough to ensure the protection of its own forces if one was launched. But it was a reelection year. A Black Hawk Down firefight might in untimely fashion remind the public of the entire Libyan debacle. Security was not beefed up, and for a time the violence seemed to taper off.
As the anniversary of the 9/11 attacks approached, there were warnings of planned terrorist attacks on overseas U.S. facilities, especially in Libya, perhaps because the CIA presence was large and visible but not invincible. In an era of lead-from-behind diplomacy, terrorists were not convinced of any dangers from another U.S. armed intervention.
Some rumors later floated around that the consulate hit was in response to the drone assassination of Yahya al-Libi, others that it was prompted by stories of CIA arms transfers, yet others that it was linked to efforts to free captured terrorists. Who knows? But few seemed to care. In any case, the State Department had two general goals: to keep Libya from unraveling and to do so without another U.S. intervention. That translated into a de facto refusal to beef up security just two months before the election, and at a time when most other nations with a presence in Libya were packing up and getting out.
When a coordinated jihadist attack did target the consulate and CIA facility in Benghazi, Washington was entirely taken by surprise. It is not clear to what degree military authorities believed that they could have sent military help to those under attack in Benghazi with good chances of success, or whether they wished to do so but were refused permission.
Clearly, the president did not consider the attack on U.S. facilities a developing national turning point on a level with his decision to take out bin Laden. There were to be no photo-ops of the Benghazi situation room.
On the evening of September 11, by the time Obama was apprised of the strike, there was no chance the U.S. was poised to achieve a great victory, as it had in the bin Laden mission. The president had a busy campaign-fundraising schedule the next day, and so he retired early in the expectation that the secretary of defense and the chairman of the Joint Chiefs of Staff could manage the lose/lose crisis.
Disaster followed, as the jihadists overwhelmed meager U.S. security and killed, over a period of several hours, U.S. Ambassador J. Christopher Stevens; Sean Smith, the U.S. Foreign Service information-management officer; and two CIA contractors, Tyrone S. Woods and Glen Doherty. Outrage spread immediately as Americans learned that a U.S. ambassador was easily reached by terrorists and just as easily killed.
There were local claims in various places in the Middle East, many of them dubious, that an obnoxious video by a Coptic Egyptian resident in the U.S. had helped intensify the 9/11-anniversary violence elsewhere. Almost immediately the administration latched onto this narrative and massaged it to meet its own political needs.
That the unexpected and unforeseen disaster was due entirely to a reactionary Coptic, anti-Muslim provocateur, ensconced on U.S. soil, who had sown bigotry and religious hatred in a video released months earlier, proved a T-ball home run for Barack Obama.
Mr. Nakoula was in a sense the perfect fall guy. The video was amateurish, the producer a small-time con artist and cheat. Obama went into action in his accustomed teleprompted cadences, denouncing the forces of intolerance and chest-beating his own anguish at such illiberality on U.S. soil.
More importantly, the video as a casus violentiae was particularly resonant with an administration that had labored to remove the idea of Islamic extremism as a font of terrorism and instead had set up various smokescreens (e.g., jihad as a personal journey, terrorism as workplace violence, the Muslim Brotherhood as largely secular — not to mention overseas contingency operations, man-caused disasters, NASA’s Muslim-outreach mission, etc.). The more Susan Rice, Hillary Clinton, and Barack Obama hammered the theme of Mr. Nakoula as the guilty party, the more they could showcase their own multicultural bona fides and perhaps thereby explain away the violence (e.g., Obama’s iconic status still resonated in the Middle East; Libya was not a den of jihadists; al-Qaeda was still on the run; extremist right-wing Western provocateurs were still part of the problem).
Someone in the administration quickly discovered that Nakoula had technically violated the terms of his parole, and he was summarily jailed. Nakoula’s incarceration spoke volumes: The Middle East could appreciate that the real culprit was now behind bars. The U.S. had hunted down its own right-wing extremists, and Muslims now had no more reason to explode in spontaneous anger at such bigotry. Finding the real culprits, as the president had once promised, had now been accomplished.
The Nakoula construct, however, posed immediate problems. There were initial intelligence reports (confirmed by the Libyan president himself) that the deaths were caused by al-Qaeda terrorists. There was evidence that U.S. officials had had warnings about the premeditated attacks beforehand but largely discounted them. There was some evidence that the U.S. military might have been able to disrupt the terrorist forces, given that they were not spontaneous crowds who came out of nowhere and could melt away just as easily.
By and large the administration quite brilliantly finessed Benghazi. It turned the tables on the skeptics in the Romney campaign by suggesting that they were using the deaths of brave Americans to score political points. The president and his team cited the fog of war for the initial confusion. They promised in the light of day to go after the perpetrators — a pledge of action that they most surely did not pursue wholeheartedly as the election neared. Western hatred and intolerance, not radical Islam, had caused the deaths, with all the obvious red–blue domestic political implications.
In some senses, the administration photo-ops and spiking the ball on the bin Laden raid (“GM is alive, bin Laden is dead”) paled in comparison with the talking points and party line that immediately created the spontaneous-riot/evil-videomaker theme. Skeptics were deemed to be the politicizers, though the real politicizers were the ones who had distorted the truth.
Finally, time would cure all. The only real worry in the fall of 2012 was reelection. Once Benghazi fizzled in the second debate, with moderator Candy Crowley’s insistence that a presidential reference to generic terror was synonymous with an admission of a deliberate act of political and religious terrorism (as if the road-rage driver who leaves in his wake terror on the highway were a political terrorist), the deaths of Benghazi had entered the black hole of House investigations. The concerned administration officials rightly assumed that, with time, a sort of “What difference — at this point, what difference does it make?” or “Dude, this was like two years ago” attitude would eventually make Benghazi a sort of bad memory. Deputy National Security Adviser Ben Rhodes and his associates in this regard were largely right, as the media snapped to attention and reduced inquirers to the status of conspiracy theorists.
What then are we left with?
Were there political reasons why requests for additional security were ignored, suggesting that American lives were not as critical as President Obama’s reelection? At what time on the night of the attack did the president go to bed, and who made decisions not to order military assistance? What was the CIA doing in Benghazi, and what effect did its activities have on our security status? Were reports that the hit was retaliation for a U.S. drone attack accurate? What exactly did top-ranking officials of the CIA initially testify about the attacks, and were their original statements contradicted by later assertions? Who in the administration massaged intelligence synopses and sent out memos to head off accusations of failed leadership? Did the administration pressure (as if pressure were needed) media outlets to downplay the story? Why did our U.N. ambassador assert falsehoods, and why was she selected to be such a spokesman? Who ordered Mr. Nakoula jailed and kept him behind bars? Why were the real perpetrators never seriously pursued as promised? Did the personal problems of CIA director David Petraeus, the administration’s initial reaction to them, his various testimonies, and his sudden post-election resignation have any interconnections? Have all those who participated in the defense of the Benghazi facilities been fully heard from? And have those who were in the chain of command responsible for holding back succor on the night of the attack? What information was redacted in documents requested by Congress or under the Freedom of Information Act, and by whom?
Until these questions are answered, we are left with the strong possibility that the lethal attacks might have been deterred with adequate security, or even neutralized in mediis rebus : that high administration officials subsequently and deliberately misled the public, the U.N., our allies, and the relatives of the dead; that the president of the United States did not consider the attacks a crisis, or at least a crisis that could offer political opportunities, and subsequently and knowingly lied about the causes of the attack; that the U.S. government deliberately jailed a U.S. legal resident for reasons other than those alleged; that a U.S. election was influenced by administration deception; that the U.S. government was engaged in covert actions that might have been connected to the violence or were themselves ill conceived; that top intelligence officials did not tell the truth; and that almost immediately top administration handlers chose to construct a fantasy in lieu of reporting the facts about the death of four Americans.
Sixty years ago, the Supreme Court handed down its epoch-making decision in Brown v. Board of Education. The aftermath of Brown changed a great deal, from the role of the Court in our constitutional and political order to the national attitude toward civil rights and the very foundations of our political discourse.
It didn’t much change education.
There is much to say about Brown, and much that will be said. On the constitutional question, many conservatives at the time — and many conservatives now — shared the views of Barry Goldwater, who was himself an advocate of desegregation. “It so happens that I am in agreement with the objectives of the Supreme Court as stated in the Brown decision,” he wrote in The Conscience of a Conservative. “I believe that it is both wise and just for Negro children to attend the same schools as whites, and that to deny this opportunity carries with it strong implications of inferiority.” Senator Goldwater’s complaint was constitutional:
To my knowledge it has never been seriously argued — the argument certainly was not made by the Supreme Court — that the authors of the Fourteenth Amendment intended to alter the Constitutional scheme with regard to education. Indeed, in the famous school integration decision, Brown v. Board of Education (1954), the Supreme Court justices expressly acknowledged that they were not being guided by the intentions of the amendment’s authors. “In approaching this problem,” Chief Justice Warren said “we cannot turn the clock back to 1868 when the amendment was adopted. . . . We must consider public education in the light of its full development and in its present place in American life throughout the nation.” In effect, the Court said that what matters is not the ideas of the men who wrote the Constitution, but the Court’s ideas. It was only by engrafting its own views onto the established law of the land that the Court was able to reach the decision it did.
That was the view of most of the editors of National Review at the time, although the remarkable discovery I made — remarkable to me, at least — in my recent course of reading this magazine from its first issue through the middle 1960s is how relatively little we had to say about those questions. Brown is remarked upon, and so is the Civil Rights Act of 1964, but compared with issues such as Communism and the Vietnam War, they occupy very little space, and they are considered mainly, though not exclusively, in legal terms. Those terms are of course important, and conservatives who are instinctively inclined to agree with Senator Goldwater would do well to consider the contrary opinion of Robert Bork, whose views on such matters are not to be discounted lightly:
The Court’s realistic choice, therefore, was either to abandon the quest for equality by allowing segregation or to forbid segregation in order to achieve equality. There was no third choice. Either choice would violate one aspect of the original understanding, but there was no possibility of avoiding that. Since equality and segregation were mutually inconsistent, though the ratifiers did not understand that, both could not be honored. When that is seen, it is obvious the Court must choose equality and prohibit state-imposed segregation. The purpose that brought the fourteenth amendment into being was equality before the law, and equality, not separation, was written into the law.
Justice Clarence Thomas, noting that Brown was roundly criticized for its reliance upon sociological and psychological theory, comes to a similarly straightforward conclusion. The Court, he writes, “did not need to rely upon any psychological or social-science research in order to announce the simple, yet fundamental truth that the Government cannot discriminate among its citizens on the basis of race.”
Conservatives, at the time, were torn between their desire that government should make no distinctions between the races and their antagonism toward judicial imperialism. Conservatives, then as now, also were deeply influenced by their belief that the law could only do so much to remake social realities. The Republican party has a remarkably consistent belief, from the Lincoln era through the present day, that the main drivers of salubrious social change must be free enterprise and economic self-improvement. It is for that reason that Senator Robert Taft of Ohio — “Mr. Republican,” the Senate’s leading conservative — floated a largely forgotten proposal in 1946 that would have been the most sweeping civil-rights reform since the Reconstruction amendments, focusing mainly on the problem of employment discrimination. David Freeman Engstrom revisited that episode in a 2006 article and documented that the Taft bill, unlike many similar earlier offerings, contained very strong enforcement mechanisms, giving it real teeth, up to and including the implementation of hiring quotas. The Taft measure won the support of the noted black labor leader A. Philip Randolph, but was rejected by the NAACP and the AFL, the latter in part very probably because, as Mr. Engstrom notes, the Taft plan would have “exposed union locals to regulation.”
The post-Reconstruction Republican party believed at its core that the South was backward because it was poor, rather than poor because it was backward, and this line of thinking was implicitly and sometimes explicitly extended to African Americans throughout the country, as it is today. The theory was that incremental social change, driven largely by improvements in economic conditions, would accomplish what mere de jure equality could not. James Burnham, writing on the tenth anniversary of Brown in the June 2, 1964, edition of National Review, sharply criticizedBrown and the Court, partly for the attempt to superimpose the justices’ idealism over state and local law, as well as what he called “natural” processes, but because the post-Brown regime failed to deliver: “The verdict pronounced by the facts leaps to the eye, and is implicit even in the many tenth-anniversary recapitulations published in the journals that rate Brown alongside the Ten Commandments and the Declaration of Independence. Brown is an abysmal failure, strictly on its own terms.” Mr. Burnham’s next paragraph could be published today with only a slight revision of the numbers:
The rate of school integration — the specific problem dealt with in Brown — has been no more rapid in this decade since 1954 than in the decade before 1954, when, without benefit of the Court, it was progressing slowly but continuously under the influence of economic change, social pressures, shifts in community sentiment, and the state of local law. Today, after a decade of Brown , 91 per cent of Negro students in the Southern and Border states still attend segregated schools. . . . In the Northern cities, the widespread de facto school segregation, resulting from residential patterns, has not been significantly changed.
Mr. Burnham’s observations in 1964 are not radically different from those of Eleanor Barkhorn writing in The Atlantic just last year. She notes that in 1969, after the Department of Education had begun robust enforcement of Brown , 77 percent of black and 55 percent of Hispanic students attended schools that were predominantly minority, whereas in 2010 the numbers had hardly budged for blacks (74 percent) and moved in the direction of more segregation for Hispanics (80 percent). And in 2010, she reports, more than 40 percent of minority students attended schools that were almost exclusively (90–100 percent) nonwhite.
It is for that reason that the constitutional debate, important as it may be, seems to me sterile.
Those Taft Republicans were in many, perhaps most, ways correct about the relationship between economic progress and broader social progress. Until after World War II, the South was desperately poor compared with the rest of the country, with incomes on average one-third those in the Northeast. The post-war economic boom was the main factor in changing that — no law, no public policy, no federal program was nearly as significant. For all the talk about economic inequality, sustained, robust growth and economic innovation can incrementally but radically change our quality of life. A middle-income American in the Northeast 100 years ago was much better off than a middle-income Southerner, but both were very poor by our standards. The difference between them was nowhere near so great in real terms as the difference between them and us.
African Americans have been as well served by economic innovation and growth as anybody — probably more so. There is no performance gap in, say, the car that a black family living in a largely black neighborhood in a largely black city can buy compared with what a white family in a white neighborhood in a white city can buy. A black family with $25,000 to spend has the same choices as a white family with $25,000 to spend. A black family that can afford a Mercedes has the same choices as a white family that can afford a Mercedes. The same is true for most products.
It is not true for education, the most important product that is still delivered on a Soviet central-planning model rather than through markets. A middle-class black family living in a largely black neighborhood is likely to be served by relatively inferior public schools. Across income groups, blacks are less well-served by the monopoly education system than whites are. That is not so much a product of the fact that African Americans are relatively poor, though they are, as of the fact that they reside in relatively poor communities. There are many young people in families of very modest means who benefit from going to schools in communities populated by people who are much better off than they. (I was one of those.) But that benefit is, statistically speaking, less available to black families. And as a practical matter, it is almost certain to remain so as long as K–12 education is dominated by model in which ZIP code is destiny.
Brown was and is important as a statement of principle, but law has a limited ability to change the facts on the ground. Free markets, on the other hand, remake the physical world anew with revolutionary speed. One of the footnotes to the Brown decision considers possible remedies, one of which — “Negro children should forthwith be admitted to schools of their choice” — suggests what is still an excellent policy option, though one that should be applied universally rather than restricted to black students. The relative lack of black educational progress in the post-Brown era highlights not only the deficiencies of the politically dominated model of economic production — and education is an economic good — but draws attention to the critical distinction between government funding of services and government provision of services. Food stamps have not interfered with innovation in the growing, distribution, preservation, or retailing of food, because government does not attempt to operate farms, food-distribution networks, or grocery stores. It does operate schools, with consequences that have been disastrous generally but especially for African Americans. Even accounting for the income disparity between blacks and whites, the groceries, clothes, housing, electronics, automobiles, and other normal market goods is radically better for black Americans today than it was 30 years ago, to say nothing of 60 years ago. The same cannot be said of schools.
We may celebrate the sentiment behind Brown, but it would be far better to take meaningful steps to make the aspirations of 1954 into a reality sometime before 2054. When the politicians make their sentimental speeches about how far we’ve come since then, ask them where they stand on school choice, consumer-driven education, and other reforms. And then ask yourself which party is still living in 1954.
After Barack Obama gave a thousand campaign speeches on Iraq, Guantanamo Bay, and the economy, one of his first actions upon taking office as president was to begin gutting a tiny school-choice scholarship program in Washington, D.C. And now newly inaugurated New York mayor Bill de Blasio has, as one of his first agenda items, begun the gutting of the city’s charter schools, which are public schools that operate with some limited measure of independence from the usual education bureaucracies. Like President Obama, Mayor de Blasio is here engaged in plain, naked payback, rewarding the teachers’ unions that funded and manned his campaign by taking hundreds of millions of dollars away from projects they despise. If a private city contractor had bankrolled the mayor’s campaign and been repaid by having him hobble its competition, we’d call it simple corruption. And it is simple corruption, legal though it may be.
Mayor de Blasio intends to redirect money from the city’s charter schools to help pay for expanded pre-kindergarten education, which is to say for a full-employment program for his union supporters. Expanding pre-kindergarten education is a questionable investment: The premier federal pre-kindergarten program, Head Start, has been shown time and time again to provide no lasting results to its supposed beneficiaries. Robust support for early-childhood education sounds like the sort of thing that should work, but the empirical results are that it does not deliver on its promises.
New York City’s charter schools are consistently flooded with applications from parents desperate to rescue their children from the city’s dysfunctional standard-issue public schools. There are many metrics by which the success of an educational institution can be measured, but if we are guided in some part by the revealed preferences of New York City’s parents, then the evidence is overwhelming that charter schools are a much more attractive choice when the alternative is the product Mayor de Blasio’s union bosses are offering up. Charter-school operators, pointedly seeking to remind the administration that they are, still, operating city public schools, have asked only that their capital and operating funds be proportional to the populations they serve: “A kid is a kid is a kid,” as charter-school executive Eva Moskowitz put it. “We are public charter schools. The operating revenue should be the same. The capital revenue should be the same.”
New York’s charter schools serve a largely minority and low-income population, in a city where the traditional schools barely manage to retain half of the young black men who enter the ninth grade to graduation four years later. Educating the children of New York City entails some serious challenges, and the charter schools have not achieved what anybody would call dramatic success. They simply provide a superior alternative to traditional schools for many families. Results need not be spectacular to be meaningful.
As a report from the Brookings Institution put it:
Two recent rigorous evaluations have found that NYC charter schools are, on average, doing a substantially better job for students than the regular public schools with which they directly compete. For example, student gains in math in charter schools compared to traditional public schools are equivalent to roughly five additional months of schooling in a single school year. Likewise, students attending the small high schools of choice opened by the Bloomberg administration have high school graduation rates that are about 10 percentage points greater than students who wanted to attend these same schools but lost a lottery for admission.
Judging by the application rates, New York City parents love charter schools. The evidence suggests they do a meaningfully if not radically better job than their traditional counterparts. They are seeking only the same resources to which they would be entitled if they were not charter schools, meaning they place no special burden on taxpayers. The only faction opposed to them is the teachers’ unions, which seek to legally eliminate all competition and all alternatives.
Charter schools are a tiny crack in the Berlin Wall of the government-school monopoly, far short of the liberalized approach to education we would prefer. But they are a significant improvement that comes at very little cost, and Mayor de Blasio’s attack on them elevates the interests of his political cronies over those of the city’s children. It is low and it is shameful, and the Panel for Education Policy, which has the opportunity to stop this abuse in March, should see to it that the mayor’s proposal does not stand.
The Environmental Protection Agency’s recently announced decision to, in effect, ban the construction of traditional coal-fired power plants in the United States is a non-solution to a hypothetical problem, enacted upon a legal basis that is shaky and an economic basis that is nonexistent. The cost-benefit analysis is almost entirely one-sided: The costs will be very high, and the benefits the EPA hopes to secure will remain out of reach.
The EPA is demanding that new U.S. plants that will use coal to generate electricity must meet standards that today are met by no commercial coal-fired plant operating anywhere in the world. There are, however, two plants coming on line — one in Saskatchewan, one in Mississippi — that incorporate new technology designed to capture enough carbon dioxide to satisfy the EPA demands. Whether that new technology will be effective in practice remains to be seen; whether it will be both effective and cost-effective is a much more important and complex question, one that the EPA has no genuine interest in contemplating.
That is a problem, inasmuch as the Clean Air Act requires that the EPA perform a cost-benefit analysis of new rules. EPA administrator Gina McCarthy not only says that the agency has conducted such an analysis but goes on to characterize it as “wonderful,” and we are indeed filled with a sense of wonder at her proclamation, though perhaps not in the way she intended.
The costs remain a mystery. The industry expects them to be high, but how high is anybody’s guess: The CO2-capture technology that the EPA expects to become standard as a result of its new mandate is, as noted, not currently in commercial use. There is no demand in the market for it, and its costs can therefore be estimated on a wild-guess basis at best.
It is easier to estimate the benefits: They will be nonexistent. Even if we assume that the general thrust of the case for anthropogenic global warming is accurate (an assumption that requires setting aside the recent failure of climate-change models and the less confident scientific consensus as to the meaning of recent data), the fact remains that global warming is, if it is anything at all, global. Local controls on U.S. power plants, even if they are draconian, will have little impact on the overall atmospheric composition of the planet and its effect on global temperatures.
Carbon dioxide is only one greenhouse gas among many, and the United States is not the world’s largest producer of it. The United States, in fact, produces about 15 percent of the world’s carbon-dioxide emissions, and U.S. power plants are responsible only for about 33 percent of that 15 percent. And the new rule applies only to newly constructed plants, though the EPA has signaled that it intends to demand the retrofitting of existing plants in the future.
What all this means is that even if the EPA were wildly successful in its implementation of the new standards, it still would not achieve any substantial reduction in global greenhouse-gas emissions. It is equally likely, if not more, that it will achieve an increase instead: Being a fungible commodity, the coal not consumed by U.S. generators will find its way to China, India, and the rest of the developing world, where it will be consumed in high-pollution plants that make those in the United States look as pure as vestal virgins by comparison.
So: Costs unknown, benefits negligible. “Wonderful,” indeed.
No doubt surviving members of the 88th Congress, which passed the Clean Air Act, are filled with a similar sense of wonder that their law is being used to police carbon dioxide emissions, an outcome the legislators did not intend. The legal basis for declaring carbon dioxide a “pollutant” under the act is questionable at best, as is the EPA’s rationale for picking and choosing what sorts of emitters will be subject to its new rules. If you would like a preview of what medicine is going to look like under Obamacare, consider the high-handed, letter-of-the-law-be-damned approach of the EPA and the courts that have enabled it.
The new rule may prove wonderful for the manufacturers of the capture technology that will effectively be mandated. As with the case of Solyndra et al., this maneuver is not about producing environmental benefits but about creating markets for politically favored firms and industries. But even those cronies may fare less well than they expect to.
The Obama administration, despite its obvious desire, has not yet been successful in strangling the natural-gas renaissance that is changing the face of the American energy industry. Though coal remains the largest single source of electricity, it already has been falling out of favor with those building new generating capacity, because natural gas is cheaper and plentiful. It is also less damaging to the environment, contra the ill-informed hysteria about the gas-extraction technique known as fracking. But the United States has a complex economy, and there is no single “right” source for fuel. Left to its own devices, the industry probably will move toward natural gas and away from coal, but coal will remain an important part of the picture for the foreseeable future.
In 2012, Barack Obama became the first major-party presidential candidate since statehood to fail to win in a single county of West Virginia. He lost the statewide vote by a substantial margin, with two out of three against him . The people of West Virginia rightly appreciated that their best-known commodity is the target of a regulatory jihad by the White House that has no environmental or economic justification.
The real motive here is the administration’s messianic pretentions, its belief that its bureaucrats and managers are more humane and more intelligent than the producers and consumers over whom they reign, and that they have been chosen to lead the United States into a future that is relatively free of such relics of the Industrial Revolution as coal-fired power plants and petroleum products. Unhappily for them, there is a wide gulf between social engineering and real engineering, and the most impressive products the green-energy revolution has delivered so far are a couple of nifty electric motorcycles — which are recharged by a power grid that gets 40 percent of its juice from coal.
A functioning modern society requires reliable electricity. A modern industrial economy requires affordable electricity. To impose incalculable costs on electricity generation in exchange for ideological satisfaction with no real-world environmental benefit is the sign of an agency that has put its own political agenda ahead of the national interest, playing fast and loose with the law in the process. The EPA is a menace, and Congress should put it on a leash.