An American Tragedy: Awarding a War Criminal on Veterans Day

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

The whole charade plays best as farce. Absurdity incarnate. The sight of former President George W. Bush receiving a medal from Democrat Joe Biden—once an ardent opponent of Bush’s war policy—in Philadelphia for “his work with veterans,” on Veterans Day no less, induced nothing short of a gag from this veteran of two Bush wars—Iraq and Afghanistan.

George W. Bush, after all, led the U.S. military—to which I’ve dedicated my adult life—into two ill-advised perpetual wars, one of which was objectively illegal and immoral. In that war, in Iraq, some 7,000 American troops—including three of mine—were killed fighting in an unwinnable quagmire. Furthermore, though it slipped the attention of an American citizenry best known for its provincial inwardness, at least 250,000 Iraqis—mostly civilians—were killed. In a just world this would be labeled what it is—a war crime—but in this era of American hegemony, the populace simply sighs with apathy.

Now, we are told, it is time to congratulate Mr. Bush on his post-presidency work with the very veterans he created. Somehow, his choice to spend his retirement painting the faces of the misemployed, and often damaged, veterans he brought into being absolves him from the crimes of what this author is certain will be remembered as one of the worst presidencies in history. This fanfare is post-factual and illogical, but it certainly reflects our times.

Only in the era of Donald Trump could such a flawed and ignoble figure as George W. Bush appear gallant. Then again, we should have seen it coming. When a bipartisan consensus of Warhawks turned out to share candy and venerate the militaristic legacy of Sen. John McCain—complete  with both Barack Obama and George W. Bush as keynote speakers—it was only a matter of time until Bush and his murderous administration were rehabilitated.

And it makes sense that it was Biden who bestowed the medal. “Smiling Joe” might have turned against that failing Iraq War by 2006, but let us remember that Biden—along with Democrats Hilary Clinton, Chuck Schumer and Harry Reid—voted for that very war in October 2002. The whole moment was shameful, and a farcical—if accurate—parody of the entire U.S. bipartisan warfare state. Biden’s boss, Barack Obama, after all, was the very man who chose not to investigate or indict the veritable war criminals in his predecessor’s administration. Obama claimed that he did so in the name of national unity; yet given his own militaristic record, it seems he did so only to perpetuate an American warfare state in the Middle East.

Shame on Joe Biden; shame on Barack Obama; shame on all the elite officials who play politics but refuse to forswear the tactics and policies of U.S. government militarism that has been in business since 9/11/2001. One group, at least, refused to bow to the—little reported—Bush award ceremony: About Face, Veterans Against the War, an organization I’m proud to be a part of, bravely chose to protest the farce in Philadelphia. As they blocked the entrances to the $1,000-plus dollar-a-plate regalia, these veterans of Iraq and Afghanistan chanted against the expansion of the warfare state and the shameful award bestowed upon Bush. As is so often the case in apathetic America, hardly anyone noticed, and the mainstream media quickly moved on.

This veteran, for one, did not avert his eyes. His eyes teared as he watched fellow veterans—victims of the Bush wars of choice—protest his absurd award ceremony. I thought of Sgt. Alexander J. Fuller, my favorite soldier, who died at the hands of Iraqi Shia militias who should have been the natural allies of the United States in the “war on terror.” Only Bush—who famously didn’t know the difference between Sunni and Shia Arabs—found a way to lead the U.S. Army into an illogical conflict with them, too.

I’ll never forgive George W. Bush, no matter how many portraits he paints. Perhaps if I was as “good” a Christian as he I would. But I’m not that guy. Bush’s ill-advised wars stole my friends, my youth, my mental health and my trust in the American state. That can’t be replaced. In my younger years, an emotional 23-year-old version of myself probably wished ill on him and his. I no longer feel that way. I wish only the best for Mr. Bush, his wife and his family. But I, and no serious veteran and scholar of the Iraq Wars, will ever countenance his rehabilitation or commemoration.

——
Danny Sjursen is a U.S. Army officer and a regular contributor to the American Conservative. He served combat tours with reconnaissance units in Iraq and Afghanistan and later taught history at his alma mater, West Point. He is the author of a memoir and critical analysis of the Iraq War, Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge. Follow him on Twitter at @SkepticalVet.

Note: The views expressed in this article are those of the author, expressed in an unofficial capacity, and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. government.

Read more

Should We Do Away With Veterans’ Day Altogether?

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

This piece originally appeared on anti-war.com.

Veterans’ Day – maybe we ought to drop the whole charade. Don’t get me wrong, there will be celebrations a plenty: the NFL will roll out the ubiquitous stadium-sized flags and march uniformed service members in front of the cameras; cities across the nation will hold parades; and millions of Americans will take a moment to go through the motions and “thank” the nation’s soldiers. Sure, the gestures are sometimes genuine and certainly preferable to the alternative. Still, all this martial spectacle misses the salient point hidden just below the surface: the American people are absolutely not engaged with U.S. foreign policy. Most could hardly name the seven countries its military actively bombing, let alone find them on a map.

Worse still, hardly anyone even talks about America’s wars these days – not the mainstream media, not the president of the United States, not the Congress. Veterans’ Day just happens to fall about a week after the midterm elections – which both President Trump and Barack Obama each told their supporters was “the most consequential of our lifetimes” – but the truth is that foreign policy was hardly even on the agenda this past Tuesday. Americans argued about healthcare, taxes, immigration, and Mr. Trump’s personality, but ignored our supposedly adulated soldiers.

See, that’s the point. Election Day, not Veterans’ Day, was the genuine opportunity to honor the nation’s veterans – many of whom are still deployed in the perpetual combat zones of the Greater Middle East. Only as expected, the American people let their veterans down. In the 24/7 media and political conversation surrounding the mid-term elections, no one, and I mean no one, took the time to ask the questions that really matter to veterans: what are they being asked to accomplish in the world? Are they achieving their missions? Are those missions even achievable? What is the end-state, or, you know, will these eternal wars ever end? No, Americans didn’t demand an answer to these questions and their elected representatives were just as happy not to engage with such complex issues.

Apathy is the name of the American game when it comes to foreign policy, war, and “peace” – whatever that means anymore. The public – utterly detached from an all-volunteer military consisting of fewer than 1% of Americans – and the Congress like it that way. It’s far easier – and strangely comforting, it seems – to throw a yellow ribbon on a car, pick up a soldier’s check, or loudly belt out the Star Spangled Banner at some sporting event, rather than actually follow US foreign policy and demand accountability. Attending an antiwar march or calling one’s representatives to discuss America’s wars, well, that’s hard! So much easier is it to “like” a sentimental meme on Facebook or briefly thank a stranger in the airport.

This author recognizes how cynical this all sounds. Truth is, though, it’s not meant to be offensive. Many of those thanking vets genuinely mean well. It’s just that all of that adulation isn’t helping to extract America’s troopers from the longest – and probably least decisive – wars in the nation’s history. We, the veterans and active duty soldiers who’ve served tour after ambiguous tour, deserve an engaged populace. We deserve a citizenry that demands answers to the real question before us: what, precisely, is the US accomplishing in the Greater Middle East? The short answer, for the few of us who seriously study this, is almost nothing.

The results of America’s ongoing wars are visible and publicly available. US military interventionism has cost 7,000 soldiers’ lives, upwards of half a million local deaths, $5.6 trillion (and counting), 10 million refugees, and an entire region in worse shape than we found it. Veterans are regularly touted as having “defended” the People and the country; only, there’s little evidence to bolster this vacuous argument. US State Department data has long demonstrated that worldwide terror attacks have increased dramatically since 2001 (even if there was a slight dip from 2016-17). Furthermore, even counterinsurgency gurus in the military and civilian policy machines recognize that US bombing, drone strikes, raids, and military occupations tend only to further enflame anti-American sentiment and often create new crops of “terrorists.” The whole enterprise is nothing if not counterproductive.

Which brings us back to the Veterans’ Day masquerade, and this veteran’s plea: please get engaged in US foreign policy. Even if you or your family members do not directly serve in the Armed Forces, commit yourselves, on this holiday weekend, to education and interest in the ongoing American wars. Remember, the executive branch, whether led by Trump “conservatives” or Obama “liberals,” counts on your apathy in order to wage unilateral global combat without congressional oversight or citizen protest. Refuse to acquiesce to this absurdity – it is killing what’s left of the republic.

So this year, on this day, try something truly brave, and different: be a citizen. Thank a veteran in your local community, sure, but then head home and open a credible periodical. Turn to the foreign policy section and start reading. Think; ask tough questions. Only then will you have genuinely honored millions of American vets.

Read more

American History for Truthdiggers: Tragic Dawn of Overseas Imperialism

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

Editor’s note: The past is prologue. The stories we tell about ourselves and our forebears inform the sort of country we think we are and help determine public policy. As our current president promises to “make America great again,” this moment is an appropriate time to reconsider our past, look back at various eras of United States history and re-evaluate America’s origins. When, exactly, were we “great”?

Below is the 21st installment of the “American History for Truthdiggers” series, a pull-no-punches appraisal of our shared, if flawed, past. The author of the series, Danny Sjursen, an active-duty major in the U.S. Army, served military tours in Iraq and Afghanistan and taught the nation’s checkered, often inspiring past when he was an assistant professor of history at West Point. His war experiences, his scholarship, his skill as a writer and his patriotism illuminate these Truthdig posts.

Part 21 of “American History for Truthdiggers.”

See: Part 1; Part 2; Part 3; Part 4; Part 5; Part 6; Part 7; Part 8; Part 9; Part 10; Part 11; Part 12; Part 13; Part 14; Part 15; Part 16; Part 17; Part 18; Part 19; Part 20.

* * *

Empire. It is a word that most Americans loathe. After all, the United States was born through its rebellion against the great (British) empire of the day. American politicians, policymakers and the public alike have long preferred to imagine the U.S. as, rather, a beacon of freedom in the world, bringing light to those in the darkness of despotism. Europeans, not Americans, it is thought, had empires. Some version of this myth has pervaded the republic from its earliest colonial origins, and nothing could be further from the truth.

According to the old historical narrative, the U.S. has always been a democratic republic and only briefly dabbled (from 1898 to 1904) with outright imperialism. And, indeed, even in that era—in which the U.S. seized Puerto Rico, Guam, Hawaii and the Philippines—the U.S. saw itself as “liberating” the locals from Spanish despotism. This wasn’t real imperialism but rather, to use a term from the day, “benevolent assimilation.” Oh, what a gloriously American euphemism!

The truth, of course, is far more discomfiting. The U.S. was an empire before it had even gained its own independence. From the moment that Englishmen landed at Jamestown and Plymouth Rock, theirs was an imperial experiment. Native tribes were conquered and displaced westward, year in and year out, until there were no sovereign Indians left to fight. In 1848, the U.S. Army conquered northern Mexico and rechristened it the American Southwest. Yes, the U.S. was always an empire, what Thomas Jefferson self-consciously called an “Empire of Liberty.” Only the American Empire looked different from the British and Western European variety. Until 1898, the U.S. lacked the overseas possessions and expansive naval power that have come to define our contemporary image of empire. That was the British, French and Spanish model. No, the U.S. was a great land empire most similar (ironically) to that of Russia, but an empire nonetheless.

Still, there is something profound about 1898 and the years that followed. For it was in this era that the American people—and their leaders—became sick with the disease of overseas imperialism. With no Indians left to fight and no Mexican lands worth conquering, Americans looked abroad for new monsters to destroy and new lands to occupy. Britain and France were far too powerful and were not to be trifled with; but Spain, the deteriorating Spanish Empire in the Caribbean and Pacific, proved a tempting target. And so it was, through a brief—“splendid,” as it was described—little war with Spain, that the United States would annex foreign territories and join the European race for colonies.

1898 is central to our understanding of the United States’ contemporary role in the world, for it was at that moment that the peculiar exceptional millenarianism of American idealism merged with the Western mission of “civilization.” The result was a more overt, distant and expansive version of American Empire. And, though the U.S. no longer officially “annexes” foreign territories, its neo-imperial foreign policy is alive and well, with U.S. military forces ensconced in some 800 bases in more than 80 countries—numbers that by far exceed those of other nations. Furthermore, the remnants of America’s first overseas conquests are with us today, as the people of Puerto Rico, Guam and Samoa are still only partial Americans—citizens, yes, but citizens without congressional representation or a vote in presidential elections. How ironic, indeed, that a nation founded in opposition to “taxation without representation” should, for more than 100 years now, hold so many of its people in a situation remarkably similar to that of the American colonists before the Revolutionary War.

In retrospect, then, 1898 represents both continuity with America’s imperial past and a bridge to its contemporary neo-imperial future. This era is key because it stands as a moment of no return: a pivot point at which the United States became a global empire. One can hardly understand contemporary interventions in Iraq and Afghanistan without a clear account of 1898 and what followed. The Spanish-American War and the occupation of the Philippines are two of America’s fundamental sins, and their consequences resonate in our ever uncertain present.

The Closing of the Frontier (1890)

In 1890, the distinguished American historian Frederick Jackson Turner combed the latest U.S. census and declared, in a widely read speech, that the American “frontier” was officially “closed.” He meant, of course, that there were no longer any uncharted Western lands to explore or Indian tribes to fight. The West was conquered and “civilized,” once and for all. According to Turner, westward expansion had defined American history and American values. “Civilizing” the West, through hardy individualism and strife, had altered and established the American soul. In his telling, which was very influential in its day, the “loss” of the frontier wasn’t necessarily a good thing; in fact, it had the potential to “soften” Americans and rot the foundation of the republic.

It was believed that without new lands to conquer, new space in which to expand, Americans would become a sedentary people riven with the same class divisions (and social conflict) infecting Europe. Furthermore, without new markets, how would American farmers and manufacturers maintain and improve their economic situation? The West was an idea, mostly, but it spoke to an inherently American trait: expansionism. Ours was a society of more: more land, more profits, more freedom, more growth. In a view widely held—then and now—the U.S. would die if it ever stopped expanding. From “sea to shining sea” wasn’t enough; no two oceans should hem in American markets, the American people or American ideals. This was, and is, the messianic nature of the American experiment, for better or worse.

Many citizens were riddled with anxiety about the “loss” of the West. This helps explain the widely popular phenomenon of Buffalo Bill Cody’s traveling “Wild West” shows, in which he paraded Indians around the cities of the American East and, eventually, around the world. Americans were transfixed at the sight of “savage” natives and “noble” cowboys and cavalrymen. For Americans of the 1890s, the West—and all it entailed—represented both freedom and virile masculinity. As more and more Americans moved to big cities and became factory laborers, many wondered whether American manhood itself was not in crisis. Those with the means (and the inherent insecurity), men like Theodore Roosevelt, the scion of a wealthy patrician New York family, made pilgrimages to Western ranches as though they represented the New Jerusalem. It is only thus that we have the image of this future American president, a city boy, adorned in Western attire. Such was the inherent unease of the times.

How to Sell an Unnecessary War: William Randolph Hearst and the Media-Militarist Conspiracy

This 1896 political cartoon from a Spanish newspaper shows a rapacious Uncle Sam reaching toward Cuba and other Spanish colonies in the Caribbean.

By 1898, the United States was bursting with energy, self-righteousness and anxiety. The only question was where all that expansionist energy would direct itself. It was then that a coalition of newspapermen and imperialist politicians provided a ready target: Cuba. Spain had, for many years, been engaged in a counterinsurgency campaign against Cuban rebels seeking independence. This would provide the opening that America’s burgeoning imperialists longed for. At the same time, none of this interest in Cuban affairs was new. Before the American Civil War, Southerners had repeatedly called for the annexation of Cuba as a new slave state.

Now, however, a conglomeration of powerful interests pushed for U.S. intervention on behalf of the Cubans. If that campaign resulted in the seizure of Cuba, well, then, all the better. Historians have long debated which factors or impulses were most responsible for America’s overseas expansion and intervention in Cuba. The reality, though, is that it was a confluence of interests that pushed the U.S. toward war with Spain. Corporate capitalists sought new markets for their goods; missionaries dreamed of Christianizing and “civilizing” foreign peoples; naval strategists coveted bases and coaling stations to project power across the seas; expansionist politicians—prominent among them Theodore Roosevelt and Sen. Henry Cabot Lodge—believed the U.S. had a mission to expand in order to salvage the virility of the republic; and “muckraking” newspapermen led by William Randolph Hearst desired nothing more than to sell papers and turn a profit—and the best way to do that was to report, and exaggerate, Spanish atrocities and drum up a new, popular war. War sells, after all.

The key triumvirate, however, was the alliance between Assistant Secretary of the Navy Roosevelt, Massachusetts Sen. Lodge and newspaper magnate Hearst. Lodge, for one, genuinely hoped for some crisis to precipitate war with Spain. In 1898, he wrote to a friend, “There may be an explosion any day in Cuba which would settle a great many things.” How right he was! First, an intercepted letter from the Spanish minister in Washington was found to contain unflattering references to President William McKinley. Hearst’s papers exaggerated the story, with his New York Journal running the headline, “WORST INSULT TO THE UNITED STATES IN ITS HISTORY.” This came on top of several years of stories in which the Journal writers whipped up chauvinist support for war with Spain.

Then, fatefully, on Feb. 15, 1898, an American naval vessel, the USS Maine, exploded in a harbor in Cuba, killing 258 sailors. Without the slightest pause for an investigation, a Hearst headline proclaimed “DESTRUCTION OF THE WARSHIP MAINE WAS THE WORK OF AN ENEMY.” It wasn’t, and experts confirmed later that the explosion was accidental. Even at the time, several policymakers and experts suspected the Maine had fallen victim to fluke tragedy. The secretary of the Navy wrote that the explosion was “probably the result of an accident”; furthermore, the country’s principal expert on maritime explosions—a professor at the Naval Academy—concluded that “no torpedo such as is known in modern warfare can of itself cause an explosion as powerful as that which destroyed the Maine.” It hardly mattered. The explosion of the Maine provided the casus belli for a nation ready for war.

Crowds gathered to protest at the Spanish Embassy; effigies of Spaniards were burned. Hearst, the newspaperman who had long sought war, cabled to one of his correspondents that “Maine is a great thing.” President McKinley—who had seen the horror of war at the Battle of Antietam—was initially hesitant to rush into action, but he quickly bowed to the pressure of a militaristic public and Congress. He, without international legal sanction, insisted that Spain give up possession of its “ever-faithful isle.” The president must have known, of course, that Spain could never bow to such a demand and still maintain its global prestige. Then, on April 11, McKinley delivered a message to Congress arguing that the U.S. must intervene in Cuba not simply as a result of the Maine explosion, but as a humanitarian intervention on behalf of the embattled Cubans. As historian Stephen Kinzer has written, McKinley thus “became the first American president to threaten war against another country because it was mistreating its own subjects.” He would not be the last.

Spain declared war on the U.S. on April 24, and Washington issued a declaration the next day. The military conflict was to last less than four months, ending in a decisive American victory over an empire long past its prime. Secretary of State John Hay called it a “splendid little war,” and, indeed, it was by some measures the most popular war in American history. War fever infected the American people. The French ambassador observed that a “sort of bellicose fever has seized the American nation”; the London Times called it “the delirium of war”; a German newspaper described it as a “lust for conquest.”

Seeking martial glory, Roosevelt resigned his position as assistant Navy secretary and raised a regiment of volunteer cavalry, “the Rough Riders.” He would take it to Cuba as part of the hastily formed American expeditionary force seeking to “liberate” the island. Roosevelt found the combat he so desired when his regiment bravely charged to victory in the Battle of San Juan Hill (which was actually fought on nearby Kettle Hill and involved the often-forgotten help of the professional black 9th and 10th Cavalry regiments). Old Teddy was as giddy as a schoolboy, shouting at the height of the battle: “Holy Godfrey, what fun!” He would later call the battle “the great day of my life.” After the battle, Roosevelt annoyed his professional military peers by shamelessly (and uncouthly) lobbying for a Medal of Honor for himself (President Bill Clinton would eventually bestow the award 80 years after the future president’s death).

The war was far from glorious. The Spanish were dislodged from Guam, the Philippines, Puerto Rico and Cuba, but deaths from disease outnumbered U.S. battle deaths by some eight to one. Few Americans cared about this fact, so caught were they in the martial fever of the day.

In early 1899, the U.S. Senate would, by a narrow margin, ratify a treaty in which Spain ceded Guam, Puerto Rico and the Philippines to America. This moment was, indeed, a point of no return—the instant that the U.S. became an overseas empire. Cuba technically received independence but, under Congress’ Platt Amendment, became essentially a U.S. protectorate; Washington retained the right to intervene at will in Cuban affairs.

And what of the Cubans themselves, on those behalf the war was supposedly fought? U.S. military and political personnel were, upon arriving on the island, surprised to learn that a significant portion of the population and the rebels were black. After all, the last thing the U.S. of 1898 wanted was an independent black republic on its southern shores. Furthermore, when it turned out the Cuban revolutionaries had expansive social reformist aims beyond independence, Washington was even less apt to grant full independence. Gen. Leonard Wood (a U.S. Army fort is named for him in Missouri), the military governor of Cuba, argued that the U.S. should maintain an indefinite occupation of the island “while saying as little as possible about the whole thing.” Wood was eventually pleased by the text of the Platt Amendment, stating, “There is, of course, little or no independence left Cuba under [the amendment].” This all cohered with Wood’s worldview. He considered the Cubans “as ignorant as children,” and sought to chose their first president.

The Spanish-American War also served another purpose for Americans. The conflict, it was said, would heal the divisions of the Civil War and unite the nation behind a “noble” cause. Newspapers bristled with stories of former Union and Confederate veterans serving together in the American Army in Cuba and the Philippines. In one famous anecdote, the former Confederate Gen. Joseph “Fighting Joe” Wheeler—now an old man—led a charge and seemingly forgot whom exactly he was fighting, rallying his men with the cry “Let’s go, boys! We’ve got the damn Yankees on the run again!” It seemed the Spanish-American War was all things for all people, except, of course, the Spaniards and the natives of the former colonies.

After the victory, the Americans’ goals became ever more expansive. A war waged for Cuba turned into a war of conquest as the U.S. seized the Spanish colonies of Guam, Puerto Rico, the Philippines and—for good measure—the independent island of Hawaii (which the Dole corporation coveted as a source of sugar for the American market). In reference to that island, McKinley declared, “We need Hawaii just as much and a good deal more than we did California. It is manifest destiny.” And so it was.

Fighting for American Manhood

Modern historians continue to grapple with the puzzle of America’s leap into the colonial land grab in 1898. What prompted the sudden bellicosity of American military might? What drove the spirit of the populace to cheer on the war? As usual, there is no simple answer. This much, however, seems certain: The answers to these questions are as much cultural as political. Indeed, one factor that seemingly drove the rush to war was a prevailing American insecurity about the citizens’ collective manhood and masculinity. The historian Jackson Lears, in fact, has persuasively argued that “imperialists deployed a mystical language of evolutionary progress … celebrating the renewal of masculine will and equating it with personal regeneration.”

Why all this gender insecurity? Well, the nation had, with the exception of several small Indian wars fought by the regular Army, been at peace since 1865. The younger generation looked up to the martial exploits of their Civil War veteran fathers. The elders feared that the nation’s youths, for lack of military service and without a Western frontier to conquer, were growing soft. Fewer and fewer Americans of the late 19th century did backbreaking farm work in the fields or ranches of the West as the population shifted toward unskilled “soft” labor in the cities of the East and Midwest.

In this climate of insecurity and toxic masculinity, many Americans and their public leaders began to believe the U.S. needed a war to rejuvenate the population and retrieve America’s collective masculinity. As early as 1895, Theodore Roosevelt—the poster boy for masculine self-consciousness—declared that he “[s]hould welcome almost any war, for I think this country needs one.” Because many women, such as the famed social activist Jane Addams, were or would soon be dissenting anti-imperialists, the expansionists depicted their opponents as lacking what Roosevelt declared “the essential manliness of the American character.” Furthermore, pro-imperialist political cartoons often depicted their opponents wearing women’s clothing.

This image from the U.S. Military Academy yearbook of 1924 suggests the self-conscious sexuality and homoeroticism inherent in American warfare, especially in the imperialist adventures of the previous generation.

In perhaps his most famous speech, “The Strenuous Life,” Roosevelt referred to America’s mission in pacifying the now rebellious Filipinos as “man’s work.” The speech was littered with sociosexual language such as his consistent exhortations that Americans must not “shrink” from their duties, and argued that anti-imperialists had an “unwillingness to play the part of men.” In another speech, in Boston, Roosevelt stated, “We have got to put down the [Philippine] insurrection! If we are men, we can’t do otherwise.” Of course, gender roles and masculine insecurity alone cannot explain the drive for colonies and military expansion; neither, though, can we discount its role in propelling the nation forward into war and conquest.

White Man’s Burden: Race and Empire

Take up the White Man’s burden,
The savage wars of peace—
Fill full the mouth of Famine
And bid the sickness cease. …

Take up the White Man’s burden,
Ye dare not stoop to less. …
By all ye cry or whisper,
By all ye leave or do,
The silent, sullen peoples
Shall weigh your gods and you.

Take up the White Man’s burden, Have done with childish days. …
Comes now, to search your manhood. …”

—An excerpt from the Englishman Rudyard Kipling’s poem “White Man’s Burden,” an inducement for the United States to occupy the Philippine Islands and join the other imperialist nations of Europe.

Racism is the original sin of the American experiment. White supremacy was part of the cultural baggage American troops carried abroad. The scourge of race did not stop at our shores. Moreover, it was a global phenomenon; this was the era of social Darwinism, the notion that “survival of the fittest” applied to man as well as beast, that certain races were scientifically superior to others. It was all snake oil, of course, but it was a predominant ideology—especially since, well, the “higher-level” white race wrote the books and carried the most advanced weapons. It was thus that racism, along with masculinity, would drive American expansionist imperialism at the turn of the 20th century.

The war with Spain and the much longer conflict with the Filipino rebels occurred in the context of what was the height of racial violence in the American South. Lynching of blacks reached pandemic proportions, what the author (and later anti-imperialist) Mark Twain described as “an epidemic of bloody insanities.” By one estimate, in the period surrounding the start of the 20th century someone in the South was hanged or burned alive on average once every four days. Racism infected the populace and policymakers on both sides of the Mason-Dixon Line. And that disease would frame America’s new wars, which, by no accident, were waged against brown folks. The language of this imperial era, and the prevailing racialized ideology so prevalent in American society, pervaded and justified America’s wars, suppressions and annexations.

Before the wars even began, men like Roosevelt argued that, indeed, the U.S. had a racial obligation to get into the imperial game. He wrote, in 1897, that he felt “a good deal disheartened at the queer lack of imperial instinct our people show … [it would seem] we have lost, or wholly lack, the masterful impulse which alone can make a race great.” Later, as governor of New York, Roosevelt—who dedicated a peculiar amount of his attention to international rather than state affairs—declared that the U.S. had a “mighty mission” and that it needed a “knowledge of [our] new duties.” Where the American flag once flew [in Cuba and the Philippines] “there must and shall be no return to tyranny or savagery.”

After the U.S. seized the Philippines from Spain, a long legislative debate ensued over just what to do with the islands: Should they be granted independence or held as a colony? On the floor of the Senate, the influential Indiana Republican Albert Beveridge summarized the majority opinion. The Filipinos, because of their race, couldn’t possibly govern themselves. “How could they?” he exclaimed, “They are not a self-governing race. They are Orientals.” Later, back in Indiana, Beveridge questioned how anyone could oppose the “mission” of American imperialism. After all, he argued, “The rule of liberty … applies only to those who are capable of self-government. We govern Indians without their consent. … We govern children without their consent.” Coarse though his language was, at least Beveridge was articulating a consistent truth: Americans did have a long history of selectively applying civil rights, regularly denying them to blacks and natives. Why not, then, deny such freedoms to “Orientals”?

Other interest groups agreed with the racialized framing of America’s role in the world. Missionaries, for example, flocked to the Philippines to “Christianize” the natives—apparently, and ironically, unaware that most Filipinos were already Christian (Roman Catholic). American soldiers also used racist language to address the tough counterinsurgencies they found themselves engrossed in, and to label and dehumanize their enemies. Just before open warfare broke out between American troops and Filipino rebels in the capital of Manila, one U.S. trooper wrote, “Where these sassy niggers used to greet us daily with a pleasant smile … they now pass by with menacing looks.” It was, indeed, remarkable how quickly the pejoratives long applied to African-Americans were retooled for America’s new Asian subjects.

When fighting did break out in the Philippines, the soldier who fired the first shots ran back to his lines and yelled, “Line up, fellows, the niggers are in here, all through!” Years later, another American soldier wrote home from the Philippines that “I am growing hardhearted, for I am in my glory when I can sight my gun on some dark skin and pull the trigger.” American soldiers and officers—often veterans of the Native American wars of the last century—also took to mixing metaphors when describing their Filipino opponents. Gen. Elwell Otis urged Filipinos in his district to “be good Indians.” Gen. Frederick Funston (for whom a military camp is named in Kansas) considered Filipinos “a semi-savage people.” Theodore Roosevelt took to calling Filipino insurgents “Apache or Comanche,” or otherwise “Chinese half-breeds” or “Malay bandits.”

In another twist of irony, many of the Army regiments engaged in combat in the Philippines consisted of black enlisted men. Often more sympathetic to the locals, these African-American troopers recognized how racism alienated and inflamed the Filipino population. One black soldier, B.D. Flower, wrote home in 1902, “Almost without exception, soldiers and also many officers refer to natives in their presence as ‘Niggers’ … and we are daily making permanent enemies. …” Analogous situations exist in America’s contemporary occupations in Iraq and Afghanistan. Arabs are often called “camel jockeys,” “rag heads” or “sand niggers.” The temptation and comfortable mental heuristic to lump the enemy together as an inhuman and often racialized “other” all too often only empowers and spreads rebellion. It is a lesson that this author lamentably learned in Baghdad and Kandahar, and that U.S. Army soldiers of the last century learned in Manila.

Nor was it just missionaries and soldiers who employed racial rhetoric to justify the annexation of new colonies and subjugation of the Filipino rebel movement. An editorial in the Philadelphia Ledger opined, “It is not civilized warfare, but we are not dealing with a civilized people. The only thing that they know is fear and force, violence and brutality, and we are giving it to them. …” Senior politicians also used racist and pejorative language. President McKinley referred to “misguided Filipinos” who simply couldn’t recognize that the U.S. acted “under the providence of God and in the name of human progress and civilization.” In sum, the United States had a racial, religious and civilizational duty to “benevolently assimilate” those the civilian governor (and future U.S. president) of the Philippines, William Howard Taft, patronizingly called “our little brown brothers.”

From the poetry of the day to the crass language of the common soldier to the rhetoric of the missionary to the proclamations of senior politicians, race infected the words and ideas of American imperialists. Armed with the armor of white supremacy, American fighting men and policymakers would, in the conflict that followed in the Philippines, wage war with a savagery they would never have applied to a white European enemy.

Quagmire and Atrocity: The Philippine-American War

“No imperial designs lurk in the American mind. They are alien to American sentiment. … Our priceless principles undergo no change under a tropical sun.” —President William McKinley in speaking of the Philippines in 1899

It has long been inaccurately labeled the “Philippine Insurrection” or “the Philippine-American War” and has been almost lost to history. Few Americans today even recall what is actually best described as a long-running Filipino rebellion waged in quest of independence. In a cruel irony, it was to be the United States—forged in opposition to empire and occupation—that would now play King George as the Filipinos struggled for independence.

There was nothing inevitable about the war in the Philippines. Sure, the island chain was a Spanish possession, but given that the war of 1898 was waged allegedly over Cuba, nothing stipulated that the U.S. had to invade and occupy the Philippines. Here again, Roosevelt was front and center. Without consulting his boss or the president, Assistant Secretary of the Navy Roosevelt issued pre-emptive orders to Adm. George Dewey’s Pacific fleet to sail to Manila and sink the Spanish ships there in the event of an outbreak of war. War began and Dewey followed orders. The result was a massacre. The better-equipped American warships outranged the Spanish vessels and inflicted 381 casualties while suffering only six wounded. Even then, with the Spanish fleet at the bottom of the harbor, nothing preordained the American ground occupation of the islands, but a sort of militaristic inertia ensured that McKinley would indeed sail an army to Manila to take control of the archipelago.

McKinley, true to his honest nature, later admitted that when he heard of Dewey’s victory at Manila he “could not have told you where those darned islands were within a thousand miles.” Presidential ignorance aside, before a significant land force could reinforce Dewey, the naval commander sought all the help he could get in defeating the Spanish garrison. Dewey went so far as to sail the Filipino rebel leader Emilio Aguinaldo—the Filipinos had been in the midst of an independence struggle with the Spanish when the Americans arrived—from Hong Kong to Manila, hoping Aguinaldo’s rebels would reinforce American efforts on the islands. Aguinaldo believed he and Dewey had a deal: that once the combined American-Filipino force liberated the islands, the U.S. would recognize Philippine independence. It was not to be.

In the end, when the Spanish garrison surrendered Manila, Aguinaldo was not even invited to the ceremony. It was then, under pressure from expansionists in McKinley’s own party, that the U.S. president had what he described as a “divine intervention” instructing him to annex the Philippine Islands. Struck by a sudden urge as he walked the corridors of the White House on the night of Oct. 24, 1898, he fell to his knees “and prayed Almighty God for light and guidance,” according to McKinley. Spoiler alert: God told him to seize the Philippines. Later he would declare that “there was nothing left for us to do but to take them all, and to educate the Filipinos, and uplift and civilize and Christianize them by God’s grace.” (As previously noted, most of these pagans who required Christianization were already Roman Catholics!) Interestingly, this was not the only militaristic divine intervention in U.S. presidential history. Before the 2003 invasion of Iraq, then-President George W. Bush famously announced that “God told him to end the tyranny in Iraq!” In both cases God seems to have saddled Americans with dirty, difficult tasks. (Well, he is known to work in mysterious ways. …)

At the start of 1899, McKinley imposed official military rule over the Philippines. Aguinaldo, who led his own army, one that was then staring across the lines at the American Army, could never accept this arrangement. He declared, “My nation cannot remain indifferent in view of such a violation and aggressive seizure of its territory by a nation [the U.S.] which has arrogated to itself the title, ‘champion of oppressed races.’ … My government is disposed to open hostilities.” Before the fighting kicked off, however, the Filipinos, following in the footsteps of the American colonists, nominated members to a newly elected congress and wrote a constitution that drew from the examples of Belgium, France, Mexico and Brazil. Washington ignored this impressively democratic turn of events.

The war began when sentries from the two opposing armies fired upon each other on Feb. 4, 1899. The day ended badly for the Filipinos. The superiorly armed and trained American Army implemented a prepared plan of attack as soon as the first shots were fired, and by day’s end 3,000 Filipinos lay dead, in contrast with 60 American fatalities. Within weeks, thousands more Filipino troops and civilians were killed. The anti-imperialist American Sen. Eugene Hale then declared in Washington, accurately, “More Filipinos have been killed by the guns of our army and navy than were patriots killed in any six battles of the Revolutionary War.”

U.S. soldiers torture a Filipino in 1901 with the “water cure,” a form of what is now called waterboarding.

After Aguinaldo’s conventional army was mostly defeated, the archipelago settled into years of guerrilla warfare between the U.S. Army and assorted local rebels (or freedom fighters, depending on one’s point of view). As the war turned into an insurgency, the brutality of both sides—but especially of the Americans—intensified. U.S. soldiers, seeking to gather tactical information from captured insurgents, took to administering the “water cure,” a crude form of waterboarding that dates back to the Spanish Inquisition in the 16th century. A victim was held to the ground and force-fed water; then his tormentors would stomp on his stomach and repeat the process. Most victims died. A form of this torture would later be employed by the U.S. at Guantanamo Bay and various secret prisons during the so-called “war on terror.”

A private wrote in a letter published in a newspaper that after an American soldier was found mutilated, Gen. Loyd Wheaton ordered his forces “to burn the town and kill every native in sight, which was done.” By 1901, Secretary of War Elihu Root had formalized the brutality of the war, telling reporters that from then on the U.S. Army would follow a “more rigid policy” in the Philippines. One reporter from a New York magazine, The Outlook, went to see this rigid policy for himself. He wrote back a horrifying description of American counterinsurgency. “In some of our dealings with the Filipinos we seem to be following more or less … the example of Spain. We have established a penal colony; we have burned native villages … we resort to torture as a means of obtaining information.” One general, James Franklin Bell, told a reporter that after two years of war “one-sixth of [the main island] of Luzon’s population had either been killed or died of disease”—which would have amounted to more than half a million people. Bell was awarded the Medal of Honor for his efforts.

A reporter from the Philadelphia Ledger observed, “Our men have been relentless, have killed to exterminate men, women, children, prisoners and captives … lads of ten and up, the idea prevailing that the Filipino, as such, was little better than a dog.”

Reports of high numbers of prisoner executions appear credible. By the summer of 1901, casualty figures showed that five times as many Filipinos were being killed as wounded—the opposite of what is normally seen in wars. Gen. Arthur MacArthur, senior commander in the Philippines and father of the future Gen. Douglas MacArthur, admitted that his men were indeed under orders to use “very drastic tactics.” That seems an understatement. Nor was American military violence the only threat to the Filipinos. Around the same time, a cholera epidemic killed over 100,000 people. America’s brand of “freedom” came at a high price for the Filipino population.

By late 1901, with the insurgency all but defeated, many Americans had begun to lose interest in the war. Then, on Sept. 28, Filipino rebels on the distant Philippine island of Samar surprised and killed a high percentage of a U.S. Army company, mostly with machetes. Roughly 50 Americans were slain outright or mortally wounded. Labeled by the press as the “Balangiga Massacre,” it was immediately compared (inaccurately) to Custer’s Last Stand and The Alamo. The real controversy, however, erupted after Brig. Gen. Jacob “Hell-Roaring Jake” Smith, a 62-year-old vet of the Indian Wars, was sent to pacify Samar.

Reports of extreme abuses and alleged war crimes immediately arrived back home. This time the Congress had little choice but to conduct a pro forma investigation. During congressional hearings, a U.S. Army major testified that Gen. Smith had told him: “I want no prisoners. I wish you to kill and burn. The more you kill and burn, the better you will please me. I want all persons killed who are capable of bearing arms.” When the major asked for an age guideline, Smith allegedly replied “10 years.” Smith, called to the hearings, eventually admitted to all this. He was court-martialed but served not a day in prison. His punishment was a reprimand from the secretary of war, with the leniency being justified on the grounds that Smith was driven to crime by “cruel and barbarous savages.” For another American general, Frederick Funston, even the reprimand of Smith was too harsh. Funston freely admitted in a speech that he “personally strung up 35 Filipinos without trial, so what’s all the fuss over [Smith] dispatching a few treacherous savages?” Asked how he felt about the growing anti-imperialist movement in America, Funston declared that those harboring such sentiments “should be dragged out of their homes and lynched.” Reading of this interview, the avowed anti-imperialist Mark Twain volunteered to be the first man lynched.

The final major campaign occurred on southern Luzon in 1902. Gen. James Franklin Bell removed natives from villages and placed them in concentration camps; crops were burned and livestock was killed; a random Filipino was selected for execution each time an American soldier was killed in combat (a certain war crime even by the standards of the day); and an American decree made it “a crime for any Filipino to advocate independence.” In three months, 50,000 locals were killed. The war was effectively over, though short spurts of violence and rebellion would occur occasionally for another decade. Untold hundreds of thousands of Filipinos were dead. The water buffalo, the key to rural life in the region, had been made nearly extinct, its numbers diminished by some 90 percent. Indeed, as historian Stephen Kinzer disturbingly noted, “Far more Filipinos were killed or died as a result of mistreatment [over four years] than in three and a half centuries of Spanish rule.” This, it appears, was the price of American “liberty”—and the islands would not receive genuine independence until after World War II!

For the Soul of America: The (Mostly) Noble Anti-Imperialist Movement

In this 1899 political cartoon from The American Sentinel titled “The New Temptation on the Mount,” the devil of despotism and imperialism tempts Lady Liberty with the spoils of overseas conquest.

For all the villains in this story, there were Americans willing to dissent against overseas conquest and imperialism. Indeed, they were a large, diverse and sometimes peculiar lot. They are, too, the heroes of the era. For the most part, that is. From the very start of the Philippine occupation, many prominent citizens publicly opposed the war. This coalition of intellectuals, politicians, artists and businessmen may have acceded to the conquest of native and Mexican lands but saw imperial expansion overseas as un-American and unconstitutional. Throughout the era they made their voices heard and fought for the soul of the nation.

Early critics of the war pointed out the hypocrisy of fighting for Cuban rights when African-Americans at home were still regularly lynched and disenfranchised. A dozen prominent New Yorkers raised the alarm in a public letter before the war with Spain, proclaiming, “The cruelty exhibited in Cuba is no peculiarity of the Spanish race; within the last few weeks instances of cruelty to Negroes occurred in this country which equal, if not surpass, anything which has occurred in Cuba. … Our crusade in this matter should begin at home.” The most prominent black leader of the era, Booker T. Washington, raised a similar concern in a speech after the Spanish surrender. After praising the heroic efforts of the troops, he called for America to heal racial wounds on the domestic front. He argued, “Until we conquer ourselves, I make no empty statement when I say we shall have, especially in the southern part of our country, a cancer gnawing at the heart of the republic.”

It was, however, the annexation of the Philippines that truly kicked off a dissenting movement in the United States. Skeptics across the spectrum of public life would form the Anti-Imperialist league, which, at its height, had hundreds of thousands of members—one of the largest anti-war movements in American history and an impressive achievement in a period of such intense martial fervor. The leaders of the movement included Democratic Party stalwart William Jennings Bryan, the magnate Andrew Carnegie (who offered to buy the Philippines from the U.S. government in order to set the islands free!), the social activist Jane Addams, the labor organizer Samuel Gompers, the civil rights leader Booker T. Washington, former President Grover Cleveland, former President Benjamin Harrison and the famed author Mark Twain. What the members of this diverse group had in common was a profound sense that imperialism was antithetical to the idea of America.

Bryan, one of the great orators of the day, summarized this notion when he proclaimed that “the imperialistic idea is directly antagonistic to the idea and ideals which have been cherished by the American people since the signing of the Declaration of Independence.” The politician and Civil War veteran Carl Schurz compared the Filipino rebels favorably with the colonial patriots and asked what Americans would do if the natives refused to submit—“Let soldiers marching under the Stars and Stripes shoot them down? Shoot them down because they stand up for their independence?” Of course, that is exactly what the U.S. Army would do, under orders from the president himself.

The Anti-Imperialist League won many moral but few practical victories. Part of the reason for this was the U.S. government’s overt suppression of civil liberties. Famously, in what became known as the “mail war,” the postmaster general ordered anti-imperialist literature mailed to soldiers in the Philippines to be confiscated. Critics of American foreign policy called it the “rape of the mail.” Practically thwarted, artists and cultural critics took the anti-imperial fight to public. The most prominent and outspoken was Mark Twain, and this, more than his famous books, marked the man’s finest hour. He announced his stand in late 1900, stating, “I have seen that we do not intend to free, but to subjugate the people of the Philippines. We have gone there to conquer, not to redeem. … And so I am an anti-imperialist.” Twain only lashed out harder as the war went on. By 1901, he declared that “we have debauched America’s honor and blackened her face” and recommended the Stars and Stripes be changed: “We can just have our usual flag, with the white stripes painted black and the stars replaced by the skull and cross-bones.” Some called it treason, others patriotism.

Though the anti-imperialists might appear to be saints, there was a dark element in the movement. Many dissenters’ opposition to annexation of foreign lands came not from a moral code but from fear of the racial amalgamation that might result. Some of these men were anti-imperialist senators from the South. One, Sen. Ben Tillman of South Carolina, summarized this viewpoint, concluding, “You are undertaking to annex and make a component part of this Government islands inhabited by tens of millions of the colored race … barbarians of the lowest type.” Furthermore, he stated, “It is to the injection of the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.” This was far from the language of liberty, but remained embarrassingly common in the movement.

This offensive component aside, eventually, and remarkably, genuine anti-imperialist sentiments made it into the official platform of the Democrats, one of the two mainstream political parties. Imagine a major party platform, even today, declaring: “We oppose militarism. It means conquest abroad and intimidation and oppression at home. It means the strong arm which has been ever fatal to free institutions.” It was a noble platform, indeed. But, ultimately, these sentiments and this party lost. Theodore Roosevelt, the national cheerleader of imperialism, easily retained the presidency in the election of 1904 (he had risen from vice president to the presidency when McKinley was assassinated in 1901). In a sense, this marked the death knell of an era of anti-imperialism. There had been, in the election, a referendum on the nature of the national soul, and, sadly, the American people chose war, conquest and annexation.

* * *

This era remains with us; it is alive in our debates and politics. Consider this: Even now, citizens of Puerto Rico, Guam and Samoa have no representation in Congress or a vote in presidential elections. The status of these territories and their populations is peculiar for a nation that so strongly professes democracy. The situation is a direct result of decisions made in 1898-1904. In 1901, the Supreme Court, by a vote of 5 to 4, ruled in Downes v. Bidwell that “the Constitution does not apply” to the territories because the islands were “inhabited by alien races.” This verdict, one among what are called the “insular cases,” remains essentially intact to this day.

Another legacy of the era was the rapid expansion of executive, presidential power. McKinley became the first president to, according to historian Stephen Kinzer, “send a large force to a country with which the United States was not at war,” when, in 1900, he dispatched 5,000 troops from the Philippines to help suppress the nationalist Boxer Rebellion in China. One could plausibly argue that this was the birth of what is still known as “presidential war power.” It is because of this precedent that American soldiers fight one undeclared war after another across the Middle East. Between 1898 and 1904, the American people—living in a somewhat democratic country (for white men, at least)—made a series of choices about what, exactly, the United States was to be. Mark Twain begged the populace to choose liberty; Roosevelt urged expansion and power. The citizenry made its fateful choice, for better or worse.

We live still in the shadow of 1898. The choice between republic and empire still lies before us.

* * *

To learn more about this topic, consider the following scholarly works:
• Stephen Kinzer, “The True Flag: Theodore Roosevelt, Mark Twain, and the Birth of American Empire” (2017).
• Jackson Lears, “Rebirth of a Nation: The Making of Modern America, 1877-1920” (2009).
• Jill Lepore, “These Truths: A History of the United States” (2018).

Maj. Danny Sjursen, a regular contributor to Truthdig, is a U.S. Army officer and former history instructor at West Point. He served tours with reconnaissance units in Iraq and Afghanistan. He has written a memoir and critical analysis of the Iraq War, “Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge.” He lives with his wife and four sons in Lawrence, Kan. Follow him on Twitter at @SkepticalVet and check out his new podcast, “Fortress on a Hill,” co-hosted with fellow vet Chris “Henri” Henrikson.

The views expressed in this article are those of the author, expressed in an unofficial capacity, and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. government.

Read more

Has Iraq Become Another ‘Lesson Lost’ Like Vietnam?

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

This piece originally appeared on The American Conservative.

According to reports, the Army has delayed the publication of a 1,300-page internal Iraq war study commissioned by General Ray Odierno in 2013. The volume, which few in the public were even aware of, was an admirable project. After all, the U.S. military famously ignored and jettisoned any lessons after its defeat in Vietnam. Most of us would agree that simply can’t happen again.

So why the delay? Some fear the Army might be hesitant to publish a study that takes its leadership to task for decisions critical to the execution, and perhaps outcome, of the war. (Basically, while the Army says it wants to learn its lessons, it doesn’t necessarily want to see them in black and white.) One chief Army historian claimed it would “air” too much institutional “dirty laundry.”

Indeed, retired Colonel Frank Sobchack, a study team director, expressed concern about the delay in the report’s release, asserting “that the Army was paralyzed with apprehension for the past two years over publishing it leaves me disappointed with the institution to which I dedicated my adult life.”

Of course, there has been skepticism about the report itself, given the commissioner of the project and the composition of the study team. Some fear the conclusions will skew towards a one-sided lionization of the 2007 “surge”—which General Odierno and his closest subordinates oversaw. In fact, reporting in The Wall Street Journal suggests that the report credits Odierno and General Petraeus, who commanded all U.S. and coalition forces in Iraq at the time, for turning the war around by shifting to a theater-wide counterinsurgency strategy (COIN).

These are all legitimate concerns. Indeed, this author has long sought to debunk the flawed notion that Petraeus’s famed “surge” achieved anything more than a temporary pause in violence and political instability. Still, the real problem with this report is that it completely ignores the utterly flawed grand strategy that brought the U.S. military into Iraq in the first place in favor of focusing on more minor tactical mistakes.

Let us review, then, some of the reported conclusions in the study and how they’re disjointed from the larger, strategic failures inherent to the entire American military adventure in Iraq.

The report admits to the following shortfalls:

  • More troops were needed to occupy the country and fight an expansive insurgency. This was undoubtedly the case, but it fails to consider whether America even had such troops available in its volunteer force, and whether an all-hands-on-deck effort to pacify Iraq was the best strategic use of its limited military machine.
  • The failure to deter Iran and Syria, which gave sanctuary and support to Shiite and Sunni militants, respectively. True enough, but how exactly—short of an expanded regional war—could the U.S. have hoped to stop this? Iraq is in Syria’s and Iran’s neighborhood, just as the Caribbean is in ours. How could Washington not expect Syria and Iran to meddle so close to home?
  • Coalition warfare wasn’t successful: the deployment of allied troops had political value but was “largely unsuccessful” because the allies didn’t send enough troops. This ignores the reasons why so few countries sent substantial troops to our aid in Iraq. It was because they considered such an invasion ill-advised, illegal, and, in many cases, immoral. Perhaps the U.S. should have listened carefully to its long-standing friends.
  • The failure to develop self-reliant Iraqi forces. Well, of course. But if eight years (2003 to 2011) of training and funding a new Iraqi army wasn’t enough to make them self-sufficient, and if 17 years hasn’t been enough to do the same in Afghanistan, might not the entire theory of America’s ongoing “advise & assist” missions need to be rethought?
  • An ineffective detainee policy: the U.S. decided at the outset not to treat captured insurgents and militia fighters as prisoners of war and many Sunni insurgents were returned to the battlefield. This ignores Guantanamo, and, most likely, the national-level global detainee policy of the U.S. Perhaps indefinite detention of suspected insurgents or “terrorists,” without any recourse to due-process, created more enemies than it imprisoned. Think Abu Ghraib.
  • Democracy doesn’t necessarily bring stability: U.S. commanders believed the 2005 Iraqi elections would have a “calming effect,” but instead they exacerbated ethnic and sectarian tensions. A more holistic analysis would question the very capacity of a foreign military occupation force to impose democracy in an ancient locale at war with itself.

There’s also a personal connection here. The leader of the study team was Colonel Joel Rayburn. At first glance, no one is more qualified. Rayburn is brilliant and someone I hold in high esteem. He taught me British History at West Point, performed quite well on a popular TV trivia show, and wrote an interesting book on Iraq. Then again, Colonel Rayburn was deeply involved in the planning and execution of the famed “surge,” and is rather likely to glorify a military campaign (one this author fought in) that ultimately failed in its purpose—to stifle violence long enough to stabilize and form an inclusive Iraqi government.

Sure, violence drastically, if temporarily, decreased, but that was mainly due to a short-term alliance with former Sunni insurgents (many of whom had American blood on their hands). In the end, all that the roughly 1,300U.S. troops killed during the surge achieved was the long-term entrenchment of a Shiite chauvinist prime minister, Nouri al-Maliki. His corrupt government in Baghdad alienated the very Sunnis once on America’s payroll and caused a new outbreak of sectarian violence. Many of those Sunnis later allied with or joined ISIS, seeing the group as their best protection. Looking back, that’s far from an encouraging outcome for a “surge” in which so many American servicemen were killed.

A truly expansive history (or study) of the Iraq war (perhaps best commissioned by a government agency besides the military) would admit to the much broader failures of the U.S. adventure in Iraq. It would certainly discuss the tactical and operational failures included in the current report, but it would also focus on the scarcity of American grand strategy. Such a study would question whether external military intervention is even capable of reordering and stabilizing ancient societies. It would include an honest cost-benefit analysis and ask whether the proverbial juice was worth the squeeze in Iraq. Did that war make America safer? Unlikely. Was it necessary? Undoubtedly not. These are the true takeaways from what will someday be remembered as one of America’s great foreign policy debacles.

The safe bet is that little to none of this will be in the report, when—or if—it ever sees the light of day. We shall see, of course, but this much is certain: an entire generation of American troops dedicated the greater part of the last 15 years of their lives to the war in Iraq. I left four soldiers and the remnants of my emotional health in Baghdad. Others suffered far worse—notably average Iraqis. We deserve a comprehensive, honest, and critical analysis of that debacle.

Whether we ever get one remains to be seen.

Read more

America’s Wars Are a Non-Factor in the Midterm Elections

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

This piece originally appeared on antiwar.org.

The United States military is actively fighting in seven Muslim-majority countries; and no one cares. As Americans go to the polls today in a ritual pretense of democracy, they will vote for one of the two major political parties on issues ranging from healthcare to immigration to the basic personality of President Donald Trump. The three mainstream networks – from “liberal” MSNBC to “conservative” Fox News – have reported on little else for the last several months. The whole charade is little more than politics-as-entertainment, like some popular sporting event in which the opposing sides wave the flag for the blue team or the red team.

For weeks now, my television, and yours, has been saturated with political commercials for and against local legislative candidates. Some are attack ads focused on corruption and the supposed left or right-wing extremism of the opposing candidate. Others center on taxes, healthcare, and the ostensible “hordes” of immigrants approaching the U.S. in a troublesome caravan. But none, I repeat, none, say a thing about American foreign policy, the nation’s ongoing wars, or the exploding, record defense budget. You see, in 2018, despite being engrossed in the longest war in US history, the citizenry – both on Main Street and Wall Street – display nothing but apathy on the subject of America’s clearly faltering foreign policy.

The reasons are fairly simple: while the populace reflexively (over) adulates our “heroes” in uniform, it has been programmed to ignore the actual travails of our troopers. So long as there is no conscription of Americans’ sons and daughters, and so long as taxes don’t rise (we simply put our wars on the national credit card), the people are quite content to allow less than 1% of the population fight the nation’s failing wars – with no questions asked. Both mainstream wings of the Republicans and Democrats like it that way. They practice the politics of distraction and go on tacitly supporting one indecisive intervention after another, all the while basking in the embarrassment of riches bestowed upon them by the corporate military industrial complex. Everyone wins, except, that is, the soldiers doing multiple tours of combat duty, and – dare I say – the people of the Greater Middle East, who live in an utterly destabilized nightmare of a region.

Why should we be surprised? The de facto “leaders” of both parties – the Chuck Schumers, Joe Bidens, Hillary Clintons and Mitch McConnells of the world – all voted for the 2002 Iraq War resolution, one of the worst foreign policy adventures in American History. Sure, on domestic issues – taxes, healthcare, immigration – there may be some distinction between Republican and Democratic policies; but on the profound issues of war and peace, there is precious little daylight between the two parties. That, right there, is a formula for perpetual war.

To find the few brave voices willing to dissent against the foreign policy consensus, one must look to the political margins of the libertarian right (i.e. Rand Paul) and the democratic socialist left (i.e. Bernie Sanders). This is a sad state of affairs on an election day that both Donald Trump and Barack Obama have assured us is the “most consequential” of our lifetimes. You see on this point I actually agree with these two polar political opposites. This is a vital election, only not for the reasons we’re told. This November 6th is profound because it demonstrates, once and for all, the utter vacuousness of American politics.

So where does the U.S. currently stand on foreign policy today? Well, it is actively bombing seven countries, has up to 800 military bases in 80 countries, has combat troops, special forces, drones and/or advisors on the ground in (or in the skies above) Syria, Iraq, Afghanistan, Somalia, Yemen, West Africa, Libya and Pakistan, among others. Occasionally, American service-members are still dying across the Middle East – often in treacherous insider attacks, in which they very people we “advise and assist” turn their weapons on our troops.

Furthermore, it is unclear that the US is either “winning” – whatever that means anymore – or accomplishing anything of note in any of these locales. For example, in the longest conflict of the lot, Afghanistan, all the key metrics indicate that the US is losing, both politically and militarily. As for the other ongoing wars in the region, no one – not the generals or the civilian policymakers – seems capable of articulating an exit strategy. Maybe there just isn’t any.

Still, none of that will be on the ballot today, when Americans queue up to vote for their favorite teams. They’ll be casting ballots based on the illusion of differentiation between two highly corporate political entities that are squarely in the pocket of the weapons’ industry and their Wall Street financiers. And, tonight, when the media outlets dazzle their viewers with holograms, charts, and other neat toys depicting the day’s winners and losers – not one station will even utter that naughty word: Afghanistan.

What all this illustrates, in sum, is that the citizenry doesn’t really care about the troops, and neither do their elected leaders. Soldiers are political props and little else – meant to be “thanked,” paraded at sporting events, and then effectively ignored – the new American way.

The republic, or, more accurately, the empire, is in real trouble when – in the midst of its longest conflicts ever – war is not even on the agenda at the polls today. Pity the nation…

Read more

The U.S. Military’s Empire of Secrecy

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

“Democracy dies in darkness.” That’s an old saying that The Washington Post recycled as its motto at the dawn of the Trump era. Truth is, the journalists at the Post don’t know the half of it; nor do they bother to report on the genuine secrecy and increasing lack of transparency in the Department of Defense. Nothing against the Post—neither do any of the other mainstream media outlets.

But it’s true: Right under most Americans’ noses, the military has become more opaque over the last several years. Now, few outlets cover foreign policy with any particular gusto—after all, there’s so much to say about Stormy Daniels or the Brett Kavanaugh drama. But this trend should concern all citizens.

Thing is, what the U.S. military is up to on any given day is done in your name. If civilians are killed, locals alienated or civil liberties restricted, then the global populace, including concerned U.S. citizens, aren’t going to fix blame solely on the armed forces … they’re going to blame you! If for no other reason than this, citizens of an—ostensible—democracy ought to be paying attention. The military is a fierce, potentially brutal instrument, and anyone who cares about liberty ought to watch it closely.

Only that’s getting harder and harder to do in today’s political climate. On one issue after another the U.S. military has recently intensified its secrecy, classified previously open information and suppressed any remaining sense of transparency. Don’t just take my word for it: This week a relatively mainstream congressional Democrat, Adam Smith—a ranking member on the House Armed Services Committee—wrote at length on this very topic.

Make no mistake, these trends are long-standing and gradual. So, what follows is not some vacuous liberal attack on President Trump, who remains, for legal purposes, and so long as I remain in uniform, my commander in chief. Still, the time is long past when someone needs to scream from the proverbial mountaintop about America’s expanding empire of secrecy.

Though there are plenty of examples to review, there’s something else to keep in mind: The military isn’t some monolithic monster. It’s far more discreet than that, and so are these trends, so watch closely. Evidence abounds. Soon after the inauguration, the military—which had long recognized and planned for the existential threat of climate change—received guidance to all but purge the term from its reports. It was to be replaced with more nebulous (and inaccurate) phrases, such as “extreme weather.”

Then there’s the minor matter of the War in Afghanistan and its progress—after, you know, 17-plus years. One of the key benchmarks or metrics for progress has been the success or failure of the Afghan National Security Forces (ANSF). Well, for years the DOD released annual casualty figures for the ANSF, and the trends were alarming. Afghan Security Force casualties are frankly unsustainable—the Taliban are killing more than the government can recruit. The death rates are staggering, numbering 5,500 fatalities in 2015, 6,700 in 2016, and an estimate of “about 10,000” in 2017. The reason we’re not sure about the exact count last year is because that data—admittedly at the request of the Afghan government—has been newly classified. This seems absurd. How can the legislature or the public determine the viability or prognosis of America’s longest war without such key statistics? The short answer is, they can’t. And so, the war drags on. …

What’s more, the military’s historically uneasy relationship with the press has also further chilled. As Rep. Smith reported, and complained about, the DOD had issued edicts to curtail or discourage officers from providing candid assessments on readiness challenges, the control of nuclear weapons and other key appraisals. Only after a prolonged public outcry were these once-common press interactions partially reinstated. Nevertheless, this all points to an alarming trend of apparent furtiveness.

There are other examples to add into the disturbing mix. The Navy has stopped publicly posting accident reports. Also, at a time of exploding, record defense budgets, once routine public reports on the cost, schedule and performance of expensive weapons systems have, since 2017, been labeled as “For Official Use Only”—which keeps the data from the public through an ever-expanding regime of “over-classification.” Without such public releases, the populace and their elected representatives cannot effectively scrutinize what President (and five-star General) Eisenhower aptly labeled the “dangerous” military-industrial complex. Is that the point? Let’s hope not.

Then there is the internal censorship within the military’s computer networks. Recently, credible, left-leaning websites such as Tom Dispatch and The Intercept have reportedly been blocked on many government computers. The reason provided in the firewall warning message is the existence of “hate and racism” on the two sites. Now, many readers, and even more American citizens, may not like the content of these publications—which is fine—but anyone who has even briefly read anything on these sites can vouch for one salient truth: There is absolutely nothing hateful or racist at Tom Dispatch or The Intercept. These publications are professionally edited and reviewed, and, indeed, are unique in that they focus on long-form analytical essays.

It appears that the only crime of these sites is that they are, indeed, left-leaning. Need proof? Well, guess which genuinely racist, conspiracy-theory-peddling websites are not blocked? You guessed it: Brietbart and InfoWars. Heck, even Facebook and Twitter have taken steps to ban Alex Jones’ InfoWars from their social media sites. So, there’s only one major conclusion to draw: Genuinely shocking and offensive right-leaning publications are just fine; meanwhile, even credible, respected left-leaning sites are apparently a threat. This sort of rank partisanship is disturbing from a purportedly apolitical organization like the DOD.

Now, there are no doubt times when tactical necessity requires secrecy in military operations. I’ve lived at the sharp end of that spear, and do not discount its occasional inexorability. That said, much of the move away from transparency has little to do with combat, so to speak, and more to do with politics. We, the citizenry, trust our military with immense responsibility, but as a supposed democracy, that same military ought to be accountable to Congress and to the public. These days, that seems ever more like a distant fantasy.

This all matters. America has a choice. It can be an empire—or it can be a genuine republic. It may not be both.


Danny Sjursen is a U.S. Army officer and a regular contributor to Truthdig. He served combat tours with reconnaissance units in Iraq and Afghanistan and later taught history at his alma mater, West Point. He is the author of a memoir and critical analysis of the Iraq War, “Ghostriders of Baghdad: Soldiers, Civilians, and the Myth of the Surge. Follow him on Twitter at @SkepticalVet.

[Note: The views expressed in this article are those of the author, expressed in an unofficial capacity, and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. government.]

Copyright 2018 Danny Sjursen

Read more

The Insidious Myth of the Magical American Soldier

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

This piece originally appeared on antiwar.com

We aren’t miracle workers. We’re just soldiers after all – kids barely out of their teens and officers in their mid-20s do most of the fighting. Still, policymakers in Washington, and citizens on Main Street both seem convinced that the mere presence of a few hundred or thousand American troops can alter societies, vanquish the wicked, and remake the world.

A colleague of mine refers to this as the myth of the magic soldier: sprinkle US troops in some horrific mess of a country and voilà – problem solved!

It sounds great, but this sort of delusional thinking has led the United States into one failed quagmire after another, killing some 7,000 US troops and close to one million locals. After 17 years of fruitless, indecisive war, its quite incredible that a bipartisan coalition of mainstream Republicans (neocons, mostly) and Democrats (neo-liberal relics) still cling to the idea that American soldiers wield magic powers. It’s long past time to review the record of our over-adulated troopers and reframe the actual – limited – capabilities of military force.

The standard Washington-media-military narrative goes something like this: take any unstable Muslim country that has any presence of Islamists at all; drop in a few thousand US Army advisors, trainers, or combat troops; stay indefinitely – and loudly proclaim that if ever those soldiers should leave said Muslim country it will undoubtedly collapse and the US of A will be directly threatened.

Some version of that exact formula has been tried in, sequentially, Afghanistan (2001-present), Iraq (2003-present), and Syria (2011-present), along with numerous smaller regional locales: Libya, Niger, Somalia, Yemen, etc. Sometimes the troop levels topped out at nearly 150,000 (Iraq), other times the ground forces and special operator teams are smaller (Yemen, Somalia), but the basic blueprint is the same – US airpower, plus commando raids, plus trainers and advisers can somehow stabilize the unstable, secure the insecure, and – ultimately – we hope, craft a “Little America” in the Muslim world. There’re just a couple problems with this veritable religion of US militarism: 1) we rarely consult with the locals before beginning each “crusade”; and 2) It. Has. Yet. To. Work.

Let us enter, then, the world of the absurd – US interventions since 9/11. In Afghanistan, the ultra hawks told (and tell) us, repeatedly, that more soldiers were needed to back up the government in Kabul. Without those magic troops, we’re warned, Al Qaeda will be back and the US Homeland in grave danger. Of course, the fact is there’s relatively few such fighters in Afghanistan, and the Taliban – our primary opponent – has neither the capacity or intent to threaten the US These folks want to conquer Kandahar not Kalamazoo…

Then there was the Iraq invasion, euphemistically titled Operation Iraqi Freedom, which began as a fantastical attempt to craft a liberal democracy between the Tigris and Euphrates – all at the point of a bayonet. By 2006, that adventure had all but fallen apart as the country tumbled into outright civil war. Only then, according the popular, prevailing military and political myth, a new general – David Petraeus – and some 30,000 more “magic” U.S. troopers, turned the tide. In hindsight that was never the case. The US military bought off former enemies with American blood on their hands and temporarilylessened violence. Washington never achieved a more vital political settlement in Baghdad and within three years of America’s departure Iraq was back in chaos. And back to Mesopotamia flew our soldier miracle workers.

This is when a second mainstream – and utterly bunk – myth developed: that if only Obama had left 10,000 “magic” soldiers in country that Iraq would have been just fine and ISIS would never have formed. Such an assertion denies agency to the Iraqis (who ultimately determine their own destiny), overestimates the capabilities of American troops, and ignores the fact that it was the Iraqi government that refused to sign a treaty to keep a US military presence on the ground. In the soldiers-as-miracles narrative, of course, all that is omitted or ignored.

The same goes for the smaller US presence in Syria, Africa, Libya, Yemen, Somalia, and on and on. We’re assured that just a bit more airpower, a smidgen more commando raids, and a few more military advisors will turn the tide, stabilize the unstable, and ensure American security. The problem is this: in each case, no one seems able to articulate an exit strategy. That’s because there is none! And there’s the rub – so long as Americans are convinced of the preternatural capabilities of US troops, Washington will be forced to keep them forever deployed. Should they leave (any of these various locales) we’re told that chaos and transnational terror will explode in the region and in American cities. If that’s not a formula for perpetual war, then I don’t know what is!

The various interventions of the “War on Terror” have, at best, a checkered record. Most were, and are, complete strategic failures. They demonstrate the inherent limits of US military power and the need for tough cost/benefit analyses before taking the fateful step of deploying American men and women in harm’s way.

Yet on the wars churn, with no end in sight. And why not? Presidents (from both parties) wield force almost unilaterally; Congress is derelict in its duty to oversee the wars; the politicized Supreme Court demonstrates no intent to rule on the constitutionality of presidential war powers; and the citizenry, well, they could care less. With no conscription, innumerable technological distractions, and regularly fed information from a media focused more on minutiae than substance, how could we expect the American people to take much interest at all?

The truth is the war for the Greater Middle East is over. America already lost – it just hasn’t accepted it yet. The tragedy – and farce – of it all is that some number of US troops and innumerable local civilians are sure to die before Washington comes out of denial and accepts strategic defeat.

I can’t say when that will be; but odds are my own young children will be of military age by then…and so will yours.

Read more

American History for Truthdiggers: Wealth, Squalor in the Progressive Era

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

Editor’s note: The past is prologue. The stories we tell about ourselves and our forebears inform the sort of country we think we are and help determine public policy. As our current president promises to “make America great again,” this moment is an appropriate time to reconsider our past, look back at various eras of United States history and re-evaluate America’s origins. When, exactly, were we “great”?

Below is the 20th installment of the “American History for Truthdiggers” series, a pull-no-punches appraisal of our shared, if flawed, past. The author of the series, Danny Sjursen, an active-duty major in the U.S. Army, served military tours in Iraq and Afghanistan and taught the nation’s checkered, often inspiring past when he was an assistant professor of history at West Point. His war experiences, his scholarship, his skill as a writer and his patriotism illuminate these Truthdig posts.

Part 20 of “American History for Truthdiggers.”

See: Part 1; Part 2; Part 3; Part 4; Part 5; Part 6; Part 7; Part 8; Part 9; Part 10; Part 11; Part 12; Part 13; Part 14; Part 15; Part 16; Part 17; Part 18; Part 19.

* * *

The Gilded Age. The American Industrial Revolution. The Progressive Era. Call it what you will, but one salient (Dickensian) fact about this period endures: It was the best of times, it was worst of times—depending on one’s point of view. Industrialization brought immense wealth for some and crippling poverty for others. Mass production might result in savings for the consumer, but working wages remained low. The boom-and-bust cycle of laissez-faire capitalism was in full swing, resulting in national banking panics and, from 1893 to 1897, the worst financial depression up to that point in the country’s history.

The key story of this era revolves around various attempts—by rural farmers and urban workers, by women and blacks, Republicans and Democrats, Populists, Progressives and even socialists—to mitigate the excesses of industrialized American capitalism. It would not prove to be an easy task, and, one could cogently argue, it is a task Americans still grapple with. The two-party system nearly fell apart in this period because neither major political brand seemed to have a viable answer to the key question of day: how to maintain peace and the basic standard of living during a time of massive industrial growth and rising economic inequality. It is a question familiar to supporters of Donald Trump and supporters of Bernie Sanders.

Were the corporate leaders of the Gilded Age corrupt “robber barons” or “rags-to-riches” heroes? Was factory work a long-term good, driving down prices and growing the American economy, or was it soul-sucking wage slavery? Maybe both. What’s certain is that the nature of labor changed forever. Systems of efficiency like <a href=”https://www.britannica.com/science/Taylorism”>“Taylorism”</a> and the assembly line specialized labor and brought much monotony to the workplace. Early American factory life was a nightmare, not unlike contemporary conditions in much of the developing world. The tyranny of the clock (a relatively new addition to the factory floor) dominated life as the average laborer worked six days a week, 10 hours a day. By 1900, there were 1.7 million children toiling in the labor force.

Worse still, in the late 19th century neither political party supported unions or any sort of modern social welfare system or safety net. The results were barbaric. Unorganized workers lacked health care, safety regulations and unemployment insurance. From 1880 to 1900, there were 35,000 deaths on the job, annually—the equivalent of a Korean War every year for two decades. Beyond the fatalities, an average of 536,000 men and women were injured at work in each of those years.

The coldhearted ideology of the day—in both major political parties and among the wealthy—tended to blame poverty on the workers themselves. Except among a tiny (but growing) core of socialists, few Americans who were not directly affected by the plight of workers demonstrated any enthusiasm for federal intervention or poverty mitigation. Indeed, the Democratic President Grover Cleveland—a fiscal conservative—declared in 1893, “While the people patriotically and cheerfully support their Government, its functions do not include the support of the people.” The Republicans were often even less sympathetic.

Most politicians simply reflected the prevailing mores. American elites (and many hoodwinked workers and farmers) clung tightly to belief in the American Dream—that with enough hard work and grit anyone, by pulling on one’s own bootstraps, could become rich. The empirical statistics, even then, debunked this ideology as little more than anecdotal, but it endured and endures. An academic of that era, William Sumner, summarized this viewpoint: “Let every man be sober, industrious, prudent, and wise and poverty will be abolished in a few generations.” Nor did many popular preachers, such as Henry Ward Beecher, show much sympathy for the plight of the poor, with Beecher famously announcing that “[n]o man in this land suffers from poverty unless it be more than his fault—unless it be his sin.”

Nonetheless, when the economy finally collapsed in 1893—due in large part to the corruption and excesses of various corporate monopolies—setting off the worst depression in U.S. history to that point, views on charity, social welfare and the supposed character defects of the poor began to change. Perhaps the federal and local governments did have a role in citizens’ welfare. Of this much, many were sure: Something had to change.

<strong>Populism and Agrarian Revolt: The Good, the Bad, the Ugly</strong>

William Jennings Bryan, the Democratic presidential candidate and Populist firebrand.

After 1877, when the Republican Party abandoned Southern blacks along with any remnants of its old abolitionist sentiment, the GOP became increasingly identified as the party of “business,” of corporations and the capitalist class. The Democrats, now largely a regional (Southern) party, also proved initially conservative on economic issues and stuck with pure free-market capitalism. Neither party, it seemed, appealed to the best interests of small rural farmers or urban wage workers. One result of industrialization was the accumulation of massive wealth in the hands of the very few (mostly Northern and Eastern corporatists). In 1890, the richest 1 percent of the population owned 51 percent of the national wealth, while the poorest 44 percent owned less than 2 percent of the wealth. The result of this imbalance was instability, strikes, work stoppages, federal intervention and, often, bloodshed. Unions formed, shattered and rose again. Still, at a national level, it was the rural farmers who first revolted.

Farmers felt themselves the perennial victims of a rigged system. They lived by the whims of market prices, of supply and demand. They hated the national tariff—which “protected” urban manufacturing but caused rising costs in the consumer goods necessary to live on the prairie. Furthermore, the post-Civic War move away from paper currency (or “greenbacks”) to hard specie, meaning gold, devastated all but the wealthiest farmers. Seeing themselves as the ideal Americans of the Jeffersonian vision—the salt of the earth who tilled the land—they demanded that silver (which was more plentiful) as well as gold be used to back their paper currency. Small farmers simply didn’t have much in the way of gold reserves, and the “hard money” policies of the Republicans and urban elites devalued what little cash they had.

Disgusted with two-party politics, and feeling abandoned by both mainstream Republicans and Democrats, a new organization, the People’s Party, formed and met with early electoral successes in their Western and (sometimes) Southern heartlands. The Populists, as they were called, entered American politics and, in one guise or another, have been with us ever since. Theirs was the policy and ideology of “us” versus “them,” rich versus poor, West versus East, rural versus urban and—lamentably—white versus black. The Populists for the most part distrusted the state; then again, they did support federal intervention when it suited them.

Riding a tidal wave of rural and agricultural support, by 1896 William Jennings Bryan of Nebraska managed a veritable takeover of the Democratic Party, fusing it with the Populists, and ran for president. Bryan was one the great orators in American history. His speeches summoned the tone of evangelical church rallies. At the Democratic National Convention in Chicago, Bryan mesmerized the crowd as he placed an imaginary crown of thorns on his head and pronounced, “We will answer their [the Republicans] demand for a gold standard by saying to them: You shall not press down upon the brow of labor this crown of thorns.” Then, stretching out his arms as if on a cross, he hollered, “You shall not crucify mankind upon a cross of gold.” Bryan, however popular and however enthralling on a podium, would go on to lose in 1896 (and then twice more). He was defeated by the Republican William McKinley, who ran on a platform of nationalism and status quo “prosperity” and who labeled Bryan a radical tainted by his party’s association with the old Confederate South.

And there was something else: money in politics. McKinley and his corporate Republicans raised $7 million (the equivalent of $3 billion today), the Democrats just $300,000. Bryan ran an energetic campaign, riding the rails and giving 600 speeches in 27 states; McKinley rarely left his home and rested on his financial advantages. In the end, money won. McKinley would be president. Lest we become too sentimental and consider Bryan’s and the Populists’ failed campaign as some sort of moral victory, it is necessary to illuminate the “dark side” of Populism.

Many Populists demonstrated strong strains of nativism and racism. They railed against “Jew” bankers, “Slavic” immigrants up north and, in the South,  the “Negro menace.” This was not mere rhetoric. As Populism rose in the West and South, blacks were being utterly disenfranchised. Southern states—now back in the hands of many former Confederate leaders—struck almost every eligible black voter from the rolls. Between 1898 and 1910, the number of black registered voters in Louisiana dropped from 130,000 to 730! Populism, in other words, may have been the party of the “people,” but it was most certainly only thus for white people. Consider the contrast. The very year Bryan ran his crusading campaign (1896), the Supreme Court would hold that segregation was legal when it ruled in the case of Plessy v. Ferguson. Few Populists made an effort to craft an interracial alliance of poor people, and thus it was ultimately American blacks who were left to writhe on Bryan’s proverbial cross of gold.

<strong>The Progressive Moment: Social Freedom or Social Control</strong>

The Panic and Depression of 1893 was so severe—and the government so unprepared and unwilling to intervene—that millions of families were brought to the brink of starvation, and the “ranks of a tramp army” (of unemployed men) swelled. Though the Populists never managed to convince or co-opt Northern factory workers to join their crusade, many of the sentiments and proposed policies of the People’s Party began to infuse a new movement of (mostly middle class) “Progressives,” as they styled themselves. Progressives weren’t exactly radical in the traditional sense—though their wealthy opponents depicted them as such—and they belonged to both major political parties. What they most had in common was an abiding criticism of the excesses of American “boom-bust” capitalism, and a sense that regulation of markets and the intervention of government could mitigate the worst aspects of this and future depressions.

Throughout their heyday, 1896-1920 or so, Progressives called for, and often achieved, many of the government programs and policies that exist to this day. They pushed for antitrust and anti-monopoly regulations, the eight-hour workday, an end to child labor and unemployment, and workers’ compensation insurance, to name but a few. One problem for the Progressives was their inability to forge lasting alliances with rural Populists, whom they saw as backward country bumpkins. Nor did the rural poor trust the machinations of these urban (seemingly arrogant) reformist Progressives. The two groups had such divergent cultural values and traditions—as well as different views on immigration and government intervention—that a true union of urban/rural workers and reformists never manifested itself.

Historians long have argued about the ultimate nature of the Progressive movement. Some view the Progressives as genuine reformers with the best interest and freedoms of the working classes at their root. Others sense an overriding aura of social control in the Progressive agenda. Though the remarkable achievements of the Progressives should never be ignored, their paternalist and controlling side bears some analysis. As true believers in the government’s ability to reform, regulate and solve the nation’s economic and social problems, Progressives sometimes displaced the social justice rhetoric of the Populists “with slogans of efficiency.” Indeed, Progressives seemed to “know what was best” for poor farmers and urban immigrants alike—sobriety and moderation. This explains why so many Progressives were also in favor of temperance. Their motto was “trust us,” meaning the experts, your social and educational betters.

While Populism pitted the “people” against the state, Progressives believed in the utility of using (through new theories of social science) the state to intervene in the economy and reform society. Indeed, the paternalistic impulses of some Progressives were such that they saw the masses—urban or rural—as a threat to democracy, a populace itself in need of regulation. Therein lies part of the “dark side” of the Progressive movement. Sure, Progressives made great, if gradualist, progress on improving working conditions, government regulation and the right of (white) women to vote. This ought to be rightfully celebrated. Still, many Progressives, both academics and policymakers, believed in the social Darwinist notion of human beings’ “survival of the fittest.” In that vein, a powerful wing of the Progressives backed eugenics programs of forced sterilization laws. Those deemed physically or mentally unfit were to be sterilized for the good of the American “whole.” Beginning in 1907, two-thirds of U.S. states would eventually pass forced-sterilization laws. Indeed, even the Supreme Court ruled, in Buck v. Bell (1927), that compulsory sterilization was fully legal and constitutional. In a disturbing irony, Adolf Hitler and other Nazis would later cite America as a positive example and model of their early racial purity programs.

Progressives had a deep blind spot related to race in general and African-Americans in particular. The inconvenient fact is that the “Progressive Era” coincided with the Jim Crow era and the height of racial terrorism in the South. When Progressives talked about easing inequality they meant white inequality. The same went for most Populists. Indeed, as The New York Times reported on the 1924 Democratic National Convention: “An effort to incorporate in the Democratic Platform a plank condemning the Ku Klux Klan … was lost early this morning by a single vote. … There [was oratory against the proposal] by William Jennings Bryan, who spoke with his old-time fire and enthusiasm.”

“Progressive” Democrats couldn’t even agree to condemn the Klan! Perhaps we shouldn’t be surprised. The Progressives and Populists had one thing in common: They wanted to win national elections. As a result, they showed a willingness to play the race card and ignore the Southern regime of terror that was then at its height.

As most Progressives and Populists remained silent, lynching reached its zenith in this era, and neither party took national action. It didn’t take very much for a black man to be lynched in the “Progressive Era” South. In 1889, Keith Bowen was killed for simply entering a room where three white women were sitting. In 1904, a white mob lynched a black man for knocking on a white woman’s door. In 1912, Thomas Miles was killed for writing a note to a white woman, inviting her for a cold drink. Multiply this by a thousand and one gets a sense of the scale of lynching in this period. And what did a self-described “Progressive” president, Theodore Roosevelt, have to say about all of this? The New York-bred Brahmin lowered himself to the baseness of a Southern apologist. He stated that “the greatest existing cause of lynching is the perpetration, especially by black men, of the hideous crime of rape.” How’s that for victim blaming?

<strong>The Wild Election of 1912: Who Was the Real Progressive?</strong>      

Nevertheless, by 1912 the old notion of a governmental hands-off policy in economics and society was out of style. In that presidential election, three men ran on a platform of “Progressivism”—Republican incumbent William Howard Taft, Democratic challenger Woodrow Wilson and the old stalwart Theodore Roosevelt, who had formed his own third-party, the Progressive (or “Bull Moose”) Party. In this election, the three main candidates all sought to “out-progressive” the others. In truth, though, the men were, in their policies and platforms, remarkably similar. And they admitted it! Wilson, the Democrat, stated, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.” Roosevelt, ever more succinct and blunt, declared that “Wilson is merely a less virile me.”

So who was most traditionally Progressive? It’s a tough question. Roosevelt was most associated with Progressivism, in theory. He touted his achievements as a “trust-buster,” believed that big government could balance big business, and had a strong environmental record (including an expanded program of national parks). Still, he was socially conservative, feared “anarchy” and supported overt imperialism. His racism, and that of his followers, was also a problem—some called the Roosevelt wing of the Republican Party the “Lily-whites.” Taft, though portrayed by his opponents as a “business” Republican, actually busted twice the number of trusts as Teddy, mandated an eight-hour workday for federal employees and pushed for a progressive income tax. Wilson—born to a slaveholding family in Virginia in 1856—had racist instincts but supported some trust-busting and believed the Constitution to be a “living document” that must change with the times.

In the end, Roosevelt (though the most successful third-party candidate in history) split the Republican vote and propelled Wilson into the White House. Wilson followed through on many of his campaign promises, and the 63rd Congress, during his first term as president, was one of the more productive in U.S. history. Under Wilson’s watch, Congress lowered the tariff, abolished child labor and passed a new antitrust act, the eight-hour workday and federal aid to farmers. Wilson’s first term seemed a Progressive dream. But the man also had Southern roots and was a product of his era and its hateful culture. The favorite movie of the Progressive Wilson was “The Birth of a Nation,” which depicted the KKK as heroic. Furthermore, he excluded black soldiers from the 50th-anniversary observance of the Battle of Gettysburg and ushered in segregation of federal buildings and the federal civil service. On the issue of race, under Wilson, “Progressive America” had took a step backward.

<strong>Why No Socialism in America?: Urban, Rural and Racial Division</strong>

The hopes and fears of the period are captured in “The Strike,” an 1886 painting by Roger Koehler. In it, a crowd of workers approaches the home of a well-dressed factory owner. Amid the tension, a man picks up a stone.

It is a question asked time and again by historians on both sides of the Atlantic: Why did a significant socialist movement not rise in the United States as it did in Europe? Indeed, one could argue that this is one of the rare things that actually is “exceptional” about America. Indeed, while many liberal Progressives in the U.S. admired the social programs long flourishing in industrialized Europe, they never managed to implement most of these policies in Washington. One reason they had so much trouble was the power of the courts in America. Through the peculiar U.S. system of lifetime appointments of judges, the justices—especially on the Supreme Court—were a generation behind contemporary policymakers and tended to strike down social welfare provisions as unconstitutional. It would take decades to modernize the court. European countries rarely had such problems with experimentation and improvisation.

Thus, the United States fell way behind Europe on social welfare (and remains so today). One dream of American Progressives was universal health insurance, which they proposed back in 1912. More than a century later, the U.S. holds the distinction of being perhaps the last major industrialized country not to have such a program. Germany had shown the way in 1883, and the United Kingdom passed the National Insurance Act in 1911.

This is not to say that American socialists did not exist. In fact, 1912 was probably the high tide of socialist sentiment in this country. Many socialists, including their party leader, Eugene Debs, a former union man, believed that neither of the two major parties had an answer to the ills and excesses of capitalism. As one union member summed it up, “People got mighty sick of voting for Republicans and Democrats when it was a ‘heads I win, tails you lose’ proposition.” Though it is rarely remembered or spoken of now, in 1911 card-carrying Socialists were elected mayors of 18 cities, and more than 300 held office in 30 states. In the 1912 presidential election, Debs received a remarkable 1 million votes, the most ever for a far-left socialist candidate. Clearly, some proportion of voters agreed with Debs that “[t]he Republican and Democratic parties, or, to be more exact, the Republican-Democratic party, represent the capitalist class in the struggle. They are the political wings of the capitalist system.”

Still, though Debs’ accomplishments were real, it must be said that no serious socialist or social democratic movement ever gathered much steam on this side of the Atlantic. The reasons were as numerous as they were lamentable. The working class in the United States was utterly divided against itself—something the owners of this country exploited and perpetuated. Rural Populists wanted lower tariffs, cultural homogeneity, an end to new immigration, and a return to “traditional values.” They were never able to make common cause with the urban (often immigrant) working class, despite their obvious common interests. Just as today, the American working and middle classes were waging a culture war over race, citizenship, immigration and social “values.”

But the main factor was race, specifically the “Negro question.” Both urban white immigrants and rural white farmers defined progress and reform along the narrowest of racial contours. They believed in white Progressivism and white Populism. In the interest of not alienating their parties’ Southern wings and winning elections and because of their downright bigotry, they left African-Americans to suffer economic peonage and physical torment. Progressives and Populists of the period failed to recognize one salient truth: Some cannot be free so long as all are not free. When it came to fulfilling the idealistic dreams of populism and progressivism, race, as in so many American matters, would prove to be the nation’s Achilles’ heel. It would take two generations to even begin to right that wrong.

* * *

To learn more about this topic, consider the following scholarly works:

• Jackson Lears, “Rebirth of a Nation: The Making of Modern America, 1877-1920” (2009).

• Jill Lepore, “These Truths: A History of the United States” (2018).

• Richard White, “The Republic for Which it Stands: The United States During Reconstruction and the Gilded Age, 1865-1896” (2017).

Maj. Danny Sjursen, a regular contributor to Truthdig, is a U.S. Army officer and former history instructor at West Point. He served tours with reconnaissance units in Iraq and Afghanistan. He has written a memoir and critical analysis of the Iraq War, “Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge.” He lives with his wife and four sons in Lawrence, Kan. Follow him on Twitter at @SkepticalVet and check out his new podcast, “Fortress on a Hill,” co-hosted with fellow vet Chris “Henri” Henrikson.

The views expressed in this article are those of the author, expressed in an unofficial capacity, and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. government.

Read more

The Tragic Anniversary in Afghanistan We Dare Not Acknowledge

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

This piece originally appeared on antiwar.com.

The absurd hopelessness was the worst part. No, it wasn’t the Improvised Explosive Devices (IEDs) blowing limbs off my boys, or the well-aimed gunshot wounds suffered by others; it wasn’t even the horror of ordering the deaths of other (“enemy”) human beings.

No, for a captain commanding 100 odd troopers in Southwest Kandahar province at the height of the Obama “surge” of 2011, what most struck me was the feeling of futility; the sense that the mission was fruitless operationally, and, of course, all but ignored at home. After a full year of saturating the district with American soldiers, the truth is we really controlled only the few square feet we each stood on. The Taliban controlled the night, the farmlands, the villages. And, back in 2011, well, the U.S. had about 100,000 servicemen and women in country. There are less than 15,000 on the ground now.

It’s an uncomfortable, almost un-American, truth – there is nothing more that the US military can do for the foundering government of Afghanistan. And, as the war reached a lamentable 17th anniversary last week, now is the time to once again raise the alarm. Fact: this next year, teenagers born after 9/11 will begin to join the military and, eventually, fight in Afghanistan.

As if that’s not disturbing enough for the ostensible republic, consider this: the Afghan War is failing, failing worse than ever before. Along each line of effort – security, politics, and economics – the metrics point downward despite all the blood and treasure already sunk into America’s longest war.

Let’s begin with security, arguably the paramount measure of success in any war. For nearly two decades, one US commanding general after another has assured the American public that – with just a few extra troops and a little more time – he could achieve “victory” in Afghanistan. The US has tried many approaches: a “light footprint” counter-terror force (2001-08), a massive “surge,” or infusion of 100,000 troops (2009-13), and a shift to smaller advise and assist elements training the Afghans (2014-present). Nonetheless, after all that time and effort, the security situation is worse than ever.

The Taliban controls or contests more districts – some 44% – than at any time since the 2001 invasion. Total combatant and civilian casualties are forecasted to top 20,000 this year – another dreadful broken record. What’s more, Afghan Security Force casualties are frankly unsustainable – the Taliban are killing more than the government can recruit. The death rates are staggering, numbering 5,500 fatalities in 2015, 6,700 in 2016, and an estimate(the number is newly classified) of “about 10,000” in 2017.

The question at hand is this: what can (or should) the US military do that it hasn’t already tried? Despite all of its sustained commitment and sacrifice (to the tune of 2,416 dead as of early September 2018), the US military and its Afghan partners have not meaningfully stanched the tide of Taliban gains. So, what can some 15,000 U.S. troops accomplish in 2018 that 100,000 could not achieve in 2010-11?

Politically, there are serious questions about Afghan government legitimacy and effectiveness. As a recent US Congressional report concluded, “Afghanistan’s…political outlook remains uncertain, if not negative, in light of ongoing hostilities.” Recent trends indicate that the U.S.-backed federal government is fragmenting along ethnic and ideological lines. This should come as little surprise. The last two presidential elections – in 2009 and 2014 – have been wracked by allegations of fraud, and the Parliamentary elections (scheduled for October 2016) have been delayed until at least late 2018. Corruption, fraud, waste and abuse have also been rampant in Kabul. Without a legitimate, stable political partner, no external military force of any size can meaningfully “win.”

Finally, there are the strict economic limits of the entire enterprise. Simply put, the Afghan economy does not generate enough income to fund its annual expenditures or even pay its military. For 17 years now, the US has picked up the tab, to the tune of of $762 billion and counting. The economic bottom line is as simple as it is stark: The Afghan GDP, largely based on foreign aid and domestic revenue, is insufficient to even fund the security sector (which runs at $5 billion annually against $2 billion of domestic, annual Afghan revenue). This is an unsustainable formula for perpetual US involvement in the conflict. It just doesn’t add up!

Make no mistake, the departure of US troops from Afghanistan will be ugly; what comes next is difficult to predict. That said, Afghanistan has been at war with itself and others for 39 years – the US intervention is but a part of a war without any discernible end. There is no military solution to the Afghan War. An Afghan settlement to the ongoing Afghan conflict will be messy, but this is an inevitable, irreversible reality the US must accept and mitigate without a costly and futile indefinite intervention.

When announcing his “new” strategy in August 2017, President Trump candidly admitted that his “original instinct” was to pull out of Afghanistan. He was correct – and should consider following those sound instincts. Nonetheless, the US military remains in place and has even conducted a mini-“surge” of advisors this year.

It is time to end this intervention, extract the US military from an unwinnable war, and refocus those assets (of blood and treasure) on training for great-power conflict and genuine homeland defense.

So it is, and so the violence churns on. Last month, an Army Sergeant Major was shot to death by the very Afghan police officers he was there to train. This was a so-called insider attack – an occurrence more common than we’d like to admit. The Sergeant Major was the seventh American soldier killed this year.

And mark my words – there will be another. I’d hate to be the officer assigned to explain to a widow or mother, just what, exactly, he or she died for.

Read more

What Keeps Washington Indefinitely in Bed With Riyadh

Read more of this story here from Truthdig RSS by Maj. Danny Sjursen.

It’s time to ask an uncomfortable question: What exactly is the U.S. getting out of its partnership with Saudi Arabia? The answer is: nothing but headaches, human rights abuses and national embarrassment. In the cynical past, the U.S. could at least argue that it needed Saudi oil, but that’s no longer the case, due to the shale-oil boom (though that fact is not necessarily good for an ever-warming planet).

Recently, the crimes of the Saudi government managed to pierce the Trump-all-the-time-Kanye-West-sometimes media-entertainment complex due to Riyadh’s likely murder of dissident journalist Jamal Khashoggi. That the U.S.-Saudi relationship is, however briefly, coming under the proverbial microscope is a good thing. Still, it is astonishing that this incident—rather than dozens of other crimes—finally garnered attention. Even so, President Trump appears reluctant to cancel his negotiated $110 billion record arms deal with the kingdom.

For me, it’s personal. Saudi Arabia’s fingerprints—both of its government and private-citizen donors—have been all over America’s various opponents these past 17 years of war. I patrolled the streets and suburbs of Baghdad from 2006 to 2007. Sunni Islamist insurgents, which were funded by the Saudis, shot a few of my soldiers and paralyzed one permanently. We regularly found Saudi Wahhabi Islamist literature in the homes and caches of our insurgent enemies.

Years later, from 2011 to 2012, I led a cavalry reconnaissance company in Kandahar, Afghanistan. We chased the Taliban—really a collection of disgruntled farm boys—around the fields and valleys of the Zhari district. Guess where those Taliban fighters—who killed three of my men and wounded 30 others—went to school? In Saudi-financed madrassas across the border in Pakistan.

All told, I—like hundreds of other officers—sacrificed young American soldiers fighting an “enemy” too often armed and funded by the kingdom of Saudi Arabia. In Iraq, my platoon suffered the loss of three lives, the use of a few legs and several gunshot wounds. In Afghanistan, my troop gave up 10 limbs, three lives and endured more than a dozen gunshot wounds. That the Saudis—America’s purported “partners” in the Middle East—have even some of that blood on their hands should be seen as a national tragedy. That it is not reflects poorly on the health and future of this republic.

Saudi Arabia is a fundamentalist theocracy and one of the world’s last absolute monarchies. The kingdom (and its private-citizen donors) have regularly supported Islamist jihadis across the Middle East. Heck, 15 of 19 9/11 hijackers were Saudis. Many of these groups later attacked the U.S. homeland or American troops overseas. Most recently, Riyadh backed the Nusra Front—the al-Qaida affiliate ensconced in the ongoing Syrian civil war.

All too often, Saudi Arabia backs groups that are anti-American, and overall, Riyadh’s regional policy is utterly counterproductive to U.S. interests. Furthermore, the Saudis’ irrational hatred of Iran has kicked off a veritable cold war and arms race in the Persian Gulf—and that’s where those $110 billion in weapons will be funneled. The last thing the overstretched U.S. military needs is to be pulled by our Saudi “partners” into a new war in the region—this time in Iran.

Then there’s the matter of human rights and U.S. “values.” Here, the Saudi record is atrocious. The kingdom beheads dissidents and executes women for adultery, “witchcraft” and “sorcery.” Only in a place like the Arabian Peninsula could it be considered an accomplishment for women to finally gain the right to drive—in 2017. Finally, and most shockingly, in terms of U.S. complicity, since 2015 Riyadh has unleashed terror bombing and a starvation blockade on Yemen—the poorest country in the Arab world. Tens of thousands of civilians have died, tens of millions are in danger of famine and the worst cholera epidemic in recorded history has broken out.

So what is it that keeps Washington so closely—and inextricably—tied to Riyadh? It’s increasing clear that the profits of the military-industrial complex might provide the best explanation. The United States no longer produces much of value. Deindustrialization crippled our Rust Belt, reoriented America to a service economy and increased the growing gap between rich and poor. These days, guns and bombs—the U.S. is by far the largest international arms dealer—are the one thing Uncle Sam still produces.

Seen this way, we must look again to the $110 billion deal Trump negotiated with Saudi Arabia. It may just be Lockheed Martin, Boeing, Honeywell and other such corporations that keep Washington indefinitely in bed with the Saudi king and princes. What’s more, guess who serves on the boards of many of those companies, in a controversial revolving door? Retired generals and admirals. The embarrassing, counterproductive U.S.-Saudi relationship thus appears to reflect a structural flaw embedded in the U.S. economy: its co-option by the ever-stronger military-industrial complex.

Maybe the recent uproar over the Saudis’ alleged murder of Khashoggi will achieve what tens of thousands of dead Yemenis and the loss of thousands more U.S. troops could not—a reboot of U.S. policy toward the kingdom.

As a historian, I wouldn’t count on it.


Danny Sjursen is a U.S. Army officer and a regular contributor to Truthdig. He served combat tours with reconnaissance units in Iraq and Afghanistan and later taught history at his alma mater, West Point. He is the author of a memoir and critical analysis of the Iraq War, Ghostriders of Baghdad: Soldiers, Civilians, and the Myth of the Surge. Follow him on Twitter at @SkepticalVet.

The views expressed in this article are those of the author, expressed in an unofficial capacity, and do not reflect the official policy or position of the Department of the Army, Department of Defense or the U.S. government.

Copyright 2018 Danny Sjursen

Read more