The story of the United States and its relationship to the Second World War is often presented as a tale of epic triumph, the one monumental conflict in which truly we stood as the defenders of good against the forces of evil; one nation united in the fight against fascism. And yet the mural we paint of the “Greatest Generation” often overlooks our own relationship with virulent antisemitism, racism, and eugenics that did, in fact, inspire Hitler’s own monstrous regime. While the public at large might have rejected fascism, prominent celebrities, politicians, and businessmen had sympathies and even in some cases ties to the Third Reich. This piece will be one in an ongoing series that reviews the connections between Fascism and America, both before, during, and after the second world war.

 I feel it is worth revisiting this facet of American history through the story of Prescott Bush, father and grandfather to future presidents George HW and George W respectively; and the family patriarch whose fortunes would aid in building the foundation for his successor’s political future. The argument being made is not to try and throw the legacy of the Bush family into the shame through ad-hominem attacks and conspiracy theories; quite the opposite, their numerous policy decisions and blunders have cemented their place in history. As a trove of declassified evidence from the National Archive has demonstrated over the last 20 years, Prescott Bush’s auspicious ties to Hitler’s Germany are much more grounded in reality than previously understood. Using a combination of primary sources via documents available to the public in the National Archives, as well as secondary sources providing the historical framework that explains how and why America’s moguls collaborated with the Reich, we begin to form a picture of greed, virulent anti-communism, and at times genuine Nazi sympathies. The “Good War”, it seems, was far from it.

An Air of Resentment

With an economy in ruins and a society in political upheaval, the newly formed Weimar Republic found itself indebted to various western financial institutions, primarily out of the United States and Great Britain. The Bank for International Settlements (BIS) in Switzerland was established after the first World War to channel war reparations between the US and Germany. Over time, banks like BIS would function for more than debt payment and the gradual transfer of US treasury bonds between the two nations throughout the 1920s and ’30s. As the Weimar Republic fell and the Third Reich rose, the bank continued to offer financial services to support their regime. Germany would not only receive an unabated flow of cash but an atmosphere of tolerance over the Nazi party’s programs of racial purity and discrimination.

The conditions for Nazi sympathies were ripe in Post-WWI America. Antisemitism had festered, with some associating Jews with communism and radical ideologies, while others believed that they exerted too much influence in entertainment, business, and academia. Harvard responded by cutting admission of freshmen Jews from 28% in 1925 to 12% in 1933. Beyond crude racism was the country’s growing acceptance of eugenics, evidenced by then-Governor of New Jersey Woodrow Wilson’s adoption of a law that sterilized convicts, epileptics, and the ‘feeble-minded’. Some 60,000 Americans would be sterilized by similar programs across the US. The programs, of course, had a reciprocal effect on the ideas of Adolf Hitler, who admired the country’s stance towards biological and racial purity but saw these as only half-measures. Hatred of the Soviet Union also extended into perceptions of politicians at home as “socialistic”, with most America businessmen denouncing Roosevelt’s New Deal as an authoritarian seizure of the economy. Corporate Anti-Semites smeared Roosevelt as a crypto-Communist and an agent of Jewish interests; they would refer to him as “Rosenfeld”, and vilified the New Deal as the “Jew Deal”. In Upton Sinclair’s novel The Flivver King, he describes the notorious anti-semite Henry Ford as dreaming of:

An American fascist movement pledged to put down the Reds and preserve the property interests of the country; to oust the Bolshevik [Roosevelt] from the White House and all his pink professors from the government services… [and] to make it a shooting offense to talk communism or to call a strike.

Motivated by hatred of communism, fascist sympathies, or simply greed, the upper echelons of American society would protect their wealth and status at any cost, even if it meant collaborating with the enemy.

The Fascist Connection

Prescott Bush’s foray into the banking sector began shortly after World War I, when he married the daughter of wealthy business magnate George Herbert Walker in 1921. Walker’s background in investment banking led him into a partnership with Averill and Roland Harriman, sons of railroad tycoon E H Harriman, in New York. Bringing on Bush as a founding member, the group established the Union Banking Corporation (UBC) in 1924. By 1931 Bush and the Harrimans would start the Brown Brothers Harriman (BBH) firm to oversee the UBC and their holdings in other German industries. The bank’s foremost client starting at its inception and continuing well into the war years would be Fritz Thyssen, heir to the most prominent industrial fortune in Germany. His father August had been a major contributor to the country’s first war effort, and by the 1920’s his sons Fritz and Heinrich had devised an interweaving web of offshore companies and banks so that their assets could be moved overseas if they were ever threatened. Fritz had inherited his father’s empire by 1926, at a time when the German economy was in the midst of collapse.

Thyssen’s influence on the success of Adolf Hitler cannot be understated. On the advice of former General Eric Ludendorff, he attended a speech given Hitler and his fledgling Nazi Party in 1923. Impressed, Thyssen became one of the early financial backers of the Nazis, claiming to have donated 1 million Reichsmarks to various right-wing parties. His contributions would bail out the struggling movement in its early years, helping to finance their future headquarters in Briennerstrasse, Munich, and even co-authoring a letter with Nazi Finance Minister Hjalmar Schacht, urging President Paul von Hindenburg to appoint Hitler as Chancellor. Thyssen’s monied contributions and political maneuvering would prove crucial, and for his assistance, he was elected a Nazi member of the Reichstag and appointed to the Council of State of Prussia. It could be argued that the UBC was unaware of Thyssen’s actions, his aid to the Nazis not revealed until he himself had fled the country in 1939, shortly after the invasion of Poland. But through a combination of Thyssen’s own testimony in his autobiography, I Paid Hitler and a trove of documentation uncovered in the last 60 years, the knowledge of a fascist alliance may have been a key component in concealing the funds.

While there was nothing initially illegal in conducting business with the Thyssens, and indeed many American companies had German subsidiaries in the country throughout the 1930s, the United States’ neutrality would end by 1941 after the attack on Pearl Harbor. By July of 1942, the New York Herald-Tribune ran a piece titled “Hitler’s Angel has $3m in US Bank,” alluding to the UBC’s massive transfers of gold, fuel, steel and coal purchases to and from Germany. The newly minted Alien Property Commission (APC) would seize those assets the following fall under the Trading with the Enemy Act. In findings undertaken by Erwin May, an investigative officer for the APC, it was revealed that Harriman, Bush, and the other directors did not own shares of UBC but merely held them on the behalf of the Dutch Bank voor Handel. Equally curious was that the UBC’s Dutch President Cornelies Lievense repeatedly denied to U.S investigators any knowledge of the ownership of either the Netherlands Bank or the true owners of its American shell company. By the end of 1942 May had found a connection between Director Roland Harriman and HJ Kouwenhoven, who had consulted the former in 1924 to set up the Union Banking Corporation. Kouwenhoven also had several other assignments; he was the managing director of voor Handel, an August-Thyssen owned bank in Berlin, and the director of Fritz Thyssen’s Union Steel Works, the heart of the family’s industrial possessions. There was not only clear knowledge of this partnership among the founders of UBC but proof that the bank was, in reality, a clearinghouse that had partly ensured Germany would be financed for its war effort. A memo from Homer Jones, chief of the APC Investigation, outlines it in the following memo that recommends the seizure of UBC assets:

Said stock is held by the above named individuals, however, solely as nominees for the Bank voor Handel, Rotterdam, Holland, which is owned by one or more of the Thyssen family, nationals of Germany and Hungary. The 4,000 shares hereinbefore set out are therefore beneficially owned and help for the interests of enemy nationals, and are vestible by the APC (September 1942)

Despite Jones’ recommendation, UBC remained intact and the assets were returned to American shareholders by the end of the war. No further action was ever taken nor did the investigation continue, a shocking decision when it remains clear that the investigation had established a link between an American bank acting as an intermediary for the Thyssen family’s enterprises, eight months after America had entered the war. Further suspicions have been raised as a result of documents unveiled at the National Archives, on the connections between Prescott Bush, Thyssen, and the Consolidated Silesian Steel Company (CSSC), which played an integral role in the use of slave labor at Auschwitz. The Archives show that not only did the Brown Brothers Harriman firm own shares in the CSSC during the 1930s, but that fellow BBH partner Knight Wooley had recommended consulting the prestigious law firm Sullivan and Cromwell to “deal with the issue of Polish laborers” in their offshore business. In his letter, he says he’s consulted John Foster Dulles, a partner at S&C and future Secretary of State under Dwight Eisenhower. Beyond historians and journalists’ own speculation, no legal action would come of the ties between American financial and legal institutions in covering up their role in Nazi Germany.

Closing Thoughts

The story of George HW Bush’s grandfather is one that is far from atypical, in a larger pattern of US enterprises who took no issue in doing business after the Third Reich emerged.  Ford, GM, Alcoa, ITT, Standard Oil, General Electric, Dupont, Kodak, Pratt and Whitney, Douglas, Westinghouse, United Fruit Company, International Harvester, and Singer all traded with Germany up until 1941. The “Arsenal of Democracy” that Roosevelt would tout mysteriously found its way into the hands of the Wehrmacht, in no small part due to continued contributions from the German subsidiaries of Ford and GM. Other financial institutions, most notably the Swiss Bank for International Settlements, Morgan Bank, and Chase Bank would all cloak their dealings with Germany during this time, allowing continued access to wealth and assets that would’ve normally been in blocked accounts.

Collaboration with the Nazis is considered a taboo subject, often reserved for discussions on the fringes of historical discourse due to its unsettling implications, and embraced by conspiracy theorists that seek to discredit existing facts. Most recent stories regarding action against Germany’s financiers re-emerged in the late 1990s and early 2000s, when Holocaust survivors suing Swiss and American corporations for their complicity and the theft of Jewish wealth. A prominent case was renewed in 2001 by two Holocaust survivors, seeking to use the evidence of material benefit that American businesses gained from the Auschwitz labor camps in a class-action lawsuit. Curiously the case was thrown out by a judge on the grounds that the United States government could not be sued on the principle of “state sovereignty”; a concept that seems to benefit the US when held to account, but vanishing when interfering in another nation’s affairs. The discussion over collaboration with fascist governments is largely a relic of the past, few recognizing the present day parallels in neo-fascist governments that have taken hold, from Brazil to Turkey to the Philippines. History’s value as a tool of hindsight has been stated by more accredited scholars, so we must ask why our own memory of Nazi collaboration has been obfuscated. Despite the liberal values that America will profess, when our industrial class is confronted with a horrid regime that promises union-busting, privatization, and the unfettered flow of capital: Money will bleed for Fascism.

BlogSignature

PS: Expect an audio version of this article up soon!

September 11th has become a day synonymous with collective anguish and shattered morale. It brought a country to its knees, the blow delivered from the phantom knife of a foreign power. The fatal wound was the last, but not the first strike against this nation. When a popularly elected leader began to assert the sovereignty of their peoples, their right to create a society that fulfilled the Jeffersonian Trinity, forces in the shadows began to conspire against them. Economic bludgeoning, assassinations, and ultimately a coup violently ended 41 years of peaceful democratic rule. As the dictator Augusto Pinochet strode amidst the rubble to rule with an iron fist, he welcomed with open arms the dutiful advice of these foreign whisperers; the architects of political and economic chaos: the United States. On September 11, 1973, Chile would forever change, another pawn in America’s ever-expanding network of regime change.

By Any Means Necessary 

This visceral anecdote is a painful demonstration of the cost that the United States has extracted upon the many nations who refuse to acknowledge its preeminence in global affairs. Never mind hostile adversaries to the country; the words of George W. Bush, on the eve of the invasion of Afghanistan, make the US’s position abundantly clear: “You are either with us, or you are with the terrorists.” No matter the enemy, America’s exceptionalism demands loyalty. These words echo the foundation of regime change, the tactic of reshaping a foreign government to match the preferences of one’s own, which crept out of the shroud of the Cold War. The bipolar world that Harry Truman’s administration had helped shape was further mired by a noxious cloud of anti-Soviet paranoia.

There was no doubt that the Soviet Union in of itself was a repressive regime; not only to its own people but to its many proxies and satellites, which were swiftly toppled and replaced with loyal leadership should they fall out of line. But there is a fundamental misunderstanding that the United States entered the Cold War because of the widespread aggression of their communist adversary. Initially, the reverse was true; The Soviets made clear their willingness to accept governments friendly to western powers until the west began to use its resources to jeopardize the security of these countries. The burden of responsibility falls into the hands of the sole superpower with not only the economic capacity to bend nations to their will, but with the unholy might to turn civilizations into nuclear wastelands. The terms of the Cold War, it is safe to say, were written by the United States, and consecrated into scripture by the Truman Doctrine. According to these texts, It need not be that a country is merely aligned with the Soviet Union to be considered a threat to US National Security. The nationalization of private industries, the empowerment of the disenfranchised, or the commitment to nonalignment were acts of defiance the United States could not tolerate.

In Theory and In Practice

While the desire for hegemonic dominance or the projection of military might motivate the US, there are other strategic reasons states pursue regime change. Lindsey O Rourke’s insights from her book Covert Regime Change: America’s Secret Cold War offers a perspective on the considerations made when wishing to mold foreign governments in one’s favor. O’Rourke identifies that 1) The dispute between two countries must be based on irreconcilable, chronic divergences on matters of national security and 2) That the intervening state must have an alternative regime in mind when attempting to overthrow a government. Furthermore, the modes of intervention often diverge between overt and covert action. The nature of covert operations often proved less costly, and should a mission go awry the intervening government could declare plausible deniability. The US was hardly successful in hiding its role in foreign interventions but nevertheless preferred this strategy when pursuing regime change. Beyond O’Rourke’s theory is the implication that the United States resorted to overthrowing democratically elected governments whose interests did not align with the American status quo. As time would demonstrate, this hardly unsettled the dual nature of US foreign policy, espousing freedom while delivering despotism.

This tone is set in the early years of the Cold War, where the phrase “democratically elected, militarily overthrown” becomes the anthem of U.S intervention. To take a few examples:

Iran, 1953: Democratically elected Prime Minister Mohammed Mosaddegh attempts to reign in British control of Iran’s oil reserves. The British enlisted the help of America’s Central Intelligence Agency, toppling Mossadegh and installing the authoritarian Shah of Iran. He ruled until 1979, when the Iranian revolution saw the overthrow of the Shah and a regime hostile to American interests.

Guatemala, 1954: Democratically elected President Jacobo Árbenz, engaging in a land reform program to expropriate (with compensation) unused acres held by the American United Fruit Company to provide new economic opportunities for the country’s impoverished. The CIA removes Árbenz and props up authoritarian leader Carlos Castillo Armas, inciting a generation of repression and violence under Guatemala’s military junta.

Iraq, 1959: Democratically elected Prime Minister Abd Al-Karim Qasim overthrows the monarchy and institutes agrarian land reform, pursues socially liberal policies, and lifts the ban on the Iraqi Communist Party. At the behest of the Egyptian government and Iraq’s Ba’ath party, the CIA sanctions an assassination attempt on Qasim. The attempt fails, but a young Ba’athist named Saddam Hussein garners widespread exposure for the attempt, and he uses his infamy to channel a strongman persona years later as President of Iraq.

Over seventy-two coups were undertaken in all. Whether for the overt purpose of crushing any popularly elected leftist government, or the more covert interests of American business, all become equal under the pretext for foreign intervention. The cruelty of the men who enter that power vacuum need not concern the United States, they assure themselves. The School of the Americas was in fact established to engender this cruelty, training thousands of Latin American military officers in the arts of torture and assassination. The School would become synonymous with paramilitary death squads in the years that followed; with every murder of a Jesuit priest or a political dissident, they would wash their hands of blood and responsibility.

testmap1

A Cursory History of The School of the Americas and their involvement in Latin America. Enhanced Image here. Credit: School of the Americas Watch

Collective Amnesia 

The decades of animosity simmering amidst the Global South; The sustained blowback that has spawned new adversities in place of old ones; Surely this litany of consequences would throw cold water on US decision-makers – does the cost merit the risks they seem so eager to wager? The only reply is the tireless shuffle of faceless suits marching in and out of the rotating doors of the military-industry complex; the seamless transition from bureaucrat to business leader to contractor and back again. There must be a vital recognition that this is not a position reserved for merely one side of the political aisle. To be sure, the administrations of both parties, often with the bipartisan benediction of Congress, prove how regime change is an orthodoxy widely accepted in Washington. Republicans stir up jingoistic and nationalist rhetoric, while Democrats decry the human rights abuses of strongmen with oil reserves. Perhaps the collective apathy of not only our leaders but the broader public is a consequence of a culture that feels less connection to overseas adventures.

Those who would beat the drums of war the loudest often have little to lose should the mission go awry; the careers of men like Dick Cheney or Donald Rumsfeld display an almost callous comfort with sending the sons and daughters of other Americans in harm’s way, but never their own. Beyond the elite circles, however, the apathy towards foreign policy could be a consequence of 1974. Former West Point Major Danny Sjursen opined in an interview that with the dissolution of the compulsory military service, American’s capacity to care about the consequences of disastrous foreign interventions declined. Save for stalwart voices in the halls of Congress, or sporadic conclaves of anti-war dissent, the public’s apathy is counted on by the architects of regime change, as the mechanisms of policy and military operations continue to fall behind a veil of secrecy. So long as transparency and accountability become synonymous with weakness and vulnerability, the engines of national security will be obfuscated.

Closing Thoughts

When any nation goes to an extreme degree to protect itself, it is inevitable that that protection will never be seen, psychologically, to be enough. It is also often true that the image of the enemy will grow proportionate to the size of the defense, resulting in an overreaction and over-expending of energies to liquidate that fear that never seems to erode. Fear, whether fear of weakness or even fear of death, is a cultural nerve that is too raw to touch; Reinforced with weapons of war abroad and barbed wire fences at home, it is a reflection of our inability to engage with the finality of our lives, perhaps to escape the recognition of our fragility; that we are a nation that extols virtues of freedom, liberty, and equality, yet we cannot bear the thought of another nation declaring their own sovereignty. That a nation may challenge, let alone question America’s unique and indispensable status in the world might shatter our image at home and abroad, the architects of war would say. Foreign intervention and the tools used to enact it, however,  can only be dismantled from within. We owe it to ourselves and to the victims we have left in our wake that intervention becomes not only a sin of our past but the point from which we start a new chapter. This discussion sparks a wider critique of the formation of foreign policy that merits its own inquiry. Such an endeavor requires a whole of society approach, requiring nothing less than a fundamental restructuring on how we educate the public, how elites  consult the opinions of the public in the decision-making process, and ultimately how to hold elites more accountable when decisions prove disastrous. Then and only then will the United States come to terms with the facsimile we have presented to the world: the mask of freedom and democracy, worn only to shield ourselves from the horror we have left in our wake.

BlogSignature

Editor’s note: For mobile users, the full infographic on the School of the Americas may not show up. Click here for a high-res version of it.

As I renew my attempt to start consistently making use of the blog again, I wanted to begin the new year with a broad overview of sorts. Initially I debated the idea of centering this piece on whether “2019 will be 1939”, alluding to the tumultuous geopolitical climate and some of the parallels to today. An idea worth exploring, to be sure, but one I think I’ll address in a more nuanced way down the road. For now, I’d like to look back on the last year in domestic U.S. politics, what electoral and strategic victories were scored (namely for the Left), and what challenges are in store for this year if the Left is going to make these victories count.

2018 Midterm Elections

November 6th, to the fanfare of pundits on both the (center) left and right, was predicted to be “The Blue Wave”. While not the landslide some were anticipating, it brought the House, some governorships, and a handful of state legislatures firmly under the control of the Democratic Party. The victories of many genuine progressives and, dare I say, democratic socialists cannot be underestimated. Congresswomen Alexandria Ocasio-Cortez (NY-14), Ihlan Omar (MN-5), Ayana Pressely (MA-7), Rashida Tlaib (MI-13), among many others proved to be huge gains towards representation in a variety of ways. While there weren’t many progressive newcomers welcomed into the Senate, we can still expect much in the way of policy out of Senators Bernie Sanders, Elizabeth Warren, Sherrod Brown, Chris Murphy, and Jeff Merkley, to name a few. Lastly, it cannot be under emphasized how refreshing it is to simply have Republicans vacate key seats, regardless of who their challenger was; Chris Kobach in Kansas, Scott Walker in Wisconsin, Bruce Rauner in Illinois, and so on.

There should be a fine line, however, between recognizing the notable gains of the Democratic Party and what they mean of the overall voter base. Matt Karp wrote in the aftermath of the elections that, “Democrats key victories were owed to (…) white college graduates in the suburbs” highlighting that “The geographic diversity of these victories should not disguise their economic homogeneity”. Despite the short-term gains made by pivoting towards the suburbs, their long-term feasibility does not bode well for those on the Left. There is still much to be gained by running more progressive campaigns in the more economically-distressed areas of the country, as the party struggles to define its policies on deep-seated inequalities.

The dynamic between the coastal elites and the heartland of flyover states has not been dismantled in this latest election cycle, and congressional Democrats need to reformulate their strategy if they hope to capture a plurality of this key base. This is as much about emancipating the working class through a platform that is more representative of their needs as it is about listening to the call of the voters themselves. Even a casual glance at various ballot initiatives and referendums reflects this sentiment. Anti-Gerrymandering initiatives were passed in Colorado, Michigan, Missouri, and Utah; Medicaid was expanded in Idaho, Nebraska, and Utah; Affordable Housing and Rent Control scored major victories in cities like Chicago and Austin. While only a sample size of the many other progressively-minded ballot initiatives, it reveals that even in traditionally Republican-held states, voters are combating the institutionalized barriers that have tried to stifle their vote, cut back access to healthcare, and gentrified their cities. In the interests of their more prosperous constituency, the Democratic party is ceding away these opportunities to recapture a true ‘Blue Wave’, leaving the door open for fake populists and demagogues to remain competitive in 2020.

The Teacher’s Strikes

The wave of educational workers’ strikes that characterized much of the first half of the year also yield many lessons that shouldn’t be forgotten. But for those unfamiliar, the strikes took place primarily in West Virginia, Oklahoma, Arizona, and Georgia, as well as more localized strikes in Colorado, Kentucky, North Carolina, and Virginia, all sharing many core features. Namely, the budget cuts, low wages/salaries, low per pupil spending, as well in many states, Right-to-work laws, school choice and voucher systems, all culminated in a boiling pot of austerity measures that was met with fierce backlash in these traditionally Republican-held states. The most prominent example of success proved to be West Virginia, where strikes were held statewide, marches on the state capitol were organized, and coupled with some highly publicized confrontations with the governor all of this yielded significant benefits. While the teachers didn’t have all of their demands met, they were able to secure a 5% pay raise, and their small victory galvanized other workers nationwide to stage their own protests. Tactics such as these, especially emerging from the rank-and-file union members, might prove a crucial step towards a resurgence in labor movements that arguably hasn’t been seen in 20 years.

But the horizon for labor must be viewed with trepidation, especially in light of the Supreme Court decision on Janus v. AFSCME, a case that on its surface determined union fees collected from non-union members in the public sector violated the First Amendment. In her dissent, Justice Elena Kagan emphasized the way in which this implicates worker choice in workplace governance, saying the First Amendment has been weaponized so that “judges, now and in the future, (can) intervene in economic and regulatory policy”. The likelihood for judges with an anti-labor bent to use this as precedent in further curtailing the power of public sector unions cannot be ignored. In light of this, labor activists must not narrow the scope of their solidarity by excluding ‘free riders’. As Chris Brooks notes, this model proved devastating to public employees in Tennessee, who became increasingly divided and weakened, unable to collectively bargain under such a small banner. This only proves the urgency for public sector workers, and their private sector counterparts in solidarity, to increasingly mobilize, even if the books are stacked against them.

The Road Ahead

I hope this piece leaves you a sense of optimism on what is possible, but clear-eyed and resolved on what must be done in 2019. Far from an in-depth review of 2018, this cursory glance at the current political climate leaves many gaps I intend to cover in the future. One thing is clear though; For this year, and many years after, the Left needs to capitalize on the gains it has made and entrench them. Concurrent with electoral politics, labor movements fighting for a better standard of living in the most red of states have proven their merits. If such tactics, imbued with progressive ideals, can exist in areas ripe with voter suppression, then the need to run candidates that match this vigor is mandatory. Many of these candidates did not come to Washington to move back to the center, compromising with their esteemed colleagues on ‘sensible proposals’; indeed, many progressives members of Congress are not afraid to show their bona fides at the time of this writing. And it is perhaps our last hope that this energetic and unapologetic approach to politics, the kind of understanding of power and how to leverage it that has made the Republican party so successful, is what the Democratic party needs to stay true to its mandate as the party of the people.

BlogSignature

 

While Silicon Valley has become an undoubted source of innovation and technological progress, as recent scandals in the operations of companies like Uber and Facebook have shown, there is most certainly more to their identity beneath the surface. Much like the financiers of Wall Street, or the glut of Oil Tycoons in the dawn of the 20th century, these are a sacred class; standard-bearers of the free market and symbols of rugged individualism: fulfilling the maxim that any American could reach these levels of success with a bold vision and the grit to back them. Just as these past industries came under the scrutiny of citizens and governments alike, the tech industry is due for its time under a more watchful eye.

“The Epitome of a Stanford-Fueled Startup”

To best articulate only a few defects in the tech industry, it’s useful to look at the curious rise and fiery fall of what the Wall Street Journal called “the epitome of a Stanford-fueled startup”: Clinkle. The company, founded in 2011 by 19-year old Stanford student Lucas Duplan, attracted both peers and faculty alike, resulting in many leaving the school to work on the tech startup. Coupled with a highly successful round of funding from venture capitalists, as well as the blessing of many heavyweights in Silicon Valley, Clinkle seemed well on its way to joining the ranks of other household names. There was one problem, however: No one quite knew what “Clinkle” was. Two years of signups and promises of beta testing had amounted to, by September 2014, a phone app that was little more than a glorified digital wallet, with functionality that mimicked other existing apps. It didn’t help that while the company was in a protracted “stealth mode”, the company quietly laid off 25% of its employees. Ex-Clinkle workers anonymously reported how the young CEO lied to employees about their stock option pool, made his staff work far below the market rate while he took a six-figure salary, and not to mention had a product launch date that was constantly in flux since the company’s founding. Duplan’s penchant for extravagant displays of wealth and absurd promotional stunts ultimately spelled the company’s demise, with investors demanding for a return of funds by 2016 from the wreckage of the tech world’s darling startup.

It would be easy to take away from this story the shortcomings of a young CEO, and indeed: much of the coverage on Clinkle at the time turned Duplan into a pariah. But the startup had attracted major support in the industry, from the prominent tech venture capital firm Andreessen Horowitz to entrepreneurs like Peter Thiel. If a company under their watch could implode, it should bring pause to those who praise the tech startup culture, and perhaps call into question how exactly they receive funding. Beyond these concerns are others that were exemplified through Clinkle’s example: Employees encouraged to make sacrifices and take lower wages while CEO’s raised their own salaries; Tech companies “disrupting” existing industry spheres being a euphemism for circumventing existing laws; and at the heart of the matter, how Silicon Valley is redefining the relationship between capital and its consumers.

Live Fast, Burn Out Young

Joining the ranks of the techies populating the land from Palo Alto to San Francisco is by no means a cheap affair. As author Corey Pein reveals in Live, Work, Work. Work, Die, the life of a young tech worker is often characterized by Airbnb rentals, shared Uber commutes, and obscene work hours. In this industry, there is an enigmatic charm to the idea of working well into the night and spilling over into the weekends. Blake Robbins, a former Google intern turned tech investor, offered his experiences in the tech world in a series of tweets. On the ‘cool’ factor of working longer, Robbins says:

“Not hanging with friends and family because you’re working isn’t ‘cool.’ Burning out isn’t ‘cool.’ I promise you…your competition isn’t beating you because they are working more hours than you. It’s because they are working smarter.”

Many former interns and tech workers have echoed this sentiment, with the idea of personal sacrifice and laborious hours being integral to the identity of the Silicon Valley work ethic. One can find the most devoted followers of the tech industry recounting the stories of their idols like Elon Musk, keeping a sleeping bag near the Tesla production line, as a point of fact that their tireless efforts and sleepless nights will pay off. It doesn’t seem to deter many of these potential entrepreneurs that the failure rate often touted for startups hovers around 90% (recent studies show it to be closer to 79%). It instead serves as a sort of mantra to defy the odds, and justify the failure of those who haven’t yet made it. Rather than serve as a moment of introspection, it seems to dismiss bad behaviors and practices, and mislead startup founders on what characterizes the nature of success in this industry. As opposed to the technical merits of the product being pitched to venture capitalists, often what entices their approval is the presentation of the idea; marketing and returns trump functionality, in the self-professed nerd meritocracy.

Beyond the world of venture capital and Mark Zuckberg acolytes, factory and labor workers for these companies don’t fare much better. Amazon’s practices regarding its warehouse employees have brought this concern to the forefront in recent years; for one, Amazon doesn’t consider them its employees. Many of the hires are brought on via a partnership with temp-firms like Integrity Hiring Solutions, allowing them to avoid maintaining a permanent workforce that expects raises, good benefits, and worker compensation. This also allows a measure of control against worker disgruntlement, as, under these conditions, temp-workers cannot organize into unions, given the flux in which the company rotates workers. Amazon had put so much stock into observing an orderly continuance of its workers performance, that its Germany facility used HESS, a security subcontractor with numerous ties to Neo-Nazi organizations, as a means to intimidate and suppress the warehouse’s foreign workers. Granted, practices as overt as these appear to be few and far between, yet remain emblematic of an industry whose attitudes towards laborers is akin to the bosses of Gilded-Age factories.

Disrupting Laws and Challenging Norms

Silicon Valley and the Tech Industry at large prides itself for companies that can “disrupt” an existing market, industry, or trend. What does disruption actually mean, though? To those who throw the word around in their market pitches and high-production advertisements, it envisions themselves as the forerunners to re-writing the rules of any given market, by challenging the ways in which they operate and under-towing their operation through cheaper, simpler, or more “innovative” means. But practically speaking, as Judith Shulevitz explains, these disrupters believe in a kind of free-market absolutism that coincides hand in hand with technological growth; existing institutions, particularly publicly funded ones, are operating inefficiently; tech entrepreneurs can do it better, and for cheaper, to the benefit of the customer.

Convenience has been a driving factor in the success stories of companies like Uber, Netflix, and Airbnb. The cost of such efficiency, namely in Uber’s case, has been exploited through legal loopholes and aggressive lobbying campaigns. Beginning with its emergence in 2010 UberCab, it was “a one-click service to hire licensed, professional drivers”. This was, of course, false, as the company was neither licensed, and hired drivers that met the qualifications of simply owning a car. Beyond misleading advertisements, the company also designated its drivers as “independent contractors”, allowing them to avoid minimum wage laws, payroll taxes, health insurance, and other legal benefits. In spite of these initial hurdles, it successfully bankrolled political consultants, courting local city officials and other state actors to legitimize the company and legalize the service. Uber, like many companies before it, was using the time-honored strategy that Corey Pein sardonically refers to as “break laws first, buy influence later”.

Upend the Disruption

It’s important to take this collection of company horror stories not as separate, unrelated anecdotes, but collectively as part of a wider narrative. Stories of workers collapsing from relentless schedules, companies using surveillance software to avoid law enforcement, and startups that provide services and products that fabricate a fictional demand are the only evidence that the world of tech thrives on boom and bust cycles to succeed. While the Tech Press fawns over the latest gadgets while simultaneously Tech moguls threaten censorship against critical coverage, it would seem SIlicon Valley has maintained a stranglehold on their media image.

If any of the evidence presented here shows, coupled with the massive backlash against companies like Facebook over consumer privacy, the tides do seem to be changing. Tech Investors, writer Erin Griffith notes, have become warier over investing startups with dubious founders and bad PR. If this is the case, perhaps we won’t see another Lucas Duplan for quite some time. As investor Hunter Walk comments “It’s the exact same story of too many people with too much money. That breeds arrogance, bad behavior, and jealousy, and society just loves to take it down. Tech has now become an institution.” This institution calls for more scrutiny over a world we’ve taken for granted in the pursuit of economic progress and technological innovation. Much like the robber barons of the 20th century, or the Wall Street investors behind the 2008 crash, the Elon Musk’s and Mark Zuckerberg’s of the world are due for the same oversight and regulation that we give to many economic bubbles, that were once deemed “too big to fail”. It’s time to disrupt the disruption.

 

BlogSignature