On September 21, 1945—five months after Franklin Roosevelt’s death—President Harry Truman assembled his cabinet for a meeting that one historian has called “a turning point in the American century.” The purpose of the meeting was to discuss Secretary of War Henry Stimson’s proposal to share atomic bomb information with the Soviets. Stimson, who had directed the Manhattan Project, maintained that the only way to make the Soviets trustworthy was to trust them. In his proposal to Truman, he wrote that not sharing the bomb with the Soviets would “almost certainly stimulate feverish activity on the part of the Soviets . . . in what will in effect be a secret armament race of a rather desperate character.”

Cold War militarism achieved its own coherence and legitimacy by adopting economic logic and criteria.

Henry Wallace, the secretary of commerce and former vice president, agreed with Stimson, as did Undersecretary of State Dean Acheson (though he later changed his position), but Secretary of the Navy James Forrestal laid down the definitive opposition. “The Russians, like the Japanese,” he argued, “are essentially Oriental in their thinking, and until we have a longer record of experience with them . . . it seems doubtful that we should endeavor to buy their understanding and sympathy. We tried that once with Hitler. There are no returns on appeasement.” Forrestal, a skilled bureaucratic infighter, had made his fortune on Wall Street and frequently framed his arguments in economic terms. The bomb and the knowledge that produced it, Forrestal argued, was “the property of the American people”—control over it, like the U.S. seizure of Japan’s former Pacific Island bases, needed to be governed by the concept of “sole Trusteeship.”

Truman sided with Forrestal. Stimson retired that very same day, his swan song ignored, and Wallace, soon to be forced out of the Truman administration for his left-wing views, described the meeting as “one of the most dramatic of all cabinet meetings in my fourteen years of Wash­ington experience.” Forrestal, meanwhile, went on to be the country’s first secretary of defense in 1947 and is the man who illustrates perhaps more than anyone else how Cold War militarism achieved its own coherence and legitimacy by adopting economic logic and criteria—that is, by envi­sioning military power as an independent domain of capital expenditure in the service of a political economy of freedom. From his pivotal work in logistics and procurement during World War II, to his assiduously cultivated relationships with anti–New Deal congressmen and regional business leaders sympathetic to the military, Forrestal both helped to fashion and occupied the nexus of an emerging corporate-military order. He only served as defense secretary for eighteen months (he committed suicide under suspicious circumstances in 1949), but on the day of that fateful cabinet meeting, he won the decisive battle, advocating for what he once called a state of ongoing “semi-war.”

The post–World War II rise of a U.S. military-industrial complex is well understood, but it still remains hidden in plain sight. Today warnings about Donald Trump’s assault on the “liberal international order” are commonplace while less examined is how we arrived at a point where democratic and “peacetime” governance entails a global military infrastructure of 800 U.S. military bases in more than 70 countries.

The post–World War II rise of a U.S. military-industrial complex is well understood, but it still remains hidden in plain sight.

Moreover, this infrastructure is under the command of one person, supported by a labor force numbering in the millions, and oriented to a more-or-less permanent state of war. If a politics of threat inflation and fear is one part of the answer, the other, more prosaic component is that the system itself is modeled after the scope of business and finance. By managing a diverse portfolio of assets and liabilities and identifying investment opportunities, it envisions a preeminently destructive enterprise as a series of returns calibrated to discretionary assessment of threats and a preponderance of force. This was Forrestal’s bailiwick.


A little-known anecdote about Truman’s 1947 call to Congress for decisive intervention in the Greek civil war—generally viewed as the official declaration of the Cold War—illustrates this point. Truman’s speech is famous for its emphasis on political freedom, particularly the idea of protecting peoples’ rights to self-determination against “armed minorities”—“the terrorist activities of several thousand armed men, led by communists.” “One of the primary objectives of the foreign policy of the United States,” Truman said, establishing the characteristic linkage between World War II and the Cold War, “is the creation of conditions in which we and other nations will be able to work out a way of life free from coercion. Our victory was won over countries which sought to impose their will, and their way of life, upon other nations.”

The moral and rhetorical heightening of the opposition between democracy and communism (and, incipiently, terrorism) was a conscious choice. Truman was famously advised by Republican senator Arthur Vandenburg that securing public and congressional support for unprecedented and costly peacetime intervention into European affairs entailed “scaring the hell out of the American people.” Another, less visible choice, however, was to downplay the role of the accountant’s ledger, which was more overt in an early draft of Truman’s speech. That draft argued that emergency financial support for Greece (and Turkey) was now a requirement of world capitalism: “Two great wars and an intervening world depression have weakened the [capitalist] system almost everywhere except in the United States. If, by default, we permit free enterprise to disappear in other countries of the world, the very existence of our democracy will be gravely threatened.” Acknowledging the less-than-compelling purchase of this argument, Secretary of State Dean Acheson remarked derisively that it made “the whole thing sound like an investment prospectus.”

The moral and rhetorical heightening of the opposition between democracy and communism was a conscious choice.

Truman’s delivered address, by contrast, made use of the words “free” and “freedom” twenty-four times in a few minutes, as if talismanic repetition were enough to hinge the defense of private capital accumulation to the maintenance of popular democracy the world over. Yet, despite the inflated rhetoric, economic considerations remained the skeletal core of the Truman Doctrine. Buried inside the address was the acknowledged collapse of British imperial policy in the region, along with an “invitation” from a dubiously democratic, right-wing Greek government for “financial and other assistance” in support of “better public administration.” The imperatives of democracy and self-government—preeminent political values understood by the U.S. public—were subordinated to building “an economy in which a healthy democracy can flourish.” In a final nod to the bean counters, Truman noted that the amount he was requesting was a mere fraction of what the United States spent during World War II, and no less justified as “an investment in world freedom and world peace.”

The challenge for U.S. policy makers going forward was to reconcile a lofty rhetorical and moral emphasis upon the principle of political self-determination with the necessity of investing military force (i.e., “other assistance”) whose paramount end was securing the market freedoms of national and international capitalists. The teleological (and tautological) proposition that a substratum of properly capitalist economic relations organically yielded a democratic harvest would be the farmer’s almanac of a rising generation of modernization theorists. But the reality on the ground—in a world where the main provenance of self-determination was defined by the bloody rearguard defense of colonial prerogatives on the part of the United States’ most important allies and industrial partners—was bitter, and far less susceptible to universalizing nostrums. Straight-talking U.S. policy makers, particularly those at the center of the military apparatus, knew it.

The following year, for example, George Kennan, author of the “containment” doctrine, a protégé of Forrestal, and the single most influential strategic foreign policy thinker of the moment, offered a strikingly candid version of the task at hand, in a classified memo that consciously punctured the universalist ambit of the Truman Doctrine:

We have about 50% of the world’s wealth but only 6.3% of its population. This disparity is particularly great as between ourselves and the peoples of Asia. In this situation, we cannot fail to be the object of envy and resentment. Our real task in the coming period is to devise a pattern of relationships which will permit us to maintain this position of disparity without positive detriment to our security. To do so, we will have to dispense with all sentimentality and day-dreaming; and our attention will have to be concentrated everywhere on our immediate national objectives. We need not deceive ourselves that we can afford today the luxury of altruism and world-benefaction. (emphasis added)

When thinking about nations and peoples, particularly those outside of Europe, Kennan again foregrounded a logic of investment and risk management, and he advised restraint and limitation of liability, espe cially with respect to “the peoples of Asia . . . [who] are going to go ahead, whatever we do, with the development of their political forms and mutual interrelationships in their own way.” Kennan warned that the coming period would be neither “liberal” nor “peaceful,” and that such countries were likely to “fall, for varying periods, under the influence of Moscow, whose ideology has a greater lure for such peoples, and probably greater reality, than anything we could oppose to it . . . [or that] our people would ever willingly concede to such a purpose.” In this light, he concluded that the United States needed to dispense with commitments, rhetorical and otherwise, to “unreal objectives such as human rights, the raising of living standards, and democratization. The day is not far off when we are going to have to deal in straight power concepts.”

Despite the rhetoric of ‘freedom,’ economic considerations remained the skeletal core of the Truman Doctrine.

This view is sometimes depicted as an exemplary instance of realism—wiser and more in tune with the messy, uneven world that emerged from World War II—and a point of view that, had it been heeded, may have prevented the costly overreach of global cold war, especially “blunders” such as the Vietnam War (which Kennan, long retired to academia, opposed). The concept of realism, however, fails to grasp the functional logic of risk and threat assessment—the insistent and anxious hedging and speculation that made the careers and fortunes of Kennan, Forrestal, and many that followed them. Forrestal fretted obsessively in his diary along these lines: “I am more impressed than ever as things develop in the world today that policy may be frequently shaped by events unless someone has a strong and clear mental grasp of events; strong enough and clear enough so that he is able to shape policy rather than letting it be developed by accidents.” This recurrent epistemic anxiety initiated an insistent demand for anticipatory policy, abiding mistrust, and the maintenance of a preponderance of force. As Forrestal bluntly put it, “Power is needed until we are sure of the reign of law.”

Despite his long period of service within a New Deal liberal political milieu, Forrestal (like Kennan) was disinterested in universalizing the scope of political self-determination overseas, recognizing as more press ing the preservation of a capitalist economy built on uneven development and asymmetric military power at a world scale. Electrified upon reading Kennan’s “Long Telegram” (1946), Forrestal viewed his fellow Princeton man as a kindred soul, one who had intuited similar grounds of Orientalist menace, inscrutability, and immunity to anything but the language of force in Soviet conduct. It was Forrestal who brought Kennan to Washington, D.C., from Moscow and into the policy-making apparatus; both men were solicitous toward the value of rank and privilege, tolerant of authoritarian deviations from liberal standards, and assured that freedom from coercion was the provenance of those who, in Kennan’s words, were already imbued with “Anglo-Saxon traditions of compromise.”

Forrestal framed his own deference for hierarchy in terms of the prerogatives of corporate capitalism—the idea that practical men of business, rather than reformers and intellectuals, had won World War II and needed to be running the world going forward. Among his more forceful conclusions was that liberal globalism would be disastrous if it were not steeled with counterrevolutionary animus. As he confided to diplomat Stanton Griffiths:

Between Hitler, your friends to the east, and the intellectual muddlers who have had the throttle for the last ten years, the practical people are going to have a hell of a time getting the world out of receivership, and when the miracles are not produced the crackpots may demand another chance in which to really finish the job. At that time, it will be of greatest importance that the Democratic Party speaks for the liberals, but not for the revolutionaries.

Forrestal used corporate capitalism to frame his own deference for hierarchy: practical men of business, rather than reformers and intellectuals, had won World War II.

For these realists, even more than the wooly moralists they sometimes ridiculed, it was the credibility of U.S. threats of force that ensured the freedom and mobility of productive capital and supported its resource needs and allied interests across an ever-widening sphere. Of a more aristocratic and consciously anti-democratic mien, Kennan likewise recognized that the animating logic was not strictly anti-communist but counterrevolutionary—indeed even racial. The inevitable dissolution of the colonial system meant that the challenge of U.S. policy in the coming period was broader than the struggle with Soviet communism, as “all persons with grievances, whether economic or racial will be urged to seek redress not in mediation and compromise, but in defiant, violent struggle.” Inspired by communist appeals, “poor will be set against rich, black against white, young against old, newcomers against established residents.”


By eliding soviet designs with those of heterogeneous movements demanding effective sovereignty and challenging material deprivation, Forrestal and his colleagues contributed to a perverse recasting of the dynamic of European colonial disintegration as the field of Soviet imperial expansion. This rhetorical and ideological frame practically demanded the militarization of U.S. foreign policy, with U.S. “counterforce” the only alternative to a world ruled by force. As such, along with Arthur Radford, Forrestal was instrumental in developing the Central Intelligence Agency (CIA), and that agency’s work soon echoed his. In 1948, for instance, a CIA document entitled “The Break-Up of Colonial Empires and its Implications for US Security” defined expressions of “economic nationalism” and “racial antagonism” as primary sources of “friction between the colonial powers and the US on the one hand, and the states of the Near and Far East on the other.”

The CIA’s analysts suggested that poverty and a legacy of anti-colonial grievances rendered colonized and formerly colonized peoples “peculiarly susceptible to Soviet penetration” and warned that the “gravest danger” facing the United States was that decolonizing nations might fall into alignment with the USSR. At the same time, they faulted Europe’s colonial powers for their failure to satisfy “the aspirations of their dependent areas” and advised them to “devise formulae that will retain their good will as emergent or independent states.” Envisioning U.S. responsibility to author such formulae in the future, the classified brief concluded that the United States should adopt “a more positive and sympathetic attitude toward the national aspirations of these areas,” including policy that “at least partially meets their demands for economic assistance.” Otherwise “it will risk their becoming actively antagonistic toward the US,” including loss of access to previously “assured sources of raw materials, markets, and military bases.”

The rheotric of ‘counterforce’ practically demaded the militarization of U.S. foreign policy.

While the emerging U.S. foreign policy clearly accepted the un-resolvable antagonism toward the Soviet Union, the challenge of the future, as the CIA argued, was how the United States should address the “increasing fragmentation of the non-Soviet world,” or, in a word, decolonization. The means for assessing risk and reward in this expansive and heterogeneous terrain of imperial disintegration were by no means clear. But it is revealing that the possibility of potential alignments between decolonizing nations and Soviet power was far less concrete and worrisome to the United States than the more definite and delineated material losses faced by the United States and the colonial powers with which it had aligned itself—namely, being deprived access to formerly “assured sources of raw materials, markets and military bases.” In other words, the challenge of the future, as Kennan had underlined, was to devise “formulae” to buttress the forms of political authority that sustained economic inequality (at a world scale) in the face of inevitable revolt and revolution against such authority and the social conditions it supported.

Despite his later misgivings, Kennan had authored the concept whose rhetorical elasticity and ideological indeterminacy proved crucial to fashioning a nemesis that suited this consciously expansionist vision of U.S. economic and military power. With the creation of the CIA, the National Security Council, and Forrestal’s own new position of secretary of defense, these years saw the growth of a national security bureaucracy that was divorced from meaningful oversight and public accountability for its actions, including myriad moral failures and calamities. A covert anti-Soviet destabilization campaign in Eastern Europe, for example, greenlit by Forrestal and Kennan, enlisted Ukrainian partisans who had worked with the Nazis. This type of activity would become routine in Latin America, Asia, and Africa, where Kennan derided respect for the “delicate fiction of sovereignty” that undeserving, “unprepared peoples” had been allowed to extend over the resources of the earth.


Over the next quarter century, fewer than 400 individuals operated the national security bureaucracy, with some individuals enjoying decades of influence. That the top tier was dominated by white men who were Ivy League–educated lawyers, bankers, and corporate executives (often with ties to armament-related industries) lends irony to official fearmongering about armed conspiracies mounted by small groups, let alone the idea that the role of the United States was to defend free choice against coercion imposed by nonrepresentative minorities. This fact, perhaps more than any other, suggests that, as much as the Cold War represented a competition between incompatible, if by no means coeval or equally powerful systems of rule (i.e., communist and capitalist), it was marked by convergences too. The Soviet “empire of justice” and the U.S. “empire of liberty” engaged in mimetic, cross-national interventions, clandestine, counter-subversive maneuvers, and forms of clientelism that were all dictated by elite, ideologically cohesive national security bureaucracies immune from popular scrutiny and democratic oversight.

Those charged with governing the controlling seat of U.S. globalism consistently doubted the compatibility of normative democratic requirements and the security challenges they envisioned, including distrust that often bordered on contempt for the publics in whose name they claimed to act. “We are today in the midst of a cold war, our enemies are to be found abroad and at home,” remarked Bernard Baruch, coining the term that names this era. In this context, “the survival of the state is not a matter of law,” Acheson famously declared, an argument similar to one being advanced by former Nazi jurist Carl Schmitt. Vandenberg, echoing defenders of Roosevelt’s accretive accumulation of war powers, was positively wistful lamenting “the heavy handicap” that the United States faced “when imperiled by an autocracy like Russia where decisions require nothing but a narrow Executive mandate.” For Forrestal, “the most dangerous spot is our own country because the people are so eager for peace and have such a distaste for war that they will grasp for any sign of a solution of a problem that has had them deeply worried.”

As much as the Cold War represented a competition between incompatible systems of rule, it was marked by convergences too.

Forrestal felt that the danger at home manifested itself most frustratingly in the threat that congressional budgeting posed to military requirements. The preservation of a state of peace was a costly proposition when it revolved around open-ended threat prevention the world over. Upholding the permanent preponderance of U.S. military power at a global scale required a new type of fiscal imagination, one that had to be funded by the future promise of tax receipts. During his final year in office, Forrestal’s diary records in mind-numbing detail his worries about acquiring Pentagon funding adequate to his projections for global military reach. In Forrestal’s view, budgetary considerations were captive to the wrong baseline of “peak of war danger” and combatting “aggression” rather than to “maintenance of a permanent state of adequate military preparation.”

A fascinating aspect of these budget wrangles is Forrestal’s manic efforts to translate future-oriented geostrategic needs into precise dollar values. Just months before his forced retirement and eventual suicide, he confided to Walter G. Andrews:

Our biggest headache at the moment, of course, is the budget. The President has set the ceiling at 14 billion 4 against the pared down requirements that we put in of 16 billion 9. I am frank to say, however, I have the greatest sympathy with him because he is determined not to spend more than we take in in taxes. He is a hard-money man if ever I saw one.

Despite his grudging admiration for the stolid Truman, Forrestal’s Wall Street background had left him at ease in a more speculative or liquid universe; at that precise moment, he was devising accounting gimmicks to offset near billion-dollar costs of stockpiling raw materials as a “capital item” that could be “removed from the budget.” The important point to emphasize is the relationship between two interrelated forms of speculation and accounting—economic and military—in which an absolute inflation of threats tempted a final break with lingering hard-money orthodoxies and a turn to deficit spending. Forrestal did not live to see the breakthrough, but his work paid off.

As Acheson described it, the Korean War—the first hot war of the Cold War era—“saved” the fledgling national security state. With its outbreak, the dream of eternal military liquidity was realized when Leon Keyserling, the liberal economist serving as Truman’s chairman of the Council of Economic Advisors, argued that military expenditures functioned as an economic growth engine. That theory then underpinned NSC 68, the document that justified massive U.S. defense outlays for the foreseeable future and which was authored by another Forrestal protégé, Paul Nitze. By yoking dramatically increased federal spending to security prerogatives, military Keynesianism thus achieved a permanent augmentation of U.S. state capacity no longer achievable under appeals to Keynesianism alone.


The embedding of the global priorities of a national security state, which sometimes appears inevitable in retrospect, was by no means assured in the years leading up to the Korean War. It was challenged by uncooperative allies, a war-weary or recalcitrant U.S. public, and politicians who were willing to cede U.S. military primacy and security prerogatives in the name of international cooperation. But by 1947, men such as Forrestal had laid the groundwork for rejecting the Rooseveltian internationalist inheritance, arguing it was necessary to “accept the fact that the concept of one world upon which the United Nations was based is no longer valid and that we are in political fact facing a division into two worlds.” Although the militarization of U.S. policy is often understood to have been reactive and conditioned by threats from the outside, his ruminations illustrate how militarized globalism was actively conceived as anticipatory policy (in advance of direct confrontations with the Soviet Union) by just a few architects and defense intellectuals—men under whose sway we continue to live and die.

The Cold War says more about how U.S. elites imagined their ‘freedom’ than it does about enabling other people to be free.

Ultimately, the declaration of the Cold War says more about how these U.S. elites represented and imagined their “freedom” and envisioned the wider world as a domain for their own discretionary action and accumulation than it did about enabling other people to be free, let alone shaping the terms of a durable and peaceful international order. As early as 1946, Forrestal began taking important businessmen on tours of the wreckage of Pacific Island battles, which also happened to be future sites for U.S. nuclear testing. Forestall described these ventures as “an effort to provide long-term insurance against the disarmament wave, the shadows of which I can already see peeping over the horizon.” The future of the bomb and the empire of bases were already on his mind.

Forrestal recognized that force and threat are always fungible things to be leveraged in the service of the reality that truly interested him, the reality made by men who own the future. For those of his cast of mind, “international order” was never more than the fig leaf of wealth and power. As he noted in a 1948 letter to Hansen Baldwin of the New York Times: “It has long been one of my strongly held beliefs that the word ‘security’ ought to be stricken from the language, and the word ‘risk’ substituted. I came to that conclusion out of my own business experience.” It was the job, after all, of these East Coast lawyers and moneymen to make sure all bets were hedged, and Forrestal knew that speculation could turn into “an investment gone bad.” As a leading investor in the Cold War project, he wanted a guaranteed return, even if the rule of law never arrived and even when the price was ruin.