- 行业: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
Although males aged between eighteen and twenty-five are required to register for possible conscription in the US, there has been no compulsory military service since 1973. Even before that date, the ideal of a voluntary army and exemptions for those with families or other reasons not to fight had precluded the idea of universal service.
Moreover, conscription has been a fierce battleground for issues of patriotism and independence, especially in the era of the Vietnam War.
Colonial practices of universal white male military readiness were replaced in the nineteenth century by a voluntary military although wartime conscription was instituted by both the North and the South in the Civil War. The Selective Service Act of 1917 authorized wartime conscription, but lapsed thereafter. The first peacetime draft, in 1940, accompanied preparation for the Second World War, but also lapsed afterwards, in 1947.
Reconstituted in 1948 as a response to dwindling interest in the military however, the Selective Service was extended for war and peacetime needs in 1950, 1951, 1955, 1959, 1963 and 1967.
As protest erupted against the Vietnam War, the draft became a special focus of attention. On the one hand, skillful use of exemptions (especially for college) allowed middle- and upper-class white males to avoid service: Bill Clinton was later accused of dodging the draft while George Bush, Jr. served in the Texas National Guard. Other opponents of the war favored direct resistance: burning cards, storming induction centers or leaving the country. The burden of military service fell on poor and minority inductees, where the lottery ranking birthdates from 1 to 365/6 (for leap year) evoked the specter of Shirley Jackson.
Inductions ended in 1973 and registration ended a year later. Since 1980, males have been required to register, but military actions have been conducted by all-volunteer forces.
Industry:Culture
Although many Americans consider this an abstract field, the study of language and meaning has, in fact, become a center for interdisciplinary interests and discussion. Its reach includes fields of communication and information processing linked to artificial intelligence and computers, questions of cognition and development from psychology and education, questions of meaning and truth from philosophy issues of style and voice in literature and questions of society and culture. In all these, American scholars and thinkers have made important contributions to our understanding of language in general, as well as of its roles within American culture.
Indeed, from the early days of the American nation, scholars confronted questions of what language and identity meant, even if only to establish American standards for lexicon (Noah Webster) or education. Descriptive linguists later confronted the unique linguistic heritage of Native Indian tribes, documenting them in some cases as they disappeared. Anthropologists Edward Sapir and Benjamin Whorf also drew important contrasts between the worldviews embodied in these languages and those of Western Europe. Other approaches stressed function and relation among parts of language, while American linguistics learned from European structural linguistics in the 1930s and 1940s.
Noam Chomsky fostered a revolution in American linguistics in the 1950s and 1960s with transformational-generative grammar, which has thrived as both tool and theoretical arena. His stresses on individual effort, mechanisms of production, and grammaticality as accepted speech all evoke themes of American individualism and community. Some critics have argued that his concerns with free will and truth also intersect with his political activism.
Since the 1970s, linguists have also explored relations of language, society meaning and change. William Labov used sociological methods to document social divisions and changes in language; he also worked with black English formations as variations or alternatives to Standard English, a theme that has drawn conflicting attention from scholars, educators and families (see Rickford 1999). John Gumperz examined the strategic construction of conversations. Others have focused on the politics and ideologies of language, the nature and changes of bilingual communities, and critical relations of language, gender and power. The last has reached a wide popular audience in self-help books and works by Deboran Tannen, as well as feminist and gay critics.
Recent work can be reviewed in central journals of the field, including Language and Language in Society.
Industry:Culture
Although medical alteration of the body tackles congenital problems and the results of accidents and violence, molding the body towards beauty stereotypes underpins plastic surgery’s gender and age configuration. Moreover, since most surgery is elective (uninsured), the field is associated with the wealth of both practitioners and clients. Yet, Americans undergo nearly 1 million such operations annually; females constitute 89 percent of all clients. While breast implants have been associated with severe hazards, they remain popular alongside face lifts; baby-boom males focus on abdomens and chests. Most plastic surgeons also are male, evoking a Pygmalion syndrome within the profession and its depictions.
Industry:Culture
Although more than 10 million Americans claim French ancestry (more than Poles or Native Americans), this pales beside the claims of German and Irish ethnics. Franco-Americans are also divided by their historical experiences of coming to the US, directly or through other colonial possessions. Yet, relations of international history culture and class tend to separate these immigrants from general American principles of ethnicity.
Perhaps no other European group is so closely identified with its homeland rather than with an American ethnic identity France has a strong historical image in the US as a historic co-revolutionary ally from Lafayette and the American Revolution to shared struggles in the World Wars. The Statue of Liberty consummate symbol of the US is, after all, a centennial gift from France. France has also been a model of elegance for fashion, architecture and style, and an image of sensuality from Bernhardt to Bardot. While hordes of American tourists and expatriates, and mass cultural marketing—movies and McDonald’s—may alarm French cultural critics (see American images abroad), the two countries maintain a close, continuing relationship on many levels which seems to overshadow the immigrant experience.
Direct migration from France to the US was also early and especially evident in the formative periods of the American Republic, when France provided leadership and service in American Catholicism and important commentaries on the American dream through De Tocqueville and Crevecouer. This migration was swamped by larger waves from other parts of Europe and later global immigration. French migrants (or claims to “Frenchness”) are often associated with positions of education, art, fashion or other high cultural realms; this might be exemplified in Jacqueline Bouvier Kennedy as a public citizen of French descent. Between 1941 and 1990, however, French immigration actually surpassed legal Irish immigration.
French-American citizenship has also been constituted through secondary migrations and expansion. Early Huguenots settled in South Carolina, for example, were joined by French West Indian planters escaping unrest in Hispaniola. Cajuns and Creoles in Louisiana, meanwhile, were incorporated into the United States through acquisition of this former French colony.
French Canadians, an ancestry claimed by 2–5 million Americans, represent a special secondary migration in many New England areas where families sought industrial opportunities in periods of underdevelopment in contiguous Quebec. These immigrants built strong settlements around churches and families. Other French Canadians have created more temporary “colonies” in the US around winter homes and retirement complexes in Florida and the Sunbelt.
In the late twentieth and early twenty-first centuries, France has actively promoted a panFrench identity of language and culture that often is represented through high cultural and language activities—art shows, university programs, film festivals or the Alliance Francaise. Yet despite this external support—or perhaps because of it—French Americans lack the cohesive identity one associates with Germans, Irish or Italians. One interesting test point for this identity in fact, will be seen in the growth of Caribbean (Haitian) and African Francophones as American ethnics.
Industry:Culture
Although not governed by the Anglican Church in Great Britain, the Protestant Episcopalian Church of the United States of America has remained in the Anglican communion, which has allowed the church to enjoy a richly multicultural outlook. More than half of all Anglicans are non-white, and the world church has worked to ordain local clergy of whom Nobel Peace Prize winner Archbishop Desmond Tutu is a notable example.
In spite of their origins in the Church of England, Episcopalians had a rocky start on American shores. Most colonial settlements were started by the Puritans (Congregationalists), Baptists and Quakers, with Episcopalians settling primarily in Virginia and the coastal areas of South Carolina and Georgia. The Church of England largely ignored the spiritual needs of its colonial members, refusing to permit an American bishop and the establishment of a colonial church organization. Over half of those who signed the Declaration of Independence were Episcopalians, but the church suffered during and after the Revolutionary War because many of the clergy were British loyalists.
During the nineteenth century, Episcopalians remained largely on the East Coast, unlike Methodists and Baptists who followed waves of settlers into the interior of the country. The Episcopal Church ordained women as deaconesses, starting in 1855, but did not officially sanction the ordination of women as priests and bishops until 1976. Both ordained men and women may be married. The Episcopal Church made few attempts to include minorities in integrated congregations or in the church’s hierarchy but in the 1950s African American men were enrolled in Episcopal seminaries. A black woman, Barbara Harris, was consecrated as the first female bishop in 1989. Although Episcopalians of all races now worship together, conservative members have resisted female clergy and some Episcopalians have converted to the more traditionally patriarchal Eastern Orthodox and Roman Catholic faiths. Conversely roughly half of all US Episcopalians were not raised in that faith. Controversy over the ordination of homosexual clergy and the marriage of gay parishioners has also caused internal turmoil for the church. While some congregations forbid openly gay clergy others accept celibate gay clergy, while a few congregations welcome gay clergy who live in committed partnerships.
Three pillars sustain Episcopalianism: tradition, faith and reason. Sunday and weekday services link Episcopalianism to the beginnings of Christianity, despite the church’s sixteenth-century break from Roman Catholicism and papal authority Religious services, rites and doctrines are published in The Book of Common Prayer, which was updated in 1979. Episcopalians may develop new services, such as rites for healing, and incorporate rites from other Anglican communion prayer books. Some disagreements are solved by having two or more forms of services and prayers, from which individual parishes may choose. The Bible is not considered the literal word of God, and Episcopalians are encouraged to use reason in their interpretation of the Bible.
Despite its small numbers (only 2.5 million Americans consider themselves Episcopalian), the church has often taken on the responsibilities of a national church, as evidenced by the seventy-five-year effort to construct the National Cathedral in Washington, DC. Open to persons of all faiths, the cathedral was where Martin Luther King. Jr, preached his last Sunday sermon.
Industry:Culture
Although Swedes participated in the colonial settlement of the Mid-Atlantic, most Scandinavian immigrants arrived in the US between 1880 and 1920, including some 300,000 from Denmark, 750,000 Norwegians and 1.25 million Swedes. While smaller than some other groups, these numbers were extremely large for the populations involved—one-fifth of all Swedes lived in the US at the turn of the nineteenth century.
Subsequent immigration, however, has been primarily professionals and sojourning business people; in the 1970s, emigration to Sweden (a haven for Vietnam protestors) exceeded immigration to the US.
While Danes scattered throughout the northern United States, Norwegians and Swedes often chose rural homesteads in the upper Midwest—especially Minnesota, Wisconsin, Illinois and South Dakota. Nonetheless, in 1900, Chicago, IL was the second-largest Swedish city in the world after Stockholm. Lutheran churches, ethnic press and society and a liberal populist political orientation united these communities, although all were highly assimilated by the 1930s. By this time Swedish Americans became American heroes (aviator Charles Lindbergh) and interpreters (poet Carl Sandburg).
As older institutions and language faded, some pan-Scandinavian associations solidified through Lutheran church mergers, social clubs and parades attuned to the new ethnicity of the 1960s and 1970s. In the 1980s, radio essayist Garrison Keillor also popularized a nostalgic yet tender vision of “bachelor Norwegian” farmers and Scandinavian American rural life in his fictional Lake Woebegon.
Industry:Culture
Although television presented a postwar door to the future, even its visionary series have faced difficulties balancing family audiences (fewer monsters), small screens (diminished special effects) and continuing characters. Hence, science-fiction television has tended to borrow plots from other genres—especially westerns and war movies—and sometimes recycled sets and props as well (evident in Star Trek or Time Tunnel and even later shipboard ensembles like Battlestar Galactica). Space and technological wizardry nonetheless, have underpinned children’s programs, including the 1950s Space Patrol and Tom Corbett, Space Cadet and the animated sitcom The Jetsons (ABC, 1962–3).
Beyond science fiction, moreover, TV crime adapted sci-fi technology and formula morality to anachronistic settings and international intrigue (The Wild Wild West).
Meanwhile, a submarine framed high-tech melodrama in Voyage to the Bottom of the Sea (ABC, 1964–8), and robots and monsters enlivened a family Lost in Space (CBS, 1965– 8). Nïght Stalker (ABC, 1972) even brought newsroom humor to revisit classic monsters.
Nevertheless, a few series really tested the medium and its mass audiences.
The Outer Limits (ABC, 1963–5) and Rod Serling’s eerily masterful Twilight Zone (CBS 1958–65, 1985–7, syndicated 1987–8) both became classic not only on the basis of their disquieting imaginations, but also through careful direction and vivid acting. Each set itself apart from television itself in famous introductory sequences—the first warned the viewer not to try to control the set, which had been taken over by unknown forces, while Twilight Zone offered a surrealist montage with Serling’s clipped voiceover. Both the entire series and individual episodes have become classics.
The 1980s and 1990s proved dry decades for science fiction, unlike movies, although a new sci-fi cable network presents movies and older series. Alien Nation, however, extended the premise of a refugee alien population in the United States beyond the 1988 movie’s themes of race and drugs to encompass family culture and sexuality. The closing of the millennium, moreover, coincided with the X-Files (FOX, 1993–), where aliens and unexplained phenomena converge with paranoia about the government itself. In 1999 Matt Groening, creator of The Simpsons, launched Futurama, vaulting a New York teenager into the year 3000.
Sci-fi TV betrays roots of American media and culture. Only after decades did Star Trek’s universes produce female and black leadership, although it offered a wider range of characters than many mainstream shows. Family, nation, consumption and morality moreover, seem continuous, however futuristic worlds lament twentieth-century war, pollution, racism and poverty). Perhaps the primary divide between shows like Star Trek and the alternative Twilight Zone, Outer Limits and X-files, however, is the divergence between the future as a continuation of the American way and those shows that underscore an unease with the way things are and might be.
Industry:Culture
Although the legal dissolution of marriage has existed throughout American history divorce rates increased dramatically during the late twentieth century This resulted from many factors, including the relaxation of legislation, secularization and heightened expectations for emotional satisfaction within marriage. Only two out of 1,000 American marriages ended in divorce in 1866 (the world’s highest divorce rate at the time). By 1929 that increased to one out of six; by 1990 it was nearly one out of two. Even among cultural and religious traditions that proscribe divorce, it has become increasingly accepted.
For most of American history adultery was the only acceptable justification for divorce. In the liberal social climate of the 1970s, the “no-fault” divorce became a critical innovation, allowing couples to divorce by citing “irreconcilable differences” or an “irretrievable breakdown of the marriage.” The ease with which many people divorced caused some to argue that there has been an unprecedented breakdown in American family life—a “divorce epidemic”—which has led to harmful effects on the children involved and for American society as a whole. Others argue that the freedom to divorce is necessary because it reduces the stigma associated with ending what would otherwise be an unhappy or perhaps seriously abusive marriage. One might note that many conservative Republicans championing traditional values—Ronald Reagan. Bob Dole and Newt Gingrich—have been themselves divorced.
Though divorce may be less stigmatized than in former times, it still creates emotional and financial distress for those involved, especially for women (who tend to lose more economically) and offspring. Divorce has been cited as one of the highest causes of stress in American life, second only to the death of a loved one. As a result, there has been much debate about what to do—if anything—about this cultural pattern. For example, in an effort to reduce divorce rates, the Louisiana legislature passed the 1997 Covenant Marriage Act, which permits each couple marrying in Louisiana to limit the legal grounds of divorce in their case to adultery abandonment, physical or sexual abuse, felony conviction, or separation of at least two years. Others, notably feminist leaders, have protested these trends as a regression to an era of stifled choices and oppressive living arrangements.
Divorce is also commonly presented in mass media, both in television and movie narratives, and celebrity lives reported in the press. Economic, social and cultural issues of divorce, in fact, provide a continuous thread in long-running soap operas and movies from Adam’s Rib (1950) through Kramer vs. Kramer (1979) to the darkly comedic War of the Roses (1989) and beyond. In Hollywood, divorce can be seen as a happy ending— or a beginning.
Freedom and individuality are central to American culture, and, in some ways, the rising divorce rate reflects this emphasis on personal liberties. Predictions that rising divorce rates portend the end of marriage, however, seem greatly exaggerated.
Industry:Culture
Ambivalent category of Hollywood creators, whose role took shape with the feature film (especially sound). Over the years, Hollywood has hired top writers from William Faulkner to Ben Hecht to John Irving to adapt their own and other works. It also has produced notable talents within the industry. Yet, the role of the screenwriter competes unsuccessfully in public acclaim and auteur theory with that of the director (which some screenwriters have also become). Moreover, Hollywood productions may also call on multiple writers and script doctors, especially in expensive and problematic projects, undercutting the vision of a single author.
Industry:Culture
America formally ushered in the “nuclear age” on August 6, 1945 when it dropped an atomic bomb on Hiroshima, Japan, to end the Second World War. Nuclear weaponry had been the American military’s long-sought goal during the war as the government devoted over 2 billion dollars to the top secret “Manhattan Project,” based in Los Alamos, New Mexico. At the bomb’s first test, J. Robert Oppenheimer, the controversial scientific director of the project, called upon Hindu writings and famously commented: “Now I am become death; destroyer of worlds.” This ominous sense of doom pervaded America’s and ultimately the world’s relationship to the new technology. Initially produced within a climate of warfare, nuclear energy never lost the trappings of awe and fear that surrounded its birth despite years of attempts to reshape public perceptions.
Initially, America held sole rights to the nuclear age and took advantage of its shortlived technological superiority to establish itself as an unchallenged international military power. When the Soviet Union successfully tested their own nuclear weapon in 1949, the arms race between what were now two superpowers began in earnest. Over the next forty years, America and the Soviet Union devoted an extraordinary amount of financial and political capital to building more, as well as more advanced, nuclear-weapons systems. In America, this arms race and the war games that detailed how such weapons might be used became the stuff of popular culture, with phrases like “first strike capability” and “MAD: Mutually Assured Destruction”—concepts that originated on the desks of military planners—entering the common vocabulary.
For many years, Americans maintained an absorbing political focus on nuclear weaponry. Generals and the public genuinely debated whether to use nuclear bombs in both the Korean War and in Vietnam. In the 1960 election John Kennedy falsely accused the Eisenhower administration of permitting a “missile gap” to arise which benefited the Soviets. As president, Kennedy himself engaged in nuclear brinkmanship with the Soviet Union over Cuba. In 1964 Lyndon Johnson implied in a famous television commercial that, if elected, Republican opponent Barry Goldwater would cause a nuclear war. The bomb-shelter craze that swept the country in the late 1950s/early 1960s reflected this public obsession, as did the regular “nuclear attack” drills held in schools which urged children to “duck and cover” should a bomb fall in the vicinity.
Although continuing to accelerate nuclear arms development and production, American political leaders simultaneously called for other countries—particularly the Soviet Union and the People’s Republic of China—to slow down and limit their nuclear arsenals. To that end, Kennedy for example, proposed a nuclear test-ban treaty in 1963.
During his first term, Richard Nixon focused on a policy of détente with the Soviet Union—an attempt to cool down Cold-War tensions—and negotiated the SALT (Strategic Arms Limitation Treaty) I accord. President Reagan reversed this course in the 1980s by denouncing the SALT II agreement negotiated under the Carter administration (although never ratified by the Senate) and extensively funding research into a new, highly speculative missile defense system set in the Earth’s orbit, known popularly as the “Star Wars” initiative.
Subsequent presidential administrations defunded this project, but the proliferation of nuclear weapons around the world and the absence of genuine multilateral armslimitation agreements reflects the long-term impact of the fact that America abandoned nuclear disarmament (outside of the Russia-US arena) as an important policy goal.
Yet, nuclear weapons are but one aspect of America’s tormented relationship with the nuclear age. American policy-makers attempted to transform the public relationship with nuclear power from one of terror to one of affection. Public utilities saw the possibilities of a cheap energy source and began a campaign to convince Americans to welcome nuclear power plants. Initially the public relations efforts had some success and plants began to dot the American landscape in the 1950s and 1960s. Yet, just as grassroots “Ban the Bomb” movements had helped reshape disarmament goals in the 1960s, so too did resistance to nuclear energy on environmental and safety grounds begin to take hold in the 1970s. Protestors challenging the image of a safe, benign nuclearenergy capability received the proof they needed in 1979 when a nuclear power plant at Three Mile Island in western Pennsylvania experienced a serious nuclear accident and exposed millions to the risk of radiation poisoning. This was enough to convince many that their communities should remain “nuclear free” as a growing political movement in the 1980s argued.
Movies like China Syndrome (1979) and Silkwood (1983) reflected this growing opposition to nuclear energy.
In fact, the effects of the nuclear age have been a standard in American popular culture.
Not surprisingly the 1950s proved to be the high-water mark for cinematic representations of nuclear disaster. Countless B-movies showed the impact of radiation on the natural world. Although exposure to radiation usually produced massive and deadly growth (giant ants in Them and a giant woman in The Attack of the 50 Foot Woman), sometimes it led to diminution (The Incredible Shrinking Man). These now pleasurably “campy” films provide ample evidence of an American culture trying to come to terms with a new and frightening technology.
Although the collapse of the Soviet Union after 1989 has freed some Americans from their longstanding anxiety over an imminent nuclear holocaust, others have pointed to the unchecked proliferation of nuclear weaponry by many smaller countries as an even greater source of concern. The longstanding American security policy of “Mutually Assured Destruction”—which argued that the United States and the Soviet Union could not risk a war because no one would be left at the end—is of no interest to many nations who currently have nuclear-weapons capability. Their conflicts are more local and the attraction of nuclear weaponry more tactically engaging. In the eyes of many the real “nuclear age” has only now just begun.
Industry:Culture