Showing posts with label Anthropology. Show all posts
Showing posts with label Anthropology. Show all posts

Saturday, 24 November 2012

The Legacy of Gandamak and the Crisis of Counterinsurgency

“Where there is a visible enemy to fight in open combat, the answer is not so difficult. Many serve, all applaud, and the tide of patriotism runs high. But when there is a long, slow struggle, with no immediately visible foe, your choice will seem hard indeed.” – President John F. Kennedy to the graduating class of the United States Military Academy at West Point in 1962.

"The Last Stand of the 44th Regiment at Gandamak" painted by William Barnes Wollen in 1898 depicts the last stand near Gandamak village on the 13 January 1842 by the survivors of the British retreat from Kabul.

In 1839, British forces invaded Afghanistan and captured Kabul. By 1842 the occupation sparked uprisings by the Afghani population which soon routed British forces. The commanding officer of the British Garrison in Kabul, Major General William Elphinstone, planned a retreat of the garrison which consisted of British and British Indian soldiers and their wives and children. The retreating column was continuously harassed by Afghani tribesmen and they were all eventually either massacred in the valley pass of Gandamak, died from the harsh wintery conditions and lack of supplies, or were captured. Only one member of the garrison, an assistant medical officer by the name of William Brydon, both survived the ordeal and made it back to the British garrison of Jalalabad.

Ever since this massacre near the village of Gandamak in January 1842, Afghanistan has proved the setting for risky geopolitical jousts, dangerous strategic manoeuvring, and folly military interventions by global great powers. Whilst the motives for the invasions of Afghanistan were different for each superpower, the course and outcomes of their conflicts have all arguably been analogous – indeed such is the legacy of Gandamak. The British invaded for strategic and imperial rationales – seeking to counter Russian influences in Central Asia and to protect Imperial India. The Soviets invaded for strategic and political rationales – seeking to counter American influences on the southern flanks of the USSR and to support the Afghani communist regime of the Saur Revolution. The Americans invaded for security and political rationales – seeking to rid the world of the Taliban that were supporting Al Qaeda, to kill or capture Osama bin Laden, and to bring capitalism and democracy to the country. Indeed the imperial and military misadventures by the great powers profoundly reflect an ignorance and misunderstanding of the diversity of Afghani political and cultural dynamics. Suffice to say, as Seth Jones has termed, Afghanistan has been the Graveyard of Empires through changing international and regional strategic, military, demographic and socioeconomic factors - the Great Game of the 19th century, the Cold War of the 20th century, the concurrent War on Terror or Global Counterinsurgency as per David Kilcullen, and the emerging New Great Game.

Afghan children beneath graffiti of a crossed out pistol with writing that reads "freedom" in Kabul (20 June 2012 | Associated Press / Ahmad Nazar).

Now in 2012, 170 years since the withdrawal of British forces and their massacre in the First Anglo-Afghan War and 23 years since the withdrawal Russian forces and their loss of the Soviet invasion, the troops of the 2009 surges by the United States have been demobilised. Indeed the history of Gandamak has been stirring up similarities between the Afghani ventures by British forces and ISAF forces. The United States Secretary of Defence, Leon Panetta, announced the finalisation of the withdrawal whilst in New Zealand in September without pomp or circumstance in the American or international media. At the NATO Summit in Chicago in May this year, an exit strategy and transition were planned entailing decreases of the NATO led ISAF troop commitments with incremental handovers to Afghani security forces. Other member nations of ISAF, such as Australia, have also begun to scale down operations and troop commitments. There is seemingly recognition by the public and governments of ISAF member nations that a continued prolonged and sustained military presence in Afghanistan has become too detrimental due to overriding political, financial and human costs.

Indeed, the international military presence in Afghanistan, which has been a continuous conflict since 2001, has largely turned into a quagmire. Efforts to train Afghani forces have resulted in growing numbers of “green on blue” attacks. Levels of corruption and obstruction in the Afghani authorities have been ride. The increasing utilisation of drone strikes, surgical or not, as well as extrajudicial killings as tools of statecraft to fight insurgents in the contested Federally Administered Tribal Areas and in the Afghani-Pakistani border more broadly have killed civilians, categorically exacerbating existing conflicts and facilitated radicalisation. Continued deaths and wounding of troops by Improvised Explosive Devices have been unforgiving. The noted troop surges, whilst may have appeared an effective strategic decision hoping to mimic the success seen by the Iraq troop surges, have operationally manifested as poor tactical deployments that have been proven ineffective at stemming the flow of violence and Taliban movement.

Recent phenomena, such as the Arab Spring, along with long term economic trends, namely the Asian Century, have profound geopolitical and strategic implications for the conflict in Afghanistan. The pivot that Central Asia has historically offered is drastically changing due to economic, technological and social change in the wider Middle East and Asia Pacific regions. The British made this mistake through pouring troops in from India in the nineteenth century as did the Soviets from their border in the 20th century. Now the Americans and ISAF are scaling down troops whilst attempting to train Afghani security forces and bequeath a new self-dependence. The Soviet campaigns of carport bombing of villages with attack helicopters and plane strikes are analogous to current drone strikes by the United States. Indeed a tragedy presents itself – Mikhail Gorbachev couldn't win the war in Afghanistan and yet couldn't acknowledge this fact. Although Gorbachev, a moderate in the politburo, planned a withdrawal deadline of Soviet forces, military expenditure actually increased until Soviet forces crossed the last bridge out of Afghanistan. Similarly, Barack Obama faces an increasingly unwinnable conflict and yet steadily increased American forces until scaling back the troop surges just last month. Indeed, Afghanistan is a war without an end. Fundamentally, Afghanistan proves unstable and, as David Petraeus notes, any progress that has been actualised is fragile and reversible.

"Remnants of an Army" painted by Lady Elizabeth Thompson in 1879 depicts William Brydon, an assistant medical officer, arriving at the gates of Jalalabad as the only survivor of the British retreat from Kabul of 1842. 

Theorised, developed and put forth by maverick military commanders and strategists thinking outside of the box at the Pentagon, counterinsurgency strategy was hailed as a panacea and game changing for the future of warfare. The pioneers of counterinsurgency, figures such as David Petraeus, John Nagl, David Kilcullen, and Herbert McMaster, have marked the rise of a new intellectual warrior class: combat soldiers with doctorates and higher academic qualifications. Counterinsurgency at its crux is an operational strategy centred on protecting population centres from insurgents, civil institutional capacity building, and information operations. Indeed the work by former Australian Army officer, government advisor and political anthropologist Dr David Kilcullen on counterinsurgency strategy has been fundamentally important. His theory of the “accidental guerilla  is a key to understanding the insurgencies around the world throughout modern history. A complex range of diagrams and flow charts explained in verbose management language with a focus on quantitative methods are all now part and parcel of the strategy of counterinsurgency as it manifests in the academic literature and military strategy. The pioneers of counterinsurgency led the charge for counterinsurgency strategy to be adopted by the United States Military as official doctrine by presenting it as the grand unified theory of everything for the unconventional warfare. Consequently the United States Military formulated the Counterinsurgency Field Manual and the Army and Marine Corps have implemented training programs, established research centres, and formulated operational tactics guidelines for small frontline units. The United States Department of State formulated a counterinsurgency guide for the civil agencies of the United States Government. The NATO led ISAF commanders have also established counterinsurgency manuals. Indeed, counterinsurgency strategy has become the orthodoxy for unconventional warfare for the United States Military.

Fundamentally though, counterinsurgency has largely failed at countering the widespread insurgency in Afghanistan and yet it is becoming almost dogmatic and a “one size fits all” strategy for commanders, planners and policymakers. Critically, counterinsurgency is largely becoming the unquestioned orthodoxy and an institutionalised narrative of unconventional warfare for the United States Military. However there is an ongoing debate surrounding the effectiveness of counterinsurgency inside the United States military and government and outside in academia and think-tanks. Even President John F. Kennedy warned the graduating class of the United States Military Academy at West Point in 1962 that counterinsurgency is problematic: “Where there is a visible enemy to fight in open combat, the answer is not so difficult. Many serve, all applaud, and the tide of patriotism runs high. But when there is a long, slow struggle, with no immediately visible foe, your choice will seem hard indeed.” Indeed, unsatisfying wars are the stock in trade of counterinsurgency; rarely, if ever, will counterinsurgency end with a surrender ceremony or look akin to the victories of conventional warfare of history. And yet, unconventional operations, from counterinsurgency to foreign military assistance, have been the operations almost exclusively waged by the United States Military since the Vietnam War though with the notable exception of the First Gulf War. Comparative to other operational doctrinal changes in the United States Military, counterinsurgency largely went unquestioned during its emergence and adoption. The doctrine of AirLand Battle was the official doctrine for the United States Military from 1982 to 1998 and over 110 articles were written for military journals fundamentally questioning it from 1976 to 1982. Up until its adoption in 2006, there was marginal questioning of counterinsurgency with only significant critiques occurring during its operational and doctrinal implementation. The paradigms of counterinsurgency that originated with the Boer War, were developed during conflicts in Malaya, Algeria, Vietnam and Northern Ireland, and then redeveloped for the insurgency in Iraq have largely failed in Afghanistan. Moreover, it is still questionable to assume that counterinsurgency and the associated operations were a success in Iraq. For Afghanistan, counterinsurgency has been inadequately implemented due to practical failings, ignorantly theorised through dogmatic assumptions, and fundamentally been an ineffective foil against the blades of cross cutting ethnic, religious, political conflicts.

There have been scathing criticisms against counterinsurgency since the official adoption of it as a doctrine of the United States Military. There is seemingly a schism in command structure over the validity of counterinsurgency with a number of active United States Army Officers openly critiquing their superior officers. United States Army Colonel Dr Gian Gentile and retired United States Army Colonel Dr Douglas Macgregor are emblematic of this increasingly present schism within the United States Military. Outside of the military there have been systematic articles criticising counterinsurgency. Adam Curtis, a documentarian with the BBC, wrote a very insightful examination of counterinsurgency theory critiquing the American redevelopment of it for Iraq and Afghanistan. Sean Liedman, a United States Navy Officer and Fellow of the Weatherhead Centre for International Affairs at Harvard University, writes a dissertation entitled “Don’t Break The Bank With COIN: Resetting U.S. Defence Strategy After Iraq and Afghanistan” which offers an insightful critique of counterinsurgency. The website Reassessing Counterinsurgency also provides a comprehensive library of articles evaluating counterinsurgency operations and strategy. Critically, counterinsurgency strategy as has been practiced in Afghanistan is flawed - the troops surges have failed, the ethics of counterinsurgency operations are questionable, and counterinsurgency has become more or less a paradigm of armed nation building. 

A bullet-riddled map of Afghanistan painted on a wall of an abandoned school in Zharay district of Kandahar province in southern Afghanistan (9 June 2012 | Reuters / Shamil Zhumatov).

Within the context of the recent demobilisation of the troops from the 2009 surge in Afghanistan order by Obama, it is important to examine the impact of surges associated with counterinsurgency operations and strategy. The primarily rationale for the Afghanistan troop surge in 2009 stemmed from the seemingly successful troop surge in Iraq in 2007 where violence significantly declined. Military commanders and counterinsurgency strategists complained that it was the troop surge that led to the declines in violence however on closer examination that a range of other factors were at play. Certainly there was a decline in violence in Iraq in 2007 coinciding with the troop surge, but this does not equate to causation. The Sunni Awakening, the cease-fire by Mahdi Army, the progressive sectarian segregation of Sunnis and Shiites, and the positive inroads of United States Military, are all critical factors in explaining the decline in violence. Moreover, comparative to Afghanistan, Iraq was more conducive to counterinsurgency operations due to the largely static population centres, working urban infrastructure, relative ethnic homogeneity and better economic conditions. The population of Afghanistan, conversely, is far more rural and sparsely located with more ethnic heterogeneity in Iraq.

Thus the starting rationale for the troop surge in Afghanistan wasn't as clear cut or significant. Failings of the Afghanistan troop surge stem from this misunderstanding of the flaws on the Iraq troop surge, but also from the misunderstanding of the dynamics of the Afghani insurgents or "accidental guerillas"  as per Kilcullen. Increasing troop commitments exacerbated unrest and consequently more accidental guerrillas were killed through the superior fire power of the United States Military forward operating bases and patrols. Efforts to protect population centres with increased security failed simply because an increased troop presence didn't deter the Taliban from making threats and carrying out punishments with those that explicitly and implicitly cooperated with ISAF. Moreover, cross cutting ethnic and political conflicts were further exacerbated by increased presence of ISAF troops and their efforts in civil cooperation and capacity building. Afghani civilians didn't attend marketplaces patrolled by ISAF troops and migrated away from their troop bases. Thus the crux of counterinsurgency of population centric operations failed either because increased troops simply deterred civilians before engagement or killed the farmers that infrequently fired at ISAF troops (the "accidental guerillas") thus fostering further radicalisation or alienation. The civil programs of the United States Department of State and the United States Agency for International Development were simply not effective, coordinated or specialised for Afghanistan. There was no sustainable socioeconomic development or long term institutional capacity building by such civil programs which were critically important for support the military operations in the overall counterinsurgency strategy in Afghanistan. 

For the surge and its accompanying counterinsurgency strategy to prevail in Afghanistan, four main things needed to occur: The Afghan government had to be a willing partner, the Pakistani government had to crack down on insurgent sanctuaries on its soil, the Afghan army had to be ready and willing to assume control of areas that had been cleared of insurgents by American troops, and the Americans had to be willing to commit troops and money for years on end. Fundamentally, the aforementioned were not attained. 

A graph from a ISAF Report in September 2012 measuring Taliban and associated insurgents launched against NATO forces, month by month from January 2008 to August 2012 (obtained via "Military's Own Report Card Gives Afghan Surge an F" from Wired.com).

Coinciding with the rise of counterinsurgency, there has been a militarisation of the social sciences by the United States Military. The relationship between the social sciences, particularly anthropology, and the military has a contentious history since the Second World War with Iraq and Afghanistan proving flashpoints. One of the pioneers of counterinsurgency, David Kilcullen, completed his Doctor of Philosophy in Political Anthropology at the University of New South Wales with a thesis entitled “The Political Consequences of Military Operations in Indonesia 1945-99: A Fieldwork Analysis of the Political Power-Diffusion Effects of Guerrilla Conflict” in 2000 utilising ethnographic methods.

Indeed there has been an acknowledgement of the importance of cultural intelligence in counterinsurgency and the United States Military have established the, horrifically named, Human Terrain System Project staffed by social scientists and deployed with combat forces in Iraq and Afghanistan. Attempts to rectify crosscultural problems and teach intercultural understanding to the military commanders of ISAF have been made by anthropologists and other social scientists. Also, there has been extensive writing on the critical importance of understanding local and regional cultural dynamics with ethnographic depth for counterinsurgency strategy and peacekeeping operations. Whilst individuals such as Kilcullen have indeed made constructive inroads in achieving a cross-culturally competent American military establishment, a range of external and operational factors have contributed to renewed failings of counterinsurgency in Afghanistan.

Yet this militarisation of the social sciences to enhance counterinsurgency efforts in Iraq and Afghanistan has been met with scepticism and controversy. Ethical objections by the American Anthropological Association and various social scientists have also been made. Legal questions pertaining to the combat status of non-military members of the Human Terrain System Project and their ability to engage with aspects of war fighting are also important. Organisations such as the Network of Concerned Anthropologists and Anthropologists for Justice and Peace have been active and vocal in criticising counterinsurgency and the Human Terrain System Project. The Network of Concerned Anthropologists has even published a book entitled The Counter-Counterinsurgency Manual that systematically critiques counterinsurgency strategy and the employment of social scientists by the Military. David Price, a Professor of Anthropology at St. Martin's College has also written the comprehensive book entitled Weaponising Anthropology: Social Science in Service of the Militarized State which also rebukes the militarisation of the social sciences. 

United States Army soldiers of the 173rd Airborne Brigade Combat Team, carry a wounded colleague after he was injured in an IED blast during a patrol in Logar province (13 October 2012 | AFP / Munir Uz Zaman).

The paradigm of nation building is arguably a valid concept of international development; however within counterinsurgency operations it is pursued primarily by the military. This is only inevitably due to the massive difference between funding and resources of the United States Military and the foreign civil development programs of the United States Government. Yet such a situation is not ideal. Thus it is accurate to classify counterinsurgency as armed nation building conducted by an organisation not versed in civic development. 

Such coincides with the existential crisis the United States Army is going through over what role it is to take in a post-Cold War and post-911 world. Pentagon officials and strategic theorists believe the future will be dominated by sea-air warfare, cyberwarfare, and unconventional warfare. Indeed the historical role of the Army of infantry operations and conventional warfare is now defunct. Therefore, the top United States Army officers believe that they must adapt to the changing strategic and security environment or risk decreased funding and irrelevancy. Though whilst adapting to unconventional warfare may seem a critical and imperative decision for the United States Army and military at large, it is merely a superficial approach and ignores the underlying factors for such conflict in the first instance. Moreover, the adaptation to counterinsurgency and asymmetrical warfare by the United States Military arguably incentivises and fosters an interventionist posture. A change of posturing of the United States Military from conventional forces towards a holistic counterinsurgency doctrine will likely incentivise continued entrances into unconventional conflicts that would traditionally not have been a consideration. Gian Gentile, a United States Army Colonel and History Professor at West Point, posits that overconfidence in the validity of counterinsurgency incentivises future interventions into conflicts but also prevents the development of capabilities to counter conventional threats.

A United States Army soldier of the 1st Airborne Brigade Combat Team of the 82nd Airborne Division fires a machine gun at insurgent forces in Ghazni province (15 June 2012 | U.S. Army / Mike MacLeod).

Rather than have the United States Military adapt to asymmetrical warfare, efforts must be taken to increase the funding and resources of civil and humanitarian programs such as USAID and various organisations of the Department of State. Only then can the underlying conditions that foster radicalisation, extremism and the accidental guerilla syndrome be dealt with. Insurgencies will continue to exist despite the exertions of counterinsurgency operations by the United States Army and Marine Corps. Thus rather than increasing military presence, non-military humanitarian and development capabilities need to be favoured. Rather than having the capabilities of drone strikes and counterinsurgency operations, the United States and ISAF as a whole should be focusing on civil, stabilisation and capacity building programs. Indeed, whilst military forces will be required to provide security and protection to such civil programs, the military option should not be the first option of choice or even the tenth. Rather than drone strikes and troop surges, ISAF must focus on civil capacity building through education and social, economic and infrastructural development projects, and through stabilisation policing and security assistance. Indeed ISAF should be focusing on security assistance and socioeconomic development, rather than combat operations with drones and troops. All other avenues are akin to attempting to crack a nut with a jackhammer.

--
Tasman Bain is a second year Bachelor of Arts (Anthropology) and Bachelor of Social Science (International Development) Student at the University of Queensland and is currently undertaking a Summer Research Scholarship at the Sustainable Minerals Institute at the University of Queensland. He is interested economic, evolutionary and medical anthropology and enjoys endurance running, reading Douglas Adams, and playing piano. 

Friday, 24 August 2012

The Contentious History of Evolutionary Theory in the Anthropological Academy: From Boasian Historical Particularism to Wilson's Sociobiology



With the publication of On the Origins of Species, Darwin’s theory of evolution by natural selection has been one of the most profound theories in the biological sciences in expounding and analysing the physical, genetic and behavioural diversity of animals. Indeed Darwinian evolution has been a profound theory outside of the biological sciences – namely it has had remarkable impact in the social sciences throughout its history. During the nineteenth century, “versions of Darwinian evolution took centre stage in political and social philosophy and in the human sciences.” A number of anthropologists came to understand cultural variation in terms of a linear progression to a cultural apex, then considered Western civilisation. This interpretation was rebuked by the anthropological school of cultural relativism and it became essentially taboo by the academy to utilise evolutionary theory in the social sciences. That said, during the early to mid-twentieth century a number of sociopolitical movements, such as the Nazi party and the eugenics movement, appropriated Darwinism to justify the genocide of certain deemed “unfavourable” and “subhuman” demographics. Such justifications were strongly condemned by evolutionary scientists as pseudoscientific and immoral but such utilisation of evolutionary theory in the social sciences still remained seriously contentious. Then in 1975 the American entomologist Wilson developed the field of sociobiology as the “systematic study of the biological basis of all social behaviour in the context of evolution” and in 1976 the British zoologist Dawkins developed the gene-centred view of evolution based on “selfish genes” determining natural selection and consequent behaviour of an organism. Whilst both Wilson and Dawkins primary aims were with the study of non-human animals, their theories flowed over into the realms of the social sciences. Indeed there was a profound backlash in the social sciences academy claiming that sociobiology and the gene-centred view were ethnocentric, reductionist, determinist and flawed in explaining human nature. By the 1990s, the debate over human sociobiology culminated in the development of evolutionary psychology as attempting to respond to the criticisms against evolutionary theory in the social sciences. Led by Barkow, Cosmides, and Tooby, evolutionary psychology aimed to understand the “neurobiology of the human brain as a series of evolutionary adaptations and that human behaviour and culture thus stem from the genetics and evolution of the brain.” Yet there still exists and persists contentions from the social sciences, particularly cultural anthropology, against sociobiology and evolutionary psychology. This essay will examine the history of the contributions and criticisms of evolutionary theory in the social sciences, from its racist and pseudoscientific past to its current contributions. Then it will examine the contributions and theory of sociobiology and evolutionary psychology as applied to anthropology in explaining human nature and also examine the contentions and controversy in the anthropological academy in response to such. Overall this essay will not delve into the technical details and scientific theory of evolutionary theory, but rather examine its claims and responses to and from in the anthropological academy.

Darwin’s theory of evolution by natural selection has presented an interpretation of human nature that has been at odds with prevailing theoretical paradigms throughout its history. The theorisation of human nature through conceptions of evolution and instinct has been undertaken by such figures as Darwin himself, Hume, Smith and Huxley who have proposed that the mosaic of human nature stems from innate human instincts. Whilst this paradigm of evolutionary social science culminated as essentially “passive and benign contemplations”, the theory of evolution of natural selection also manifested as bigotry, racism and the apparent justification of white Anglo male supremacy based on a linear interpretation of history. The school of social evolutionism in the anthropological academy led by Tylor, Morgan and Spencer became the dominant paradigm in the late ninetieth century and became appropriated as Social Darwinism in popular discourses. This paradigm utilised the framework of evolution to describe the differences between developed Western civilisations and non-developed “savage” cultures as stemming from the biological inferiority of the “savages” who were considered more related to chimpanzees than the superior Anglo-Saxons. Social movements also took up this school of thought and supported social policies of eugenics and forced sterilisation of certain demographics such as those in low socioeconomic statuses, those with disabilities or mental illness, or those from non-white ethnicities. Key responses to this school came from Boas and Kroeber in the anthropological schools of historical particularism and cultural relativism. These schools posited that the history of humanity is not a linear progression to the technological civilisation of the West but rather that each culture must be understood by “its own conditions and own particular cultural history.” Yet, there have still been individuals and groups that support Social Darwinism, culminating in cases such as the forced assimilation of Australian Aboriginals in the early twentieth century and the genocide committed by the Nazi regime during 1933 to 1945. After the Second World War, as Degler posits “the utilisation of the theory of the biological sciences in the social sciences and the ‘biologicisation’ of human nature became a taboo” due to the profound consequences of its appropriations.

During the 1970s there was a revival in the theory of evolutionary theory with advances in molecular biology, genetics, computer science and mathematical game theory. This revival was primarily aimed at explaining non-human animal behaviour and was largely led by Wilson and Dawkins along with other evolutionary theorists. Although this was the primary aim of the science, both Wilson and Dawkins still theorised on the evolution of human behaviour using the same paradigm of evolutionary theory of the study non-human behaviour. There was profound backlash in the social sciences, primarily by Lewontin, Rose, and Kamin and Sahlins with the scientific reductionism and biological determinism of sociobiology in explaining culture, but also controversy surrounding the ideological and ethical implications of such. Thus the theorists of sociobiology then responded to such criticism and controversy with the development of the field of evolutionary psychology. Led by Barkow, Cosmides and Tooby, evolutionary psychology attempted to redress the claims of reductionism and determinism by focusing on the dichotomy of nature and nurture through holistically studying the neurobiological, cognitive and psychological factors of human nature. Yet, the paradigm of evolutionary psychology has also too been met with profound criticism from the anthropological academy. Despite such criticism over legitimacy and usefulness, the fields of sociobiology, evolutionary psychology and the general application of evolutionary theory in the social sciences, have been increasingly taken up in the biological and social sciences academies.



The majority of ethical, practical and theoretical contentions in the anthropological academy surrounding the application of evolutionary theory to explain human nature stem from the reductionism and genetic determinism of evolutionary theory. Indeed the question of how in the confines of the so perceived savage, impersonal and selfish world of Darwinian natural section can complex social structures and cultural norms come about is indeed important. Sociobiology and evolutionary psychology as they manifest in the academic and popular literature have been rebuked by anthropologists and other cultural theorists as being “ethnocentric, reductionist, determinist, and philosophically reprehensible.” Critics level evolutionary theory “is merely academic fancy foot work away from the archaic and pseudoscientific” school of social evolutionism of the nineteenth century, that it explicitly and implicitly makes “false, flawed and unsubstantiated assumptions about social, political, economic, and cultural processes”, and that even the presumption of a “human nature itself is flawed.” Indeed as Sahlins has stated, evolutionary theory in the social sciences is “at its worst pseudoscientific and racist and at its best it is quasi-scientific based on flawed principles and methodology with a profound misunderstanding of the dynamics of culture.” The responses to these claims by evolutionary theorists have centred on pointing out the “fundamental biological basis of humans, being an animal species just like any other” but also pointing out the “naturalistic fallacy between making descriptive and normative judgements."

The theory of sociobiology and evolutionary psychology is predicated, by definition, on reducing human behaviour to an evolutionary and biological basis. According to Lewontin and Sahlins this does away with the cultural forces of acculturation and diffusion and other social, economic and political dynamics. Indeed explaining human behaviour by “reducing it down to the genes of the body and modules in the brain” fundamentally “neglects to recognise the power of culture in shaping and reshaping the human mind.” Thus the theorisation of the gene and or the brain being the paramount determiner in human behaviour is flawed as it “restricts the interpretation of behaviour and its cultural context.” Rather, Lewontin propounds a dialectical and interactionist interpretation of human behaviour in response to the reductionism as “it is not just that wholes are more than the sum of their parts, it is that parts become qualitatively new by being parts of the whole.” Lewontin propose that dialectical explanations are more effective and holistic in explaining human behaviour in contrast to the “reductionist calculus of the evolutionary neurobiological and gene-centred view of culture.” In response, evolutionary theorists propose that reductionism is an “important scientific principle.” Moreover such theorists as Barkow and Wilson propose that the theory of evolutionary psychology also seeks a holistic interpretation of human nature via genetic, cognitive, neurobiological and psychological processes “based on the fact that humans have evolved to environments with culture – that culture is not independent of evolution, but rather biology is the precursor.” Indeed “culture is sometimes advanced as competing with explanations that invoke evolutionary psychology, most frequently when cross-cultural variability is observed” and these “cultural explanations invoke the notion that differences between groups are prima facie evidence that culture is an autonomous causal agent.” Evolutionary theorists respond to these criticisms by stating that “cultural explanations are more or less cultural reductionism” and “ignorant of the role of biology and innate characteristics” that have evolved in the human species.

Along with the claims and criticisms of reductionism against sociobiology and evolutionary psychology is biological determinism. The critics label sociobiology and evolutionary psychology as biological determinist and that evolutionary theory is ignorant of the forces of nurture and the capacity of culture and social environments to shape and reshape human nature, but also that evolutionary theory facilitates and entrenches racism, sexism and prejudice. Indeed major criticism against sociobiology and evolutionary psychology that stems from the ethical, political and social implications of their theoretical underpinnings and findings. The critics of evolutionary psychology propose that evolutionary theory promotes or at least enables racism and sexism and does to re-entrench out-dated perceptions of sex and race. This criticism came from key findings in evolutionary psychology that the male and female brains evolved differently and thus possess different cognitive and behavioural hardwiring and that certain ethnicities are more likely to behave in certain ways or are more susceptible to certain diseases. Indeed, some evolutionary theorists, such as Jensen have even claimed that that intelligence is inheritable, that certain races are more intelligent than others, and that racial economic equality is unattainable. Thus it is proposed that just as the historically dominant class ideologies that supported the oppression of women and ethnic minorities had strong pseudoscientific justifications, in the form of assertions that women and ethnic minorities were genetically inferior, sociobiology and evolutionary psychology makes it possible again to hold such beliefs.

All this criticism has been strongly responded to by evolutionary theorists primarily based on pointing forth the naturalistic fallacy and that “sociobiology and evolutionary psychology are scientific disciplines with no social agenda.” It is also put forward that the frameworks of sociobiology and evolutionary psychology dissolve dichotomies of nature versus nurture, innate versus learned, and biological versus culture. It is not biological determinism but rather an understanding that genes and other biological factors predispose certain behavioural traits and therefore culture. Moreover, it is proposed that the biological determinism perceived of evolutionary theory in the social sciences “as being seen to be antithetical to social or political change is evidently historically falsified.” Evolutionary theorists respond with that evolutionary psychology does not privilege or prejudice individuals or groups but rather just seeks to describe and that the claims on racial inequality being inevitable by Jensen and Herrnstein have been discredited in the evolutionary theory by fellow theorists such as De Waal and Pinker. Indeed, it has been asserted that critics have been putting forth critiques based on personal political and ethical values rather than any empirical or explanatory factors and thus the attacks against evolutionary theory have been made on “non-scientific grounds”. Sociobiologists and evolutionary psychologists “should and do acknowledge the role of ideology and politics in the formation and support of scientific paradigms” but do not let it influence their own paradigm. Moreover it is noted that “genetically determined mechanisms do not imply genetically determined behaviour” and thus the theory of sociobiology and evolutionary psychology is not predicated on genetic determinism. Fundamentally critics do not recognise the naturalistic fallacy in their critiques of the ethical implications of evolutionary theory. Indeed “an explanation is not a justification” and neither sociobiology nor evolutionary psychology attempt to justify the existence of social hierarchies, racism or sexism – “when they are and have been used to justify such than evidently that is not scientific.” It is posited that any “politically incorrect assertions of evolutionary psychology are based on considerable empirical evidence” and indeed critics are welcome to challenge the evidence or provide testable alternative explanations. Overall it is a profound misunderstanding of sociobiology and evolutionary psychology to claim it is biological determinist when it takes in genetic, neurobiological, and cultural evolution of human behaviour. Thus when the theoretical paradigm fails to achieve such a spread of looking at genetic, neurobiological and cultural factors, theorists agree with critics that such a paradigm is indeed flawed.


The resurgence of evolutionary theory in the social sciences has indeed been a contentious and controversial one with much criticism being levelled against it but it also has managed to make constructive contributions to the anthropological academy. With its archaic and pseudoscientific beginnings in the schools of social evolutionism and Social Darwinism of Tylor, Morgan and Spencer arguably behind it, Wilson and Dawkins and then Barkow, Cosmides and Tooby and others transformed the application of evolutionary theory in the social sciences. Yet indeed the theory of sociobiology and evolutionary theory was met with critical claims of ethnocentrism, determinism and reductionism by Sahlins, and Lewontin, Rose, and Kamin and others, it responded with arguments stemming from the naturalistic fallacy and that it is a misunderstanding of the theory to label it determinist. Indeed the majority of theorists, both evolutionary and non-evolutionary, acknowledge that it is flawed and invalid to make purely reductionistic and biologically determinist explanations for human nature, specifically culture. Thus evolutionary theory attempts to employ a holistic interpretation based on neurobiological, genetic and cultural factors whilst firmly grounded in the understanding that humans have evolved with culture. Overall, whilst evolutionary theory in the social sciences, particularly cultural anthropology, has been and still largely is contentious, it is becoming the popular and prevailing paradigm once again. Thus sociobiology and evolutionary psychology must not revert to their natal beginnings in the application of the human sciences through justifying racism and sexism and other forms of violence and prejudice of the times of Social Darwinism. Fundamentally evolutionary theory must progress cautiously in explaining the politically, socially and morally sensitive issues that exist. Indeed making politically incorrect findings through evolutionary theory is essentially inevitable and should not be refrained from, but its theorist must recognise the consequences as they manifest in the social environment that it exists in. 

--
Tasman Bain is a second year Bachelor of Arts (Anthropology) and Bachelor of Social Science (International Development) Student at the University of Queensland. He is interested evolutionary anthropology, social epidemiology and philosophy of science and enjoys endurance running, reading Douglas Adams, and playing the glockenspiel.

Thursday, 23 August 2012

'In the future, everyone will be world-famous for 15 minutes': Why Popularity has become More Concentrated not Less


Reflecting on Warhol

Andy Warhol's prediction that everyone will be world-famous for fifteen minutes was intended to undermine the idea that anyone 'deserved' to be famous and highlight that with modern media a broader collection of people could be known by everyone (for only a little time, admittedly). This is related to the broader idea that globalisation and the end of the old media etc. would lead to more voices being heard and a decrease in the dominance of cultural conversations by a few individuals. I have highlighted before (in my post http://reciprocans-reciprocans.blogspot.com.au/2012/04/is-trade-in-ideas-free-interrogating.html) that the marketplace for ideas is fundamentally unfree and in this post I wish to examine whether the broader idea that we have a more pluralistic (or less 'concentrated' perhaps) culture is at all valid. I can't help myself, I'm addicted to a life of material

So this begs two questions: How do people get famous? And is this more or less concentrated than before?


I don't mean I want to examine the marketing of celebrity (which is detailed for those interested in a reasonably old Economist post: http://www.economist.com/node/4344144) nor do I want to get into a debate about the merits of Madonna or Lady Gaga etc (which I have previously defended at http://reciprocans-reciprocans.blogspot.com.au/2012/05/baby-im-your-biggest-fan-ill-follow-you.html), I mean the process by which the works, knowledge of their lives or writings of famous individuals spread in society.

I want to make two claims: really famous works (or the fame of celebrities) tend to spread at a slow rate until they reach a 'critical mass' at which point they spread exponentially (seemingly without effort) and that this and the processes of globalised capitalism make a 'winner take all' culture more pervasive (with qualifications) than before. 

Warhol may have been right that many people are famous for short periods, but it is still true that there are particular subjects whose fame does not fade as easily who still dominate our culture.

How Things Get Popular

Gabriel Rossman in his excellent book Climbing the Charts discusses how most ideas (or works etc.) either spread 'within' a social network (think of your friends recommending a song or a new cardigan) or 'from without' (e.g. promoting the Batman film with a huge advertising release). The latter type produces the more predictable pattern that the work will do huge business initially but then fade away quickly, for example with Twilight box office sales.


Charts courtesy of Sociological Images

Now, most works function like this- they have some initial scales (which are obviously scale variant) and then peter off. But some films or songs etc work in the first way- they become 'viral' which means that their sales are 'S-shaped', they are unpopular initially and then suddenly when they reach a critical mass of popularity, spike! As an example: the box office results for My Big Fat Greek Wedding:


This even applies to baby names, as you can see in the chart below, Isabella spiked as a social phenomenon from without whereas Madison spiked initially due to the movie Splash (released in 1984) and then became a runaway success until fading in the late 1990s.


Now, what does this all mean? Well, if you do get famous, with the exception for those whose 'fame' is the very brief glimpse on the nightly news- you tend to become famous for a longer period than Warhol's quip might suspect- see the Kardashians, for instance. Once you get people initially interested in a product, work etc. it can spread 'virally' throughout social networks till the popularity of that product, work etc. is self sustaining, at least for a time.

But, since I'm an economist at heart, what are the consequences then for the monetary side?

The Rise of Winner Takes All Markets

As Adorno puts it in The Culture Industry, previously you might've had a tenor for each major town and a group of tenors in a major city- but now, thanks to technology, everyone can listen to Pavarotti- who is almost certainly not so much better than other tenors that he deserves most of the attention/profit but might be a bit better and able to be marketed more easily. With the invention of the internet in particular, it is very easy now for the works of certain people to spread through the whole population without limitations on say actually being able to go to a concert hall or wait for a new print run of Harry Potter etc.

Now, this is obviously not entirely the case- 'within' trends do exist as I noted, as evidenced by the explosion of new acts and writers who have risen from the internet (for example Justin Bieber). But the idea that new technology was solely going to lead to pluralism or the demise of persistent celebrity is false- indeed popular Western acts like Madonna etc. have displaced some locally famous acts across the developed and developing world. As there is an ability to reach more people, the 'market' for fame and status can devolve into a 'winner takes all' market. In economics, you can have a market where scaling up actually increases the returns you make on an investment (as opposed to what you might think is more intuitive- where scaling up decreases efficiency). If this is the case, then superstars can basically extract a lot of profit when they get to a certain critical mass of popularity.

What are the limitations on this? Well, most new adaptions only get to a certain level of popularity before they peter out- with the exception of televisions, there is almost no technology owned by close to the whole population of the United States, for instance. Also, as we have seen time and time again (particularly with the youth), people will either intentionally or otherwise break the mould of the system- perhaps creating alternative or subculture communities as a consequence. 

Conclusion

The tide of any cultural change is hard to predict- who would have thought that 'Call me Maybe' would become so popular or that a book about a boy wizard who goes off to wizarding school would enthral a huge reading public? But the general outline of the complex and varied system that is 'culture' can be at least traced. Fame might be quicker to obtain now than before, but it still often lasts and can provide particularly high profits. In the end, some people are still famous for a lot longer than 15 minutes and most people are only noticed for fifteen seconds, at best.

--
Dan Gibbons is a third year Bachelor of Commerce (Economics) student at the University of Melbourne. He has a forthcoming publication in Intergraph: A Journal of Dialogic Anthropology (about memory and nationalism) and is currently submitting papers on the rise of modern consumerism, the role of criminology theory in literary criticism and the institutional theory of nationalism. Dan is a keen debater and public speaker.

Friday, 20 July 2012

Almost Human: On Great Apes, Selfhood and Rights


The Great Apes are always seen as humanlike- probably why films like The Planet of the Apes resonate so much- after all could we really relate to a 'Planet of the Elephants' even if we know elephants are intelligent? And when we see apes in pain or being mistreated this tends to again tear at our heart strings more than most animals, save in Western cultures perhaps dogs or cats. While we aren't directly descended from chimpanzees (contra Darwin's initial musings on human evolution), we are very closely related- so this does indeed make sense. But is there a scientific basis to this feeling that we aren't too different from apes?

In particular, after a recent Australs debate (to the effect of that this house would grant the great apes more rights than other animals), I was prompted to think about some of the scientific underpinnings of that debate- do apes have selfhood? Should we grant rights on this basis? Do apes have unique cognitive capabilities? This very complex series of questions is far too much for a blog post of this length to entirely deal with- so for those particularly interested I recommend The Age of Empathy by Frans de Waal, or indeed any of de Waal's masterful works. I will briefly outline two claims: apes have many humanlike capacities and do have selfhood (or something very closely equivalent) and that attendant to this we should grant animal rights on a spectrum (because they should exist for purposes that aren't just for human benefit).


"Beware the beast Man, for he is the Devil's pawn": What separates apes from man?
A good summary answer would be: effectively, a lot of apes' capacities are simply gradations of fully mentally able human capacities. 

Let's start with recognising others: that capability is a lot more basic- it exists in a lot more species than can recognise themselves- for instance social insects are aware of what the other members of the colony are feeling but hardly care much for their own being. The most basic capacity of any social creature is emotional contagion- that is, the ability to perceive others' emotions and feel them yourself. This is what newborn babies in hospitals can do- cry when others cry, even if they don't know why they do this. The next stage is consolation- this is what the higher primate species (as well as dolphins, some lower primates and a few other species) can do- have direct concern for others. An example is that male chimpanzees are often comforted by direct relatives and friends after losing a fight. The final kind is targeted helping- so for example, if you hear a scream and know that you should go and rush to deal with the danger itself. This exists somewhat in non-hominid higher primates, but humans do have a more finely attuned capacity to this (though this has negative effects to- it enhances our capacity to torture as well). Humans do indeed also have a more evolved capacity for imitation- giving rise to stronger memes or 'units of cultural transmission' (analogous though not the same as genes). 

More controversial though, is the question of whether we can find selfhood in non-human animals. This is a very philosophical question (with increasing argumentation from psychologists like Susan Blackmore that the idea of a truly independent self doesn't exist at all- see her book The Meme Machine), so I will largely leave treatment just to self-recognition. One standard test of self-recognition is to put a dab (that is visible but impossible to feel) on an animal's forehead (or equivalent) and see if they try to rub it off when they see themselves in a mirror. Now, in a very young human child (say less than 2) if you try this- they can't yet recognise themselves and so they don't try and rub the dab off. But in an older child or chimpanzee they will indeed try and rub the mark off- showing that they recognise that it is indeed themselves in the mirror (this circumvents the problem of having to ask children or chimps). 

The fact that these capacities exist in non-humans isn't troubling at all- if it didn't, the traits would be evolutionarily new and thus not particularly 'deep' in our neural architecture. As de Waal notes in The Age of Empathy, if we were the only species to recognise ourselves and feel empathy- these would be particularly weak traits of ours- and this is certainly not the case.

So, humans are only separated from apes by gradations of these capacities- not the cosmic leaps that were once supposed in the philosophy of mind (and what a relief- such philosophies are so supremely arrogant about humans that they were often allied with attempts to put our little rock of a planet in the centre of the universe).


How Should We Grant Rights Then?
I would find granting rights merely on capabilities deeply problematic- I am not a professional philosopher, but it would seem to me that there is little distinction between the capabilities of the mentally impaired or young children and particular animal species, yet I would prefer the state to give more rights to the humans (and certainly never withdraw rights wherever they can be given on the basis of incapacity alone). But a capability consideration in how we view rights seems to make intuitive sense- fish after all feel pain in a less brutal way than a chimpanzee does, and I would feel much less guilty about the pain of a fish.

I would therefore propose that animal rights exist on a spectrum (which is already partly recognised in law, but I think should be changed to reflect human purposes less). Obviously there are other reasons to give animal rights- torturing animals reflects badly on humans and also society wants to minimise the amount of pain in the world. But to the extent that rights to animals are 'inherent' (which I would argue they partly are), I would say they need to be reframed in the context of the capacities of that animal. And possibly not even how 'human' they are- but merely in the contexts of empathy and selfhood (for example if animals had other ways of expressing either of those ideas, it would still make sense to grant them rights).

In particular, such rights might included being treated differently in experimental trials or having particular guarantees on the kinds of environments in which Great Apes are kept.

Conclusion
Any ethical conversation, particularly about animals, is always very divisive. But this post has attempted to explain a few of the surface scientific and philosophical issues about the Great Apes and their rights. In particular, it has claimed that they have a kind of selfhood, and so should be afforded more rights on a spectrum. After all, if Great Apes are so like us- it should be unbearable to see them suffer.
--

Dan Gibbons is a third year Bachelor of Commerce (Economics) student at the University of Melbourne. He has a forthcoming publication in Intergraph: A Journal of Dialogic Anthropology (about memory and nationalism) and is currently submitting papers on the rise of modern consumerism, the role of criminology theory in literary criticism and the institutional theory of nationalism. Dan is a keen debater and public speaker.

Saturday, 19 May 2012

'You've Got a Friend in Me, When the Road Looks Rough Ahead': On Patterns of Friendship

Introduction: 'I get by with a little help from my friends'
Humans are deeply unusual creatures- we are the only species to form 'long-standing, non-reproductive unions'- that is, we have friends! From C.S. Lewis and J.R.R Tolkien to Boswell and Samuel Johnson to Gertrude Stein and Ernest Hemingway to even fictional friendships like that of Achilles and Patroclus- friendships are some of the most important relationships we have. Indeed, a decline in friendships in the United States (an American Sociological Review study found the number of people with at least one close confidant has dropped from 80% to 57% from 1985 to 2004) has been linked to an increase in psychological disorders. But why do we have friends at all? And perhaps more interestingly: who are we likely to be friends with?

I will trace evidence that cooperation is important in human societies and that this likely explains the psychological rewards of friendship. I will also explore new evidence that even in tribal societies we tend to befriend people who cooperate similar amounts to us, have similar genes to us (even among non-relations) and are physically and socially similar.

'Lean on Me, When You're Not Strong': The Evolution of Human Cooperation
There is strong evidence from chimps on the antecedents of friendships- for chimps non-reproductive connections provide a form of direct reciprocity- support in a fight, borrowing valuable tools, food in time of scarcity (this has been particularly documented by Pruetz and Lindshield). While these aren't exactly friendships as we'd categorise them- they are based too much in reciprocal giving and taking- they do provide clues on why friendships make evolutionary sense.

Further, it has been documented in primates that those who have a better ability to form coalitions have an evolutionary advantage over their competitors- which has been posed as a possible explanation- the logic being that many of the same characteristics (a giving nature etc.) are the same as we prize in friends and potential members of an alliance.

Baboons who form strong non-reproductive bonds also have better immune function and energy savings, which have been explained as being relieved of the burden of being continuously vigilant of potential challenges and attacks and the potential reduced sense of vulnerability.

As Bowles and Gintis (who on a side note wrote papers for MLK Jr.'s Poor People's March back in the day) document in The Cooperative Species, the relatively warlike nature of the hunter-gatherer existence and the rapid extinction of many groups precipitated the genetic and cultural evolution of social emotions such as shame and guilt because they conferred an advantage on any member of a relatively cooperative group. It is theorised that these emotions provided the jump from so called 'contingent cooperation' (think: if you buy coffee for your co-workers, then you expect them to buy you coffee back at a relatively fixed point in the future) and true friendship.

But can we thus shed any light on who we become friends with?

'Don't walk in front of me; I may not follow. Just walk beside me and be my friend': Who are we more likely to be friends with?
Friendship is obviously a culturally contingent phenomenon- witness the breakdown in affectionate male friendships in particularly Anglo-American society that occurred after the Oscar Wilde trial (and from which the Anglo-American world has never really recovered- men used to walk arm in arm in Hyde Park- would many straight men ever do that again?). However, studies have shown amongst groups as diverse as Americans and the Hadza people of north-central Tanzania that there a few common threads amongst those who we choose to be friends with. Broadly speaking, interpersonal similarity is the strongest predictor: we are rarely friends with those who are completely dissimilar to us (except in the case that through repeated interaction we grow to like them).

Much like Erving Goffman's 'matching hypothesis' for couples, there is evidence that people often pick people of similar 'worth' as defined by different cultural characteristics e.g. looks, intelligence, interests etc. Apicella et al found that the strongest predictors of what they call 'social assortativity' (a measure of the regularity of interactions based on the idea that we tend to interact more with our friends) is highest amongst those who cooperate in similar amounts (this is unsurprising- we like friends who are friendly!). A similar result has also been found for US students and Honduran adult villagers- meaning it is likely to be robust to cultural variation. Physical similarities are also prized amongst the Hadza -- after all foraging is labour intensive and if you've got friends who can physically help more, they are going to be contributing more to your life or group. This may also explain why it has been observed that even in modern society we tend to group with people of reasonably similar physical attractiveness to us- although this is obviously also socially attuned- more attractive people are also more popular. Similar positions in a social group are also a strong predictor of friendship- they both bring people together more often and increase the desire for continued social interaction. 

There is also interesting new evidence that people may befriend those with similar genotypes- in particular a study by James Fowler found that whether a person carries DRD2 (which has been linked to alcoholism) and CYP2A6 (which has been linked to openness) is strongly linked to whether they befriend another person with or without those genes, even accounting for social proximity. This of course is particularly bad news for alcoholics, it turns out that not only are they more likely to be genetically predisposed to drink to excess, they may be genetically predisposed to be friends with others who are also predisposed as such. But it provides an interesting broader point- is friendship also for the benefit of the genes? If we follow a Dawkins logic, some of the purpose of friendship may actually be to benefit our genes. It should also be noted that the Fowler study found that 4 other genes were not linked to friendship- so this question needs further exploration. 

Some Further Questions
Obviously this is an area where many new discoveries are being made- studies of the evolution of cooperation more broadly are on the frontier of science after having been largely ignored by evolutionary biology for so long. But there is interesting evidence that far from just being social constructs, friendships were evolutionary advantageous to humans as a form of reciprocity, social association and possibly even genetic association. None of this of course is to downplay how important and varied friendships really are- it just asks an interesting question: how was I able to feel this way towards others in the first place?

--
Dan Gibbons is a third year Bachelor of Commerce (Economics) student at the University of Melbourne. He has a forthcoming publication in Intergraph: A Journal of Dialogic Anthropology (about memory and nationalism) and is currently submitting papers on the rise of modern consumerism, the role of criminology theory in literary criticism and the institutional theory of nationalism. Dan is a keen debater and public speaker.

Saturday, 12 May 2012

'Baby I'm Your Biggest Fan, I'll Follow You Until You Love Me': Why pop culture isn't 'low culture'


Popular culture, in particular pop music is attacked from the right and the left- by the former for attacking 'traditional values' and the latter for embracing what Theodor Adorno termed the 'culture industry' (think EMI, the Murdoch Group, Disney etc.). It is also generally attacked by a lot of bourgeoisie, hipsters and other intelligentsia for lacking 'substance'- a charge I'm certainly guilty of making in the past.

It is often ignored by academics (though this trend is changing)- which is deeply silly, because in examining popular culture we learn a lot more than by reading texts (some of which I certainly enjoy) which no one else reads. It is wrong to cast aspersions on all pop culture as 'valueless' and to treat it as an undifferentiated mass- both the music, books etc and the reactions to them are often as heterogenous and interesting as their alternatives.

I want to make to contend that the label of 'low culture' indicates more about those who wield this distinction than those any medium that fits into either category. I'd like to deconstruct two main arguments about the distinction between 'high' and 'low' culture: 1) whether pop culture is 'contentless' and 2) whether commercialisation has somehow 'cheapened' culture or enslaved us (the argument about whether the culture industry has captured as all has some merit I think- with qualifications).

'So, Call me Maybe': Is all pop culture free of 'content' and what is 'content', anyway?
It is often claimed (perhaps fairly in the case of say Rebecca Black's 'Friday'), that pop culture lacks 'content' (Theodor Adorno in particular in The Culture Industry- claims that modern society had invented the concept of a contentless 'free time' and 'leisure' in order to tie entertainment to the culture industry).

The first issue with this is that the word 'content' is a loaded one- for instance in what way does Beethoven's 9th Symphony contain more 'content' than say Lady GaGa's 'Alejandro'? One could claim that the 9th Symphony has stood the test of time and that is certainly true (but how can we tell what of modern culture will last? My guess is that it won't be an indie band, though). However, much of what we now think are classics were once 'pop culture' and some classics we might even consider crude and relatively 'content-free' now. I am thinking of many of Chaucer's Canterbury Tales in particular- they are more vulgar than most modern fiction, not to mention that The Prioress' Tale is one of the more anti-Semitic texts in existence. If we take 'content' as requiring 'skill', this is a problematic test as skill is both subjective and that which we now value isn't necessarily the most ornate- it is mostly just what previous generations valued (who says objectively for example that Shakespeare was a more skilful playwright than Marlowe?).



The second issue with this charge is that even if we take a less vague definition of culture- say 'emotional range' or 'thematic range', then pop culture can live up to this test. It is first worth noting that the judgment of the present will make little difference to what is remembered later- the Impressionists were considered vulgar in their day and Ernest Meissonier was considered the height of French art, yet who is remembered now? Conversely, popular culture of the twentieth and twenty-first centuries has notable range- from the almost mythological Lord of the Rings to the wizarding world of Harry Potter, from the kitschy pleasures of Glee to the geeky Big Bang Theory, from the haunting satire of American Beauty to the classic romance of Casablanca and from the iconic Elvis to the rather controversially Grammy Award-winning Arcade Fire.


The key problem though is that the charge of 'lacking content' really indicates something about those who say it. Most people who reflexively claim to hate anything popular actually look down upon either the masses as commercial slaves or the masses as cultural proletariat. Disliking anything popular has become the social equivalent of sumptuary laws- one thing for a 'higher' class of connoisseur, another for the rest. I would not claim that certain aspects of popular culture can be without harm- it can be sexist, racist, voyeuristic and deeply glib at times (and I think a lot of it is terrible- but probably much of most media forms is terrible- you have to churn through a lot of any sort of music, literature or art etc. to get to a few gems, look at poetry). But it should not be dismissed out of hand just because it is popular. And these charges are not exactly new- ballads, the pop music of the Middle Ages, were accused by authorities of 'debasing' those who heard them (and indeed they were often deliciously subversive of chivalric or social norms).

But does the commercialisation charge have any weight, then? This brings us to whether culture has been 'cheapened'.

'We are Living in a Material World, And I am a Material Girl': Are we the slaves of industry? Has Culture been Cheapened?
Adorno saw all mass culture as creating false needs, of supplanting the 'true needs' of freedom, creativity and genuine happiness. The issue with this theory is that for the longest time, humans have turned to others to produce entertainment for them that merely entertained- from ancient Greek theatre to modern television. Indeed, very little of modern culture is as debauched as the ancient Bacchanalia!


A more serious inditement might be that the 'culture industry' of which Adorno speaks has taken control of our culture- which carries weight given the influence of News Corp. and all its subsidiaries- Murdoch's tendrils run deep.

However, while corporations have certainly used cultural media to make a profit they are not the only source of culture and various sub-cultures and counter-cultures demonstrate that hegemony can be resisted (e.g. gay subcultures, the Beatniks, mods etc.).

Certainly, modern pop culture is displacing many traditional cultures, which is a cause for concern throughout many societies. Further, even in Western societies it may be causing culture to be homogenised, an accumulation of American tastes and values. These are serious concerns- but rarely actually addresses by those who raise them. I have no comprehensive solution to note here, save that there may be a role for governments and other organisations to foster language and other cultural customs especially for indigenous groups- provided those customs do not actually harm the participants (practices which oppress women or minorities should not be encouraged, no matter their importance to anyone's culture).

On cheapness, I would argue that culture is as 'cheap' as it has always been- mostly people look for the same sorts of things from their entertainment- as Sherman Young points out in The Book is Dead: Long Live the Book, the idea that there was ever a vast, educated reading public reading literary classics is a fallacy.

'Don't You Step On My Blue Suede Shoes': Some Conclusions
Pop culture can be as terrible as any art form, but it is wrong to hate something just because it is popular. So dislike Lady GaGa if you think she is derivative or dislike sex-positive feminism (I think you're missing out on how fun her music is but whatever), dislike Britney if you think her music doesn't mean anything to you, dislike Game of Thrones if you think it is too violent or bad fantasy, dislike Glee if you think it is poor quality music (again missing its kitschy fun appeal, but again whatever) and dislike Lord of the Rings if you think it is too long-winded. But don't hate anything just because it is mainstream, especially not if you consider the mainstream 'below' you. I certainly know I've been guilty of this in the past, but it is a poor error to make.

Now certainly there are some questions that should be considered:
1. How much do corporations really control modern culture and is this actually new?
2. How much does modern culture alienate minorities, the poor etc.?
3. Is pop culture any more derivative than other art forms?
etc.
But none of these take away from my key point- pop culture is an important part of our society and doesn't deserve to be reflexively looked down upon or ignored by the chattering classes.

And who knows, if you're like me up until recently and you'd ignored large swathes of pop culture- maybe you'll actually find a lot of it deliciously fun.

For those who are interested, some interesting texts on this subject are:
- Theodor Adorno's The Culture Industry
- Ross King's The Judgment of Paris (on the rise of the Impressionists)
- Ken Gelder's Popular Fiction: The Logics and Practice of a Literary Field
- Hannah Arendt's The Crisis in Culture

---
Dan Gibbons is a third year Bachelor of Commerce (Economics) student at the University of Melbourne. He has a forthcoming publication in Intergraph: A Journal of Dialogic Anthropology (about memory and nationalism) and is currently submitting papers on the rise of modern consumerism, the role of criminology theory in literary criticism and the institutional theory of nationalism. Dan is a keen debater and public speaker.