Taking North American White Supremacist Groups Seriously: The Scope and the Challenge of Hate Speech on the Internet

This article aims to address two questions: how does hate speech manifest on North American white supremacist websites; and is there a connection between online hate speech and hate crime? Firstly, hate speech is defined and the research methodology upon which the article is based is explained. The ways that ‘hate’ groups utilize the Internet and their purposes in doing so are then analysed, with the content and the functions of their websites as well as their agenda examined. Finally, the article explores the connection between hate speech and hate crime. I argue that there is sufficient evidence to suggest that speech can and does inspire crime. The article is based in the main on primary sources: a study of many ‘hate’ websites; and interviews and discussions with experts in the field.


Introduction
On 12 August 2017, James Alex Fields Jr rammed his car into a crowd of anti-fascist protesters united against a white supremacist rally, Unite the Right, in Charlottesville, Virginia, United States of America (USA). Fields killed 32-year-old Heather Heyer and injured dozens others. Prior to this attack, Fields associated himself with the alt-right movement, 2 which includes white supremacists and neo-Nazis. On his Facebook account, Fields expressed support for racism and extreme right-wing movements. Photos showed Fields with members of Vanguard America, a neo-Nazi group that is part of the Nationalist Front.
This incident illustrates the danger that the white supremacist movement poses to American society, and the close connection between hate online and hate crimes. Yet many liberals, from the USA in particular, tend to object to general hate speech regulation. They believe that legal restrictions on racist or hate speech are not warranted because they violate the speaker's autonomy. Baker (1992Baker ( , 1997, for instance, argues that almost all of the harm inflicted by free speech is eventually mediated by the mental processes of the audience. The audience decides its reaction to speech. The listeners determine their own response. Any consequences of the listeners' response to hate speech must be attributed, in the end, to the listeners. The result is the right of speakers to present their views even if assimilation by the listeners leads to or constitutes serious harm. Baker (1997Baker ( , 2012, like many American liberal philosophers and First Amendment scholars, wishes to protect freedom of expression notwithstanding the harm that the speech might inflict on the audience (Abrams 2017;Cohen 1993;Hardin 2002;Meiklejohn 1965Meiklejohn , 2000Nye, Zelikow and King 1997;Richards 1986;Scanlon 1979Scanlon , 1995Stone 2005;Volokh 2003Volokh , 2015 also interviews and discussions with those listed in Appendix 2 with citation identifier A). Consequently, many of my interviewees argue that American liberals thus tend to underestimate the harm in hate speech (interviews and discussions with those listed in Appendix 2 with citation identifier B).
Rather than speculating about what racists are saying, this article presents evidence that is largely based on primary sources. First, it reports on the author's long-term study of 'hate' websites, documenting what radical members of the White movement are saying. In this respect, the article aims to illustrate the apparent mind-set, concerns and language of the people who contribute to the hate sites. Second, it is informed by interviews and discussions with over 40 specialists (identified in Appendix 2) in seven countries over the period 2006-2016. The article documents links between speech and action, arguing that hate speech should not be dismissed as 'mere speech' and that the preferred American liberal approach of fighting ideas with ideas, speech with speech, is insufficient. Hate speech needs to be taken more seriously by the legal authorities than it currently is.
This article addresses two questions: how does hate speech manifest on North American white supremacist websites; and is there evidence of a connection between online hate speech and hate This article firstly defines the concepts of hate speech and akrasia (acting against one's better judgement) and explains the methodology. I then analyse the ways that hate groups are utilizing the Internet and their purposes in doing so. Internet hate can be found on thousands of websites, file archives, chat rooms, newsgroups and mailing lists. To understand the challenge of responding to this hate, it is important to reflect on the content and agenda of hate mongers. It is also important to examine the functions of hate sites. Like many other social groups, hate groups utilize the Internet to socialize and to link with other people. Like other ideological and political groups, hate sites also aim to raise funds, share a particular worldview, propagate ideas and recruit new members. Like terrorist groups, hate groups also engage in the promotion of violence (Cohen-Almagor 2012a, 2017a. The final part of the article emphasises the connection between hate speech and hate crime. Hate crime concerns the threat or the use of physical harm motivated by prejudice (Perry 2001(Perry , 2005USLegal.com website). Not all forms of hate speech lead to hate crimes while hate crimes can, but do not necessarily, contain hate speech. Still, there is sufficient evidence to suggest that speech can and does inspire crime.

Concepts and methodology
The concept of hate speech Hate speech is not a simple concept. Hate speech definitions concern both legal and illegal speech. The same speech might be illegal in Canada but legal in the USA. For example, while Canada does not permit the establishment of a Nazi party, the USA does not ban the American Nazi Party (Cohen-Almagor 2005;Neier 1979; Village of Skokie v The National Socialist Party of America 1978). Different countries also have different stands on Holocaust denial, which is a form of hate speech (Behrens, Jensen and Terry 2017;Cohen-Almagor 2009; Appendix 2, citation identifier C). The concept of hate speech thus contains a variety of speech and behavior on the Internet, as well as various motivations for production of that speech. For this reason, the sources of hate speech are manifold. Here my concern is solely with racist, white supremacist hate speech. Previously, I have defined hate speech as 'a bias-motivated, hostile, malicious speech aimed at a person or a group of people because of some of their actual or perceived innate characteristics' (Cohen-Almagor 2011). Hate speech expresses 'discriminatory, intimidating, disapproving, antagonistic and/or prejudicial attitudes toward those characteristics, which include sex, race, religion, ethnicity, colour, national origin, disability, or sexual orientation' (Cohen-Almagor 2011). Hate speech is intended to 'injure, dehumanize, harass, intimidate, debase, degrade, and victimize the targeted groups, and to foment insensitivity and brutality against them' (Cohen-Almagor 2011). A hate site is defined as a site that carries any form of hateful textual, visual or audio-based rhetoric.
The article is also informed by more than 40 semi-structured interviews and discussions I have held in Canada, the USA, Israel, France, England, Ireland and Portugal during the last decade (2006)(2007)(2008)(2009)(2010)(2011)(2012)(2013)(2014)(2015)(2016). The interviews and discussions were with leading Internet scholars, security experts, and human rights activists and experts. They were designed to learn about the scope of Internet hate and what can be done to counter hate mongers' activities. Interviews varied in length from one to two hours and 30 minutes. The interviewees and discussants provided information and insight about the structure and functions of the Internet, the possibilities it opens for abuse, the ways the Internet has been utilised by hate organizations and individuals, and the dangers of hate speech and its links to violence and to other criminal activities. Appendix 2 IJCJ&SD 42 Online version via www.crimejusticejournal.com © 2018 7(2) compiles the names of the interviewees and discussants cited in this article together with locations and dates.

Studying racist websites
The devil is in the details. To comprehend the seriousness of the challenge that white supremacists pose to society, it is essential to understand their agenda, aims, priorities and mode of operation. Hate groups use the Internet as other users do but their intentions are sinister, antisocial and violent. Here I focus on propaganda, socializing, linking, fundraising, recruitment and the promotion of violence.

Propaganda and sharing ideology
White supremacist websites promote messages of racial superiority and attack certain religions or gays and lesbians. White supremacist groups such as the Ku Klux Klan, skinheads, neo-Nazis and National Association for the Advancement of White People have websites, blogs, 'rants and raves' forums, discussion groups, photos and videos on the Internet. These websites (see Appendix 1) are readily accessible through Internet search engines. While a main website is set up by a group's leader, multiple sites are also set up by district or state chapters as well as by individual members. These sites usually contain the history of the sponsoring group, a mission statement, and text by group members. To attract the reader, they offer eye-catching teasers such as symbols and pictures.
For example, Northwest Front has a clear agenda to create a White Homeland in the Pacific Northwest, which they promote through a distinct flag (blue, white and green), a constitution and a set of principles on migration and citizenship. They believe in a clear demarcation in accordance with 'white blood and race' and argue for ousting those who do not belong. Yet they claim they are not about promoting hatred. They are about promoting freedom: 'We don't stand for hating people, we stand for freeing people-our people-from a yoke of tyranny and oppression that has become impossible for us to live with. We stand for preserving our race from biological and cultural extinction' (Northwest Front website, 'Dear White American'). Non-White and Jewish people and homosexuals are not welcome in their new country called The Northwest American Republic (Northwest Front website, 'Nationhood and Citizenship').
Hate mongers talk to each other, thereby reinforcing their commonly held views, empowering people who share their beliefs and identifying ways to offend their targets. This dichotomy between 'us' and 'them' is necessary as it fulfils both functions of creating a sense of belonging and of marking the bounds of unity. White supremacist websites and chat groups such as Stormfront promulgate the belief that White people are the oppressed group and that society is in danger of being overrun by ignorant, welfare-loving minorities who desire White women.
Much effort is invested in appealing to young people through video games and music that teach children that violence is acceptable (Anti-Defamation League 2012; Appendix 2, citation identifier D In this hate propaganda, the racial 'other' is represented as a social polluter. They are metaphorically associated with disease and cast as a viral presence whose very existence on (often American) soil is sufficient to undermine its social stability and the values which have made it a strong and powerful nation. The foreigner is the enemy (Roversi 2008: 93-94). The Internet allows the ignorant and the prejudiced to send these anonymous messages to those whom they despise (Delgado and Stefancic 2004: 24). For example, Jewish Ritual Murder (Holywar website) claimed that the two principal feast-days of Judaism, Purim and Passover, were associated with the murder of Christians.
The writings of Dr William Pierce have become glorified and much celebrated within these circles to promote hatred against Jewish people (see, for example, National Alliance website). Pierce's Turner Diaries, published in 1978 under the pseudonym Andrew Macdonald, provides a fictional account of a race war by white supremacists against government officials, intellectuals, Jewish and Black people in order to establish an Aryan world. Timothy McVeigh, the Oklahoma City bomber, actively promoted the book and appeared to have carefully read some of Turner's instructions prior to the 1995 bombing that resulted in the death of 168 people: 'The plan roughly is this … Unit 8 will secure a large amount of explosives … We will then drive into the FBI building's freight-receiving area, set the fuse, and leave the truck' (Macdonald 1978: Chapter IV).
Organizations dedicated to fighting hatred are concerned with The Turner Diaries (Appendix 2, citation identifier E).

Socializing
Encouraging interpersonal socialization in the offline world is a key strategy of white supremacist websites. For instance, the Hammerskin Nation is one of the most organized and most violent neo-Nazi skinhead group in the USA (Foxman and Wolf 2013: 13). In 2017, its website proclaimed 'REBEL HELL TOTAL WAR' and invited people to 'Beers, Bands and Brotherhood,' an exciting event that also includes merchandise and a raffle (Hammerskins website). Similarly, the Nordic Fest (obsolete Southside Antifa website) is an annual white patriotic rally and music festival in Dawson Springs Kentucky. The group claimed that comrades from all over the world travelled to this and other events held by the IKA (Imperial Klans of America). In 2017, The Brotherhood of Klans Knights of the Ku Klux Klan also convened a Summer Unity Gathering (Stormfront forum). These kinds of websites cultivate a sense of community and offer interested parties opportunities for mingling and socializing. It is one thing to find like-minded people on the Internet. It is quite another to actually meet and strike more than virtual friendship. Hate groups strive for both.
White power rock n' roll has been instrumental to the racist movement. Some sites have free downloadable music with lyrics promoting hate. The lyrics are violent and derogatory, calling for a racial war and the murder of Black and gay people and other 'undesired' groups. For example, Resistance Records, which offered merchandise as well as music, was a financially successful label founded in 1993 that pioneered the music of a dozen high-quality Skinhead bands, such as Cute Girl (Atkins 2011

Fund-raising
Hate mongers are able to make blatant appeals for funding over their websites because none has been designated as a terrorist organization by official state agencies; therefore, they are not as concerned about state interference with funding channels. Appeals come in three forms: general appeals for funds needed to sustain operations; fund-raisers for legal representation for members who have been arrested; and donations towards 'official membership,' which entitles subjects to additional material, such as newsletter subscriptions (Gruen 2004: 139; Appendix 2, citation identifier G). Many hate sites also generate revenue through product sales (coins, jewellery, belt buckles, t-shirts, hats, patches, pins, flags, sports items, music, videos, comic books, memorabilia, decorations, knives, survival defence (see, for example, Final Conflict website; Third Reich Books website; Tightrope website). One typical website, Aryan Wear (no longer active), explained that 'Through our support of Altermedia.info and Newsnet14.com Aryan Wear helps get out news and information that is hidden by the controlled media.'

Recruitment
The Internet introduces people to new ideas. People surf the Internet, encounter intriguing ideas and get interested. Often this makes the Internet the starting point for further contact. Some people then continue to explore and may initiate contact with people who are more experienced. Then they become identifiable targets for recruitment (Angie et al. 2011;Foxman and Wolf 2013; Appendix 2, citation identifier H). Online hate sites are also used to recruit individuals to offline hate groups and coordinate group efforts (Chan, Ghose and Seamans 2016;Hall et al. 2017;Ibanga 2009;Wines and Saul 2015). Much of this is geared to teenage and young adult males. As mentioned above, music plays an important role in this recruitment. When children and youth surf the Internet for music, they may chance on sites that offer hate music, sometimes free of charge. Such sites are often linked to hate newsgroups and chat rooms (Chernynkaya 2010;Shekhvtsov 2013

Promotion of violence
Racist websites provide links to information on terrorism. The now obsolete Aryan Nations website celebrated the 11 September 2001 Islamist extremist terrorist attacks in the USA which killed many thousands of people, the rationale being that the enemy of the USA is, 'for now at least, our friend' (Wallace 2001 identifier I). There is compelling evidence of direct connections between these manuals and violent, terrorist actions (Chan, Ghose and Seamans 2016; Cohen-Almagor 2017b). In the following section, I consider this link between hate speech and violence.

From hate speech to hate crimes
On average, USA residents experienced approximately 250,000 hate crime victimizations each year between 2004 and 2015, of which about 230,000 were violent victimizations (Masucci and Langton 2017). Those who are opposed to hate speech regulation argue that venting hate speech is preferable to violent action (Baker 1997(Baker , 2012Richards 1986). They support freedom of speech and net neutrality, 3 notwithstanding its most troubling content. Further arguments are that regulation of hate speech is ineffective, futile, makes martyrs out of haters and might even help them achieve their goals (Henthoff 1992: 134; Appendix 2, citation identifier J). But, absolute net neutrality in itself constitutes a form of clear-eyed akrasia because it entails an abrogation of moral and social responsibility for Internet content. Indeed, the trouble with these arguments, as Allport (1954) and others have observed, lies with their empirical assumptions.
Furthermore, government failure to act against hate speech helps to normalize or even authorize the relevant hate speakers to carry on doing what they are doing (this reinforcing the original akrasic inaction). Victims of unrecognised hate speech end up lacking protection (Brown 2017: 604). In his critique of First Amendment scholars in the USA, Waldron (2012: 165) rightly notes that hate speech harms the dignity of its targets by undermining public assurance and support. Waldron (2012: 171) explains: 'To the extent that the message conveyed by the racist already puts them on the defensive, and distracts them from the ordinary business of life ... to that extent, the racist speech has already succeeded in one of its destructive aims'. In contrast, supporters of free speech such as Baker (1992Baker ( , 1997Baker ( , 2012 give no convincing reason why society should tolerate hate speech when the pain may be so strong, so immediate, so penetrating and so instant, that people do not have the luxury of choosing their response. A recent study by Chan, Ghose and Seamans (2016) found that some 14,000 Internet sites contained hate-related content. Using a large-scale dataset and econometric techniques, they found a positive relationship between Internet penetration and offline racial hate crime. This correlation is most evident in areas with higher levels of racism than others, indicated by higher levels of segregation and a higher propensity for people in those areas to search for racially charged words. Chan, Ghose and Seamans (2016) also observed a link between online hate sites and the incidence of racial hate crimes executed by lone wolf perpetrators. My own research spanning two decades concludes there is evidence for this link. For example, in 1999, 21-year-old Benjamin Nathaniel Smith shot and killed two innocent people and wounded eight others after being exposed to Internet racial propaganda (Anti-Defamation League 2003a; Apologetics Index website; Berkowitz 1999;Greyhavens 2007; Church of the Creator website (registration required); Appendix 2, citation identifier K). Smith said: 'It wasn't really 'til I got on the Internet, read some literature of these groups that … it really all came together' (Wolf 2004b). He maintained: 'It's a slow, gradual process to become racially conscious' (Wolf 2004a(Wolf , 2004bChan, Ghose and Seamans 2016).
The same year Buford Furrow embarked on a hate-motivated shooting spree after visiting hate sites, including Stormfront and a macabre site called Gore Gallery on which explicit photos of brutal murders were posted (Levin 2002: 959). Furrow killed one person and wounded five others.
Throughout the 2000s, there were numerous cases in the USA of active users of white supremacist Internet sites committing offences of racial violence intimidation (Fuoco 2001;Gruen 2004: 128;United States v Magleby 2001; for discussion on the cross burning phenomenon, see Bell 2004;Gey 2004;Newton 2014 Commission concluded that the materials used by such offenders were likely to expose those of the Jewish faith, Aboriginal peoples, francophones, Black people and others to hatred and contempt: 'They are undoubtedly as vile as one can imagine and not only discriminatory but threatening to the victims they target' (Warman v Harrison 2006: 23-24;CBC News 2006). The danger is well exemplified in the case of Keith Luke who, in 2009, murdered two Black people and raped and nearly killed a third on the morning after Barack Obama was inaugurated as president.
When he was captured, Luke told police that he intended to go to a synagogue that night and kill as many Orthodox Jewish people as possible. Luke also told the police that he had been reading white power websites for about six months (in other words, from about the time that Obama won the Democratic nomination) and had concluded that the White race was being subjected to a genocide in America. Therefore, he had to act (Ellement 2009;Holthouse 2009).
Later the same year, on 10 June 2009, James von Brunn entered the USA Holocaust Memorial Museum in Washington DC and killed Security Guard Stephen Tyrone Johns. Von Brunn, a diehard white supremacist anti-Semite, was an active neo-Nazi for decades (Beirich 2009;Martin 2015). For this Holocaust denier, the Holocaust Museum was the most appropriate place for the shooting as it served the greatest hoax of all time.
There is some similarity between von Brunn and the 73-year-old American Nazi Frazier Glenn  Together, these cases demonstrate that online hate speech and hate threats need to be taken seriously. When harmful speech is closely linked to harmful action, to the extent that one does not know where the harmful speech ends and the harmful action begins, those speech-acts do not warrant protection (Cohen-Almagor 2006;George 2017). Incitement warrants legal intervention. Overly permissive and tolerant attitudes towards hate speech is a form of askrasia, whereby people act against their better judgment. Not just those who post but also those who allow such postings on their servers are culpable for their akratic conduct. Whether through ignorance, indifference or insistence on clinging to freedom of speech without caring about dangerous consequences, these are unjustifiable. Internet service providers are expected to abide by a basic code of conduct, one that objects to rather than celebrates violence and its promotion. When it comes to hate speech on the Internet, society and its regulators cannot continue to remain akratic and avoid responsibility for the harm that is inflicted. As Christopher Wolf, Chair of the Internet Task Force of the Anti-Defamation League, argues: 'The evidence is clear that hate online inspires hate crimes' (Wolf 2004b).

Conclusion
Hate is a powerful emotion. People who allow themselves to develop hatred towards others move in vicious circles. With the help of the Internet, they find like-minded people and then engage in discussions about why their hatred is justified, and what can be done to fight their targets of hate. The entire conversation is negative, dark and destructive. The bigots incite each other to hatred, and push those who are prone to violence to act upon their hatred. This article shows that hateful messages are destructive. They directly harm the victimized targets and they might indirectly desensitize the public on important issues (such as Holocaust denial). Allowing hate to propagate freely shows a form of irresponsible akrasia, acting against one's better judgement through weakness of will (Stroud 2014) or, worse, through an intention to express bigotry and hate.
This article focused on the study of websites and their conduciveness to hate crime. Hate groups are quite varied and many do not allow access except through direct personal contact, not through the Internet (Appendix 2 citation identifier L). However, some hate mongers make the most of the Internet and the communication options that are now open to them beyond websites: blogs, email, Usenet Newsgroups (computer discussion forums), Web-based bulletin boards, clubs and groups on social networks, chat rooms, Internet Relay Chat and instant messaging. With the help of the Internet, hate groups are able to reach places that were closed for them before: homes, schools, offices. Social networking sites are particularly well suited for connecting social outcasts, angry and isolated individuals on the fringe of society who find solace and comfort in cyberspace. Facebook, Twitter and YouTube are increasingly the platforms used to disseminate hate and to target teens and children to become supporters of hate (Fuchs 2014;KhosraviNik and Unger 2015;Werts 2000). Further research may analyse the ways social media apps are used in spreading hate speech, the way modern technologies are exploited to spread hate speech and whether search engines and social networking sites should continue to assist hate groups in their agenda. I have suggested some counter-measures to tackle Internet hate elsewhere (Cohen-Almagor 2014).
We also need more research that compares the utilization of the Internet to sprout hatred to the way it is being used by other anti-social groups such as paedophiles (Cohen-Almagor 2013) and terrorists. From my interviews with experts on children's safety, terrorism, crime and hate there seem to be many commonalities between the modes of operation of these groups. Such comparative studies may help security agencies in the fighting against these phenomena (Appendix 2 citation identifier M).
The Internet became commercial and widespread only in the early 1990s. In historical terms, this is new technology in the making. As has been the case with any other powerful innovation that shapes our lives, people quickly learn to cope with the benefits that technology yields. Society and 1 I am grateful to Abraham Cooper, Harvey Goldberg, Oren Segal, Stephanos Stavros, Jonathan Vick, Richard Warman and Chris Wolf for important information. I thank my interviewees for their time and many valuable insights. 2 'Alt-right' or 'alternative right' is defined by Urban Dictionary (2017) (at https://www.urbandictionary.com/define.php?term=alt-right accessed 27 March 2018) as 'a name embraced by some white supremacists and white nationalists to refer to themselves and their ideology, which emphasizes preserving and protecting the white race in the United States in addition to, or over, other traditional conservative positions such as limited government, low taxes and strict law-and-order. The movement is a mix of racism, white nationalism and populism. Its members criticize multiculturalism, feminists, Jews, Muslims, gays, immigrants and other minorities. They reject the American democratic ideal'. 3 Net neutrality is the basic principle that prohibits Internet service providers from speeding up, slowing down or blocking any content, applications or websites.