Historical_General

joined 2 years ago
MODERATOR OF
[–] [email protected] 2 points 1 year ago (1 children)

If they tied a bookwyrm comments section to an ISBN number for example then anybody/site could easily have it embedded to make it a universal tool rather than specifically connected to a piracy site.

[–] [email protected] 1 points 1 year ago

Yeah, it's silly and odd and likely done to push customers towards formats that they have greater control over.

Those epubs that aren't really epubs, randomly disallowing azw3 files (that they support officially!!!) from being downloaded directly from the kindle's built in browser and other restrictive behaviour are part of this. That's why I'm eventually looking to enable epubs on kindle once the people at mobileread find a way to do it. Apparently calibre can be set up to send files too via email so that's another option.

[–] [email protected] 1 points 1 year ago (4 children)

They're not though. They only do over the cloud conversions from epub to an amazon proprietary format, that can make the covers or formatting go awry.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

It's been disabled for now because of CSAM spam.

https://feddit.nl/post/7266922?scrollToComments=true

If the community is on another server, I recommend using an alt from yet another server, making the alt a mod and then add it that way? It worked for me.

[–] [email protected] 7 points 1 year ago (4 children)

I'm envisioning Bookwyrm behaving as a comments section for anna's archive (possibly all/any decentralised book repositary), but they'd be reviews instead. I'm reminded of discus or facebook that you often get embedded on certain sites.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Mate, Palestinians just happen to exist and want to thrive. Stop ascribing some violent fantasy that never was to those poor people.

[–] [email protected] 1 points 1 year ago

That's a throwback.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (2 children)

It's not worthless, there's 500 million dollars worth of gas north of Gaza that Israel wants to secure. They're already stealing it and have been for years. And the 75 year long occupation must end of course.

[–] [email protected] 2 points 1 year ago

I would have thought this was common knowledge. I suspect these redditors just don’t put any effort into recall or thinking in general.

[–] [email protected] -3 points 1 year ago (1 children)

No you dimwit. I read the papers. Normal people do do that. Dimwit.

 

cross-posted from: https://lemm.ee/post/12865151

Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself


At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

 

Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself


At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

 

cross-posted from: https://lemm.ee/post/12600657


Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

“The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.


Subscribe to [email protected] and [email protected]

 

Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

“The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.

 

cross-posted from: https://lemm.ee/post/10945207

Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

*In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when formulating the Nuremberg Laws in the 1930s. This is a carefully researched and timely analysis of how racist ideology can penetrate the political and institutional fabric of societies, furthermore underscoring its continued impact in the USA today, writes Thomas Christie Williams. *

After the full horrors of Nazism were exposed at the end of World War II, eugenics – in Francis Galton’s words, the ‘science which deals with all influences that improve the inborn qualities of a race’ – as a social and scientific movement slowly faded from public view. The fact that Ronald Fisher, the founder of the modern discipline of genetics, and John Maynard Keynes, the economist whose ideas underpinned the New Deal, were active members of the Eugenics Society is now rarely discussed at Cambridge University, where they spent much of their academic careers. In 1954, the name of scientific journal the Annals of Eugenics was changed to the Annals of Human Genetics, and in 1965 the incoming recipient of the Chair of Eugenics at UCL, Harry Harris, became instead the Galton Professor of Human Genetics.

However, two groups of people have worked hard to keep memories of this great enthusiasm for a ‘scientific’ approach to institutionalised racism alive. The first are those who see understanding the history of the twentieth century as important, in order that we do not make the same mistakes again. They argue that whilst Nazism was the extreme end of the spectrum, it espoused views on nationality and race that were, if not mainstream, definitely recognised as acceptable by many sectors of society in Europe and the Americas. James Q. Whitman, author of Hitler’s American Model: The United States and the Making of Nazi Race Law, falls into this camp.

A legal scholar, Whitman identifies many commonalities between Nazi legislation in the early 1930s, which sought to exclude Jews from German public life, and the ‘Jim Crow’ laws enacted to exclude African Americans in the United States. Moving beyond commonalities, he argues that Nazi lawyers and the German public had a keen interest in US race law. As an example, he cites a 1936 article on racial policy in Neues Volk (New Volk), a propaganda newsletter from the National Socialist Office, which included a US map labelled ‘Statutory Restrictions on Negro Rights’, detailing disenfranchisement and anti-miscegenation laws in the 48 mainland US states.

The second group is the far-right movements arguably edging into the mainstream in the United States and Europe (in Hungary or Holland, for example). The chants of ‘Blood and Soil’ from the recent white supremacist rallies in Charlottesville, Virginia were an explicit reference to the Nazi ideal of ‘Blut und Boden’, and those gathered there are united by their fascination with fascist ideology and rhetoric. Vanguard America argues in its manifesto for an economy ‘free from the influence of international corporations, led by a rootless group of international Jews, which place profit beyond the interests of our people’. Membership of the Nationalist Socialist Movement (described on their website as ‘America’s Premier White Civil Rights Organization’) is ‘open to non-Semitic heterosexuals of European descent’, and a popular blogger for the alt-right, Mike Peinovich, who spoke at Charlottesville, hosts a chatshow entitled ‘The Daily Shoah’.

Hitler’s American Model is therefore a timely and sobering outline of how racist ideology can make its way into the political fabric of a country. It focuses on the changes introduced by Nazi lawyers post-1933, but we also learn much about how this developed in the United States. Whilst in the latter the case law excluding non-whites from public life developed over decades, in Nazi Germany the Nuremberg Laws were drafted and introduced in 1935, just two years after Hitler became Chancellor. Whitman’s main premise is that in this accelerated process, German lawyers and officials took inspiration and concrete guidance from legal practice across the Atlantic.

Reading the book, two sets of records stand out, one for their presence, and the other for their absence. The first is the stenographic report of a 5 June 1934 meeting of the Commission on Criminal Law Reform. Whitman’s twenty-page description of this transcript makes for gripping reading, and is the highlight of the book (94-113). The second is the lack of documentation regarding a September 1935 US study tour by 45 German lawyers (132). The trip was apparently a reward for their success in finalising the Nuremberg Race Laws, laid out by Hermann Göring at a rally only a few weeks earlier. As Dr. Heubner, chief of the Nazi Jurists’ Association, told the tour group before they left: ‘through this study trip the upholder of German law [will] gain the necessary compensation for an entire year of work’ (133). According to Whitman, historical record tells us that on arrival in New York at a reception organised by the New York City Bar Association, the group were met by a noisy demonstration lasting six hours and requiring police presence. However, in Whitman’s words: ‘sadly it does not seem possible to learn more about how […] the group fared on their study trip’. From the first set of records we learn much about how German lawyers saw their American counterparts; from the second (missing) set, we might have learnt more about how the American establishment viewed legal developments in the Third Reich.

Assembled at the 1934 meeting were seventeen lawyers and officials, and their brief was to respond to the demands of the Prussian Memorandum of September 1933. This document argued that the ‘task of the National Socialist State is to check the race-mixing that has been underway in Germany over the course of the centuries, and strive towards the goal of guaranteeing that Nordic blood, which is still determinative in the German people, should put its distinctive stamp on our life again’ (85). The final outcome of such meetings was the Nuremberg Laws, which consisted of three parts. The first, the Flag Law for the Reich, declared the swastika to be the only German national flag. The second, the Citizenship Laws, created a difference between German nationals – ‘any person who belongs to the mutual protection association of the German Reich’ – and the citizen – ‘a national of German blood’ who was the ‘sole bearer of full political rights’ (29). The third, the Nuremberg Blood Laws, made a criminal offence of marriage or extramarital sex between ‘Jews and nationals of German blood’ (31).

Whitman’s description of the 1934 meeting is gripping for a number of reasons. Firstly, it allows the opportunity to witness the mechanics of discrimination at work. We learn how a group of highly educated professionals – civil servants, legal academics, medical doctors – came together to formulate a set of profoundly exclusionary and undemocratic laws. The committee was faced with a number of questions. How could one define race in legal terms? Could it be possible to criminalise an act (in this case, sexual relations between a German and a Jew) to which two competent parties had consented? Secondly, as a non-American, it further underscores the deeply institutionalised discrimination within US law at this time, belying the idea that a supposedly independent judiciary can act to protect the rights of all citizens.

In Whitman’s interpretation, two groups were pitted against each other at the 1934 meeting. The first were juristic moderates, who felt that a policy of criminalising German and Jewish sexual relations was not in keeping with the German legal tradition. German criminal law, they argued, was based on clear and unambiguous concepts (105). Race, and in particular Jewishness, was difficult to ‘scientifically’ define (105); judges could not be expected to convict on the basis of vague concepts. Their adversaries were Nazi radicals, who argued that a new Criminal Code should be drawn up using the ‘fundamental principles of National Socialism’ (96). According to Whitman, it was these radicals who championed American law, already touched on in the Prussian Memorandum.

As it turns out, the American approach to defining race was not greatly troubled by the absence of a scientific conceptualisation. For the Nazi radicals, this was a heartening example. Roland Freisler, a State Secretary attached to the Ministry of Justice, pointed out: ‘How have they gone about doing this [defining race]? They have used different means. Several states have simply employed geographical concepts […] others have conflated matters, combining geographical origin with their conception of a particular circle of blood relatedness’ (107). Freisler continued:

they name the races in some more primitive way […] and therefore I am of the opinion that we can proceed with the same primitivity that is used by these American states (109).

Contrary to established German tradition, Nazi radicals believed that judges should be given freedom to institute racist legislation, without the need to come up with a scientifically satisfactory definition of race.

It is hard to argue with Whitman’s assertion that Nazi jurists and policymakers took a sustained interest in American race law, and that this helped shape the legal and political climate that led to the promulgation of the Nuremberg Laws. What Whitman moves on to in his conclusion is the extent to which the American legal and political system as a whole, beyond Jim Crow, was permeated with racism: laws related to race-based immigration, race-based citizenship and race-based anti-miscegenation. He makes the unsettling argument that America and Nazi Germany were united by a strong egalitarian, if not libertarian (in the Nazi case), ethos. This ethos, he argues, is that of all white men being equal, and thus it was not surprising that Nazism – in Whitman’s view an egalitarian social revolution for those self-defining as of German origin – turned to America for inspiration. As Whitman points out, white supremacy has a long history in the US, from 1691 when Virginia adopted the first anti-miscegenation statute, to 1790, when the First Congress opened naturalisation to ‘any alien, being a free white person’ (145), to the anti-immigration laws that followed the San Francisco Gold Rush and the segregation laws that followed the Civil War. In the wake of the Charlottesville protests, he would probably argue against Senator John McCain’s assertionthat ‘white supremacists and neo-Nazis are, by definition, opposed to American patriotism and the ideals that define us as a people and make our nation special’.

Whitman also questions whether the US common law system really serves to protect the freedom of individuals against an over-reaching state. He points out that the Nazis, rather than taking over the pre-existing German civil law system, reformed it according to a common law model. Nazi officials were given discretion to act in what they believed to be the ‘spirit of Hitler’ (149), brushing aside the legal scientific tradition of the moderates of the 1934 meeting. He argues that when it came to race, American ‘legal science’ tended to yield to American politics and left much racist legislation untouched.

So where does that leave the ‘science’ of eugenics, and the ‘legal science’ of the jurists working in a civil code system? Does a logically consistent approach of any kind protect individual liberties, or rather open up a way to discriminate based on supposedly objective measures? An important point, not explicitly made by Whitman but implicit throughout the book, is that the supposed objectivity of a scientific approach (whether in biology or the law) can easily be misused by those whose aims are clearly undemocratic and unegalitarian. On ‘The Daily Shoah’ and other racist websites, substantial discussion is devoted to ‘metrics’ related to, for example, race and IQ or sexual orientation and the chance of conviction for paedophile offences.

The Charlottesville protests were sparked by the decision to remove a statue of Robert E. Lee, a Confederate General in the Civil War: proponents of the removal argued that it served as a monument to white supremacy. Conversely, in the United Kingdom, a similar controversy surrounding a petition to remove Cecil Rhodes’s statue in Oriel College Oxford failed to lead to its removal, and the Galton Institute in London (which acknowledges its founding as the Eugenics Education Society in 1907, but disassociates itself from any interest in the theory and practice of eugenics) continues to fund research and award essay prizes on genetics for A Level students. Clearly retaining the material legacy of historical figures runs the risk of allowing their glorification (as in Charlottesville), whitewashing or suggesting implicit sanction of their actions.

However, in Whitman’s view, to try to forget or ignore these figures and their ongoing influence on society today is the more dangerous option. Hitler’s American Model is a thoughtful and carefully researched account of how the legal community in the US and Germany proved ‘incapable of staving off the dangers of the politicization of criminal law’ (159). He worries that:

the story in this book […] is not done yet […] what Roland Freisler saw, and admired, in American race law eighty years ago is still with us in the politics of American criminal justice (160).

Given recent developments in American politics, this should perhaps give us all pause for thought.


Subscribe to [email protected] :)

 

Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

*In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when formulating the Nuremberg Laws in the 1930s. This is a carefully researched and timely analysis of how racist ideology can penetrate the political and institutional fabric of societies, furthermore underscoring its continued impact in the USA today, writes Thomas Christie Williams. *

After the full horrors of Nazism were exposed at the end of World War II, eugenics – in Francis Galton’s words, the ‘science which deals with all influences that improve the inborn qualities of a race’ – as a social and scientific movement slowly faded from public view. The fact that Ronald Fisher, the founder of the modern discipline of genetics, and John Maynard Keynes, the economist whose ideas underpinned the New Deal, were active members of the Eugenics Society is now rarely discussed at Cambridge University, where they spent much of their academic careers. In 1954, the name of scientific journal the Annals of Eugenics was changed to the Annals of Human Genetics, and in 1965 the incoming recipient of the Chair of Eugenics at UCL, Harry Harris, became instead the Galton Professor of Human Genetics.

However, two groups of people have worked hard to keep memories of this great enthusiasm for a ‘scientific’ approach to institutionalised racism alive. The first are those who see understanding the history of the twentieth century as important, in order that we do not make the same mistakes again. They argue that whilst Nazism was the extreme end of the spectrum, it espoused views on nationality and race that were, if not mainstream, definitely recognised as acceptable by many sectors of society in Europe and the Americas. James Q. Whitman, author of Hitler’s American Model: The United States and the Making of Nazi Race Law, falls into this camp.

A legal scholar, Whitman identifies many commonalities between Nazi legislation in the early 1930s, which sought to exclude Jews from German public life, and the ‘Jim Crow’ laws enacted to exclude African Americans in the United States. Moving beyond commonalities, he argues that Nazi lawyers and the German public had a keen interest in US race law. As an example, he cites a 1936 article on racial policy in Neues Volk (New Volk), a propaganda newsletter from the National Socialist Office, which included a US map labelled ‘Statutory Restrictions on Negro Rights’, detailing disenfranchisement and anti-miscegenation laws in the 48 mainland US states.

The second group is the far-right movements arguably edging into the mainstream in the United States and Europe (in Hungary or Holland, for example). The chants of ‘Blood and Soil’ from the recent white supremacist rallies in Charlottesville, Virginia were an explicit reference to the Nazi ideal of ‘Blut und Boden’, and those gathered there are united by their fascination with fascist ideology and rhetoric. Vanguard America argues in its manifesto for an economy ‘free from the influence of international corporations, led by a rootless group of international Jews, which place profit beyond the interests of our people’. Membership of the Nationalist Socialist Movement (described on their website as ‘America’s Premier White Civil Rights Organization’) is ‘open to non-Semitic heterosexuals of European descent’, and a popular blogger for the alt-right, Mike Peinovich, who spoke at Charlottesville, hosts a chatshow entitled ‘The Daily Shoah’.

Hitler’s American Model is therefore a timely and sobering outline of how racist ideology can make its way into the political fabric of a country. It focuses on the changes introduced by Nazi lawyers post-1933, but we also learn much about how this developed in the United States. Whilst in the latter the case law excluding non-whites from public life developed over decades, in Nazi Germany the Nuremberg Laws were drafted and introduced in 1935, just two years after Hitler became Chancellor. Whitman’s main premise is that in this accelerated process, German lawyers and officials took inspiration and concrete guidance from legal practice across the Atlantic.

Reading the book, two sets of records stand out, one for their presence, and the other for their absence. The first is the stenographic report of a 5 June 1934 meeting of the Commission on Criminal Law Reform. Whitman’s twenty-page description of this transcript makes for gripping reading, and is the highlight of the book (94-113). The second is the lack of documentation regarding a September 1935 US study tour by 45 German lawyers (132). The trip was apparently a reward for their success in finalising the Nuremberg Race Laws, laid out by Hermann Göring at a rally only a few weeks earlier. As Dr. Heubner, chief of the Nazi Jurists’ Association, told the tour group before they left: ‘through this study trip the upholder of German law [will] gain the necessary compensation for an entire year of work’ (133). According to Whitman, historical record tells us that on arrival in New York at a reception organised by the New York City Bar Association, the group were met by a noisy demonstration lasting six hours and requiring police presence. However, in Whitman’s words: ‘sadly it does not seem possible to learn more about how […] the group fared on their study trip’. From the first set of records we learn much about how German lawyers saw their American counterparts; from the second (missing) set, we might have learnt more about how the American establishment viewed legal developments in the Third Reich.

Assembled at the 1934 meeting were seventeen lawyers and officials, and their brief was to respond to the demands of the Prussian Memorandum of September 1933. This document argued that the ‘task of the National Socialist State is to check the race-mixing that has been underway in Germany over the course of the centuries, and strive towards the goal of guaranteeing that Nordic blood, which is still determinative in the German people, should put its distinctive stamp on our life again’ (85). The final outcome of such meetings was the Nuremberg Laws, which consisted of three parts. The first, the Flag Law for the Reich, declared the swastika to be the only German national flag. The second, the Citizenship Laws, created a difference between German nationals – ‘any person who belongs to the mutual protection association of the German Reich’ – and the citizen – ‘a national of German blood’ who was the ‘sole bearer of full political rights’ (29). The third, the Nuremberg Blood Laws, made a criminal offence of marriage or extramarital sex between ‘Jews and nationals of German blood’ (31).

Whitman’s description of the 1934 meeting is gripping for a number of reasons. Firstly, it allows the opportunity to witness the mechanics of discrimination at work. We learn how a group of highly educated professionals – civil servants, legal academics, medical doctors – came together to formulate a set of profoundly exclusionary and undemocratic laws. The committee was faced with a number of questions. How could one define race in legal terms? Could it be possible to criminalise an act (in this case, sexual relations between a German and a Jew) to which two competent parties had consented? Secondly, as a non-American, it further underscores the deeply institutionalised discrimination within US law at this time, belying the idea that a supposedly independent judiciary can act to protect the rights of all citizens.

In Whitman’s interpretation, two groups were pitted against each other at the 1934 meeting. The first were juristic moderates, who felt that a policy of criminalising German and Jewish sexual relations was not in keeping with the German legal tradition. German criminal law, they argued, was based on clear and unambiguous concepts (105). Race, and in particular Jewishness, was difficult to ‘scientifically’ define (105); judges could not be expected to convict on the basis of vague concepts. Their adversaries were Nazi radicals, who argued that a new Criminal Code should be drawn up using the ‘fundamental principles of National Socialism’ (96). According to Whitman, it was these radicals who championed American law, already touched on in the Prussian Memorandum.

As it turns out, the American approach to defining race was not greatly troubled by the absence of a scientific conceptualisation. For the Nazi radicals, this was a heartening example. Roland Freisler, a State Secretary attached to the Ministry of Justice, pointed out: ‘How have they gone about doing this [defining race]? They have used different means. Several states have simply employed geographical concepts […] others have conflated matters, combining geographical origin with their conception of a particular circle of blood relatedness’ (107). Freisler continued:

they name the races in some more primitive way […] and therefore I am of the opinion that we can proceed with the same primitivity that is used by these American states (109).

Contrary to established German tradition, Nazi radicals believed that judges should be given freedom to institute racist legislation, without the need to come up with a scientifically satisfactory definition of race.

It is hard to argue with Whitman’s assertion that Nazi jurists and policymakers took a sustained interest in American race law, and that this helped shape the legal and political climate that led to the promulgation of the Nuremberg Laws. What Whitman moves on to in his conclusion is the extent to which the American legal and political system as a whole, beyond Jim Crow, was permeated with racism: laws related to race-based immigration, race-based citizenship and race-based anti-miscegenation. He makes the unsettling argument that America and Nazi Germany were united by a strong egalitarian, if not libertarian (in the Nazi case), ethos. This ethos, he argues, is that of all white men being equal, and thus it was not surprising that Nazism – in Whitman’s view an egalitarian social revolution for those self-defining as of German origin – turned to America for inspiration. As Whitman points out, white supremacy has a long history in the US, from 1691 when Virginia adopted the first anti-miscegenation statute, to 1790, when the First Congress opened naturalisation to ‘any alien, being a free white person’ (145), to the anti-immigration laws that followed the San Francisco Gold Rush and the segregation laws that followed the Civil War. In the wake of the Charlottesville protests, he would probably argue against Senator John McCain’s assertionthat ‘white supremacists and neo-Nazis are, by definition, opposed to American patriotism and the ideals that define us as a people and make our nation special’.

Whitman also questions whether the US common law system really serves to protect the freedom of individuals against an over-reaching state. He points out that the Nazis, rather than taking over the pre-existing German civil law system, reformed it according to a common law model. Nazi officials were given discretion to act in what they believed to be the ‘spirit of Hitler’ (149), brushing aside the legal scientific tradition of the moderates of the 1934 meeting. He argues that when it came to race, American ‘legal science’ tended to yield to American politics and left much racist legislation untouched.

So where does that leave the ‘science’ of eugenics, and the ‘legal science’ of the jurists working in a civil code system? Does a logically consistent approach of any kind protect individual liberties, or rather open up a way to discriminate based on supposedly objective measures? An important point, not explicitly made by Whitman but implicit throughout the book, is that the supposed objectivity of a scientific approach (whether in biology or the law) can easily be misused by those whose aims are clearly undemocratic and unegalitarian. On ‘The Daily Shoah’ and other racist websites, substantial discussion is devoted to ‘metrics’ related to, for example, race and IQ or sexual orientation and the chance of conviction for paedophile offences.

The Charlottesville protests were sparked by the decision to remove a statue of Robert E. Lee, a Confederate General in the Civil War: proponents of the removal argued that it served as a monument to white supremacy. Conversely, in the United Kingdom, a similar controversy surrounding a petition to remove Cecil Rhodes’s statue in Oriel College Oxford failed to lead to its removal, and the Galton Institute in London (which acknowledges its founding as the Eugenics Education Society in 1907, but disassociates itself from any interest in the theory and practice of eugenics) continues to fund research and award essay prizes on genetics for A Level students. Clearly retaining the material legacy of historical figures runs the risk of allowing their glorification (as in Charlottesville), whitewashing or suggesting implicit sanction of their actions.

However, in Whitman’s view, to try to forget or ignore these figures and their ongoing influence on society today is the more dangerous option. Hitler’s American Model is a thoughtful and carefully researched account of how the legal community in the US and Germany proved ‘incapable of staving off the dangers of the politicization of criminal law’ (159). He worries that:

the story in this book […] is not done yet […] what Roland Freisler saw, and admired, in American race law eighty years ago is still with us in the politics of American criminal justice (160).

Given recent developments in American politics, this should perhaps give us all pause for thought.

 

cross-posted from: https://lemm.ee/post/10358195

The road from Rome

The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern.

This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention?

It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted.

But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last.

The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters.

When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants.


Read more through the link. And join lemm.ee/c/history

 

The road from Rome

The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern.

This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention?

It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted.

But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last.

The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters.

When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants.


Read more through the link. And join lemm.ee/c/history

 

Do not forget them

Thousands of Indigenous children suffered and died in residential ‘schools’ around the world. Their stories must be heard

Between 1890 and 1978, at Kamloops Indian Residential School in the Canadian province of British Columbia, thousands of Indigenous children were taught to ‘forget’. Separated from their families, these children were compelled to forget their languages, their identities and their cultures. Through separation and forgetting, settler governments and teachers believed they were not only helping Indigenous children, but the nation itself. Canada would make progress, settlers hoped, if Indigenous children could just be made more like white people.

In 1890, this curriculum of forgetting was forcibly taught in the few wooden classrooms and living quarters that comprised Kamloops Indian Residential School. But in the early 20th century, the institution expanded, and a complex of redbrick buildings was constructed to accommodate an increase in students. In every year of the 1950s, the total enrolment at the ‘school’ exceeded 500 Indigenous children, making this the largest institution of its kind in Canada.

Today, the redbrick buildings are still standing on the Tk’emlúps te Secwépemc First Nation’s land. You can still look through the glass windows and see the old classrooms and halls. You can walk the grounds, toward the site of the former orchard or the banks of the nearby river. And you can stand over the graves of 215 children who died right here, at Kamloops Indian Residential School. Some never saw their fourth birthday.

You might think the Kamloops ‘school’ and its unmarked graves are an isolated and regrettable part of Canadian history, which we have now moved beyond. But that is a lie. Those 215 graves are part of a much larger political project that continues to this day.

When the burial sites at Kamloops were identified in May 2021 using ground-penetrating radar, news of the ‘discovery’ spread through international media. First-hand accounts of former students and Indigenous community members began to spread, too, and it soon became clear to the wider world that the ‘discovery’ was really a confirmation of what Indigenous peoples in Canada had known for generations. As Rosanne Casimir, the current Kúkpi7 (chief) of Tk’emlúps te Secwépemc, explains it, the search for bodies was a deliberate attempt to verify a knowing:

We had a knowing in our community that we were able to verify. To our knowledge, these missing children are undocumented deaths … Some were as young as three years old. We sought out a way to confirm that knowing out of deepest respect and love for those lost children and their families, understanding that Tk’emlúps te Secwépemc is the final resting place of these children.

The testimonies from survivors and their descendants were met with expressions of shock and disbelief from settler Canadians: how could this have happened? Why didn’t we know anything about this? But the knowledge was no secret. It was publicly available in institutional records; it was in the testimonies of Indigenous peoples; and it was in 20th-century reports made by government officials. We didn’t just choose to forget, we participated in a grand project of forgetting.

During the past decade or so, I have been finding out what I can – as a white British psychologist with longstanding interests in education and social justice – about this forgetting and the attempts made to forcibly assimilate Indigenous peoples through residential ‘schooling’. I am grateful beyond measure to the Indigenous peoples from Canada and elsewhere who have generously shared their experiences and stories with me over the years. Very often, their parting advice to me has been something along the lines of: ‘You should educate your own people about this.’ This essay is my most recent attempt to do so.

Abuses didn’t take place only in the dim and distant past

Yes, I’ve been honoured and privileged to have had Indigenous survivors of ‘educational’ systems, and their descendants, share their experiences and perspectives with me. But hearing the truth directly isn’t the only way for settlers and Europeans to learn and remember. The records are there, filled with the stories of those left to drown in the wake of settler colonisation. So, what does that say for our apparent ‘shock’? What does our ‘surprise’ really mean?

These questions become more confronting when we accept that abuses didn’t take place only in the dim and distant past. Consider this testimony from 1998 of Willie Sport who was a student, in the 1930s, of Alberni Indian Residential School in British Columbia:

I spoke Indian in front of Reverend Pitts, the principal of the Alberni school. He said: ‘Were you speaking Indian?’ Before I could answer, he pulled down my pants and whipped my behind until he got tired. When I moved, he put my head between his knees and hit me harder. He used a thick conveyor belt, from a machine, to whip me.

That Principal Pitts was trying to kill us. He wouldn’t tell parents about their kids being sick and those kids would die, right there in the school. The plan was to kill all the Indians they could, so Pitts never told the families that their kids had tuberculosis.

I got sick with TB and Pitts never told anyone. I was getting weaker each day, and I would have died there with all those others but my Dad found out and took me away from that school. I would be dead today if he hadn’t come.

Abuses took place well into the 20th century. The revelation of the burial sites at Kamloops and the ensuing ‘shock’ of settler Canadians shows that forgetting – in the form of unlearning, concealment, or deception – is an integral part of the very system that killed those children and erased them from settler memories.

*Read the rest through the link. *


The previous spammer on this community would post gory stories from history, for sensationalist clickbait, to push malware onto them using enthralling stories of horror. But it is a different experience to realise that the horrors are not past and dead, but present and pervasive as you read this.

 

cross-posted from: https://lemm.ee/post/10323811

Frickles talks Harry Potter fanfiction, writing it, and more!

(Author of A Malignant Ruse Harry/Daphne, A Discordant Pattern, etc)

 

cross-posted from: https://lemm.ee/post/10323811

Frickles talks Harry Potter fanfiction, writing it, and more!

(Author of A Malignant Ruse Harry/Daphne, A Discordant Pattern, etc)

 

Michael Gambon, the Irish-English actor best known for his role as Hogwarts headmaster Albus Dumbledore in six of the “Harry Potter” movies, has died, Variety has confirmed. He was 82.

“We are devastated to announce the loss of Sir Michael Gambon,” his family said in a statement. “Beloved husband and father, Michael died peacefully in hospital with his wife Anne and son Fergus at his bedside, following a bout of pneumonia.”

While it is easier for a character actor, often working in supporting roles, to rack up a large number of credits than it is for lead actors, Gambon was enormously prolific, with over 150 TV or film credits in an era when half that number would be impressive and unusual — and this for a man whose body of stage work was also prodigious.

He played two real kings of England: King Edward VII in “The Lost Prince” (2003) and his son, King George V, in “The King’s Speech” (2010); Winston Churchill in his later years in the 2015 ITV/PBS “Masterpiece” telepic “Churchill’s Secret”; U.S. President Lyndon Johnson in John Frankenheimer’s 2002 HBO telepic “Path to War,” for which he was Emmy-nominated; and a fictional British prime minister in “Ali G Indahouse,” also in 2002. And as Hogwarts headmaster in the “Harry Potter” movies, he presided over the proceedings therein. In 2016, he served as the narrator for the Coen brothers’ paean to golden-age Hollywood, “Hail! Caesar.”

But Gambon was just as likely to play a gangster as an eminence grise: He recurred on David Milch’s HBO horse-racing drama “Luck” in 2011-12 as a powerful adversary of Dustin Hoffman’s mobster Ace Bernstein, but if there is a single film role for which Gambon should be remembered, it is his thunderous, sulfurous foray as the thief of the title — a gangster if ever there was one — in Peter Greenaway’s 1999 “The Cook, the Thief, His Wife and Her Lover.” This role, after decades of appearing in movies, is what really brought him to the attention of the film world. Roger Ebert declared: “The thief’s thuggish personality stands astride the movie and browbeats the others into submission. He is a loud, large, reprehensible criminal, played by Michael Gambon as the kind of bully you can only look at in wonder, that God does not strike him dead.”

Playing another excellent gangster in Matthew Vaughn’s 2005 British crime film “Layer Cake,” Gambon was handed one of the best lines: “England. Typical. Even drug dealers don’t work weekends.” (Ebert said that Eddie Temple, Gambon’s character, is “the kind of man whose soul has warts on its scars.”)

But Gambon could equally well play upper crust, as in Robert Altman’s 2001 film “Gosford Park” or the 2008 rendition of “Brideshead Revisited.”

And he played an excellent villain in Michael Mann’s whistleblower film “The Insider,” in which the actor portrayed the head of a tobacco company.

Gambon took over the role of Albus Dumbledore after the death of Richard Harris, who had played the role in the first two films. Gambon admitted that he had never read the “Harry Potter” books, and he told the U.K.’s the Independent, “I’d never seen any of the previous films, but working on the series was huge fun — and for lots of dosh.”

We are incredibly saddened to hear of the passing of Sir Michael Gambon. He brought immeasurable joy to Harry Potter fans from all over the world with his humour, kindness and grace. We will forever hold his memory in our hearts. pic.twitter.com/1CoTF3zeTo — Harry Potter (@harrypotter) September 28, 2023

Gambon was also among the stars of the 2015 BBC/HBO miniseries based on J.K. Rowling’s novel “The Casual Vacancy.”

In addition to his nomination for outstanding lead actor in a miniseries or movie for “Path to War” in 2002, Gambon was Emmy-nominated for supporting actor in a miniseries or movie for playing Mr. Woodhouse in the 2009 adaptation of Jane Austen’s “Emma” that starred Romola Garai in the title role.

The actor won four BAFTA TV Awards for best actor, first for his career-changing role in 1986’s “The Singing Detective,” next for 1999’s “Wives and Daughters,” then for 2000’s exquisite telepic “Longitude” and then the following year for “Perfect Strangers.”

His TV career also included starring as the legendary French police inspector in the Granada Television series “Maigret,” which aired on PBS in early 1990s, and more recently included starring, in 2015, in the Scandinavian series “Fortittude,” airing in the U.S. on Pivot.

Gambon made his movie debut in “Othello,” starring Laurence Olivier, in 1965. While his craggy appearance as an older man may make it hard to believe, he played romantic leads in film and TV for a time. He was, for example, the swashbuckling Gavin Ker in BBC series “The Borderers” in the early 1970s. And, in 1970, Gambon was asked by James Bond producer Albert “Cubby” Broccoli to audition for the role of 007 to replace George Lazenby.

Gambon’s first role in a film where Americans might have noticed him was as the zookeeper who helps Ben Kingsley and Glenda Jackson abscond with the sea turtles in 1985’s delightful, eccentric romance “Turtle Diary.”

After decades in British television, the actor starred in Dennis Potter’s extraordinary 1986 musical mystery miniseries “The Singing Detective,” drawing a BAFTA TV Award for best actor. The series later aired on PBS and won a Peabody Award.

In his long and illustrious stage career, he was, in addition to Shakespeare, most associated with the works of Alan Ayckbourn (including the “Norman Conquests” trilogy) and Harold Pinter.

In 2004, Gambon starred with Annette Bening in Istvan Szabo’s “Being Julia,” playing the theater impresario who taught Bening’s Julia much of what she knows.

He won three Laurence Olivier Awards (the highest honors in British theater, equivalent to a Tony): in 1986, for best comedy performance, for Ayckbourn’s “A Chorus of Disapproval”; in 1988, for best actor, for Arthur Miller’s “A View From the Bridge”; and in 1990, for comedy performance, for Ayckbourn’s “Man of the Moment.” He was also nominated for best actor a further 10 times.

Despite a long career on the stage in the U.K., Gambon appeared on Broadway only once, starring in David Hare’s play “Skylight” in 1996 and drawing a Tony nomination for best actor.

Michael John Gambon was born in Cabra, Dublin, Ireland. He attended the Royal Academy of Dramatic Art from the ages of 18 to 21, all the while apprenticing as a toolmaker (and forever maintaining a fascination with machines big and small, collecting antique guns, clocks and watches as well as classic cars).

Gambon made his professional stage debut in the Gate Theatre Dublin’s 1962 production of Othello; he was 24, and toured with the Gate before catching the attention of Laurence Olivier, who brought him into the newly formed National Theatre Company. In 1967, Gambon departed to join the Birmingham Repertory Company, where he had the chance to take on the starring roles in the Shakespearean canon, his favorite of which was the title role in “Othello,” though he also essayed “Macbeth” and “Coriolanus.” In his early 40s, he impressed critics and audiences with his take on the title role in “King Lear” at Stratford.

Impressed by the young actor, Ralph Richardson once dubbed him the Great Gambon; decades later, in July 2012, the BBC included Gambon on its list of the top 10 British character actors.

In 2004, he played Sir John Falstaff in Nicholas Hytner’s National Theatre production of “Henry IV,” Parts 1 and 2, fulfilling a lifelong ambition.

In addition to the three Olivier Awards he won, Gambon’s additional 10 nominations, all for best actor, were for Harold Pinter’s “Betrayal” in 1979; Bertolt Brecht’s “The Life of Galileo” in 1980; Christopher Hampton’s “Tales From Hollywood” in 1983; David Hare’s “Skylight” in 1997; Stephen Churchett’s historical drama “Tom and Clem” in 1998; Yasmina Reza’s “The Unexpected Man” in 1999; Pinter’s “The Caretaker” in 2001; Caryl Churchill’s “A Number” in 2003; Beckett’s “Endgame” in 2005; and Pinter’s “No Man’s Land” in 2009.

In February 2015, at the age of 74, Gambon announced that he was retiring from stage acting because memory loss was making it increasingly difficult for him to remember his lines. He had, for several years before that, relied on an earpiece through which he could be prompted if he forgot his lines. A few years earlier he had been rushed to a hospital over the panic attacks caused by forgetting his lines.

Gambon was loath to reveal details of his private life. He married Anne Miller in 1962 and had a child, Fergus, in 1964. Fergus, schooled in part by his father, appeared as an expert on the BBC version of “Antiques Roadshow.”

In 2002, Gambon moved out of the home he shared with his wife in Kent and soon introduced Philippa Hart as his girlfriend. In addition to son Fergus, he is survived by Hart and two young sons by her, Michael, born in 2007, and William in 2009.

view more: next ›