The Obama administration's most consequential decision was to address the Great Financial Crisis by bailing out the finance sector, rather than borrowers. It was an unforced error, directed by Goldman-Sachs bankers elevated to the ranks of finance regulators, and we are still living with its consequences.
The choice to enact quantitative easing (rather than debt relief, direct transfers to consumers, or regulation of mortgage-backed securities) triggered the foreclosure crisis, wiped out family wealth (especially Black wealth, which declined more under Obama than any other president), and produced a still-inflating asset bubble.
It also set a precedent, shifting the Overton window in a way that made the trillions that Trump pumped into the capital markets (first through massive tax-cuts, then through covid programs) seem bipartisan.
Today, the capital markets are utterly uncoupled from the real economy. The stock market's unprecedented bull run coincided with a decline in the fortunes of real businesses and real workers, and the overslosh produced bubbles in other asset classes, including some absolutely absurd ones (cryptos/NFTs, wine, art, supercars, sports collectibles, etc).
The rich have too much money, and nowhere to put it and so the economy is in metastatic, stage-four ponzi-ism.
Many asset bubbles are indirectly harmful – e.g. the climate consequences of crypto, or the way easy capital has spurred even more mergers and monopolization, with the attendant layoffs and worsening labor conditions. Asset inflation has also spurred investment in predatory enterprises like Uber and Doordash, who use investor cash to subsidize a money-losing operation that strangles real, locally owned businesses.
But there's one asset bubble that has an immediate and direct harmful effect: the housing bubble. Since the Great Financial Crisis, Wall Street has been hell-bent on acquiring single-family homes, converting them to rental property, gouging on rent, skimping on maintenance, and evicting on the flimsiest pretense.
By every metric, Wall Street investors are the worst landlords, and they're the fastest-growing class of landlord. These two facts are related. Big firms are able to buy up so much housing because they are able to borrow cheaply, issuing bonds that "securitize" the rent payments from tenants. Access to this capital is dependent on the ability to raise rents and scare tenants into silence over dangerous living conditions. In other words, Wall Street firms can only corner the market on housing if they promise their investors that they'll brutalize and beggar their tenants.
Wall Street's plan to financialize the roof over your head is a deeply corrupting project, one that involves pumping hundreds of millions in dark money into defeating tenant's rights and rent controls:
Stein's Law predicts that "If something cannot go on forever, it will stop." Bubbles can't go on forever. The stock market has been swinging like a drunkard around a lamppost for the past week. Cryptos are tumbling. Rugpulls are dominating the NFT market. I don't know about sports memorabilia, but I wouldn't be surprised if that was in trouble, too.
But the rich still have too much money, and that money is fleeing stocks and collectibles and cryptos. Where's it going? Housing.
In this interview with Saker, Michael Hudson discusses the coming tsunami of real-estate inflation, in which the suckers who lost everything to the other bubbles go bankrupt and lose their homes to "investors" who plan to securitize a stream of rents from immiserated tenants.
"Inflation" is the topic du jour, wielded primarily as a whip to drive us to austerity. Most of the time, "inflation" is code for "regular people have too much money and we need to take some of it away" – by reducing benefits and suppressing wages.
It's 100% true that asset inflation is being driven by too much money chasing too few goods – the money in this case is the capital gains of the investor class and the "goods" are productive businesses with growth potential. Thus we see investors flocking to pyramid schemes and destructive strip-miners whose "growth" comes from looting good businesses and discarding their husks.
But do regular people really have too much money? Are we all just trying to buy too many vegetables from farmers, too much paper from pulp mills, too many carbon-steel bike-frames from Chinese factories?
That proposition is a lot muddier. Sure, there are supply shocks due to covid's impact on the over-optimized, brittle supply chains the finance sector has demanded of real businesses. But there's also obvious monopolies and oligopolies, whose CEOs are openly touting their ability to raise prices under cover of the pandemic.
The more we tell the story of "too many working people's dollars chasing not enough goods," the more we empower monopolists to increase their margins and jack up prices. As Hudson says, "For meat, eggs and other farm produce, the farmers are not receiving higher prices for their crops and produce. The middlemen are gouging out more fees for themselves, thanks to the monopoly position of Cargill et al."
Not all price-gouging is created equal. Wealthy, empowered consumers are able to shop more widely, and when they catch monopolists at their profiteering, they have the ears of policy-makers. If you are a monopolist looking to really ratchet up prices, your safest bet is to pick on poor people, who have fewer retail options and less political cachet.
That's something that Jack Munroe has made clear with her new "Vimes Boot Index" (inspired by Terry Pratchett's character Sam Vimes, who ruminates on how replacing cheap boots costs more than buying one long-lasting but expensive pair, creating a tax on poverty).
Monroe's project started with a viral tweet complaining that the official UK inflation figures pegged inflation at 5%, but that this did not reflect the massive price-hikes on the very cheapest food at the supermarket.
Monroe cites triple-digit inflation on goods like pasta, rice, baked beans, peanut butter, etc. This week on the BBC's More or Less podcast, Tim Harford and Monroe take a closer look at the index:
Neither offers a theory of why price inflation is hitting the cheapest goods so much harder than the most expensive, but I think the answer lies in political economy: it's safer to abuse poor people than rich people, and monopolists know it.
(Image: Library of Congress)
Despite what you may have heard, cops have a relatively safe job. Cops are injured and killed with less frequency than roofers, truckers, fishermen, and pizza-delivery people. Cops are basically armed bureaucrats and their primary role is to file reports about crimes, not intervene in dangerous situations.
Now, you may have heard that cop deaths are way, way, way up over the past two years. That is actually true – cops have been slain in unprecedented numbers since the pandemic began. Nearly all those deaths are the result of catching covid. Naturally, police unions (which are not actually unions) are fighting tooth-and-nail against vaccine and mask requirements for cops.
(You've heard of "suicide by cop?" This is "suicide by cop union.")
The rhetoric about the dangerous life of a cop doesn't merely serve to make cops feel romantic about their form-filling and rule-enforcing. It's the foundation of the narrative that makes it okay for police officers to murder people suspected of minor crimes using overwhelming, unjustifiable force: that force is hand-waved away as the inevitable result of the daily terror of being a cop on the mean, mean streets.
The latest mutation of this mean-streets story is the nonsensical claims that police officers are in daily risk of dying because they might be touched by someone experiencing a fentanyl overdose, and, in so doing, absorb a fatal dose of fentanyl through their fingertips.
This isn't a thing. There's a reason fentanyl users snort it and inject it, rather than rubbing it between their fingers.
It's not a thing when the Sacramento Bee reports it:
It's not a thing when CNN reports it:
Now, looking at these reports, it seems that some cops actually believe they have been poisoned (either that or they're putting on quite a show). That doesn't make it real. History is full of extraordinary popular delusions, imaginary diseases spread by social contagion.
The delusional belief in fentanyl overdose by contact high doesn't just hurt impressionable cops who scare themselves into flopping around on the ground, moaning. That's because those same cops then go on to charge people who experience fentanyl overdose with assaulting an officer by means of their imaginary Opiod Death-Touch.
These additional charges are adding years to the sentences of people experiencing addiction or just those guilty of simple possession. This despite the fact that the DEA has revised its guidance and now admits that there's no serious risk of fentanyl skin absorbtion:
The myth is all-pervasive in cop circles and has spread to other first responders, with many now hesitating to resuscitate people in overdose. 80% of NYC first responders now believe in skin-penetrating fentanyl:
As Tim Cushing writes for Techdirt, this delusion is so strong because "courts and lawmakers cut cops all sorts of slack under the assumption that cops should be given every opportunity to be wrong."
It's true that cops experience some danger on the job – just not as much as the pizza-delivery person who dropped off your pepperoni pie yesterday.
#20yrsago John Ashcroft covers up blind justice's booby https://web.archive.org/web/20020124151626/https://abcnews.go.com/sections/us/HallsOfJustice/hallsofjustice.html
#15yrsago Google founder regrets censoring China https://www.theguardian.com/technology/2007/jan/27/news.newmedia
#15yrsago Japan’s health minister: Women are “birth-giving machines” https://web.archive.org/web/20070305015449/http://www.tokyomango.com/tokyo_mango/2007/01/health_minister.html
#10yrsago Here’s the utterly inconsequential recording that resulted in NZ PM John Key ordering raids on the free press https://web.archive.org/web/20120130200045/https://juha.saarinen.org/7955
#10yrsaog Chief EU ACTA sponsor quits in disgust at lack of democratic fundamentals in global copyright treaty https://memex.craphound.com/2012/01/27/political-contributions-from-financial-sector-increased-700-since-1990/
#10yrsago Polish MPs wear Guy Fawkes masks to protest ACTA https://web.archive.org/web/20120127040038/http://sg.news.yahoo.com/poland-signs-copyright-treaty-drew-protests-102302237.html
#10yrsago Twitter adopts country-specific censorship regime – how will that work? https://www.theguardian.com/technology/2012/jan/27/twitter-censor-tweets-by-country
#10yrsago Software piracy is vital to preservation https://web.archive.org/web/20120310162820/https://www.pcworld.com/article/248571/why_history_needs_software_piracy.html
#10yrsago EMI VP opposes SOPA, thinks better products at better prices will solve piracy https://torrentfreak.com/emi-boss-opposes-sopa-says-piracy-is-a-service-issue-120125/
#5yrsago 19 crooks, 7,000 false identities, 1,800 drop addresses, and $200 million in credit card fraud https://www.justice.gov/usao-nj/pr/fugitive-arrested-200-million-credit-card-fraud-scam
#5yrsago White House confuses Theresa May, UK Prime Minister; with Teresa May, porn star https://timesofindia.indiatimes.com/world/uk/white-house-confuses-uk-pm-theresa-mays-name-with-a-porn-stars/articleshow/56815083.cms
#5yrsago Immigration officers raid Good Samaritan Family Resource Center in San Francisco’s Mission District https://sfist.com/2017/01/26/ice_agents_descend_on_missions_good/
#5yrsago US towns that pandered to anti-immigrant sentiment had to raise taxes and borrow to cover the millions in losses https://www.washingtonpost.com/business/economy/in-these-six-american-towns-laws-targeting-the-illegals-didnt-go-as-planned/2017/01/26/b3410c4a-d9d4-11e6-9f9f-5cdb4b7f8dd7_story.html
#5yrsago France aggressively prosecutes citizens for “solidarity crimes”: feeding and housing migrants https://www.aljazeera.com/features/2017/1/25/france-prosecuting-citizens-for-crimes-of-solidarity
#5yrsago Brexit, Chicken and Ulysses Pacts: the negotiating theory behind the UK-EU stalemate https://timharford.com/2017/01/brexit-as-a-game-of-chicken/
#1yrago Goldman CEO gets $17.5m reward for $4.5b fraud https://pluralistic.net/2021/01/27/viral-colonialism/#failing-up
#1yrago Facebook champions (its own) privacy https://pluralistic.net/2021/01/27/viral-colonialism/#ico-schtum
#1yrago Casino mogul steals First Nation's vaccine https://pluralistic.net/2021/01/27/viral-colonialism/#seriously-fuck-that-guy
Today's top sources: Naked Capitalism (https://www.nakedcapitalism.com/).
Moral Hazard, a short story for MIT Tech Review's 12 Tomorrows. Yesterday's progress: 373 words (2658 words total).
A Little Brother short story about remote invigilation. PLANNING
A Little Brother short story about DIY insulin PLANNING
Spill, a Little Brother short story about pipeline protests. SECOND DRAFT COMPLETE
A post-GND utopian novel, "The Lost Cause." FINISHED
A cyberpunk noir thriller novel, "Red Team Blues." FINISHED
Currently reading: Analogia by George Dyson.
Latest podcast: Science Fiction is a Luddite Literature (https://craphound.com/news/2022/01/10/science-fiction-is-a-luddite-literature/)
Dangerous Visions and New Worlds: Radical Science Fiction, 1950
to 1985 (City Lights), Feb 27
Emerging Technologies For the Enterprise, Apr 19-20
Moral Panic (Drug Science Podcast)
"How to Destroy Surveillance Capitalism": an anti-monopoly pamphlet analyzing the true harms of surveillance capitalism and proposing a solution. https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59 (print edition: https://bookshop.org/books/how-to-destroy-surveillance-capitalism/9781736205907) (signed copies: https://www.darkdel.com/store/p2024/Available_Now%3A__How_to_Destroy_Surveillance_Capitalism.html)
"Little Brother/Homeland": A reissue omnibus edition with a new introduction by Edward Snowden: https://us.macmillan.com/books/9781250774583; personalized/signed copies here: https://www.darkdel.com/store/p1750/July%3A__Little_Brother_%26_Homeland.html
"Poesy the Monster Slayer" a picture book about monsters, bedtime, gender, and kicking ass. Order here: https://us.macmillan.com/books/9781626723627. Get a personalized, signed copy here: https://www.darkdel.com/store/p1562/_Poesy_the_Monster_Slayer.html.
This work licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.
Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.
Blog (no ads, tracking, or data-collection):
Newsletter (no ads, tracking, or data-collection):
Mastodon (no ads, tracking, or data-collection):
Medium (no ads, paywalled):
(Latest Medium column: "A Bug in Early Creative Commons Licenses Has Enabled a New Breed of Superpredator" https://doctorow.medium.com/a-bug-in-early-creative-commons-licenses-has-enabled-a-new-breed-of-superpredator-5f6360713299)
Twitter (mass-scale, unrestricted, third-party surveillance and advertising):
Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):
"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla
ssh-agent was in the news recently due to the
The main takeaway from that incident was that one should avoid the
-A) functionality when
ProxyCommand can do and consider multi-factor
authentication on the server-side, for example using libpam-google-authenticator or libpam-yubico.
That said, there are also two options to
that can help reduce the risk of someone else with elevated
privileges hijacking your agent to make use of your ssh
The first option is
-c which will require you to
confirm each use of your ssh key by pressing Enter when a graphical
prompt shows up.
Simply install an
ssh-askpass frontend like
apt install ssh-askpass-gnome
and then use this to when adding your key to the agent:
ssh-add -c ~/.ssh/key
ssh-add -D will remove all identities (i.e. keys)
from your ssh agent, but requires that you remember to run it
manually once you're done.
That's where the second option comes in. Specifying
-t when adding a key will automatically remove that
key from the agent after a while.
For example, I have found that this setting works well at work:
ssh-add -t 10h ~/.ssh/key
where I don't want to have to type my ssh password everytime I push a git branch.
At home on the other hand, my use of ssh is more sporadic and so I don't mind a shorter timeout:
ssh-add -t 4h ~/.ssh/key
I couldn't find a configuration file to make these settings the
default and so I ended up putting the following line in my
alias ssh-add='ssh-add -c -t 4h'
so that I can continue to use
ssh-add as normal and
have not remember to include these extra options.
In 1992 John Conway raised a question about the patterns in his famous mathematical Game of Life: "Is there a Godlike still-life, one that can only have existed for all time (apart from things that don't interfere with it)?" Conway closed his note by adding "Well, I'm going out to get a hot dog now..." And then, nearly 30 years later, a mathematical blog reports: Ilkka Törmä and Ville Salo, a pair of researchers at the University of Turku in Finland, have found a finite configuration in Conway's Game of Life such that, if it occurs within a universe at time T, it must have existed in that same position at time T-1 (and therefore, by induction, at time 0)... The configuration was discovered by experimenting with finite patches of repeating 'agar' and using a SAT solver to check whether any of them possess this property. The blogger also shares some other Game of Life-related news: David Raucci discovered the first oscillator of period 38. The remaining unsolved periods are 19, 34, and 41.Darren Li has connected Charity Engine to Catagolue, providing approximately 2000 CPU cores of continuous effort and searching slightly more than 10^12 random initial configurations per day.Nathaniel Johnston and Dave Greene have published a book on Conway's Game of Life, featuring both the theoretical aspects and engineering that's been accomplished in the half-century since its conception. Unfortunately it was released slightly too early to include the Törmä-Salo result or Raucci's period-38 oscillator. Thanks to Slashdot reader joshuark for sharing the story.
Read more of this story at Slashdot.
How do I even write about this? Joel Coen has created a stunning, often terrifying, German Expressionist-ish take on Macbeth that, when it chooses to be, tips into full horror. While it isn’t my favorite take on the play (that would Akira Kurosawa’s Throne of Blood) this is the first time I’ve seen a Shakespeare adaptation and immediately wanted to rewatch it before the credits were even done rolling.
I’m going to assume that if you’re reading this you know the plot of Macbeth, so allow me to hasten to the important stuff: Joel Coen chose to take this on as a solo project, without his usual partner Ethan, and he adapted the play himself. What he chose to do in this adaptation was mine every vein of horror in the play, no matter how tiny, to create a movie about a curse grinding someone down to powder. I’ll talk more about the horror elements in a sec, but first:
Denzel Washington is goddamn incandescent as Macbeth. I mean, I expected him to be good, he’s Denzel Washington—but this trampled all over my expectations. He begins the story as a man already maybe a little too curious about finding an easy path to glory, but also prone to an excellent sardonic wit. Watching him wring himself inside out with paranoia and guilt is simply beautiful. And the best part, for me, is that his Macbeth becomes more compelling as his crimes pile up. Rather than becoming increasingly paranoid and defeated, Washington’s Macbeth becomes increasingly paranoid and powerful, seeming to gain strength from knowing that everyone has turned on him.
Frances McDormand is the Lady Macbeth I’ve wanted all my life. She brings a desert-dry wit to the role that makes lines like “Who’d have thought the old man to have so much blood in him” fucking sing. Would I risk it all to murder my beloved king and usurp his throne if she asked me to?
I wouldn’t even ask her to repeat the question.
But here’s the thing, the reason I wanted to write about it here—Macbeth is a horror, right? Macbeth, who as far as we know at the opening is a decent man, a loyal subject to his King, and great in battle is, for reasons that are never made clear, targeted by a trio of witches. They tell him a prophecy, knowing full well this will completely derail his life. Do they know about Lady Macbeth’s admirable bloodthirstiness? Do they believe that such a prophecy will be the undoing of any man? Is this a riff on the Book of Job, where a powerful supernatural entity decides to test a mortal’s moral character? Or do they wake up and say “Fuck the Thane of Glamis in particular”? Because they give a slightly more reasonable prophecy to Banquo—by telling him that his sons will be kings, they give him hope for the future. Delayed gratification. He can sit back and wait for destiny to come to him, while Macbeth looks at an alive king and his two alive sons, and decides he needs to take a more active role to make his prophecy a reality.
We’re left with the basic Oedipal question: if Macbeth had just continued being Thane of Glamis and Cawdor, a loyal subject and friend to the King, would other events have transpired to give him the Throne? Would the royal family have all died of pneumonia after staying the night in Macbeth’s enormous drafty castle? Or maybe both sons would have abdicated and run away to study at Wittenberg, like a certain Danish prince should have done, leaving the King with no choice but to name his trusted friend Macbeth as his heir? Or, I don’t know, multiple horse accidents! That had to happen a lot in the Middle Ages?
But no, the Macbeths go leaping straight for regicide.
In both Washington’s and McDormand’s performances, this felt to me like the two of them were bound to this path by a terrible destiny. They don’t enjoy their rule—only the anticipation of it. Macbeth begins seeing ghosts immediately, and Lady Macbeth only holds herself together slightly longer, seemingly because she’s trying to add some steel to her husband’s spine. But then, as she cracks, he grows stronger again? I loved the idea that the two of them are locked together, feeding from each other without even realizing. Their first few scenes are also… well, one hesitates to say “nice”, but they’re really a team! While I was watching it occurred to me that they’re the only couple in Shakespeare who treat their marriage as a true partnership, and then I found a quote from Joel Coen saying the same thing: “In the context of Shakespeare it’s a good marriage, they love each other. They happen to be plotting a murder, but hey, it’s OK.”
But what’s at the heart of the horror?
First Coen creates an inescapable sense of atmosphere. This film, maybe more than any version of Macbeth I’ve seen, feels like the thane’s crime has knocked nature itself out of joint. Thick mists roil across the camera, crows burst out of fields like malevolent clouds, the stars seems a little too bright—I began to wonder if Coen was going to go off book and bring some eldritch monster down on the Macbeths. The entire film plays out like a nightmare—but the standard issue nightmare is nothing compared to the scenes with The Witches.
I don’t want to spoil The Witches—if you haven’t seen the film yet, and plan to, skip down a paragraph so you can meet them, as I did, with NO IDEA what was about to happen. I’ll let you know in bold when the spoilers are over.
The Witches are played by Kathryn Hunter, who you might have seen as Arabella Figg in the Harry Potter movies. When they first appear—well, only one appears, because Coen works the camera so that you, the viewer, become the other two Weird Sisters, and Sister #1 is speaking to you directly. This is fucking terrifying, first of all, but it also makes you the viewer complicit in whatever curse or temptation is being laid upon Macbeth. (Their scenes are also where Coen chooses to cross his German Expressionist influences with something that feels like The Seventh Seal.) Only later do we see all three Witches, and then it’s like this:
Their second appearance is somehow ever scarier? They come to Macbeth, and even as they warp his castle around him and conjure a scrying pool in what used to be his floor, he seems to think that he’s the one commanding them. When he’s told that no man born of woman can kill him, he immediately files it under “I’m immortal now, and that’s rad” instead of questioning the way these prophecies seem to be changing, or seeing any potential pitfalls. He is a man whose very reality is being manipulated before his eyes, but he thinks he’s in control. Meanwhile The Witches hover in the rafters above him, looking down on him and waiting for him to take their bait. Coen also changes the ending of the play a tiny bit to hammer home the idea that The Witches have been twisting everyone around their gnarled fingers, and it’s great.
SPOILING OF WITCHES IS OVER!
Corey Hawkins and Moses Ingram are both excellent as the couple MacDuff—if this cold, stark film has a heart, it’s shared between the two of them. And as if all of this wasn’t enough, Stephen Root is in the movie! Our Greatest Living Character Actor turns up as the Porter, a tiny comic relief role that he milks for all its fun and Shakespearean gross-out humor. Carter Burwell’s score is ominous and boils up just like the mists.
I’ve said a lot here, but I’m still not sure exactly how to talk about this movie. What it reminded me of most was an earlier Coen outing, A Serious Man. That movie also wound itself around questions of fate and choice, as one beleaguered professor tried to understand what it means to be a “good” person, particularly a good Jew, in the face of impossible odds and possible divine retribution, and it seems that no matter what choices he makes, Larry Gopnik’s life falls to pieces around him. But that movie is resolutely real, solid, Mid-Century Modern—only in the very last scene does it seem to tip over into the mythic.
With The Tragedy of Macbeth, Joel Coen seems to be imagining something more like a Calvinist horror of predestination, or maybe a Bergman-esque, old school Lutheran slasher film? We don’t see Macbeth fight the prophecy. We don’t see him deny it. (And his wife isn’t so much a spider being held over a flame as a spider enthusiastically cannonballing into the inferno.) Instead it seems that he’s dragged along by fate with his eyes wide open to all the uncanny horror of being chosen to suffer—up until the moment he seems to think he’s invulnerable, which is when his destiny really begins toying with him. Joel Coen has created a world that is an unrelenting tightening fist, and turned Macbeth into even more of an existentialist horror than it already was.
It’s hard to believe it’s been over 15 years since Daniel Craig was announced as the new James Bond. Much has changed in that time but one thing that stayed constant was Craig’s gritty, grounded approach to the world’s most famous super spy. Craig’s Bond has always been a bit more real. A bit more relatable than the Bonds of the past. And now, with his final film No Time to Die out in the world, a documentary is available to commemorate the ride.
The documentary is called Being James Bond and while it was previously available to Apple TV+ subscribers, it’s now available in a much wider capacity on YouTube. In it, “Daniel Craig candidly reflects on his 15-year tenure as James Bond,” says the official website. “Including never-before-seen archival footage spanning from Casino Royale to No Time To Die, Craig shares his personal memories in conversation with 007 producers, Michael G. Wilson and Barbara Broccoli, in the lead up to his final performance as the iconic secret agent.” Check it out.
Because this was released before No Time to Die, Craig doesn’t talk about its surprising ending (but you can read him discuss those big spoilers here), but he does get very contemplative. “A lot of people here have worked on five pictures with me,” Craig says in the film. “I’ve loved every single second of these movies, and especially this one because I’ve got up every morning and I’ve had the chance to work with you guys, and that has been one of the greatest honors of my life.”
Since a new Bond has yet to be cast, and Craig’s is still so fresh in our minds, it’s hard to accurately rank him alongside the Bonds of the past. Does he overtake the legend of Sean Connery? Probably not. Has the popularity of his run been more influential than that of Pierce Brosnan? Probably. But certainly a few years and a new Bond will let us more properly think about his legacy.
Let us know what you think of the documentary, and where you think Craig ranks among the Bonds, below.
Wondering where our RSS feed went? You can pick the new up one here.
We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.
There has been a notable, and long overdue flurry, of antitrust actions targeting Big Tech, launched by users, entrepreneurs, and governments alike. And in the US and abroad, policymakers are working to revamp our antitrust laws so they can be more effective at promoting user choice.
These are positive developments, but this renewed focus on antitrust risks losing sight of another powerful legal lever: copyright. Because there’s copyrighted software in every digital device and online service we use, and because the internet is essentially a giant machine for copying digital data, copyright law is a major force that shapes technology and how we use it. That gives copyright law an enormous role in enabling or impeding competition.
The Digital Millennium Copyright Act (DMCA) is a case in point. It contains two main sections that have been controversial since they went into effect in 2000. The "anti-circumvention" provisions (sections 1201 et seq. of the Copyright Act) bar circumvention of access controls and technical protection measures. The "safe harbor" provisions (section 512) protect service providers who meet certain conditions from monetary damages for the infringing activities of their users and other third parties on the net.
Congress ostensibly passed Section 1201 to discourage would-be infringers from defeating DRM and other access controls and copy restrictions on creative works. In practice, it’s done little to deter infringement – after all, large-scale infringement already invites massive legal penalties. Instead, Section 1201 has been used to block competition and innovation in everything from printer cartridges to garage door openers, videogame console accessories, and computer maintenance services. It’s been used to threaten hobbyists who wanted to make their devices and games work better. And the problem only gets worse as software shows up in more and more places, from phones to cars to refrigerators to farm equipment. If that software is locked up behind DRM, interoperating with it so you can offer add-on services may require circumvention. As a result, manufacturers get complete control over their products, long after they are purchased, and can even shut down secondary markets (as Lexmark did for printer ink, and Microsoft tried to do for Xbox memory cards.)
On the other hand, Section 512’s “safe harbors” are essential to internet innovation, because they protect service providers from monetary liability based on their users’ infringing activities. To receive these protections service providers must comply with the conditions set forth in Section 512, including “notice and takedown” procedures that give copyright holders a quick and easy way to disable access to allegedly infringing content. Without these protections, the risk of potential copyright liability would prevent many online intermediaries—from platforms to small community websites to newspapers and ISPs -- from hosting and transmitting user-generated content. Without the DMCA, much of big tech wouldn’t exist today – but it is equally true that if we took it away now, new competitors would never emerge to challenge today’s giants. Instead, the largest tech companies would strike lucrative deals with major entertainment companies and other large copyright holders, and everyone else who hosted or transmitted third-party content would just have to shoulder the risk of massive and unpredictable financial penalties—a risk that would deter investment.
There is a final legal wrinkle: filtering mandates. The DMCA’s hair-trigger takedown process did not satisfy many rightsholders, so large platforms, particularly Google, also adopted filtering mechanisms and other automated processes to take down content automatically, or prevent it from being uploaded in the first place. In the EU, those mechanisms are becoming mandatory, thanks to a new copyright law that conditions DMCA-like safe harbors on preventing users from uploading infringing content. Its proponents insisted that filters aren't required, but in practice that’s the only way service providers will be able to comply. That’s created a problem in the EU – as the Advocate General of the EU Court of Justice acknowledged last year, automated blocking necessarily interferes with the human right to free expression.
But filtering mandates create yet another problem: they are expensive. Google has famously spent more than $100 million on developing its Content ID service – a cost few others could bear. If the price of hosting or transmitting content is building and maintaining a copyright filter, investors will find better ways to spend their money, and the current tech giants will stay comfortably entrenched.
If we want to create space for New Tech to challenge Big Tech, antitrust law can’t be the only solution. We need balanced copyright policies as well, in the U.S. and around the world. That’s why we fought to stop the EU’s mandate and continue to fight to address the inevitable harms of implementation, It’s why we are working hard to stop the current push to mandate filters in the U.S. as well. We also need the courts to do their part. To that end, EFF just this month asked a federal appeals court to block enforcement of the copyright rules in Section 1201 that violate the First Amendment and criminalize speech about technology. We have also filed amicus briefs in numerous cases where companies are using copyright to shut out competition. And we’ll keep fighting, in courts, legislatures, agencies, and the public sphere, to make sure copyright serves innovation rather than thwarting it.
A unique study led by researchers from the Harvard T.H. Chan School of Public Health has reported some of the first robust evidence to affirm multiple sclerosis (MS) is primarily caused by infection from the Epstein-Barr virus (EBV). Tracking more than 10 million US military personnel over a 20-year period, the landmark findings indicate EBV infection leads to a 32-fold increased risk of developing MS.
Unless you’re a lawyer, there’s a pretty good chance you’ve never read through a website’s entire terms of service. There’s a simple reason for that. Far too often, they’re too long and difficult to parse. Some services offer summary statements, but they’re the exception, not the norm.
A bipartisan group of lawmakers made up of Representative Lori Trahan and Senators Bill Cassidy and Ben Ray Luján of Louisana and New Mexico want to change that. They’ve introduced the – that’s TLDR for short. , the proposed legislation would require online businesses to include a “nutrition label-style” summary at the top of their terms of service agreements and make the contracts easy for researchers to examine through the use of XML tags. It would also require them to disclose any recent data breaches, as well as provide information on whether a user can delete their data and how they would go about doing that.
“For far too long, blanket terms of service agreements have forced consumers to either ‘agree’ to all of a company’s conditions or lose access to a website or app entirely. No negotiation, no alternative, and no real choice,” said Representative Trahan. The group cites a that found it would take the average American 76 workdays to read all the terms of service contracts they’ve agreed to use their favorite online services as the basis for the need of the TLDR Act. Should the legislation pass, it would empower the Federal Trade Commission and state attorneys general to enforce it.
<SimonSapin> nox: the history of packaging in python is
<nox> SimonSapin: All I need to know is, is setuptools old stuff or new stuff?
<SimonSapin> nox: its been both
<SimonSapin> in that order
What do Minecraft and modern DOOM have in common? Not a lot but a modder decided to merge them together to create DOOMED: Demons of the Nether and it's pretty darn good.
Giving you a single-player campaign, this free to download mod is inspired by the modern DOOM games borrowing plenty of design elements. Technically, in Minecraft land, it's an adventure map pack but it really does change the game. Take a look at just how impressive it is:
Installation is easy enough too and I've tested it working nicely on Linux.
Make sure you have a fresh install of Minecraft 1.16.13,
download the pack and
then simply extract the contents into your Minecraft saves folder
usually found in
~/.minecraft/saves/. That's it, as it
comes with everything needed.
Once done, load Minecraft up and select it from the Singleplayer menu and it will guide you through a few settings you might need to tweak to make it look as intended. Absolutely brilliant.
The first detailed observations of lightning's emergence inside a cloud have exposed how electric fields grow strong enough to let bolts fly.
2021 saw record numbers of people checking out ebooks from libraries, reports the Boston Globe — 500 million, according to figures from the ebook-lending platform OneDrive. But some Massachusetts lawmakers want to require publishers to make sure all their digital products are available to libraries — and at "reasonable terms" — because currently libraries pay much more than consumers: According to the American Library Association, libraries currently pay three to five times as much as consumers for ebooks and audiobooks. Thus, an ebook selling for $10 at retail could cost a library $50. In addition, the library can only buy the right to lend the book for a limited time — usually just two years — or for a limited number of loans — usually no more than 26. James Lonergan, director of the Massachusetts Board of Library Commissioners, believes that publishers settled on 26 checkouts after calculating that this is the number of times a printed book can be checked out before it's worn out and in need of replacement. And that's what happens to a digital book after 26 checkouts. The library must "replace" it by paying full price for the right to lend it out 26 more times. Lonergan admits that this approach makes a certain sense. Traditional printed books can only be borrowed by one user at a time, but in theory a digital book could be loaned to thousands of patrons at once. Also, printed books wear out and must be repurchased, but digital books last indefinitely. "You can't have a book be available forever at the same price point," Lonergan said. "The publishers need to make money." Lonergan thinks libraries and publishers can work out less expensive and more flexible terms. Publishers might charge a lower up-front cost for their digital products, for instance. Or they might expand the number of times libraries can lend out an ebook or audiobook. Lonergan believes that passing the Massachusetts law would give publishers further incentive to deal. But the Association of American Publishers (AAP), which represents most of the nation's leading publishers, is ready to fight. An emailed statement from AAP said the Massachusetts bill "raises significant constitutional and federal copyright law concerns and is an unjustified intrusion into a vibrant and thriving market for ebooks and audiobooks that benefits authors and publishers, booksellers, libraries, and the general public." The AAP has already sued in federal court to block enforcement of the Maryland law, arguing that only the federal government can regulate digital publishing practices. The head of the Massachusetts Board of Library Commissioners counters that if enough individual states pass ebook-pricing laws, "then Congress will step in and do something about this on the federal level."
Read more of this story at Slashdot.
This year’s passage of the Infrastructure Investment and Jobs Act (IIJA)—also known as the bipartisan infrastructure package—delivered on a goal EFF has sought for years. It finally creates a way for people to remedy a serious problem: a severe lack of fiber-to-the-home connectivity. Fiber optics lie at the core of all future broadband access options because it is the superior medium for moving data, with no close comparisons. As a result, global demand for fiber infrastructure is extremely high: China’s seeking to connect 1 billion of its citizens to symmetrical gigabit access, and many advanced EU and Asian nations rapidly approach near-universal deployment. Crucially, these countries did not reach these outcomes naturally through market forces alone, but rather by passing infrastructure policies much like the IIJA.
Now it’s up to elected officials in states, from governors to state legislators, to work to ensure the federal infrastructure program delivers 21st-century ready infrastructure to all people. Some states are ahead of the curve. In 2021, California embraced a fiber infrastructure for all effort with the legislature unanimously passing a historic investment in public fiber. State Senator Lena Gonzalez led this effort by introducing the first fiber broadband-for-all bill; EFF was a proud sponsor of this bill in Sacramento.
Other states are behind the curve by overly restricting the ability for local governments and utilities to plug the gaps that private internet service providers (ISPs) have left for sixteen years and counting. (2005 was when private fiber-to-the home deployment really kicked off.) Maintaining those barriers, even as federal dollars are finally released, guarantees those states’ failures to deliver universal fiber; the federal law, while important, isn’t sufficient on its own. Success requires maximum input from local efforts to make the most of this funding.
Understanding what progress we’ve made this year—and what still needs to be done—requires understanding the IIJA itself. The basic structure of the law is a collaboration between the federal government’s National Telecommunication Information Administration (NTIA), the Federal Communications Commission (FCC), and the states and territories. Congress appropriated $65 billion in total. That includes $45 billion for construction funds and $20 billion for efforts promoting affordability and digital inclusion. This money can be paired with state money, which will be essential in many states facing significant broadband gaps.
Responsibility for different parts of this plan falls to different people. The NTIA will set up a grant program, provide technical guidance to the states, and oversee state efforts. The FCC will issue regulations that require equal access to the internet, produce mapping data that will identify eligible zones for funding, and implement a new five-year subsidy of $30 per month to improve broadband access for low-income Americans. Both agencies will be resources to the states, which will be responsible for creating their own multi-year action plan that must be approved by the NTIA.
The timelines behind many of these steps are varied. The NTIA’s grant program must be established by around May 2022; states will then take in that guidance and develop their own action plans. Every state will receive $100 million plus additional funding to reflect their share of the national unserved population—a statistic that the FCC will estimate.
Congress also ordered the FCC to issue “digital discrimination” (also known as digital redlining) rules that ban deployment decisions based on income, race, ethnicity, color, religion, or national origin. EFF and many others have sought such digital redlining bans. Without these kinds of rules, we risk cementing first- and second- class internet infrastructure based on income status. Currently, companies offer high-income earners ever-increasingly cheaper and faster broadband, while middle to low-income users are stuck on legacy infrastructure that grows more expensive to maintain, while increasingly growing slower as broadband needs expand.
The digital discrimination provisions do allow carriers to exempt themselves from the rules if they can show economic and technical infeasibility for building in a particular area, which will limit the impact of these rules in rural markets. However, there should be no mistake that there is no good excuse for discriminatory deployment decisions in densely populated urban markets. These areas are fully profitable to serve, which is why the major ISPs that don’t want to serve everyone, such as AT&T and Comcast, fought so hard to remove these provisions from the bipartisan agreement. But this rulemaking is how we fix the access problem. It is time to move past a world where kids go to fast-food parking lots to do their homework and where school districts' only solution is to rent a slow mobile hotspot. This rulemaking is how we change things for those kids and for all of us.
The states are going to need to embrace new models of deployment that focus on fostering the development of local ISPs, as well as openly accessible fiber infrastructure. The federal law explicitly prioritizes projects that can “easily scale” speeds over time to “meet evolving connectivity needs” and “support 5G [and] successor wireless technologies.” Any objective reading of this leads to the conclusion that pushing fiber optics deep into a community should lie at the core of every project (satellite and 5G rely on fiber optics). That’s true whether it is wired or wireless delivery at the end. A key challenge will be how to build one infrastructure to service all of these needs. The answer is to deploy the fiber and make it accessible to all players.
Shared fiber infrastructure is going to be essential in order to extend its reach far and wide. EFF has produced cost-model data demonstrating that the most efficient means to reach the most people with fiber connections is deploying it on an open-access basis. This makes sense when considering that all 21st-century broadband options from satellite to 5G rely on fiber optics, but not all carriers intend to build redundant, overlapping fiber networks in any place other than urban markets. The shared infrastructure approach is already happening in Utah, where fiber infrastructure local governments are deploying fiber and enabling several small private ISPs to offer competitive gigabit fiber services. Similarly, California’s rural county governments have banded together to jointly build open-access fiber to all people through the EFF-supported state infrastructure law.
Needless to, say states have to move past the idea that a handful of grants and subsidies will fix their long-term infrastructure problems. They have to recognize that we’ve done that already and understand the mistakes of the past. This is, in fact, the second wave of $45 billion in funding we’ve done for broadband. The previous $45 billion was just spent on slow speeds and non-future proofed solutions, which is why we have nothing to show for it in most states. Only fully embracing future-proofed projects with fiber optics at their core is going to deliver the long-term value Congress is seeking with its priority provisions written into statute.
Here is a fact: It is unlikely Congress will come around again to produce a national broadband infrastructure fund. A number of states will do it right this time, which will alleviate the political pressure to have Congress act again. A number of states will take the lessons of 2021 and of the past when planning how to spend their infrastructure funding. In a handful of years, those states are probably going to have a super-majority of their residents connected to fiber. But, unfortunately, it’s possible some states will fall for the lie—often pushed by big ISPs—that slow networks save money.
We know that the “good enough for now” mindset doesn’t work. Taking this path will waste every dollar, with nothing to show for it. Networks good enough for 2021 will look slow by 2026, forcing communities to replace them to remain economically competitive. The truth is, speed-limited networks cost a fortune in the long run because they will face obsolescence quickly as needs grow. On average, we use about 21% more data each year, and that trend has been with us for decades. Furthermore, with the transition towards distributed work, and the increasingly remote delivery of services such as healthcare and education, the need for ever-increasing upload speeds and symmetrical speeds will continue to grow.
The slow broadband grift will come from industry players who are over-invested in "good enough for now" deployment strategies. It is worth billions of dollars to them for states to get this wrong. And so they will repeat their 2021 playbook and deploy their lobbyists just like they did with Congress—though they mostly failed—to the states. Industry players failed to sway Congress because everyone understands the simple fact that we will need more and more broadband with each passing year.
Any ISP that comes to a state leader with a suggested plan needs to have its suggestions scrutinized using the technical objectives Congress has laid out this year. Can their deployment plan “easily scale” into ever increasing speeds? Will it meet community needs and enable 5G and successor wireless services? And, most importantly, will it deliver low-cost, high-quality broadband access?
Many of these questions are answerable with proper technical vetting. There are no magical secrets of technology, just physics and financial planning. But it remains to be seen whether the states will allow politically well-connected legacy industries to make the call for them, or to rely on objective analysis focused on long term value to their citizens. EFF worked hard in 2021 to make 21st century ready broadband-for-all a reality for every community. We will continue do everything we can to ensure the best long-term outcome for people. If you need help convincing your local leadership to do the right thing for the public—connecting everyone to 21st-century internet access through fiber optics laid deep into your community—you have a partner in EFF.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.
We all know the line: With great power comes great responsibility. What you might not know is that Uncle Ben’s creed has been widely adopted by the courts and has been cited nearly 100 times on a variety of legal issues, ranging from debt collection to patent law and everything in-between. Ironically, Spider-Man: No Way Home fails to learn or apply its own lesson, particularly where MIT is concerned. This week, we’ll turn the tables on Spider-Man and see how his latest adventure places the blame in all the wrong places.
In the weeks leading up to No Way Home, trailers and promotional materials confirmed the appearance of five exciting key villains: the Green Goblin, Doctor Octopus, Electro, Lizard, and Sandman. Fans speculated that the movie might introduce a secret sixth villain to complete the famous Sinister Six.
Coming out of the theater, the prevailing view was that there was no sixth villain. That view is wrong. There was a sixth villain who single-handedly set the conflict in motion in No Way Home: the MIT admissions officer. More than any other villain, the admissions officer had the power to destroy Spider-Man’s life, to separate him from his friends and loved ones, and to jeopardize his future happiness and well-being.
The events of No Way Home get rolling when MIT rejects Peter, Ned, and MJ. Determined to find a way in, Peter visits Dr. Strange, with the hope that he will help Peter “magic” his way into the elite institution. As expected, the spell backfires and Peter’s world is breached by a series of villains and Spider-Men from other dimensions, who leave a trail of chaos and destruction in their wake.
Now, you may think we can’t really view admissions officers as supervillains. After all, they’re just trying to find the best students to fill out their class ranks. Surely there’s nothing wrong with that, right?
One of the factors that makes Spider-Man such a timeless character is that his problems are relatable. Along those lines, Peter’s struggles with college admissions are shared by millions of students across the country. In 2020, the acceptance rate at MIT was 7.3%; Stanford’s acceptance rate was 5.2%; Harvard’s was 5%. Even without a Spider-Man-related public image problem, the likelihood that three friends would each be accepted into MIT is a minuscule 0.03%. The difficulties and stress associated with the admissions process are widely known and well-documented. Studies show that high-achieving high school students should be treated as an “at-risk” population when it comes to suicide, depression, and substance abuse.
The widespread nature of these problems makes it easy for viewers to empathize with Peter in No Way Home and to understand why he felt compelled to do whatever it took to find his way into MIT — from Peter’s perspective, the rejection from MIT essentially amounted to the end of the world. It is thus fitting that it sets into motion events that almost lead to the literal end of the world.
Admissions officers and the educational institutions that employ them are well aware of these problems. Yet, the admissions process continues, year after year without any meaningful change. In other words, the admissions officers (or, more aptly, the institutions they represent) have great power, but consistently choose to wield that power irresponsibly — in a way that inflicts tremendous harm on students and their families. In real terms, that makes them supervillains. It also potentially gives rise to legal liability.
Under the law, there are a few instances where one person can be held responsible for the acts of another. For example, an employer can be held responsible for the acts of an employee, and a parent can be held responsible for the acts of a child. As is relevant here, the law holds an individual responsible for the unlawful acts of another when the individual takes action with knowledge that the action will cause another person to act unlawfully.
I explained it in depth in my article on time travel torts: “When it comes to avoiding harm, the law is concerned with predictability rather than personal responsibility.” If Alice leaves her car running while she goes inside a restaurant to pick up an order, and Bob takes the opportunity to steal her car and soon crashes it into another car, Alice would in several states be legally responsible for the damage Bob caused. This is because courts deem it “reasonably foreseeable” that a thief would steal a readily available car and reasonably foreseeable that a thief would drive negligently in his efforts to flee, which in turn renders Alice negligent.
One could argue that, under this theory, admissions officers and academic institutions like MIT should be held liable for the negative consequences of the college admissions process, including the multiversal conflict and destruction in Spider-Man: No Way Home. More specifically, in this world one could argue the following:
Alternatively, one could argue that admissions officers should be held liable under an expanded version of an “attractive nuisance” theory. Ordinarily, a landowner is not responsible for injuries that befall trespassers. Thus, if Alice slips and falls while walking on Bob’s land without permission, she cannot sue Bob for failing to install guardrails. However, several states recognize an exception to this rule when it comes to children. Under the “attractive nuisance” doctrine, landowners can be held liable for injuries to trespassing children if the injury results from a dangerous object or installation that attracted children to the property. For example, an owner would be liable for injuries resulting from a trespassing child’s use of a trampoline.
Taken at face value, the attractive nuisance doctrine would not apply to any of the issues arising from No Way Home. Nevertheless, the principles motivating the doctrine apply with full force: The admissions officers operate a system that is likely to attract minors and that is likely to cause harm to those minors (and others). Further, even though the affected individuals are in their later-teen years, many are still unable to appreciate or handle the risks associated with the admissions process. Thus, there is good reason to argue that courts should expand the attractive nuisance doctrine to hold institutions and admissions officers legally responsible for the foreseeable harms resulting from their broken system.
While I think the theories I have advanced are persuasive and that universities should face liability for their negligent admissions scheme, these theories would certainly not hold water in court. When it comes to holding individuals liable for the actions of others, courts have a fairly narrow view of what constitutes “reasonably foreseeable.” A court may agree that it would be reasonably foreseeable for students (or even Spider-Man specifically) to react negatively to a rejection, but it probably would not find it reasonably foreseeable that a rejection would lead to (a) magic (b) that is poorly executed (c) in a way that risks the destruction of the multiverse. Likewise, courts would not look favorably on such an aggressive expansion of the attractive nuisance doctrine.
Yet, the court’s rejection of these arguments does not serve as a redemption of MIT and its place in the No Way Home Sinister Six. The fact remains that elite universities wield tremendous power but do not take responsibility for any of the significant negative consequences resulting from that power. While universities might not be liable for the literal destruction of the multiverse, I would be surprised if there were not a competitive class action that could be levied against universities for negligent infliction of emotional distress. Like several supervillains, admissions administrators adhering to rigid processes mean well, but their thirst for superiority and dominance has caused them to lose track of right and wrong, resulting in tremendous harm.
None of this analysis is meant to excuse the actions of Spider-Man or Doctor Strange. There is no doubt that their efforts to, in essence, forget-me-now the entire population are unethical and illegal. But in the grand scheme of things, that is low-hanging fruit. The more interesting issues relate to the villain hiding in plain sight.
On first glance, Spider-Man: No Way Home looks like standard-fare MCU fun. But if we look past the literal, we can view No Way Home as a frightening metaphor for the struggle students face in connection with the college admissions process.
Peter Parker could not handle his rejection from MIT. In the wake of the rejection, he found himself face to face with personal demons from year’s past — demons that he could not simply banish or ignore, but instead had to confront, defeat, and reform. To defeat those villains, he had to turn inward, reflect on the different paths his life could take, and draw strength from the various facets of his personality — from the different Spider-Men that he could become. Peter was able to defeat the villains, but only by sacrificing his most important relationships. He ends the movie alone and isolated, without a clear place in the world.
After removing the fantastical elements of Spider-Man: No Way Home, all that is left behind is the tragic story of a talented high school student who lost everything because he couldn’t handle the stress of the admissions process.
With great power comes great responsibility — it’s high time our elite institutions live up to that responsibility.
A study of the impact of national face mask laws on Covid-19 mortality in 44 countries with a combined population of nearly a billion people found that—over time—the increase in Covid-19 related deaths was significantly slower in countries that imposed mask laws compared to countries that did not. [Published articles]
For the first time, a spacecraft has made contact with the sun. During a recent flyby, NASA’s Parker Solar Probe entered the sun’s atmosphere.
“We have finally arrived,” Nicola Fox, director of NASA’s Heliophysics Science Division in Washington, D.C., said December 14 in a news briefing at the fall meeting of the American Geophysical Union. “Humanity has touched the sun.”
Parker left interplanetary space and crossed into solar territory on April 28, 2021, during one of its close encounters with the sun. While there, the probe took the first measurements of exactly where this boundary, called the Alfvén critical surface, lies. It was about 13 million kilometers above the sun’s surface, physicists reported at the meeting, held online and in New Orleans, and in Physical Review Letters on December 14.
“We knew the Alfvén critical surface had to exist,” solar physicist Justin Kasper of the University of Michigan in Ann Arbor said at the news briefing. “We just didn’t know where it was.”
Finding this crucial layer was one of Parker’s main goals when it launched in 2018 (SN: 7/5/18). The Alfvén critical surface is important because it marks where packets of plasma can separate from the sun and become part of the solar wind, the speedy stream of charged particles that constantly emanates from the sun (SN: 8/18/17). The solar wind and other, more dramatic forms of space weather can wreak havoc on Earth’s satellites and even on life (SN: 2/26/21). Scientists want to pinpoint exactly how the wind gets started to better understand how it can impact Earth.
The Alfvén critical surface also may hold the key to one of the biggest solar mysteries: why the sun’s corona, its wispy outer atmosphere, is so much hotter than the sun’s surface (SN: 8/20/17). With most heat sources, temperatures drop as you move farther away. But the sun’s corona sizzles at more than a million degrees Celsius, while the surface is only a few thousand degrees.
In 1942, physicist Hannes Alfvén proposed a solution to the mystery: A type of magnetic wave might carry energy from the solar surface and heat up the corona. It took until 2009 to directly observe such waves, in the lower corona, but they didn’t carry enough energy there to explain all the heat (SN: 3/19/09). Solar physicists have suspected that what happens as those waves climb higher and meet the Alfvén critical surface might play a role in heating the corona. But until now, scientists didn’t know where this frontier began.
With the boundary identified, “we’ll now be able to witness directly how coronal heating happens,” Kasper said.
As Parker crossed the invisible boundary, its instruments recorded a marked increase in the strength of the local magnetic field and a drop in the density of charged material. Out in the solar wind, waves of charged particles gush away from the sun. But below the Alfvén critical surface, some of those waves bend back toward the surface of the sun.
Surprisingly, Parker’s measurements showed that the Alfvén critical surface is wrinkly. “That was one of the big outstanding questions,” says solar physicist Craig DeForest of the Southwest Research Institute in Boulder, Colo., who is a member of the Parker probe team but was not part of this measurement.
“There was some debate in the community about whether the Alfvén surface would exist as a surface at all,” he says. Decades ago, scientists imagined the boundary as a smooth sphere surrounding the sun like a snow globe. More recently, some thought it would be so ragged that it wouldn’t be apparent when the spacecraft crossed it.
Neither of those images turned out to be correct. The surface is smooth enough that the moment of crossing was noticeable, Kasper said. But during the spacecraft’s close approach to the sun in April, it crossed in and out of the boundary three times. The first dip lasted about five hours, the last only half an hour.
“The surface clearly has some structure and warp to it,” Kasper said.
That structure could influence everything from the way solar eruptions leave the sun to the way the solar wind interacts with itself farther out from the sun, DeForest says. “That has consequences that we don’t know yet, but are likely to be profound,” he says. “This is very exciting. It’s terra incognita.”
Parker is still orbiting the sun and planning to make several more close approaches over the next few years, eventually getting within 6 million kilometers of the solar surface. That should bring Parker into the solar corona again and again, solar physicist Nour Raouafi of the Johns Hopkins Applied Physics Laboratory in Laurel, Md., said in the news briefing. The spacecraft may have made another journey past the Alfvén critical surface in August and will have another opportunity in January.
“The expectation is that as we fly closer and closer to the sun, we’ll keep crossing this boundary,” Raouafi said. But the boundary might not be in the same place every time. As the sun’s activity changes, the level of the Alfvén critical surface is expected to rise and fall as if the corona is breathing in and out, he said.
That’s another thing that scientists hope to observe for the first time.
Probably the best working science fiction director around has his next job lined up. Denis Villeneuve, who directed the recently released Dune, has signed on to direct a movie adaptation of Rendezvous with Rama, the classic 1973 novel from Arthur C. Clarke. The novel is about a group of space explorers who are tasked with intercepting a spaceship hurtling through the universe, which they believe will lead to humanity’s first contact with an alien race.
While not the the seminal novel that Dune was, or even Clarke’s own 2001: A Space Odyssey, Rendezvous with Rama is definitely an interesting piece of science fiction from one of the genre’s most legendary writers, and it is in safe hands with Denis Villeneuve. Villeneuve has shown an incredible knack for bringing sci-fi to the screen, especially thanks to his ability to pull out the human threads that other directors often glaze over in favor of special effects and space battles. Alcon Entertainment is financing the film.
Broderick Johnson, Andrew Kosove, Morgan Freeman, and Lori McCreary will produce the film, with Freeman having previously owned the rights to the book. It’s not clear if he’ll have any role in the film.
“This is one of the most intelligent works of fiction in the genre; it poses as many questions as it does answers, and is a work for our time,” said Johnson and Kosove. “It’s perfectly fitted to our friend and collaborator Denis’ brilliant sensibilities and specifically to his love and passion for science fiction. We are also pleased to work with Morgan and Lori, who have a long-standing passion for this IP.”
It isn’t certain when Denis Villeneuve will begin working on the Rendezvous with Rama movie adaptation. The director is looking at an incredibly busy future as he begins turning Dune into a cinematic franchise with both the sequel film confirmed and a spin-off series, Dune: The Sisterhood, coming.
A visualization of the Internet made using network routing data. Image: Barrett Lyon, opte.org.
Imagine being able to disconnect or redirect Internet traffic destined for some of the world’s biggest companies — just by spoofing an email. This is the nature of a threat vector recently removed by a Fortune 500 firm that operates one of the largest Internet backbones.
Based in Monroe, La., Lumen Technologies Inc. [NYSE: LUMN] (formerly CenturyLink) is one of more than two dozen entities that operate what’s known as an Internet Routing Registry (IRR). These IRRs maintain routing databases used by network operators to register their assigned network resources — i.e., the Internet addresses that have been allocated to their organization.
The data maintained by the IRRs help keep track of which organizations have the right to access what Internet address space in the global routing system. Collectively, the information voluntarily submitted to the IRRs forms a distributed database of Internet routing instructions that helps connect a vast array of individual networks.
There are about 70,000 distinct networks on the Internet today, ranging from huge broadband providers like AT&T, Comcast and Verizon to many thousands of enterprises that connect to the edge of the Internet for access. Each of these so-called “Autonomous Systems” (ASes) make their own decisions about how and with whom they will connect to the larger Internet.
Regardless of how they get online, each AS uses the same language to specify which Internet IP address ranges they control: It’s called the Border Gateway Protocol, or BGP. Using BGP, an AS tells its directly connected neighbor AS(es) the addresses that it can reach. That neighbor in turn passes the information on to its neighbors, and so on, until the information has propagated everywhere .
A key function of the BGP data maintained by IRRs is preventing rogue network operators from claiming another network’s addresses and hijacking their traffic. In essence, an organization can use IRRs to declare to the rest of the Internet, “These specific Internet address ranges are ours, should only originate from our network, and you should ignore any other networks trying to lay claim to these address ranges.”
In the early days of the Internet, when organizations wanted to update their records with an IRR, the changes usually involved some amount of human interaction — often someone manually editing the new coordinates into an Internet backbone router. But over the years the various IRRs made it easier to automate this process via email.
For a long time, any changes to an organization’s routing information with an IRR could be processed via email as long as one of the following authentication methods was successfully used:
-CRYPT-PW: A password is added to the text of an email to the IRR containing the record they wish to add, change or delete (the IRR then compares that password to a hash of the password);
-PGPKEY: The requestor signs the email containing the update with an encryption key the IRR recognizes;
-MAIL-FROM: The requestor sends the record changes in an email to the IRR, and the authentication is based solely on the “From:” header of the email.
Of these, MAIL-FROM has long been considered insecure, for the simple reason that it’s not difficult to spoof the return address of an email. And virtually all IRRs have disallowed its use since at least 2012, said Adam Korab, a network engineer and security researcher based in Houston.
All except Level 3 Communications, a major Internet backbone provider acquired by Lumen/CenturyLink.
“LEVEL 3 is the last IRR operator which allows the use of this method, although they have discouraged its use since at least 2012,” Korab told KrebsOnSecurity. “Other IRR operators have fully deprecated MAIL-FROM.”
Importantly, the name and email address of each Autonomous System’s official contact for making updates with the IRRs is public information.
Korab filed a vulnerability report with Lumen demonstrating how a simple spoofed email could be used to disrupt Internet service for banks, telecommunications firms and even government entities.
“If such an attack were successful, it would result in customer IP address blocks being filtered and dropped, making them unreachable from some or all of the global Internet,” Korab said, noting that he found more than 2,000 Lumen customers were potentially affected. “This would effectively cut off Internet access for the impacted IP address blocks.”
The recent outage that took Facebook, Instagram and WhatsApp offline for the better part of a day was caused by an erroneous BGP update submitted by Facebook. That update took away the map telling the world’s computers how to find its various online properties.
Now consider the mayhem that would ensue if someone spoofed IRR updates to remove or alter routing entries for multiple e-commerce providers, banks and telecommunications companies at the same time.
“Depending on the scope of an attack, this could impact individual customers, geographic market areas, or potentially the [Lumen] backbone,” Korab continued. “This attack is trivial to exploit, and has a difficult recovery. Our conjecture is that any impacted Lumen or customer IP address blocks would be offline for 24-48 hours. In the worst-case scenario, this could extend much longer.”
Lumen told KrebsOnSecurity that it continued offering MAIL-FROM: authentication because many of its customers still relied on it due to legacy systems. Nevertheless, after receiving Korab’s report the company decided the wisest course of action was to disable MAIL-FROM: authentication altogether.
“We recently received notice of a known insecure configuration with our Route Registry,” reads a statement Lumen shared with KrebsOnSecurity. “We already had mitigating controls in place and to date we have not identified any additional issues. As part of our normal cybersecurity protocol, we carefully considered this notice and took steps to further mitigate any potential risks the vulnerability may have created for our customers or systems.”
Level3, now part of Lumen, has long urged customers to avoid using “Mail From” for authentication, but until very recently they still allowed it.
KC Claffy is the founder and director of the Center for Applied Internet Data Analysis (CAIDA), and a resident research scientist of the San Diego Supercomputer Center at the University of California, San Diego. Claffy said there is scant public evidence of a threat actor using the weakness now fixed by Lumen to hijack Internet routes.
“People often don’t notice, and a malicious actor certainly works to achieve this,” Claffy said in an email to KrebsOnSecurity. “But also, if a victim does notice, they generally aren’t going to release details that they’ve been hijacked. This is why we need mandatory reporting of such breaches, as Dan Geer has been saying for years.”
But there are plenty of examples of cybercriminals hijacking IP address blocks after a domain name associated with an email address in an IRR record has expired. In those cases, the thieves simply register the expired domain and then send email from it to an IRR specifying any route changes.
While it’s nice that Lumen is no longer the weakest link in the IRR chain, the remaining authentication mechanisms aren’t great. Claffy said after years of debate over approaches to improving routing security, the operator community deployed an alternative known as the Resource Public Key Infrastructure (RPKI).
“The RPKI includes cryptographic attestation of records, including expiration dates, with each Regional Internet Registry (RIR) operating as a ‘root’ of trust,” wrote Claffy and two other UC San Diego researchers in a paper that is still undergoing peer review. “Similar to the IRR, operators can use the RPKI to discard routing messages that do not pass origin validation checks.”
However, the additional integrity RPKI brings also comes with a fair amount of added complexity and cost, the researchers found.
“Operational and legal implications of potential malfunctions have limited registration in and use of the RPKI,” the study observed (link added). “In response, some networks have redoubled their efforts to improve the accuracy of IRR registration data. These two technologies are now operating in parallel, along with the option of doing nothing at all to validate routes.”
: I borrowed some descriptive text in the 5th and 6th paragraphs from a CAIDA/UCSD draft paper — IRR Hygiene in the RPKI Era (PDF).
I first fell in love with wuxia when I was around eight or so. I remember running around swinging the bright yellow handle of my toy broom as a sword, calling a sprawling tiger stuffed toy my master and pretending the shower was a waterfall I could learn the secrets of the universe under. I ran on tiptoe because that was somehow more like flying—or “hing gung” 輕功, the art of lightness, as I would eventually become fond of translating it .
But even before then I was deeply familiar with the genre; its many conventions have become baked into the everyday language of the Hong Kong I grew up in. My relatives all played Mahjong and much like with sports, discussions around these games borrowed heavily from the language of sparring martial artists. I’d ask at the end of every Sunday, what are the results of the battles. When asking for a family recipe, someone would joke that they’d have to become the apprentice of this or that auntie. Later, there was the world of study guides and crib sheets, all calling themselves secret martial arts manuals. The conventions around martial artists going into seclusion to perfect their craft and going mad in the pursuit of it take on new meaning as slang around cramming for exams.
Which is all to say, I really love wuxia.
“Wuxia”, literally meaning “martial hero”, is a genre about martially powerful heroes existing in a world parallel to and in the shadows of the Chinese imperial history.
The archetypal wuxia hero is someone carving out his own path in the world of rivers and lakes, cleaving only to their own personal code of honour. These heroes are inevitably embroiled in personal vengeance and familial intrigue, even as they yearn for freedom and seek to better their own skills within the martial arts. What we remember of these stories are the tournaments, the bamboo grove duels and the forbidden love.
Parallels are often drawn to knights errant of medieval romances, with many older translations favouring a chivalric vocabulary. There are also obvious comparisons to be made with the American western, especially with the desperados stumbling into adventures in isolated towns in search for that ever-elusive freedom.
It is easy to think of wuxia in these universal terms with broad themes of freedom, loyalty and justice, but largely divorced from contemporary politics. These are stories, after all, that are about outlaws and outcasts, existing outside of the conventional hierarchies of power. And they certainly do have plenty to say about these big universal themes of freedom, loyalty and justice.
But this is also a genre that has been banned by multiple governments within living memory. Its development continues to happen in the shadows of fickle Chinese censorship and at the heart of it remains a certain defiant cultural and national pride intermingled with nostalgia and diasporic yearning. The vast majority of the most iconic wuxia texts are not written by Chinese authors living comfortably in China, but by a dreaming diaspora amid or in the aftermath of vast political turmoil.
Which is all to say that the world of wuxia is fundamentally bound up with those hierarchies of power it seeks to reject. Much like there is more to superheroes than dorky names, love triangles, and broad universal ideals of justice, wuxia is grounded in the specific time and place of its creation.
Biography of Old Dragon-beard (虯髯客傳) by Du Guangting (杜光庭, 850-933) is commonly cited as the first wuxia novel. It chronicles the adventures of the titular Old Dragon-beard, who along with the lovers, Hongfu 紅拂 and Li Jing 李靖, make up the Three Heroes of the Wind and Dust. But the story isn’t just supernatural adventures; they also help Li Shimin 李世民 found the Tang Dynasty (618–906). The martial prowess and the seemingly eccentric titles of the characters aside, the act of dynastic creation is unavoidably political. 虯髯客傳 pivots around Hongfu’s ability to discern the true worth a man, which leads her to abandon her prior loyalties and cleave her love to Li Jing and his vision for a better empire. Not to mention Du wrote this and many of his other works whilst in exile with the Tang imperial court in the south, after rebels sacked the capital and burnt his books. Knowing this, it is difficult not to see Du as mythologising the past into a parable of personal resonance, that perhaps he too was making decisions about loyalties and legacies, which court or emperor he should stay with, asking himself if the Tang would indeed rise again (as he himself, as a taoist has prophecised).
Other commonly cited antecedents to the modern wuxia genre are the 14th Century classics like Romance of the Three Kingdoms (三國演義) and Outlaws of the Marsh (水滸傳), the former of which is all about the founding of dynasties and gives to Chinese the now ubiquitously cited The empire, long divided, must unite; long united, must divide. Thus it has ever been (话说天下大势．分久必合，合久必分).
Revolutionaries, Rebels and Race in the Qing Dynasty
No era of imperial China was in possession of a “free press”, but the literary inquisitions under the Qing Dynasty (1644–1911) were particularly bloody and thorough. The Manchu elite suppressed any openly revolutionary sentiment in fiction, however metaphorical, and what is written instead is a literature that sublimates much of that discontent into historical fiction nostalgic for the eras of Han dominance. Wandering heroes of the past were refashioned into a pariah elite, both marginalised from mainstream society but also superior to it with their taoist-cultivated powers.
Whilst earlier quasi-historical epics and supernatural tales are replete with gods and ghosts, late Qing wuxia begins to shed these entities and instead grounds itself in a world where taoist self-cultivation grants immense personal powers but not divinity itself. In each of the successive reprintings of Three Heroes and Five Gallants (三俠五義), editors pruned the text of anachronisms and supernatural flourishes.
The parallel world of secret societies, foreign cults, bickering merchants and righteous martial clans came to be known as jianghu, literally “rivers and lakes”. As a metaphor, it was first coined by taoist philosopher, Zhuangzi 莊子, to describe a utopian space outside of cutthroat court politics, career ambitions and even human attachments. This inspires subsequent generations of literati in their pursuits of aesthetic hermitism, but the jianghu we know today comes also from the waterways that form the key trade routes during the Ming Dynasty (1368–1644). To the growing mercantile classes, jianghu referred to the actual rivers and canals traversed by barges heavy with goods and tribute, a byname for the prosperous Yangtze delta.
These potent lineages of thought intermingle into what jianghu is within martial arts fiction today, that quasi historical dream time of adventure. But there is also another edge to it. In Stateless Subjects: Chinese Martial Arts History and Postcolonial History, Petrus Liu translates jianghu as “stateless”, which further emphasizes that the hero’s rejection of and by the machineries of government. Jianghu is thus a world that rejects the dictates of the state in favor of divine virtue and reason, but also of a sense of self created through clan and community.
The name of the genre, wuxia (“武俠“) comes from Japanese, where a genre of martially-focused bushido-inspired fiction called bukyō (“武侠”) was flourishing. It was brought into Chinese by Liang Qichao 梁启超, a pamphleteer writing in political exile in Japan, seeking to reawaken what he saw as Han China’s slumbering and forgotten martial spirit. In his political work, he holds up the industrialisation and militarisation of Meiji Japan (and its subsequent victory against Russia) as inspiration and seeks a similar restoration of racial and cultural pride for the Han people to be the “master of the Continent” above the hundred of different races who have settled in Asia.
Wuxia is fundamentally rooted in these fantasies of racial and cultural pride. Liang Qichao’s visions of Han exceptionalism were a response to subjugation under Manchu rule and Western colonialism, a martial rebuttal to the racist rhetoric of China being the “Sick Man of Asia”. But it is still undeniably ethno-nationalism built around the descendants of the Yellow Emperor conquering again the continent that is their birthright. Just as modern western fantasy has as its bones the nostalgia for a pastoral, premodern Europe, wuxia can be seen as a dramatisation of Sinocentric hegemony, where taoist cultivation grants power and stalwart heroes fight against an ever-barbaric, ever-invading Other.
Dreams of the Diaspora
Jin Yong 金庸 remains synonymous with the genre of wuxia in Chinese and his foundational mark on it cannot be overstated. His Condor Trilogy (射鵰三部曲) was serialised between 1957-63 and concerns three generations of heroes during the turbulent 12th-13th centuries. The first concerns a pair of sworn brothers, one loyal and righteous, the other clever and treacherous. Their friendship deteriorates as the latter falls into villainy, scheming with the Jin Empire (1115–1234) to conquer his native land. The second in the trilogy follows their respective children repeating and atoning for the mistakes of their parents whilst the Mongols conquer the south. The last charts the internal rivalries within the martial artists fighting over two peerless weapons whilst its hero leads his secret society to overthrow the Yuan Dynasty (1271–1368).
It’s around here that English articles about him start comparing him to Tolkien, and it’s not wholly unjustified, given how both created immensely popular and influential legendaria that draw heavily upon ancient literary forms. Entire genres of work have sprung up around them and even subversions of their work have become themselves iconic. Jin Yong laid down what would become the modern conventions of the genre, from the way fights are imagined with discrete moves, to the secret martial arts manuals and trap-filled tombs.
Unlike Tolkien, however, Jin Yong’s work is still regularly (even aggressively) adapted. There are in existence nine tv adaptations of each instalment of the Condor Trilogy, for example, as well as a video game and a mobile game. And at time of writing, eight feature films and nine tv series based on his work are in production.
But Jin Yong’s work was not always so beloved by mainland Chinese audiences. For a long time he, along with the rest of wuxia, were banned and the epicentre of the genre was in colonial Hong Kong. It is a detail often overlooked in the grand history of wuxia, so thoroughly has the genre been folded into contemporary Chinese identity. It is hard at times to remember how much of the genre was created by these artists in exile. Or perhaps that is the point, as Hong Kong’s own unique political and cultural identity is being subsumed into that of the People’s Republic, so too is its literary legacy. Literalist readings of his work as being primarily about historical martial artists defang the political metaphors and pointed allegories.
Jin Yong’s work is deeply political. Even in the most superficial sense, his heroes intersect with the politics of their time, joining revolutionary secret societies, negotiating treaties with Russia and fighting against barbarian invaders. They are bound up in the temporal world of hierarchy and power. Legend of the Condor Hero (射鵰英雄傳)’s Guo Jing 郭靖 becomes the sworn brother to Genghis Khan’s son, Tolui, and joins the Mongol campaign against the Khwarezmid Empire. Book and Sword (書劍恩仇錄)’s Chen Jialuo 陳家洛 is secretly the Qianlong Emperor’s half brother. The Deer and the Cauldron (鹿鼎記)’s Wei Xiaobao 韋小寶 is both best friends with the Kangxi Emperor and also heavily involved in a secret society dedicated to overthrowing the aforementioned emperor. Even Return of the Condor Hero (神鵰俠侶)‘s Yang Guo 楊過 ends up fighting to defend the remains of the Song Empire against the Mongols.
But it goes deeper than that. Jin Yong was a vocal critic of the Cultural Revolution, penning polemics against Mao Zedong and the Gang of Four during the late 60s. Beyond the immediate newspaper coverage, Jin Yong edited and published many more works both documenting and dissecting the Cultural Revolution.
Jin Yong described himself as writing every day one novel instalment and one editorial against the Gang of Four. Thus did they bleed together, the villains of Laughing in the Wind (笑傲江湖) becoming recognisable caricatures as it too rejected senseless personality cults.
In this light, his novels seem almost an encyclopaedia of traditional Chinese culture, its values and virtues, a record of it to stand bulwark against the many forces that would consign it all to oblivion. It is a resounding rebuttal to principles of the May Fourth Movement, that modernisation and westernisation are equivalents. To Jin Yong the old and the traditional were valuable, and it is from this we must build our new literature .
Taken together, Jin Yong’s corpus offers an alternate history of the Han people spanning over two thousand years from the Eastern Zhou (771–256 B.C.) to the Qing Dynasty (1644–1911). He fills in the intriguing gaps left in official records with folk heroes, court gossip and conspiracy theories. His text is dense with literary allusions and quotations from old Chinese poems.
His stories are almost all set during times of turmoil when what can be termed “China”, or at least, the Han people are threatened by barbarian invasion and internal corruption; pivotal moments in history that makes heroes and patriots out of ordinary men and women. All this Jin Yong immortalises with a deep yearning for a place and past that never quite was; nostalgia in the oldest sense of the word, with all the pain and pining and illusion that it implies.
It is arguably this very yearning, this conjuring of a real and relevant past from dry history books that makes Jin Yong’s work so endlessly appealing to the Chinese diaspora, as well as the mainland Chinese emerging from the Cultural Revolution. This alternate history dramatises the complexities of Han identity, all the times it has been threatened, disrupted and diluted in history, but at the same time it gave hope and heroics. These were stories as simple or as complex as the reader wanted it to be.
Chinese Imperialism and Han Hegemony
It is sometimes hard to remember that Jin Yong and all the rest of wuxia was once banned in the People’s Republic of China, so thoroughly have they now embraced his work. As late as the 1990s was Jin Yong decried as one of the “Four Great Vulgarities of Our Time” (alongside the four heavenly kings of cantopop, Jackie Chan and sappy Qiong Yao romances).
In recent decades, the CCP has rather dramatically changed its relationship with the past. The censorship machine is still very active, but it does not have in its crosshairs the decadent and feudal genre of wuxia (though there have been exceptions, especially during the run up to the Republic’s 70th anniversary when all frivolous dramas were put on pause; it is important to remember that the censors are not always singular or consistent in their opinions). But more importantly, the Party no longer draws power from a radical rejection of the past, instead it is embraces utterly, celebrated at every turn. Traditionalism now forms a core pillar of their legitimacy, with all five thousand years of that history validating their rule. The State now actively promotes all those superstitions and feudal philosophies it once held in contempt.
Along with the shifting use of history to inspire nationalism has Jin Yong been rehabilitated and canonised. It’s arguably that revolutionary traditionalism —that he was preserving history in a time of its destruction—that makes him so easy to rehabilitate. Jin Yong’s work appeals both to the conservative mind with its love of tradition and patriotic themes, but also to rebels in its love of outlaw heroes.
It isn’t that these stories have nothing to say on themes of a more abstract or universal sense of freedom or justice, but that they are also very much about the specifics of Han identity and nationalism. Jin Yong’s heroes often find themselves called to patriotism, even as they navigate their complex or divided loyalties, they must defend “China” in whatever form it exists in at the time against barbaric, alien invaders. Even as they function as straightforward stories of nationalistic defence, they are also dramatising disruptions of a simplistic or pure Chinese identity, foregrounding characters from marginalised (if also often exoticised) ethnicities and religions.
Jin Yong’s hero Guo Jing is Han by birth and Mongol by adoption. He ultimately renounces his loyalty to Genghis Khan and returns to his Han homeland to defend it from Mongol conquest. Whilst one can read Jin Yong’s sympathy and admiration for the Mongols as an attempt to construct an inclusive nationalism for modern China, Guo Jing’s participation as a Han hero in the conquest of Central Asia also functions as a justification of modern Han China’s political claim on that imperial and colonial legacy.
Book and Sword has this even more starkly as it feeds the popular Han fantasy that the Kangxi Emperor is not ethnically Manchu but instead, a Han changeling. He is forced by the hero of the novel Chen Jialuo to swear an oath to acknowledge his Han identity and overthrow the Manchus, but of course, he then betrays them and subjugates not only the Han but also the “Land of Wei” (now known as Xin Jiang, where the genocide is happening). Still there is something to be said about how this secret parentage plot attributes the martial victories of the Qing to Han superiority and justifies the Han inheritance of former Qing colonies.
The Uyghur tribes are portrayed with sympathy in Book and Sword. They are noble and defiant and devout. Instead of savages who need to be brought to heel, they are fellow resistance fighters. It alludes to an inclusive national identity, one in which Han and Uyghur are united by their shared suffering under Manchu rule. It can also be argued that their prominence disrupts the ideal of a pure Han-centric Chineseness. But what good is inclusion and unity to those who do not want to be part of that nation? Uyghurs, being a people suffering occupation, actively reject the label of “Chinese Muslims”.
Furthermore, the character of Kasili in Book and Sword, based on the legend of the Fragrant Concubine, is drenched in orientalist stereotype. Chen first stumbles upon her bathing naked in a river, her erotic and romantic availability uncomfortably paralleling that of her homeland. When the land of Wei falls to the emperor’s sword and Kasili is taken as a concubine, she remains loyal to the Han hero she fell in love with, ultimately killing herself to warn Chen of the emperor’s duplicity. Conquest and imperial legacy is thus dramatised as a love triangle between a Uyghur princess, a Han rebel and a Manchu emperor.
Chen, it should be noted, falls in love and marries a different Uyghur princess for his happy ending.
Amid other far more brutal policies meant to forcibly assimilate and eradicate Uyghur identity, the PRC government encouraged Han men to take Uyghur women as wives. Deeply unpleasant adverts still available online extolled the beauty and availability of Uyghur women, as something and somewhere to be conquered. It is impossible not to be reminded of this when reading about the beautiful and besotted Kasili.
There is no small amount of political allegory to be read between the lines of Jin Yong, something he became increasingly frank about towards the end of his life. Condor Trilogy with its successive waves of northern invaders can be seen as echoing at the Communist takeover of China. The success of Wei Xiaobao’s affable cunning can be a satire on the hollowness materialistic 70s modernity. But Jin Yong himself proved to be far less radical than his books as he sided with the conservative anti-democracy factions within Hong Kong during the Handover.
In an 1994 interview, Jin Yong argues against the idea that China was ever under “foreign rule”, instead proposing that the many ethnic groups within China are simply taking turns on who happens to be in ascendance. All wars are thus civil wars and he neatly aligns his novels with the current Chinese policies that oppress in the name of unity, harmony and assimilation, of “inclusive” nationalism.
The legacy of Jin Yong is a complex one. His work, like all art, contains multitudes and can sustain any number of seemingly contradictory interpretations. It is what is beautiful about art. But I cannot but feel that his rapid canonisation over the last decades in mainland China is a stark demonstration of how easily those yearning dreams of the diaspora can become nationalistic fodder.
I did not come to bury wuxia, but to praise it. I wanted to show you a little bit of its complexities and history, as well as the ideals and ideologies that simmer under its surface.
For me, I just think it is too easy to see wuxia as a form of salvation. Something to sustain and inspire me in a media landscape hostile to people who look like me. To give me the piece of me that I have felt missing, to heal a deep cultural wound. After all, Hollywood or broader Anglophone media might be reluctant to make stories with Asian protagonists, but I can turn to literally all of wuxia. American TV series won’t make me a fifty episode epic about two pretty men eyefucking each other that also has a happy ending, but I will always have The Untamed.
It’s this insidious feeling of hope. That this genre is somehow wholly “unproblematic” because I am reconnecting with my cultural roots, that it can nourish me. That it can be safe that way. It is, after all, untouched by all the problematic elements in Anglophone mainstream that I have analysed to death and back. That it is some sort of oasis, untouched by colonialism and western imperialism. That it therefore won’t or can’t have that taint of white supremacy; it’s not even made by white people.
Perhaps it is just naive of me to have ever thought these things, however subconsciously. Articulating it now, it’s ridiculous. Han supremacy is a poisonous ideology that is destroying culture, hollowing out communities and actively killing people. In the face of its all-consuming genocide-perpetuating ubiquity, the least I can do is recognise its presence in a silly little genre I love. It just doesn’t seem too much to ask.
Jeannette Ng is originally from Hong Kong but now lives in Durham, UK. Her MA in Medieval and Renaissance Studies fed into an interest in medieval and missionary theology, which in turn spawned her love for writing gothic fantasy with a theological twist. She runs live roleplay games and is active within the costuming community, running a popular blog. Jeannette has been a finalist for the John W. Campbell Award for Best New Writer and the Sydney J Bounds Award (Best Newcomer) in the British Fantasy Awards 2018.
A weakness in the algorithm used to encrypt cellphone data in the 1990s and 2000s allowed hackers to spy on some internet traffic, according to a new research paper. Motherboard: The paper has sent shockwaves through the encryption community because of what it implies: The researchers believe that the mathematical probability of the weakness being introduced on accident is extremely low. Thus, they speculate that a weakness was intentionally put into the algorithm. After the paper was published, the group that designed the algorithm confirmed this was the case. Researchers from several universities in Europe found that the encryption algorithm GEA-1, which was used in cellphones when the industry adopted GPRS standards in 2G networks, was intentionally designed to include a weakness that at least one cryptography expert sees as a backdoor. The researchers said they obtained two encryption algorithms, GEA-1 and GEA-2, which are proprietary and thus not public, "from a source." They then analyzed them and realized they were vulnerable to attacks that allowed for decryption of all traffic. When trying to reverse-engineer the algorithm, the researchers wrote that (to simplify), they tried to design a similar encryption algorithm using a random number generator often used in cryptography and never came close to creating an encryption scheme as weak as the one actually used: "In a million tries we never even got close to such a weak instance," they wrote. "This implies that the weakness in GEA-1 is unlikely to occur by chance, indicating that the security level of 40 bits is due to export regulations." Researchers dubbed the attack "divide-and-conquer," and said it was "rather straightforward." In short, the attack allows someone who can intercept cellphone data traffic to recover the key used to encrypt the data and then decrypt all traffic. The weakness in GEA-1, the oldest algorithm developed in 1998, is that it provides only 40-bit security. That's what allows an attacker to get the key and decrypt all traffic, according to the researchers.
Read more of this story at Slashdot.
SaltStack has released a security update to Salt to address three critical vulnerabilities. We strongly recommend that you prioritize this update.
This is a security release. The following CVE’s were fixed as part of this release:
The post November 2020 SaltStack CVEs: CVE-2020-16846, CVE-2020-17490, CVE-2020-25592 appeared first on Salt Project.
For my work on Debian, i want to use my debian.org email address, while for my personal projects i want to use my gmail.com address.
One way to change the user.email git config value is to git config --local in every repo, but that's tedious, error-prone and doesn't scale very well with many repositories (and the chances to forget to set the right one on a new repo are ~100%).
The solution is to use the git-config ability to include extra configuration files, based on the repo path, by using includeIf:
Content of ~/.gitconfig:
name = Sandro Tosi
email = <personal.address>@gmail.com
path = ~/.gitconfig-deb
Every time the git path is in ~/deb/ (which is where i have all Debian repos) the file ~/.gitconfig-deb will be included; its content:
[user]That results in my personal address being used on all repos not part of Debian, where i use my Debian email address. This approach can be extended to every other git configuration values.
email = firstname.lastname@example.org
Studies from around the world suggest that success depends on class size, distancing, the age of the students, and how prevalent the virus is locally.
or at this YouTube link:
Been putting this together for a while... more to come.
In no particular order, though grouped by composer.
To be clear, I'm in no way saying these are unknown themes or not loved. In my limited experience, they just don't get the same acclaim as some more well-known scores, and I feel they deserve recognition! These are just pieces of music uncannily suited to their films, and work perfectly in the movie while also standing alone as wonderful pieces of music.
And while I haven't completely steered away from the John Williams' and Jerry Goldsmiths of the world, I have tried to include slightly more off-kilter selections that are truly fantastic.
Klendathu Drop - Starship Troopers
Robocop Theme - Robocop
Riddle of Steel & Riders of Doom - Conan the Barbarian
Love Theme - Cinema Paradiso
Complete Score - The Thing
Ecstasy of Gold - The Good, The Bad, and The Ugly
Going The Distance & The Final Bell - Rocky
Main Theme - The Right Stuff
Main Theme - Capricorn One
Main Theme - Gremlins II (and Gremlins... just a great
performance of it)
Main Title - Planet of the Apes
The Enterprise - Star Trek: The Motion Picture
Erich Wolfgang Korngold
Main Title - Kings Row (also... the inspiration for Star
Main Title - Reunion - The Sea Hawk
Main Theme - Seven Years in Tibet (one of his best)
Main Theme - Born on the Fourth of July
With Malice Towards None - Lincoln
Main Theme - Predator
Main Theme - Contact (Maybe my fav on the list... I'm a sucker
for sentimentality... Sue me)
Captain America March - Captain America: The First Avenger
Junkie Xl - Mad Max: Fury Road
Daft Punk - Tron Legacy
James Horner - Commando
Wow, man. Some of us take on more extreme projects during the
Great Coronavirus Quarantine than others.
This ambitious fellow shows you how to build a Nintendo Switch, with a beautiful and wholesome purpose: “to Starve Online Price Gougers” who are jacking up the prices because demand is high for Nintendo Switch, and availability is nil.
Here's their introduction to the HOWTO gallery, which is amazing and stupendous.
After playing New Horizons and hyping it up to my friends, they decided they wanted a Switch. They called around to different retailers every day for a week with no luck finding anyone who had one in stock. No one knew when the next shipment would be. This led to an online search like Craigslist, OfferUp, and Ebay.
Unfortunately everyone knows the rest. Upwards of $450 to $600 in the Seattle area for a used Switch. Some with and without all the accessories. This enraged me to the point of telling them I could build one cheaper out of spare parts. So they hired me to do just that. If anyone is interested in doing the same here is my step by step buying guide along with assembly instructions and a pricing guide.
1. Game Cartridge Card Slot Socket Board w/Headphones Port - $15
2. NS Console Micro SD TF Memory Card Slot Port Socket Reader - $5
3. Nintendo Switch HAC-001 CPU Cooling Heatsink - $7
4. Game Cartridge Card Plastic Cover - $1
5. Console Speaker Replacement Parts For Nintendo Switch Built in speaker - $8
6. Wifi Antenna Connecting Cable (Short) $2
7. Wifi Antenna Connecting Cable (Long) $2
8. Internal Cooling Fan - $3
9. Power & Volume Button control flex cable (w/ buttons and rubber conductor) - $4
10. Side Slider Sliding Rail Flex Cable (Left) - $3
11. Side Slider Sliding Rail Flex Cable (Right) - $3
12. Replacement Top Housing Shell Case Face plate -$6
13. Nintendo Switch Console Replacement Battery (New) - $15
14. Replacement Bottom Housing Shell Transparent Case Face plate -$5
15. Touch Screen Digitizer Adhesive - $0.50
16. Touch Screen Digitizer - $9
17. LCD Display Screen Replacement - $12
18. Shield Plate - $2
19. Iron Middle Frame - $6
20. (Not Pictured Here) - 100% WORKING OEM NINTENDO SWITCH REPLACEMENT LOGIC BOARD MOTHERBOARD - $95
21. (Not Pictured Here) - Full Screw Replacement Set - $2
22. (Not Pictured Here) - (Removal of Copper Sicker on CPU)
Grand Total For Used Parts Build: = $199
Ebay Average Price Jan 2020: = (between $175 and $225)
Ebay Average Price April 2020: = (between $300 and $400)
I am sure I made made mistakes in this post so feel free to correct me if I am wrong about anything.
And screw you if you are one of the bad guys making a buck off of a crisis.
Here you go...
It used to be that being a couch potato was almost universally deemed a negative—but it’s funny how it only takes a contagious epidemic to turn the normal state of things on its head. Fortunately, nobody with a computer need be without ways to occupy their time.
Publishers, studios, and other media agencies are providing free offerings to give people plenty to do to ride out the corona lockdowns—as well as tools to assist self-education or learning at home. Here are a few of them I’ve noticed.
Educational/children’s book publisher Scholastic is offering a free 20-day learn-at-home program for grades K-9 via its web site—very handy for those in areas whose schools have closed down.
Would your children like to learn more about whales? Seattle-based research institute Oceans Initiative has launched a free Virtual Marine Biology Camp to teach school-closed children more about aquatic life. They’re holding live sessions every Monday and Thursday at 11 a.m. Pacific (2 p.m. Eastern) to help give those out-of-school children something educational to do.
Audiobook publisher and Amazon subsidiary Audible.com is making hundreds of audiobook titles available for free for the duration of school closures, via stories.audible.com.
NPR, the Sarasota Herald-Tribune, and CNET, among others, have articles collecting a lot of other free entertainment and education sources that weren’t free before the Corona quarantines. (Indeed, all you need do is google “coronavirus free entertainment” to find all the others who had the same idea.) But there are also still plenty of things that were already free and still are.
Baen’s Free Library is, of course, still just as free as it ever was. If you’re a member of a compatible public library, Hoopla Digital will let you borrow a limited number of ebooks, audiobooks, albums, movies, or TV episodes per month for free. And you still have access to Project Gutenberg, Librivox for audiobooks, Archive.org for all sorts of content, and all the other public-domain sites out there.
If you’re looking for something interesting to watch, Open Culture has links to over 200 free documentary films online, on subjects as diverse as Hayao Miyazaki and M.C. Escher. The site also includes links to free ebooks, audiobooks, online courses, and textbooks.
If you’re into anime, most of Crunchyroll‘s anime titles are available to watch for free (save for the very newest episode). Resolution may be limited, and you may have to put up with advertisements—but free is free, right? Pluto TV has over 250 channels of free video content, too, with mobile apps for iOS and Android available. And YouTube has its usual countless hundreds of thousands of hours of enjoyable ways to entertain or improve yourself, including its “Learning” category.
If you’re more into computer games, you could check out the Homecoming City of Heroes servers. Coming up on a full year since the game originally returned, it has thousands of players once again enjoying life in the early-2000s superhero MMO. (I play primarily on the Torchbearer shard, myself, and am always happy to help out new or returning players.)
There are many more free education or entertainment resources than I could even list, and there will doubtless be more the longer this lockdown goes on. How about adding your favorites in the comments?
Photo by Eric Antunes on Pexels.com
If you found this post worth reading and want to kick in a buck or two to the author, click here.
I have been late to adopt an on-premise cloud solution as the security of Owncloud a few years ago wasn't so stellar (cf. my comment from 2013 in Encryption files ... for synchronization across the Internet). But the follow-up product Nextcloud has matured quite nicely and we use it for collaboration both in the company and in FLOSS related work at multiple nonprofit organizations.
There is a very annoying "feature" in Nextcloud though that the designers think menu items for apps at the top need to be limited to eight or less to prevent information overload in the header. The whole item discussion is worth reading as it it an archetypical example of design prevalence vs. user choice.
And of course designers think they are right. That's a feature
of the trade.
And because they know better there is no user configurable option to extend that 8 items to may be 12 or so which would prevent the annoying overflow menu we are seeing with 10 applications in use:
Luckily code can be changed and there are many comments floating
around the Internet to change
minAppsDesktop = 8. In this case it is slightly
compressed form (aka "minified") as
core/js/dist/main.js and you probably don't want to
build the whole beast locally to change one constant.
gets compressed during build time to become part of one 15,000+ character line. The relevant portion reads:
Well, we can still patch that, can we?Continue reading "Fixing the Nextcloud menu to show more than eight application icons"
As part of our Blackhat Europe talk “Reverse Engineering and Exploiting Builds in the Cloud” we publicly released a new tool called Terrier.
In this blog post, I am going to show you how Terrier can help you identify and verify container and image components for a wide variety of use-cases, be it from a supply-chain perspective or forensics perspective. Terrier can be found on Github https://github.com/heroku/terrier.
In this blog post, I am not going to go into too much detail about containers and images (you can learn more here) however it is important to highlight a few characteristics of containers and images that make them interesting in terms of Terrier. Containers are run from images and currently the Open Containers Initiative (OCI) is the most popular format for images. The remainder of this blog post refers to OCI images as images.
Essentially images are tar archives that container multiple tar archives and meta-information that represent the “layers” of an image. The OCI format of images makes images relatively simple to work with which makes analysis relatively simple. If you only had access to a terminal and the tar command, you could pretty much get what you need from the image’s tar archive.
When images are utilised at runtime for a container, their
contents become the contents of the running container and the
layers are essentially extracted to a location on the
container’s runtime host. The container runtime host is the
host that is running and maintaining the containers. This location
location contains a few folders of interest, particularly the
"merged" folder. The "merged" folder contains the contents of the
image and any changes that have occurred in the container since its
creation. For example, if the image contained a location such as
/usr/chris/stuff and after creating a container from
this image I created a file called
/usr/chris/stuff. This would result in
the following valid path on the container runtime host
Now that we have a brief understanding of images and containers, we can look at what Terrier does. Often it is the case that you would like to determine if an image or container contains a specific file. This requirement may be due to a forensic analysis need or to identify and prevent a certain supply-chain attack vector. Regardless of the requirement, having the ability to determine the presence of a specific file in an image or container is useful.
Terrier can be used to determine if a specific image contains a specific file. In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above uses a Docker image ID which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will then need to establish the SHA256 hash of the particular file you would like to identify in the image. There are multiple ways to achieve this but in this example, we are going to use the hash of the Golang binary go1.13.4 linux/amd64 which can be achieved with following command on a Linux host:
$ cat /usr/local/go/bin/go | sha256sum
The command above should result in the following SHA256 hash:
Now that we have a hash, we can use this hash to determine if
the Golang binary is in the image
achieve this, we need to populate a configuration file for Terrier.
Terrier makes use of YAML configuration files and below is our
config file that we save as
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
The config file above has multiple entries which allow us to
mode that Terrier will operate in and in
this case, we are working with an image file (tar archive) so the
image. The image file we are working with is
myImage.tar and the hash we are looking to identify is
We are now ready to run Terrier and this can be done with the following command:
The command above should result in output similar to the following:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
We have identified a file
located at layer
that has the same SHA256 hash as the one we provided. We now have
verification that the image “myImage.tar” contains a
file with the SHA256 hash we provided.
This example can be extended upon and you can instruct Terrier to search for multiple hashes. In this case, we are going to search for a malicious file. Recently a malicious Python library was identified in the wild and went by the name “Jeilyfish”. Terrier could be used to check if a Docker image of yours contained this malicious package. To do this, we can determine the SHA256 of one of the malicious Python files that contains the backdoor:
$ cat jeIlyfish-0.7.1/jeIlyfish/_jellyfish.py | sha256sum cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c
We then update our Terrier config to include the hash calculated above.
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
We then run Terrier against and analyse the results:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
The results above indicate that our image did not contain the malicious Python package.
There is no limit as to how many hashes you can search for however it should be noted that Terrier performs all its actions in-memory for performance reasons so you might hit certain limits if you do not have enough accessible memory.
Terrier can also be used to determine if a specific image contains a specific file at a specific location. This can be useful to ensure that an image is using a specific component i.e binary, shared object or dependency. This can also be seen as “pinning” components by ensuring that you are images are using specific components i.e a specific version of cURL.
In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above utilises a Docker image id which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will need to determine the path of the file you would like to identify and verify in the image. For example, if we would like to ensure that our images are making use of a specific version of cURL, we can run the following commands in a container or some other environment that resembles the image.
$ which curl /usr/bin/curl
We now have the path to cURL and can now generate the SHA256 of this instance of cURL because in this case, we trust this instance of cURL. We could determine the hash by other means for example many binaries are released with a corresponding hash from the developer which can be acquired from the developer’s website.
$ cat /usr/bin/curl | sha256sum 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96
With this information, we can now populate our config file for Terrier:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96'
We’ve saved the above config as
when we run Terrier with this config, we get the following
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1) $ echo $? 0
The output above indicates that the file
/usr/bin/curl was successfully identified and
verified, meaning that the image contained a file at the location
/usr/bin/curl and that the SHA256 of that file matched
the hash we provided in the config. Terrier also makes use of
return codes and if we analyse the return code from the output
above, we can see that the value is
0 which indicates
a success. If Terrier cannot identify or verify all the provided
files, a return code of
1 is returned which indicates
a failure. The setting of return codes is particularly useful in
testing environments or CI/CD environments.
We can also run Terrier with verbose mode enable to get more information:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [!] Identified instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl [!] Verified matching instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl with hash: 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1)
The output above provides some more detailed information such as which layer the cURL files was located at. If you wanted more information, you could enable the veryveryverbose option in the config file but beware, this is a lot of output and grep will be your friend.
There is no limit for how many hashes you can specify for a file. This can be useful for when you want to allow more than one version of a specific file i.e multiple versions of cURL. An example config of multiple hashes for a file might look like:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - hash: 'aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545' - hash: '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759' - hash: 'd4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c'
The config above allows Terrier to verify if the identified cURL instance is one of the provided hashes. There is also no limit for the amount of files Terrier can attempt to identify and verify.
Terrier’s Github repo also contains a useful script called
convertSHA.sh which can be used to convert a list of
SHA256 hashes and filenames into a Terrier config file. This is
useful when converting the output from other tools into a Terrier
friendly format. For example, we could have the following contents
of a file:
8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d ./bin/uname 6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92 ./bin/gzexe 74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04 ./bin/wdctl 61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877 ./bin/dmesg 7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b ./bin/which 3ed46bd8b4d137cad2830974a78df8d6b1d28de491d7a23d305ad58742a07120 ./bin/mknod e8ca998df296413624b2bcf92a31ee3b9852f7590f759cc4a8814d3e9046f1eb ./bin/mv a91d40b349e2bccd3c5fe79664e70649ef0354b9f8bd4658f8c164f194b53d0f ./bin/chown 091abe52520c96a75cf7d4ff38796fc878cd62c3a75a3fd8161aa3df1e26bebd ./bin/uncompress c5ebd611260a9057144fd1d7de48dbefc14e16240895cb896034ae05a94b5750 ./bin/echo d4ba9ffb5f396a2584fec1ca878930b677196be21aee16ee6093eb9f0a93bf8f ./bin/df 5fb515ff832650b2a25aeb9c21f881ca2fa486900e736dfa727a5442a6de83e5 ./bin/tar 6936c9aa8e17781410f286bb1cbc35b5548ea4e7604c1379dc8e159d91a0193d ./bin/zforce 8d641329ea7f93b1caf031b70e2a0a3288c49a55c18d8ba86cc534eaa166ec2e ./bin/gzip 0c1a1f53763ab668fb085327cdd298b4a0c1bf2f0b51b912aa7bc15392cd09e7 ./bin/su 20c358f7ee877a3fd2138ecce98fada08354810b3e9a0e849631851f92d09cc4 ./bin/bzexe 01764d96697b060b2a449769073b7cf2df61b5cb604937e39dd7a47017e92ee0 ./bin/znew 0d1a106dc28c3c41b181d3ba2fc52086ede4e706153e22879e60e7663d2f6aad ./bin/login fb130bda68f6a56e2c2edc3f7d5b805fd9dcfbcc26fb123a693b516a83cfb141 ./bin/dir 0e7ca63849eebc9ea476ea1fefab05e60b0ac8066f73c7d58e8ff607c941f212 ./bin/bzmore 14dc8106ec64c9e2a7c9430e1d0bef170aaad0f5f7f683c1c1810b466cdf5079 ./bin/zless 9cf4cda0f73875032436f7d5c457271f235e59c968c1c101d19fc7bf137e6e37 ./bin/chmod c5f12f157b605b1141e6f97796732247a26150a0a019328d69095e9760b42e38 ./bin/sleep b9711301d3ab42575597d8a1c015f49fddba9a7ea9934e11d38b9ff5248503a8 ./bin/zfgrep 0b2840eaf05bb6802400cc5fa793e8c7e58d6198334171c694a67417c687ffc7 ./bin/stty d9393d0eca1de788628ad0961b74ec7a648709b24423371b208ae525f60bbdad ./bin/bunzip2 d2a56d64199e674454d2132679c0883779d43568cd4c04c14d0ea0e1307334cf ./bin/mkdir 1c48ade64b96409e6773d2c5c771f3b3c5acec65a15980d8dca6b1efd3f95969 ./bin/cat 09198e56abd1037352418279eb51898ab71cc733642b50bcf69d8a723602841e ./bin/true 97f3993ead63a1ce0f6a48cda92d6655ffe210242fe057b8803506b57c99b7bc ./bin/zdiff 0d06f9724af41b13cdacea133530b9129a48450230feef9632d53d5bbb837c8c ./bin/ls da2da96324108bbe297a75e8ebfcb2400959bffcdaa4c88b797c4d0ce0c94c50 ./bin/zegrep
The file contents above are trusted SHA256 hashes for specific files. If we would like to use this list for ensuring that a particular image is making use of the files listed above, we can do the following:
$ ./convertSHA.sh trustedhashes.txt terrier.yml
The script above takes the input file
trustedhashes.txt which contains our trusted hashes
listed above and converts them into a Terrier friendly config file
terrier.yml which looks like the following:
mode: image image: myImage.tar files: - name: '/bin/uname' hashes: - hash: '8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d' - name: '/bin/gzexe' hashes: - hash: '6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92' - name: '/bin/wdctl' hashes: - hash: '74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04' - name: '/bin/dmesg' hashes: - hash: '61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877' - name: '/bin/which' hashes: - hash: '7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b' - name: '/bin/mknod'
The config file
terrier.yml is ready to be
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] Not all components were identifed: (4/31) [!] Component not identified: /bin/uncompress [!] Component not identified: /bin/bzexe [!] Component not identified: /bin/bzmore [!] Component not identified: /bin/bunzip2 $ echo $? 1
As we can see from the output above, Terrier was unable to identify 4/31 of the components provided in the config. The return code is also 1 which indicates a failure. If we were to remove the components that are not in the provided image, the output from the previous command would look like the following:
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (27/27) [!] Not all components were verified: (26/27) [!] Component not verified: /bin/cat [!] Component not verified: /bin/chmod [!] Component not verified: /bin/chown [!] Component not verified: /bin/df [!] Component not verified: /bin/dir [!] Component not verified: /bin/dmesg [!] Component not verified: /bin/echo [!] Component not verified: /bin/gzexe [!] Component not verified: /bin/gzip [!] Component not verified: /bin/login [!] Component not verified: /bin/ls [!] Component not verified: /bin/mkdir [!] Component not verified: /bin/mknod [!] Component not verified: /bin/mv [!] Component not verified: /bin/sleep [!] Component not verified: /bin/stty [!] Component not verified: /bin/su [!] Component not verified: /bin/tar [!] Component not verified: /bin/true [!] Component not verified: /bin/uname [!] Component not verified: /bin/wdctl [!] Component not verified: /bin/zdiff [!] Component not verified: /bin/zfgrep [!] Component not verified: /bin/zforce [!] Component not verified: /bin/zless [!] Component not verified: /bin/znew $ echo $? 1
The output above indicates that Terrier was able to identify all
the components provided but many were not verifiable, the hashes
did not match and once again, the return code is
indicate this failure.
The previous sections focused on identifying files in images, which can be referred to as a form of “static analysis,” however it is also possible to perform this analysis to running containers. In order to do this, you need the following:
merged folder is Docker specific, in this case,
we are using it because this is where the contents of the Docker
container reside, this might be another location if it were
The location of the container’s
can be determined by running the following commands. First obtain
the container’s ID:
$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b9e676fd7b09 golang "bash" 20 hours ago Up 20 hours cocky_robinson
Once you have the container’s ID, you can run the
following command which will help you identify the location of the
merged folder on the underlying
$ docker exec b9e676fd7b09 mount | grep diff overlay on / type overlay (rw,relatime,lowerdir=/var/lib/docker/overlay2/l/7ZDEFE6PX4C3I3LGIGGI5MWQD4: /var/lib/docker/overlay2/l/EZNIFFIXOVO2GIT5PTBI754HC4:/var/lib/docker/overlay2/l/UWKXP76FVZULHGRKZMVYJHY5IK: /var/lib/docker/overlay2/l/DTQQUTRXU4ZLLQTMACWMJYNRTH:/var/lib/docker/overlay2/l/R6DE2RY63EJABTON6HVSFRFICC: /var/lib/docker/overlay2/l/U4JNTFLQEKMFHVEQJ5BQDLL7NO:/var/lib/docker/overlay2/l/FEBURQY25XGHJNPSFY5EEPCFKA: /var/lib/docker/overlay2/l/ICNMAZ44JY5WZQTFMYY4VV6OOZ, upperdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/diff, workdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/work)
From the results above, we are interested in two entries,
workdir because these two
entries will provide us with the path to the container’s
merged folder. From the results above, we can
determine that the container’s
is located at
on the underlying host.
Now that we have the location, we need some files to identify and in this case, we are going to reuse the SHA256 hashes from the previous section. Let’s now go ahead and populate our Terrier configuration with this new information.
mode: container path: merged #image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
The configuration above shows that we have changed the
and we have added the
path to our
folder. We have kept the two hashes from the previous section.
If we run Terrier with this configuration from the location
we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd
From the output above, we know that the container
b9e676fd7b09) does not contain the malicious Python
package but it does contain an instance of the Golang binary which
is located at
And as you might have guessed, Terrier can also be used to verify and identify files at specific paths in containers. To do this, we need the following:
The points above can be determined using the same procedures described in the previous sections. Below is an example Terrier config file that we could use to identify and verify components in a running container:
mode: container path: merged verbose: true files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - name: '/usr/local/go/bin/go' hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
If we run Terrier with the above config, we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '/usr/bin/curl' at: merged/usr/bin/curl with hash:9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [!] Found matching instance of '/usr/local/go/bin/go' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91 dd3ff92dd [!] All components were identified: (2/2) [!] All components were identified and verified: (2/2) $ echo $? 0
From the output above, we can see that Terrier was able to
successfully identify and verify all the files in the running
container. The return code is also
0 which indicates a
successful execution of Terrier.
In addition to Terrier being used as a standalone CLI tool, Terrier can also be integrated easily with existing CI/CD technologies such as GitHub Actions and CircleCI. Below are two example configurations that show how Terrier can be used to identify and verify certain components of Docker files in a pipeline and prevent the pipeline from continuing if all verifications do not pass. This can be seen as an extra mitigation for supply-chain attacks.
Below is a CircleCI example configuration using Terrier to verify the contents of an image.
version: 2 jobs: build: machine: true steps: - checkout - run: name: Build Docker Image command: | docker build -t builditall . - run: name: Save Docker Image Locally command: | docker save builditall -o builditall.tar - run: name: Verify Docker Image Binaries command: | ./terrier
Below is a Github Actions example configuration using Terrier to verify the contents of an image.
name: Go on: [push] jobs: build: name: Build runs-on: ubuntu-latest steps: - name: Get Code uses: actions/checkout@master - name: Build Docker Image run: | docker build -t builditall . - name: Save Docker Image Locally run: | docker save builditall -o builditall.tar - name: Verify Docker Image Binaries run: | ./terrier
In this blog post, we have looked at how to perform multiple actions on Docker (and OCI) containers and images via Terrier. The actions performed allowed us to identify specific files according to their hashes in images and containers. The actions performed have also allowed us to identify and verify multiple components in images and containers. These actions performed by Terrier are useful when attempting to prevent certain supply-chain attacks.
We have also seen how Terrier can be used in a DevOps pipeline via GitHub Actions and CircleCI.
Learn more about Terrier on GitHub at https://github.com/heroku/terrier.
For decades, architectural critic and photographer John Margolies obsessively documented roadside attractions: vernacular architecture, weird sculpture, odd businesses and amusements. By his death in 2016, his collection consisted of more than 11,000 slides (he published books of his favorites, with annotations).
The Library of Congress purchased the Margolies archive and has released it to the public domain, with hi-rez scans of 11,710 slides.
Almost all of Margolies’ work was done in the interest of preserving images of what would otherwise be lost to time. Even his first book, published in 1981, was elegiacally called The End of the Road: Vanishing Highway Architecture in America. From the start, Margolies knew the quirky motels, miniature golf courses, diners, billboards, and gas stations were being endangered by franchising and changing fashions — not to mention changing patterns of automobile traffic. (For decades now, most drivers have, of course, opted for the high speed-limits of superhighways and the convenience of service areas, leaving the old local highways in the lurch.)
John Margolies’ Photographs of Roadside America [Public Domain Review]
John Margolies [Library of Congress]
Roadside America [Library of Congress/Flickr Commons]
The first anatomically correct model of the visual cortex seeks to capture how the brain sees the world.
Last weekend, Jeanette Ng won the John W Campbell Award for Best New Writer at the 2019 Hugo Awards at the Dublin Worldcon; Ng's acceptance speech calls Campbell, one of the field's most influential editors, a "fascist" and expresses solidarity with the Hong Kong pro-democracy protesters.
I am a past recipient of the John W Campbell Award for Best New Writer (2000) as well as a recipient of the John W Campbell Memorial Award (2009). I believe I'm the only person to have won both of the Campbells, which, I think, gives me unique license to comment on Ng's remarks, which have been met with a mixed reception from the field.
I think she was right -- and seemly -- to make her remarks. There's plenty of evidence that Campbell's views were odious and deplorable. For example, Heinlein apologists like to claim (probably correctly) that his terrible, racist, authoritarian, eugenics-inflected yellow peril novel Sixth Column was effectively a commission from Campbell (Heinlein based the novel on one of Campbell's stories). This seems to have been par for the course for JWC, who liked to micro-manage his writers: Campbell also leaned hard on Tom Godwin to kill the girl in "Cold Equations" in order to turn his story into a parable about the foolishness of women and the role of men in guiding them to accept the cold, hard facts of life.
So when Ng held Campbell "responsible for setting a tone of science fiction that still haunts the genre to this day. Sterile. Male. White. Exalting in the ambitions of imperialists and colonisers, settlers and industrialists," she was factually correct.
Not just factually correct: also correct to be saying this now. Science fiction (like many other institutions) is having a reckoning with its past and its present. We're trying to figure out what to do about the long reach that the terrible ideas of flawed people (mostly men) had on our fields. We're trying to reconcile the legacies of flawed people whose good deeds and good art live alongside their cruel, damaging treatment of women. These men were not aberrations: they were following an example set from the very top and running through fandom, to the great detriment of many of the people who came to fandom for safety and sanctuary and community.
It's not a coincidence that one of the first organized manifestation of white nationalism as a cultural phenomenon was within fandom, and while fandom came together to firmly repudiate its white nationalist wing, these assholes weren't (all) entryists who showed up to stir trouble in someone else's community. The call (to hijack the Hugo award) was coming from inside the house: these guys had been around forever, and we'd let them get away with it, in the name of "tolerance" even as these guys were chasing women, queer people, and racialized people out of the field.
Those same Nazis went on to join Gamergate, then take up on /r/The_Donald, and they were part of the vanguard of the movement that put a boorish, white supremacist grifter into the White House.
The connection between the tales we tell about ourselves and our past and futures have a real, direct outcome on the future we arrive at. White supremacist folklore, including the ecofascist doctrine that says we can only avert climate change by murdering all the brown people, comes straight out of sf folklore, where it's completely standard for every disaster to be swiftly followed by an underclass mob descending on their social betters to eat and/or rape them (never mind the actual way that disasters go down).
When Ng took the mic and told the truth about his legacy, she wasn't downplaying his importance: she was acknowledging it. Campbell's odious ideas matter because he was important, a giant in the field who left an enduring mark on it. No one disagrees about that. What we want to talk about today is what that mark is, and what it means.
There are still people in our community who knew Campbell personally, and many many others one step removed, who idolize and respect the writers Campbell took under his wing. And there are people — and once again I raise my hand — who are in the field because the way Campbell shaped it as a place where they could thrive. Many if not most of these folks know about his flaws, but even so it’s hard to see someone with no allegiance to him, either personally or professionally, point them out both forcefully and unapologetically. They see Campbell and his legacy abstractly, and also as an obstacle to be overcome. That’s deeply uncomfortable.
He's not wrong, and the people who counted Campbell as a friend are legitimately sad to confront the full meaning of his legacy. I feel for them. It's hard to reconcile the mensch who was there for you and treated his dog with kindness and doted on his kids with the guy who alienated and hurt people with his cruel dogma.
Here's the thing: neither one of those facets of Campbell cancel the other one out. Just as it's not true that any amount of good deeds done for some people can repair the harms he visited on others; it's also true that none of those harms cancel out the kindnesses he did for the people he was kind to.
Life is not a ledger. Your sins can't be paid off through good deeds. Your good deeds are not cancelled by your sins. Your sins and your good deeds live alongside one another. They coexist in superposition.
You (and I) can (and should) atone for our misdeeds. We can (and should) apologize for them to the people we've wronged. We should do those things, not because they will erase our misdeeds, but because the only thing worse than being really wrong is not learning to be better.
People are flawed vessels. The circumstances around us -- our social norms and institutions -- can be structured to bring out our worst natures or our best. We can invite Isaac Asimov to our cons to deliver a lecture on "The Power of Posterior Pinching" in which he literally advises men on how to grope the women in attendance, or we can create and enforce a Code of Conduct that would bounce anyone, up to and including the Con Chair and the Guest of Honor, who tried a stunt like that.
We, collectively, through our norms and institutions, create the circumstances that favor sociopathy or generosity. Sweeping bad conduct under the rug isn't just cruel to the people who were victimized by that conduct: it's also a disservice to the flawed vessels who are struggling with their own contradictions and base urges. Create an environment where it's normal to do things that -- in 10 or 20 years -- will result in your expulsion from your community is not a kindness to anyone.
There are shitty dudes out there today whose path to shitty dudehood got started when they watched Isaac Asimov deliver a tutorial on how to grope women without their consent and figured that the chuckling approval of all their peers meant that whatever doubts the might have had were probably misplaced. Those dudes don't get a pass because they learned from a bad example set by their community and its leaders -- but they might have been diverted from their path to shitty dudehood if they'd had better examples. They might not have scarred and hurt countless women on their way from the larval stage of shittiness to full-blown shitlord, and they themselves might have been spared their eventual fate, of being disliked and excluded from a community they joined in search of comradeship and mutual aid. The friends of those shitty dudes might not have to wrestle with their role in enabling the harm those shitty dudes wrought.
Jeannette Ng's speech was exactly the speech our field needs to hear. And the fact that she devoted the bulk of it to solidarity with the Hong Kong protesters is especially significant, because of the growing importance of Chinese audiences and fandom in sf, which exposes writers to potential career retaliation from an important translation market. There is a group of (excellent, devoted) Chinese fans who have been making noises about a Chinese Worldcon for years, and speeches like Ng's have to make you wonder: if that ever comes to pass, will she be able to get a visa to attend?
Back when the misogynist/white supremacist wing of SF started to publicly organize to purge the field of the wrong kind of fan and the wrong kind of writer, they were talking about people like Ng. I think that this is ample evidence that she is in exactly the right place, at the right time, saying the right thing.
And I am so proud to be part of this. To share with you my weird little story, an amalgam of all my weird interests, so much of which has little to do with my superficial identities and labels.
But I am a spinner of ideas, of words, as Margaret Cavendish would put it.
So I need say, I was born in Hong Kong. Right now, in the most cyberpunk in the city in the world, protesters struggle with the masked, anonymous stormtroopers of an autocratic Empire. They have literally just held her largest illegal gathering in their history. As we speak they are calling for a horological revolution in our time. They have held laser pointers to the skies and tried to to impossibly set alight the stars. I cannot help be proud of them, to cry for them, and to lament their pain.
I’m sorry to drag this into our fantastical words, you’ve given me a microphone and this is what I felt needed saying.
John W. Campbell, for whom this award was named, was a fascist. [Jeannette Ng/Medium]
This week the New York Times published a five-years-later retrospective on Gamergate and its aftereffects, which is chilling and illuminating, and you should go read it. It makes an excellent case — several excellent written cases, actually — that “everything is Gamergate,” that it and its hate-screeching online mobs were the prototype for all the culture and media wars since and to come.
Sadly, the lesson expounded herein by the NYT is one which they — and other media — do not yet seem to have actually learned themselves.
Let’s look at another piece which called Gamergate a template for cultural warfare, using the media as a battleground. This one was written back in 2014, by one Kyle Walker, in Deadspin, and its scathing, take-no-prisoners real-time analysis was downright prophetic. A few of its most important passages:
Gamergate is […] a relatively small and very loud group of video game enthusiasts who claim that their goal is to audit ethics in the gaming-industrial complex and who are instead defined by the campaigns of criminal harassment that some of them have carried out against several women […] What’s made it effective, though, is that it’s exploited the same basic loophole in the system that generations of social reactionaries have: the press’s genuine and deep-seated belief that you gotta hear both sides … that anyone more respectable than, say, an avowed neo-Nazi is operating in something like good faith
It is now clear to us all that that last statement is no longer correct … in that it is far too optimistic. Two years ago, the NYT made it apparent that they are in fact willing to assume “an avowed neo-Nazi is operating in something like good faith,” when they published a piece about “the Nazi sympathizer next door,” one variously called “chummy” (Quartz), “sympathetic” (Business Insider), and “normalizing” (NYT readers themselves, among many others.)
Back to Wagner in Deadspin:
The demands for journalistic integrity coming from Gamergate have nothing at all to do with the systemic corruption of the gaming media … The claims from what we like to call the “bias journalisms” school of media criticism aren’t meant to express anything in particular, or even, perhaps, to be taken seriously; they’re meant to work the referees, to get them looking over their shoulders, to soften them up in the hopes that a particular grievance, whatever its merits, might get a better hearing next time around.
How does it play out? Like this: Earlier this month, the New York Times covered Intel’s capitulation in the face of a coordinated Gamergate campaign, called “Operation Disrespectful Nod.”
Here’s that NYT piece from five years ago. It, in turn, begins:
For a little more than a month, a firestorm over sexism and journalistic ethics has roiled the video game community, culminating in an orchestrated campaign to pressure companies into pulling their advertisements from game sites.
That campaign won a big victory in recent days with a decision by Intel, the chip maker, to pull ads from Gamasutra, a site for game developers.
Intel’s decision added to a controversy that has focused attention on the treatment of women in the games business and the power of online mobs. The debate intensified in August, partly because of the online posts of a spurned ex-boyfriend of a female game developer.
Wagner’s inescapable conclusion:
The story continued in this vein—cautious, assiduously neutral, lobotomized […] Both sides were heard. And thus did Leigh Alexander’s commentary on the pluralism of gaming today get equal time with a campaign bent on silencing her. …Make it a story about an oppressive and hypocritical media conspiracy, and all of a sudden you have a cause, a side in a “debate.”
Gamergate, like so many bad-faith movements since, followed a variant of the “motte and bailey” strategy, which is
when you make a bold, controversial statement. Then when somebody challenges you, you claim you were just making an obvious, uncontroversial statement, so you are clearly right and they are silly for challenging you. Then when the argument is over you go back to making the bold, controversial statement.
Here, the motte is an ugly or vile cause — in Gamergate’s case, vicious misogyny — and the bailey is an entirely different purported argument — for Gamergate, “it’s about ethics in games journalism.” They work the latter argument for credibility, but entirely in bad faith, because it is tacitly understood, both internally and externally, albeit in a quasi-deniable way, that what they actually care about is their ugly cause.
This has become the playbook for so many modern disputes, because it continues to be a thoroughly effective way to manipulate the mainstream media. Arguments about purported “grievance politics,” or “the decline of America sanctioned by the elites,” or a manufactured, fictional “immigration crisis,” all continue to be treated by the media as legitimate grievances, and/or good-faith disputes, rather than a thin pretext for bald-faced racism and xenophobia.
Every so often the motte is accidentally revealed, as when the head of the USCIS said, just this week, that the famous poem which adorns the Statue of Liberty referred to “people coming from Europe.” But in general the pretense of the bailey is upheld.
Let me reiterate: the pretense. These are arguments knowingly made in bad faith. What’s more, the actual cause soon becomes apparent to those who investigate the subject with open and searching minds. Good journalists should not be willing accept such distorted pretenses at face value, nor assume good faith without evidence. The NYT clearly made that mistake, fell into that trap, with Gamergate five years ago. As Wagner put it then,
What we have in Gamergate is a glimpse of how these skirmishes will unfold in the future—all the rhetorical weaponry and siegecraft of an internet comment section brought to bear on our culture, not just at the fringes but at the center.
How right he was. And yet it is all too apparent that, in the heart and at the heights of the New York Times, nothing of significance has been learned. How else to explain how, five years after Gamergate, and two years after “readers accuse(d) us of normalizing a Nazi sympathizer,” the NYT continues to treat exactly the same kind of bad-faith arguments as if they are meaningful, important, and valid? Most visibly with its most recent headline debacle, but that is only the tip of the wilfuly ignorant iceberg.
In the aftermath of that headline incident, Dean Baquet, its executive editor, told CNN a remarkable thing: “Our role is not to be the leader of the resistance.” In other words, the publisher of this excellent recent Gamergate exegesis has learned nothing from it.
The NYT’s role should be to lead a resistance — not necessarily against any individual political party or figure, but a resistance of critical thinking, and searching analysis, against deceptive motte-and-bailey arguments. But they don’t seem willing to recognize that they are being manipulated by such bad-faith movements, much less accept that one of them has grown to occupy much of America’s political landscape. One wonders when the Gray Lady will finally open her eyes.
As we all know, Wyze uses a linux kernel, so, adding NFS (or SMB or...) shouldn't be that difficult to do.
Well, the same author that found out how to use the Sensors without the need of a hub, also has a way to enable NFS share support of the WyzeCam. This means that you can have your WyzeCam write directly to a NFS share, and not the microSD card, and from there, you could write the files to a cloud service of your choice, or do many things that wasn't possible before.
Currently, I am doing a cron job to concat every hour's worth of data, then do some external processing on the footage.
For the info on how this was done, read
For the needed files, you need to clone (or...) https://github.com/HclX/WyzeHacks
If Wyze would just enable this from the start, and even offer different protocols like SMB, their cams would become so much more popular.
Dan Hon (previously at BB) noticed that Star Trek's meetings and conferences always involve military officers, usually occur with ample time for preparation, yet invariably has them just talking to one another. If there are any graphics involved, they are simple, concise and expressive.
This is of course nothing whatsoever like any military on earth
or off it. So Hon decided to photoshop what such meetings would
actually entail: PowerPoint, and lots of it.
Sorry not sorry. Bajoran Stability / Maquis Dynamics - GOVERNANCE
Heres "Overall Weekly Dominion Attack Trends for Stardate 51145.3 - 51247.5"
Overall Weekly Dominion Attack Trends
Stardate 51145.3 - 51247.5 pic.twitter.com/uL7jZWCyUS
— dan hon is back (@hondanhon) July 19, 2019
As reviewed by Lt. Cmdrs. Worf, Data, and LaForge, and Capt. Picard:
L-R: Worf, Data, Geordi and Picard review the latest overall weekly Dominion attack trends. pic.twitter.com/wACdfEC1vP
— dan hon is back (@hondanhon) July 19, 2019
Dave Rutledge, however, plays for the other team:
It's now much easier to ask for permission to fly drones in controlled airspace even if you're only doing it for fun. The FAA is giving recreational drone pilots access to the Low Altitude Authorization and Notification Capability (LAANC) system -- t...
NPR notes today's "supercomputer-driven" weather modelling can crunch huge amounts of data to accurately forecast the weather a week in advance -- pointing out that "a six-day weather forecast today is as good as a two-day forecast was in the 1970s." Here's some highlights from their interview with Andrew Blum, author of The Weather Machine: A Journey Inside the Forecast : One of the things that's happened as the scale in the system has shifted to the computers is that it's no longer bound by past experience. It's no longer, the meteorologists say, "Well, this happened in the past, we can expect it to happen again." We're more ready for these new extremes because we're not held down by past expectations... The models are really a kind of ongoing concern. ... They run ahead in time, and then every six hours or every 12 hours, they compare their own forecast with the latest observations. And so the models in reality are ... sort of dancing together, where the model makes a forecast and it's corrected slightly by the observations that are coming in... It's definitely run by individual nations -- but individual nations with their systems tied together... It's a 150-year-old system of governments collaborating with each other as a global public good... The positive example from last month was with Cyclone Fani in India. And this was a very similar storm to one 20 years ago, that tens of thousands of people had died. This time around, the forecast came far enough in advance and with enough confidence that the Indian government was able to move a million people out of the way.
Read more of this story at Slashdot.
Computer engineer George Hilliard says he has built an electronic business card running Linux. From his blog post: It is a complete, minimal ARM computer running my customized Linux firmware built with Buildroot. It has a USB port in the corner. If you plug it into a computer, it boots in about 6 seconds and shows up over USB as a flash drive and a virtual serial port that you can use to log into the card's shell. The flash drive has a README file, a copy of my resume, and some of my photography. The shell has several games and Unix classics such as fortune and rogue, a small 2048, and a small MicroPython interpreter. All this is accomplished on a very small 8MB flash chip. The bootloader fits in 256KB, the kernel is 1.6MB, and the whole root filesystem is 2.4MB. So, there's plenty of space for the virtual flash drive. It also includes a writable home directory, on the off chance that anyone creates something they want to keep. This is also saved on the flash chip, which is properly wear leveled with UBI. The whole thing costs under $3. It's cheap enough to give away. If you get one from me, I'm probably trying to impress you. In a detailed write-up, Hilliard goes on to explain how he came up with the design and assembled all the components. Naturally, there were some problems that arose during the construction that he had to troubleshoot: "first, the USB port wasn't long enough to reliably make contact in many USB ports. Less critically, the flash footprint was wrong, which I worked around by bending the leads under the part by hand..." Impressively, the total cost of the card (not including his time) was $2.88 -- "cheap enough that I don't feel bad giving it away, as designed!"
Read more of this story at Slashdot.
<Roosevelt> !choose do work, play games
<RoBoBo> Choice: do work
<Roosevelt> !choose listen to a stupid bot, don't listen to a stupid bot
<RoBoBo> Choice: listen to a stupid bot
Internet Archive founder Brewster Kahle created The Game of Oligarchy, which "shows that the 'free market' leads inexorably to one person getting all the money and everyone else going broke. And fast."
The game's rules are simple: everyone is assigned $100 in play money to begin with; they take it in turns to pick a player to have a coin-toss against, with the winner taking 50% of the lesser of pots of the pair (if both have $100, the winner takes $50 from the loser).
Very quickly, the winners of the initial coin tosses wipe out the remaining players, and then each other, producing an outcome with a single winner with all the money. What's more interesting than the ability of small amounts of random chance to produce oligarchic outcomes is the psychological effect of playing the game: over the duration of the very short games, the winners arrive at a "feeling of righteous empowerment based on being successful" and players experience class divisions.
Kahle based his game on an article in Scientific American: "Is Inequality Inevitable? Wealth naturally trickles up in free-market economies, model suggests. Neal Krawetz has implemented the game so it can run automatically in browsers.
Read the rest
What is amazing is that even through each toss is “fair” in that it is a 50-50 chance to win a straight amount of money, the results shows one player wins all the money, and really quickly.
Two nephews and their partners, Mary and I played 4 rounds in about an hour and we discovered social classes (we called the broke ones “organ sellers”), feeling of righteous empowerment based on being successful (even though it was completely random), but also that “free market” ended with all-but-one-of-us in a bad situation really quickly.