u/RunsWithApes explains the real reason the Rosa Parks biography was banned in Florida schools [Published articles]
submitted by /u/CandidateDecent1391 to
Inaugural flight of Relativity's 3D-printed rocket Terran 1 launches and passes Max Q [Published articles]
|submitted by /u/thesheetztweetz to
Why You Should Opt Out of Sharing Data With Your Mobile Provider [Published articles]
A new breach involving data from nine million AT&T customers is a fresh reminder that your mobile provider likely collects and shares a great deal of information about where you go and what you do with your mobile device — unless and until you affirmatively opt out of this data collection. Here’s a primer on why you might want to do that, and how.
Telecommunications giant AT&T disclosed this month that a breach at a marketing vendor exposed certain account information for nine million customers. AT&T said the data exposed did not include sensitive information, such as credit card or Social Security numbers, or account passwords, but was limited to “Customer Proprietary Network Information” (CPNI), such as the number of lines on an account.
Certain questions may be coming to mind right now, like “What the heck is CPNI?” And, ‘If it’s so ‘customer proprietary,’ why is AT&T sharing it with marketers?” Also maybe, “What can I do about it?” Read on for answers to all three questions.
AT&T’s disclosure said the information exposed included customer first name, wireless account number, wireless phone number and email address. In addition, a small percentage of customer records also exposed the rate plan name, past due amounts, monthly payment amounts and minutes used.
CPNI refers to customer-specific “metadata” about the account and account usage, and may include:
-Called phone numbers
-Time of calls
-Length of calls
-Cost and billing of calls
-Premium services, such as directory call assistance
According to a succinct CPNI explainer at TechTarget, CPNI is private and protected information that cannot be used for advertising or marketing directly.
“An individual’s CPNI can be shared with other telecommunications providers for network operating reasons,” wrote TechTarget’s Gavin Wright. “So, when the individual first signs up for phone service, this information is automatically shared by the phone provider to partner companies.”
Is your mobile Internet usage covered by CPNI laws? That’s less clear, as the CPNI rules were established before mobile phones and wireless Internet access were common. TechTarget’s CPNI primer explains:
“Under current U.S. law, cellphone use is only protected as CPNI when it is being used as a telephone. During this time, the company is acting as a telecommunications provider requiring CPNI rules. Internet use, websites visited, search history or apps used are not protected CPNI because the company is acting as an information services provider not subject to these laws.”
Hence, the carriers can share and sell this data because they’re not explicitly prohibited from doing so. All three major carriers say they take steps to anonymize the customer data they share, but researchers have shown it is not terribly difficult to de-anonymize supposedly anonymous web-browsing data.
“Your phone, and consequently your mobile provider, know a lot about you,” wrote Jack Morse for Mashable. “The places you go, apps you use, and the websites you visit potentially reveal all kinds of private information — e.g. religious beliefs, health conditions, travel plans, income level, and specific tastes in pornography. This should bother you.”
Happily, all of the U.S. carriers are required to offer customers ways to opt out of having data about how they use their devices shared with marketers. Here’s a look at some of the carrier-specific practices and opt-out options.
AT&T’s policy says it shares device or “ad ID”, combined with demographics including age range, gender, and ZIP code information with third parties which explicitly include advertisers, programmers, and networks, social media networks, analytics firms, ad networks and other similar companies that are involved in creating and delivering advertisements.
AT&T said the data exposed on 9 million customers was several years old, and mostly related to device upgrade eligibility. This may sound like the data went to just one of its partners who experienced a breach, but in all likelihood it also went to hundreds of AT&T’s partners.
According to the Electronic Privacy Information Center (EPIC), T-Mobile seems to be the only company out of the big three to extend to all customers the rights conferred by the California Consumer Privacy Act (CCPA).
EPIC says T-Mobile customer data sold to third parties uses another unique identifier called mobile advertising IDs or “MAIDs.” T-Mobile claims that MAIDs don’t directly identify consumers, but under the CCPA MAIDs are considered “personal information” that can be connected to IP addresses, mobile apps installed or used with the device, any video or content viewing information, and device activity and attributes.
T-Mobile customers can opt out by logging into their account and navigating to the profile page, then to “Privacy and Notifications.” From there, toggle off the options for “Use my data for analytics and reporting” and “Use my data to make ads more relevant to me.”
According to Wired.com’s tutorial, Verizon users can opt out by logging into their Verizon account through a web browser or the My Verizon mobile app. From there, select the Account tab, then click Account Settings and Privacy Settings on the web. For the mobile app, click the gear icon in the upper right corner and then Manage Privacy Settings.
On the privacy preferences page, web users can choose “Don’t use” under the Custom Experience section. On the My Verizon app, toggle any green sliders to the left.
EPIC notes that all three major carriers say resetting the consumer’s device ID and/or clearing cookies in the browser will similarly reset any opt-out preferences (i.e., the customer will need to opt out again), and that blocking cookies by default may also block the opt-out cookie from being set.
T-Mobile says its opt out is device-specific and/or browser-specific. “In most cases, your opt-out choice will apply only to the specific device or browser on which it was made. You may need to separately opt out from your other devices and browsers.”
Both AT&T and Verizon offer opt-in programs that gather and share far more information, including device location, the phone numbers you call, and which sites you visit using your mobile and/or home Internet connection. AT&T calls this their Enhanced Relevant Advertising Program; Verizon’s is called Custom Experience Plus.
In 2021, multiple media outlets reported that some Verizon customers were being automatically enrolled in Custom Experience Plus — even after those customers had already opted out of the same program under its previous name — “Verizon Selects.”
If none of the above opt out options work for you, at a minimum you should be able to opt out of CPNI sharing by calling your carrier, or by visiting one of their stores.
Why should you opt out of sharing CPNI data? For starters, some of the nation’s largest wireless carriers don’t have a great track record in terms of protecting the sensitive information that you give them solely for the purposes of becoming a customer — let alone the information they collect about your use of their services after that point.
In January 2023, T-Mobile disclosed that someone stole data on 37 million customer accounts, including customer name, billing address, email, phone number, date of birth, T-Mobile account number and plan details. In August 2021, T-Mobile acknowledged that hackers made off with the names, dates of birth, Social Security numbers and driver’s license/ID information on more than 40 million current, former or prospective customers who applied for credit with the company.
Last summer, a cybercriminal began selling the names, email addresses, phone numbers, SSNs and dates of birth on 23 million Americans. An exhaustive analysis of the data strongly suggested it all belonged to customers of one AT&T company or another. AT&T stopped short of saying the data wasn’t theirs, but said the records did not appear to have come from its systems and may be tied to a previous data incident at another company.
However frequently the carriers may alert consumers about CPNI breaches, it’s probably nowhere near often enough. Currently, the carriers are required to report a consumer CPNI breach only in cases “when a person, without authorization or exceeding authorization, has intentionally gained access to, used or disclosed CPNI.”
But that definition of breach was crafted eons ago, back when the primary way CPNI was exposed was through “pretexting,” such when the phone company’s employees are tricked into giving away protected customer data.
In January, regulators at the U.S. Federal Communications Commission (FCC) proposed amending the definition of “breach” to include things like inadvertent disclosure — such as when companies expose CPNI data on a poorly-secured server in the cloud. The FCC is accepting public comments on the matter until March 24, 2023.
While it’s true that the leak of CPNI data does not involve sensitive information like Social Security or credit card numbers, one thing AT&T’s breach notice doesn’t mention is that CPNI data — such as balances and payments made — can be abused by fraudsters to make scam emails and text messages more believable when they’re trying to impersonate AT&T and phish AT&T customers.
The other problem with letting companies share or sell your CPNI data is that the wireless carriers can change their privacy policies at any time, and you are assumed to be okay with those changes as long as you keep using their services.
For example, location data from your wireless device is most definitely CPNI, and yet until very recently all of the major carriers sold their customers’ real-time location data to third party data brokers without customer consent.
What was their punishment? In 2020, the FCC proposed fines totaling $208 million against all of the major carriers for selling their customers’ real-time location data. If that sounds like a lot of money, consider that all of the major wireless providers reported tens of billions of dollars in revenue last year (e.g., Verizon’s consumer revenue alone was more than $100 billion last year).
If the United States had federal privacy laws that were at all consumer-friendly and relevant to today’s digital economy, this kind of data collection and sharing would always be opt-in by default. In such a world, the enormously profitable wireless industry would likely be forced to offer clear financial incentives to customers who choose to share this information.
But until that day arrives, understand that the carriers can change their data collection and sharing policies when it suits them. And regardless of whether you actually read any notices about changes to their privacy policies, you will have agreed to those changes as long as you continue using their service.
Celebrate Kurt Russell's Birthday With This Very Fun Escape From New York Fan Film [Published articles]
It’s March 17, and we all know what that means. It’s
birthday! A group of John Carpenter
super-fans have picked this occasion to release their fan film:
Call Me Snake, a note-perfect homage to the director’s 1981
sci-fi classic Escape From New York, starring Russell as eye
patch-wearing rebel Snake Plissken.
Call Me Snake, directed and written by Sean E. McCarthy, takes place in the early 21st century, a time when free-thinkers and hellraisers have fled the U.S., where vices are banned (smoking, drinking, gambling, swearing) and bigots have taken over, with gay people and anyone interested in religious freedom paying the price. In a lawless corner of the world—the appropriately named New Vegas, Thailand—an underground duel introduces its opponents: the wild-eyed “Kabuki Joe” (Davis Noir) and a certain mystery man (Matt Kohler) who’s “of few words, and a giver of even fewer fucks.” Guess who?
Yep, he’s a stone-cold badass in a fight, but he can barely contain his disgust and horror at being recognized by an admirer... or, egads, the sight of an umbrella in his cocktail. It’s worth noting that the fan film takes place in between the events of Escape From New York (which is set in 1997) and Escape From L.A. (which was released in 1996, but takes place in 2000), in case you’re wondering where it falls in Snake’s own dystopian timeline. As we find him, he’s not actively trying to escape a particular place for once, but he’s definitely still a guy who effortlessly makes enemies everywhere he goes.
Escape From New York—its tone, its storyline, its setting, its surly hero, its distinctive musical score—has been highly influential in the four decades since its release. (Let us speak not of the long-rumored, much-dreaded remake.) Call Me Snake nails the qualities that’ve made Escape such a beloved cult favorite among fans and filmmakers alike; the character with the cackling laugh is a nice touch, as are the knowingly cheesy moments sprinkled throughout. Hey, team Call Me Snake—can we get a Big Trouble in Little China homage in time for next March 17?
Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.
Cloning Dell charger chip [Published articles]
|submitted by /u/doatopus to
NASA’s Magellan Data Reveals Volcanic Activity on Venus [Published articles]
|submitted by /u/alvinofdiaspar to
It took a TikToker barely 30 minutes to doxx me [Published articles]
In 30 minutes or less, TikToker and Chicago-based server Kristen Sotakoun can probably find your birth date. She’s not a cybersecurity expert, despite what some of her followers suspect, but has found a hobby in what she calls “consensual doxxing.”
“My first thing is to be entertaining. My second thing is to show you cracks in your social media, which was the totally accidental thing that I became on TikTok,” Sotakoun, who goes by @notkahnjunior, told me.
It’s not quite doxxing, which usually refers to making private information publicly available with malicious intent. Instead, it’s known in the cybersecurity field as open-source intelligence, or OSINT. People unknowingly spell out private details about their lives as a bread crumb trail across social media platforms that, when gathered together, paint a picture of their age, families, embarrassing childhood memories and more. In malicious cases, hackers gather information based on what you or your loved ones have published on the web to get into your accounts, commit fraud, or even socially engineer a user to fall for a scam.
Sotakoun mostly just tracks down an anonymous volunteer's birth date. She doesn’t have malicious intent or interest in a security career, she said she just likes to solve logic puzzles. Before TikTok, that was spending a ride home from a friend’s birthday dinner at Medieval Times discovering the day job of their “knight.” Sotakoun just happened to eventually go viral for her skills.
So, to show me her process, I let Sotakoun “consensually doxx” me. She found my Twitter pretty quickly, but because I keep it pretty locked down, it wasn’t super helpful. Information in author bios from my past jobs, however, helped her figure out where I went to college.
My name plus where I studied led her to my Facebook account, another profile that didn’t reveal much. It did, however, lead her to my sister, who had commented on my cover photo nine years ago. She figured out it was my sister because we shared a last name, and we’re listed as sisters on her Facebook. That’s important to note because I don’t actually share a last name with most of my other siblings, which could’ve been an additional roadblock.
My sister and I have pretty common names though, so Sotakoun also found my stepmom on my sister’s profile. By searching my stepmom’s much more unique name on Instagram, it helped lead Sotakoun to mine and my sister’s Instagram accounts, as opposed to one of the many other Malones online.
Still, my Instagram account is private. So, it was my sister’s Instagram account – that she took off “private” for a Wawa giveaway that ultimately won her a t-shirt – featuring years-old birthday posts that led Sotakoun to the day I was born. That took a ton of scrolling and, to correct for the fact that a birthday post could come a day late or early, Sotakoun relied on the fact that my sister once shared that my birthday coincided with World Penguin Day, April 25.
Then, to find the year, she cross-referenced the year I started college, which was 2016 according to my public LinkedIn, with information in my high school newspaper. My senior year of high school, I won a scholarship only available to seniors, Sotakoun discovered, revealing that I graduated high school in 2016. From there, she counted back 18 years, and told me that I was born on April 25, 1998. She was right.
“My goal is always to find context clues, or find people who care less about their online presence than you do,” Sotakoun said.
Many people will push back on the idea that having personal information online is harmful, according to Matt Edmondson, an OSINT instructor at cybersecurity training organization SANS Institute. While there are obvious repercussions to having your social security number blasted online, people may wonder what the harm is in seemingly trivial information like having your pet’s name easily available on social media. But if that also happens to be the answer to a security question, an attacker may be able to use that to get into your Twitter account or email.
In my case, I’ve always carefully tailored my digital footprint to keep my information hidden. My accounts are private and I don’t share a lot of personal information. Still, Sotakoun’s OSINT methods found plenty to work with.
Facebook and Instagram are Sotakoun’s biggest help for finding information, but she said she has also used Twitter, and even Venmo to confirm relationships. She specifically avoids resources like records databases that could easily give away information.
That means that there’s still a lot of data out there on each of us that Sotakoun isn’t looking for. Especially if you’re in the US, data like your date of birth, home address and more are likely already out there in some form, according to Steven Harris, an OSINT specialist that teaches at SANS.
“Once the data is out there, it’s very hard to take back,” Harris said. “What protects people is not that the information is securely locked away, it’s that most people don’t have the knowledge or inclination to go and find out.”
There are simple things you can do to keep attackers from using these details against you. Complex passwords and multi-factor authentication make it harder for unauthorized users to get into your account, even if they know the answers to your security questions.
That gets a bit more complicated, though, when we think about how much our friends and family post for us. In fact, Sotakoun said she noticed that even if a person takes many measures to hide themselves online, the lack of control over their social circle can help her discover their birth date.
“You have basically no control on your immediate social circle, or even your slightly extended social circle and how they present themselves online,” she said.This article originally appeared on Engadget at https://www.engadget.com/it-took-a-tiktoker-barely-30-minutes-to-doxx-me-120022880.html?src=rss
No One Knows if You Need Another Covid Booster [Published articles]
It’s cellular immunity, not antibodies, that probably protects against the coronavirus’s worst effects—and scientists haven’t worked out how long it lasts.
Terrier: An Open-Source Tool for Identifying and Analyzing Container and Image Components [Published articles]
As part of our Blackhat Europe talk “Reverse Engineering and Exploiting Builds in the Cloud” we publicly released a new tool called Terrier.
In this blog post, I am going to show you how Terrier can help you identify and verify container and image components for a wide variety of use-cases, be it from a supply-chain perspective or forensics perspective. Terrier can be found on Github https://github.com/heroku/terrier.
In this blog post, I am not going to go into too much detail about containers and images (you can learn more here) however it is important to highlight a few characteristics of containers and images that make them interesting in terms of Terrier. Containers are run from images and currently the Open Containers Initiative (OCI) is the most popular format for images. The remainder of this blog post refers to OCI images as images.
Essentially images are tar archives that container multiple tar archives and meta-information that represent the “layers” of an image. The OCI format of images makes images relatively simple to work with which makes analysis relatively simple. If you only had access to a terminal and the tar command, you could pretty much get what you need from the image’s tar archive.
When images are utilised at runtime for a container, their
contents become the contents of the running container and the
layers are essentially extracted to a location on the
container’s runtime host. The container runtime host is the
host that is running and maintaining the containers. This location
location contains a few folders of interest, particularly the
"merged" folder. The "merged" folder contains the contents of the
image and any changes that have occurred in the container since its
creation. For example, if the image contained a location such as
/usr/chris/stuff and after creating a container from
this image I created a file called
/usr/chris/stuff. This would result in
the following valid path on the container runtime host
Now that we have a brief understanding of images and containers, we can look at what Terrier does. Often it is the case that you would like to determine if an image or container contains a specific file. This requirement may be due to a forensic analysis need or to identify and prevent a certain supply-chain attack vector. Regardless of the requirement, having the ability to determine the presence of a specific file in an image or container is useful.
Terrier can be used to determine if a specific image contains a specific file. In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above uses a Docker image ID which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will then need to establish the SHA256 hash of the particular file you would like to identify in the image. There are multiple ways to achieve this but in this example, we are going to use the hash of the Golang binary go1.13.4 linux/amd64 which can be achieved with following command on a Linux host:
$ cat /usr/local/go/bin/go | sha256sum
The command above should result in the following SHA256 hash:
Now that we have a hash, we can use this hash to determine if
the Golang binary is in the image
achieve this, we need to populate a configuration file for Terrier.
Terrier makes use of YAML configuration files and below is our
config file that we save as
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
The config file above has multiple entries which allow us to
mode that Terrier will operate in and in
this case, we are working with an image file (tar archive) so the
image. The image file we are working with is
myImage.tar and the hash we are looking to identify is
We are now ready to run Terrier and this can be done with the following command:
The command above should result in output similar to the following:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
We have identified a file
located at layer
that has the same SHA256 hash as the one we provided. We now have
verification that the image “myImage.tar” contains a
file with the SHA256 hash we provided.
This example can be extended upon and you can instruct Terrier to search for multiple hashes. In this case, we are going to search for a malicious file. Recently a malicious Python library was identified in the wild and went by the name “Jeilyfish”. Terrier could be used to check if a Docker image of yours contained this malicious package. To do this, we can determine the SHA256 of one of the malicious Python files that contains the backdoor:
$ cat jeIlyfish-0.7.1/jeIlyfish/_jellyfish.py | sha256sum cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c
We then update our Terrier config to include the hash calculated above.
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
We then run Terrier against and analyse the results:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
The results above indicate that our image did not contain the malicious Python package.
There is no limit as to how many hashes you can search for however it should be noted that Terrier performs all its actions in-memory for performance reasons so you might hit certain limits if you do not have enough accessible memory.
Terrier can also be used to determine if a specific image contains a specific file at a specific location. This can be useful to ensure that an image is using a specific component i.e binary, shared object or dependency. This can also be seen as “pinning” components by ensuring that you are images are using specific components i.e a specific version of cURL.
In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above utilises a Docker image id which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will need to determine the path of the file you would like to identify and verify in the image. For example, if we would like to ensure that our images are making use of a specific version of cURL, we can run the following commands in a container or some other environment that resembles the image.
$ which curl /usr/bin/curl
We now have the path to cURL and can now generate the SHA256 of this instance of cURL because in this case, we trust this instance of cURL. We could determine the hash by other means for example many binaries are released with a corresponding hash from the developer which can be acquired from the developer’s website.
$ cat /usr/bin/curl | sha256sum 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96
With this information, we can now populate our config file for Terrier:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96'
We’ve saved the above config as
when we run Terrier with this config, we get the following
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1) $ echo $? 0
The output above indicates that the file
/usr/bin/curl was successfully identified and
verified, meaning that the image contained a file at the location
/usr/bin/curl and that the SHA256 of that file matched
the hash we provided in the config. Terrier also makes use of
return codes and if we analyse the return code from the output
above, we can see that the value is
0 which indicates
a success. If Terrier cannot identify or verify all the provided
files, a return code of
1 is returned which indicates
a failure. The setting of return codes is particularly useful in
testing environments or CI/CD environments.
We can also run Terrier with verbose mode enable to get more information:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [!] Identified instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl [!] Verified matching instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl with hash: 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1)
The output above provides some more detailed information such as which layer the cURL files was located at. If you wanted more information, you could enable the veryveryverbose option in the config file but beware, this is a lot of output and grep will be your friend.
There is no limit for how many hashes you can specify for a file. This can be useful for when you want to allow more than one version of a specific file i.e multiple versions of cURL. An example config of multiple hashes for a file might look like:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - hash: 'aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545' - hash: '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759' - hash: 'd4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c'
The config above allows Terrier to verify if the identified cURL instance is one of the provided hashes. There is also no limit for the amount of files Terrier can attempt to identify and verify.
Terrier’s Github repo also contains a useful script called
convertSHA.sh which can be used to convert a list of
SHA256 hashes and filenames into a Terrier config file. This is
useful when converting the output from other tools into a Terrier
friendly format. For example, we could have the following contents
of a file:
8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d ./bin/uname 6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92 ./bin/gzexe 74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04 ./bin/wdctl 61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877 ./bin/dmesg 7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b ./bin/which 3ed46bd8b4d137cad2830974a78df8d6b1d28de491d7a23d305ad58742a07120 ./bin/mknod e8ca998df296413624b2bcf92a31ee3b9852f7590f759cc4a8814d3e9046f1eb ./bin/mv a91d40b349e2bccd3c5fe79664e70649ef0354b9f8bd4658f8c164f194b53d0f ./bin/chown 091abe52520c96a75cf7d4ff38796fc878cd62c3a75a3fd8161aa3df1e26bebd ./bin/uncompress c5ebd611260a9057144fd1d7de48dbefc14e16240895cb896034ae05a94b5750 ./bin/echo d4ba9ffb5f396a2584fec1ca878930b677196be21aee16ee6093eb9f0a93bf8f ./bin/df 5fb515ff832650b2a25aeb9c21f881ca2fa486900e736dfa727a5442a6de83e5 ./bin/tar 6936c9aa8e17781410f286bb1cbc35b5548ea4e7604c1379dc8e159d91a0193d ./bin/zforce 8d641329ea7f93b1caf031b70e2a0a3288c49a55c18d8ba86cc534eaa166ec2e ./bin/gzip 0c1a1f53763ab668fb085327cdd298b4a0c1bf2f0b51b912aa7bc15392cd09e7 ./bin/su 20c358f7ee877a3fd2138ecce98fada08354810b3e9a0e849631851f92d09cc4 ./bin/bzexe 01764d96697b060b2a449769073b7cf2df61b5cb604937e39dd7a47017e92ee0 ./bin/znew 0d1a106dc28c3c41b181d3ba2fc52086ede4e706153e22879e60e7663d2f6aad ./bin/login fb130bda68f6a56e2c2edc3f7d5b805fd9dcfbcc26fb123a693b516a83cfb141 ./bin/dir 0e7ca63849eebc9ea476ea1fefab05e60b0ac8066f73c7d58e8ff607c941f212 ./bin/bzmore 14dc8106ec64c9e2a7c9430e1d0bef170aaad0f5f7f683c1c1810b466cdf5079 ./bin/zless 9cf4cda0f73875032436f7d5c457271f235e59c968c1c101d19fc7bf137e6e37 ./bin/chmod c5f12f157b605b1141e6f97796732247a26150a0a019328d69095e9760b42e38 ./bin/sleep b9711301d3ab42575597d8a1c015f49fddba9a7ea9934e11d38b9ff5248503a8 ./bin/zfgrep 0b2840eaf05bb6802400cc5fa793e8c7e58d6198334171c694a67417c687ffc7 ./bin/stty d9393d0eca1de788628ad0961b74ec7a648709b24423371b208ae525f60bbdad ./bin/bunzip2 d2a56d64199e674454d2132679c0883779d43568cd4c04c14d0ea0e1307334cf ./bin/mkdir 1c48ade64b96409e6773d2c5c771f3b3c5acec65a15980d8dca6b1efd3f95969 ./bin/cat 09198e56abd1037352418279eb51898ab71cc733642b50bcf69d8a723602841e ./bin/true 97f3993ead63a1ce0f6a48cda92d6655ffe210242fe057b8803506b57c99b7bc ./bin/zdiff 0d06f9724af41b13cdacea133530b9129a48450230feef9632d53d5bbb837c8c ./bin/ls da2da96324108bbe297a75e8ebfcb2400959bffcdaa4c88b797c4d0ce0c94c50 ./bin/zegrep
The file contents above are trusted SHA256 hashes for specific files. If we would like to use this list for ensuring that a particular image is making use of the files listed above, we can do the following:
$ ./convertSHA.sh trustedhashes.txt terrier.yml
The script above takes the input file
trustedhashes.txt which contains our trusted hashes
listed above and converts them into a Terrier friendly config file
terrier.yml which looks like the following:
mode: image image: myImage.tar files: - name: '/bin/uname' hashes: - hash: '8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d' - name: '/bin/gzexe' hashes: - hash: '6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92' - name: '/bin/wdctl' hashes: - hash: '74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04' - name: '/bin/dmesg' hashes: - hash: '61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877' - name: '/bin/which' hashes: - hash: '7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b' - name: '/bin/mknod'
The config file
terrier.yml is ready to be
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] Not all components were identifed: (4/31) [!] Component not identified: /bin/uncompress [!] Component not identified: /bin/bzexe [!] Component not identified: /bin/bzmore [!] Component not identified: /bin/bunzip2 $ echo $? 1
As we can see from the output above, Terrier was unable to identify 4/31 of the components provided in the config. The return code is also 1 which indicates a failure. If we were to remove the components that are not in the provided image, the output from the previous command would look like the following:
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (27/27) [!] Not all components were verified: (26/27) [!] Component not verified: /bin/cat [!] Component not verified: /bin/chmod [!] Component not verified: /bin/chown [!] Component not verified: /bin/df [!] Component not verified: /bin/dir [!] Component not verified: /bin/dmesg [!] Component not verified: /bin/echo [!] Component not verified: /bin/gzexe [!] Component not verified: /bin/gzip [!] Component not verified: /bin/login [!] Component not verified: /bin/ls [!] Component not verified: /bin/mkdir [!] Component not verified: /bin/mknod [!] Component not verified: /bin/mv [!] Component not verified: /bin/sleep [!] Component not verified: /bin/stty [!] Component not verified: /bin/su [!] Component not verified: /bin/tar [!] Component not verified: /bin/true [!] Component not verified: /bin/uname [!] Component not verified: /bin/wdctl [!] Component not verified: /bin/zdiff [!] Component not verified: /bin/zfgrep [!] Component not verified: /bin/zforce [!] Component not verified: /bin/zless [!] Component not verified: /bin/znew $ echo $? 1
The output above indicates that Terrier was able to identify all
the components provided but many were not verifiable, the hashes
did not match and once again, the return code is
indicate this failure.
The previous sections focused on identifying files in images, which can be referred to as a form of “static analysis,” however it is also possible to perform this analysis to running containers. In order to do this, you need the following:
merged folder is Docker specific, in this case,
we are using it because this is where the contents of the Docker
container reside, this might be another location if it were
The location of the container’s
can be determined by running the following commands. First obtain
the container’s ID:
$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b9e676fd7b09 golang "bash" 20 hours ago Up 20 hours cocky_robinson
Once you have the container’s ID, you can run the
following command which will help you identify the location of the
merged folder on the underlying
$ docker exec b9e676fd7b09 mount | grep diff overlay on / type overlay (rw,relatime,lowerdir=/var/lib/docker/overlay2/l/7ZDEFE6PX4C3I3LGIGGI5MWQD4: /var/lib/docker/overlay2/l/EZNIFFIXOVO2GIT5PTBI754HC4:/var/lib/docker/overlay2/l/UWKXP76FVZULHGRKZMVYJHY5IK: /var/lib/docker/overlay2/l/DTQQUTRXU4ZLLQTMACWMJYNRTH:/var/lib/docker/overlay2/l/R6DE2RY63EJABTON6HVSFRFICC: /var/lib/docker/overlay2/l/U4JNTFLQEKMFHVEQJ5BQDLL7NO:/var/lib/docker/overlay2/l/FEBURQY25XGHJNPSFY5EEPCFKA: /var/lib/docker/overlay2/l/ICNMAZ44JY5WZQTFMYY4VV6OOZ, upperdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/diff, workdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/work)
From the results above, we are interested in two entries,
workdir because these two
entries will provide us with the path to the container’s
merged folder. From the results above, we can
determine that the container’s
is located at
on the underlying host.
Now that we have the location, we need some files to identify and in this case, we are going to reuse the SHA256 hashes from the previous section. Let’s now go ahead and populate our Terrier configuration with this new information.
mode: container path: merged #image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
The configuration above shows that we have changed the
and we have added the
path to our
folder. We have kept the two hashes from the previous section.
If we run Terrier with this configuration from the location
we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd
From the output above, we know that the container
b9e676fd7b09) does not contain the malicious Python
package but it does contain an instance of the Golang binary which
is located at
And as you might have guessed, Terrier can also be used to verify and identify files at specific paths in containers. To do this, we need the following:
The points above can be determined using the same procedures described in the previous sections. Below is an example Terrier config file that we could use to identify and verify components in a running container:
mode: container path: merged verbose: true files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - name: '/usr/local/go/bin/go' hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
If we run Terrier with the above config, we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '/usr/bin/curl' at: merged/usr/bin/curl with hash:9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [!] Found matching instance of '/usr/local/go/bin/go' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91 dd3ff92dd [!] All components were identified: (2/2) [!] All components were identified and verified: (2/2) $ echo $? 0
From the output above, we can see that Terrier was able to
successfully identify and verify all the files in the running
container. The return code is also
0 which indicates a
successful execution of Terrier.
In addition to Terrier being used as a standalone CLI tool, Terrier can also be integrated easily with existing CI/CD technologies such as GitHub Actions and CircleCI. Below are two example configurations that show how Terrier can be used to identify and verify certain components of Docker files in a pipeline and prevent the pipeline from continuing if all verifications do not pass. This can be seen as an extra mitigation for supply-chain attacks.
Below is a CircleCI example configuration using Terrier to verify the contents of an image.
version: 2 jobs: build: machine: true steps: - checkout - run: name: Build Docker Image command: | docker build -t builditall . - run: name: Save Docker Image Locally command: | docker save builditall -o builditall.tar - run: name: Verify Docker Image Binaries command: | ./terrier
Below is a Github Actions example configuration using Terrier to verify the contents of an image.
name: Go on: [push] jobs: build: name: Build runs-on: ubuntu-latest steps: - name: Get Code uses: actions/checkout@master - name: Build Docker Image run: | docker build -t builditall . - name: Save Docker Image Locally run: | docker save builditall -o builditall.tar - name: Verify Docker Image Binaries run: | ./terrier
In this blog post, we have looked at how to perform multiple actions on Docker (and OCI) containers and images via Terrier. The actions performed allowed us to identify specific files according to their hashes in images and containers. The actions performed have also allowed us to identify and verify multiple components in images and containers. These actions performed by Terrier are useful when attempting to prevent certain supply-chain attacks.
We have also seen how Terrier can be used in a DevOps pipeline via GitHub Actions and CircleCI.
Learn more about Terrier on GitHub at https://github.com/heroku/terrier.
Daniel Lange: Fixing the Nextcloud menu to show more than eight application icons [Published articles]
I have been late to adopt an on-premise cloud solution as the security of Owncloud a few years ago wasn't so stellar (cf. my comment from 2013 in Encryption files ... for synchronization across the Internet). But the follow-up product Nextcloud has matured quite nicely and we use it for collaboration both in the company and in FLOSS related work at multiple nonprofit organizations.
There is a very annoying "feature" in Nextcloud though that the designers think menu items for apps at the top need to be limited to eight or less to prevent information overload in the header. The whole item discussion is worth reading as it it an archetypical example of design prevalence vs. user choice.
And of course designers think they are right. That's a feature
of the trade.
And because they know better there is no user configurable option to extend that 8 items to may be 12 or so which would prevent the annoying overflow menu we are seeing with 10 applications in use:
Luckily code can be changed and there are many comments floating
around the Internet to change
minAppsDesktop = 8. In this case it is slightly
compressed form (aka "minified") as
core/js/dist/main.js and you probably don't want to
build the whole beast locally to change one constant.
gets compressed during build time to become part of one 15,000+ character line. The relevant portion reads:
Well, we can still patch that, can we?Continue reading "Fixing the Nextcloud menu to show more than eight application icons"
Daily 11 minute brisk walk enough to reduce risk of early death [Published articles]
One in ten early deaths could be prevented if everyone managed at least half the recommended level of physical activity, say a team. The researchers say that 11 minutes a day (75 minutes a week) of moderate-intensity physical activity -- such as a brisk walk -- would be sufficient to lower the risk of diseases such as heart disease, stroke and a number of cancers.
Stuck at home? Entertain or educate yourself for free [Published articles]
It used to be that being a couch potato was almost universally deemed a negative—but it’s funny how it only takes a contagious epidemic to turn the normal state of things on its head. Fortunately, nobody with a computer need be without ways to occupy their time.
Publishers, studios, and other media agencies are providing free offerings to give people plenty to do to ride out the corona lockdowns—as well as tools to assist self-education or learning at home. Here are a few of them I’ve noticed.
Educational/children’s book publisher Scholastic is offering a free 20-day learn-at-home program for grades K-9 via its web site—very handy for those in areas whose schools have closed down.
Would your children like to learn more about whales? Seattle-based research institute Oceans Initiative has launched a free Virtual Marine Biology Camp to teach school-closed children more about aquatic life. They’re holding live sessions every Monday and Thursday at 11 a.m. Pacific (2 p.m. Eastern) to help give those out-of-school children something educational to do.
Audiobook publisher and Amazon subsidiary Audible.com is making hundreds of audiobook titles available for free for the duration of school closures, via stories.audible.com.
NPR, the Sarasota Herald-Tribune, and CNET, among others, have articles collecting a lot of other free entertainment and education sources that weren’t free before the Corona quarantines. (Indeed, all you need do is google “coronavirus free entertainment” to find all the others who had the same idea.) But there are also still plenty of things that were already free and still are.
Baen’s Free Library is, of course, still just as free as it ever was. If you’re a member of a compatible public library, Hoopla Digital will let you borrow a limited number of ebooks, audiobooks, albums, movies, or TV episodes per month for free. And you still have access to Project Gutenberg, Librivox for audiobooks, Archive.org for all sorts of content, and all the other public-domain sites out there.
Online academic database JSTOR has over 6,000 ebooks and 150 journals that are available to the general public, and could also help to fill the education gap with schools closed down.
If you’re looking for something interesting to watch, Open Culture has links to over 200 free documentary films online, on subjects as diverse as Hayao Miyazaki and M.C. Escher. The site also includes links to free ebooks, audiobooks, online courses, and textbooks.
If you’re into anime, most of Crunchyroll‘s anime titles are available to watch for free (save for the very newest episode). Resolution may be limited, and you may have to put up with advertisements—but free is free, right? Pluto TV has over 250 channels of free video content, too, with mobile apps for iOS and Android available. And YouTube has its usual countless hundreds of thousands of hours of enjoyable ways to entertain or improve yourself, including its “Learning” category.
If you’re more into computer games, you could check out the Homecoming City of Heroes servers. Coming up on a full year since the game originally returned, it has thousands of players once again enjoying life in the early-2000s superhero MMO. (I play primarily on the Torchbearer shard, myself, and am always happy to help out new or returning players.)
There are many more free education or entertainment resources than I could even list, and there will doubtless be more the longer this lockdown goes on. How about adding your favorites in the comments?
Photo by Eric Antunes on Pexels.com
If you found this post worth reading and want to kick in a buck or two to the author, click here.
'How to build a Nintendo Switch' for coronavirus #StayAtHome gaming [Published articles]
Wow, man. Some of us take on more extreme projects during the
Great Coronavirus Quarantine than others.
This ambitious fellow shows you how to build a Nintendo Switch, with a beautiful and wholesome purpose: “to Starve Online Price Gougers” who are jacking up the prices because demand is high for Nintendo Switch, and availability is nil.
Here's their introduction to the HOWTO gallery, which is amazing and stupendous.
After playing New Horizons and hyping it up to my friends, they decided they wanted a Switch. They called around to different retailers every day for a week with no luck finding anyone who had one in stock. No one knew when the next shipment would be. This led to an online search like Craigslist, OfferUp, and Ebay.
Unfortunately everyone knows the rest. Upwards of $450 to $600 in the Seattle area for a used Switch. Some with and without all the accessories. This enraged me to the point of telling them I could build one cheaper out of spare parts. So they hired me to do just that. If anyone is interested in doing the same here is my step by step buying guide along with assembly instructions and a pricing guide.
1. Game Cartridge Card Slot Socket Board w/Headphones Port - $15
2. NS Console Micro SD TF Memory Card Slot Port Socket Reader - $5
3. Nintendo Switch HAC-001 CPU Cooling Heatsink - $7
4. Game Cartridge Card Plastic Cover - $1
5. Console Speaker Replacement Parts For Nintendo Switch Built in speaker - $8
6. Wifi Antenna Connecting Cable (Short) $2
7. Wifi Antenna Connecting Cable (Long) $2
8. Internal Cooling Fan - $3
9. Power & Volume Button control flex cable (w/ buttons and rubber conductor) - $4
10. Side Slider Sliding Rail Flex Cable (Left) - $3
11. Side Slider Sliding Rail Flex Cable (Right) - $3
12. Replacement Top Housing Shell Case Face plate -$6
13. Nintendo Switch Console Replacement Battery (New) - $15
14. Replacement Bottom Housing Shell Transparent Case Face plate -$5
15. Touch Screen Digitizer Adhesive - $0.50
16. Touch Screen Digitizer - $9
17. LCD Display Screen Replacement - $12
18. Shield Plate - $2
19. Iron Middle Frame - $6
20. (Not Pictured Here) - 100% WORKING OEM NINTENDO SWITCH REPLACEMENT LOGIC BOARD MOTHERBOARD - $95
21. (Not Pictured Here) - Full Screw Replacement Set - $2
22. (Not Pictured Here) - (Removal of Copper Sicker on CPU)
Grand Total For Used Parts Build: = $199
Ebay Average Price Jan 2020: = (between $175 and $225)
Ebay Average Price April 2020: = (between $300 and $400)
I am sure I made made mistakes in this post so feel free to correct me if I am wrong about anything.
And screw you if you are one of the bad guys making a buck off of a crisis.
Here you go...
How to Build A Nintendo
Switch to Starve Online Price Gougers
How to Build A Nintendo Switch to Starve Online Price Gougers
My list of underrated movie scores and themes... Part 1 [Published articles]
Been putting this together for a while... more to come.
In no particular order, though grouped by composer.
To be clear, I'm in no way saying these are unknown themes or not loved. In my limited experience, they just don't get the same acclaim as some more well-known scores, and I feel they deserve recognition! These are just pieces of music uncannily suited to their films, and work perfectly in the movie while also standing alone as wonderful pieces of music.
And while I haven't completely steered away from the John Williams' and Jerry Goldsmiths of the world, I have tried to include slightly more off-kilter selections that are truly fantastic.
Klendathu Drop - Starship Troopers
Robocop Theme - Robocop
Riddle of Steel & Riders of Doom - Conan the Barbarian
Love Theme - Cinema Paradiso
Complete Score - The Thing
Ecstasy of Gold - The Good, The Bad, and The Ugly
Going The Distance & The Final Bell - Rocky
Main Theme - The Right Stuff
Main Theme - Capricorn One
Main Theme - Gremlins II (and Gremlins... just a great
performance of it)
Main Title - Planet of the Apes
The Enterprise - Star Trek: The Motion Picture
Erich Wolfgang Korngold
Main Title - Kings Row (also... the inspiration for Star
Main Title - Reunion - The Sea Hawk
Main Theme - Seven Years in Tibet (one of his best)
Main Theme - Born on the Fourth of July
With Malice Towards None - Lincoln
Main Theme - Predator
Main Theme - Contact (Maybe my fav on the list... I'm a sucker
for sentimentality... Sue me)
Captain America March - Captain America: The First Avenger
Junkie Xl - Mad Max: Fury Road
Daft Punk - Tron Legacy
James Horner - Commando
An Acceptance, in rough times [Published articles]
or at this YouTube link:
Some Countries Reopened Schools. What Did They Learn About Kids and Covid? [Published articles]
Studies from around the world suggest that success depends on class size, distancing, the age of the students, and how prevalent the virus is locally.
Sandro Tosi: Multiple git configurations depending on the repository path [Published articles]
For my work on Debian, i want to use my debian.org email address, while for my personal projects i want to use my gmail.com address.
One way to change the user.email git config value is to git config --local in every repo, but that's tedious, error-prone and doesn't scale very well with many repositories (and the chances to forget to set the right one on a new repo are ~100%).
The solution is to use the git-config ability to include extra configuration files, based on the repo path, by using includeIf:
Content of ~/.gitconfig:
name = Sandro Tosi
email = <personal.address>@gmail.com
path = ~/.gitconfig-deb
Every time the git path is in ~/deb/ (which is where i have all Debian repos) the file ~/.gitconfig-deb will be included; its content:
[user]That results in my personal address being used on all repos not part of Debian, where i use my Debian email address. This approach can be extended to every other git configuration values.
email = firstname.lastname@example.org
🧟♂️ ZOMBIES4TEST " Modpack " 🧟♀️ [Published articles]
ZOMBIES4TEST", a very simple modpack, which adds some types of zombies in your world and some other very simple things, and I hope you like it.. ðŸ˜
What are some albums that have ZERO bad songs? [Published articles]
submitted by /u/JJacobb- to r/AskReddit
Absolutely LOVED this scene as a kid…..still do. _ Dragonslayer (1981) [Published articles]
|submitted by /u/WildDog3000 to
What's a story where the "bad guys" are actually, completely, 100% right, to the point where it's weird the story keeps calling them the bad guys? [Published articles]
submitted by /u/HallZac99 to
I just helped discover the second closest black hole to Earth!!! [Published articles]
Paper here, with yours truly as 3rd author! (Note: preprint, we still have to undergo peer review)
TL; DR: new black hole ~3800 light years from us, spotted via a star it's in orbit with!
Now first thing to clarify is, this is truly the lead author's discovery, Kareem El-Badry, who is an amazing astronomer. What he's been doing is going into the Gaia catalog (which carefully tracks the precise movement of billions of sources) and being great at finding "needle in a haystack" type things. In this case, the thing was a red giant star, about the same mass as our sun, orbiting an unseen companion that we've concluded must be a black hole, named Gaia BH2.
How do you do this? Well as you might recall, orbital mechanics state that if you have two objects in space gravitationally bound, they will orbit a common point of interest. When this happens, you'll see the objects "wobble" in their movement back and forth over the course of their mutual orbit (which is how we find many exoplanets, in fact!) What Kareem did, strictly speaking, was find a star with a weird "wobble" in the data... and that "wobble" indicated the star's orbit was in a period of P= 1277 days, and the companion it was orbiting would be a compact object ~9x the mass of the sun.
Now, a star 9x the mass of the sun would be stupid bright, and very obvious bc this visible star is pretty bright on its own (12th magnitude). Definitely nothing there in follow-up observations, so it's not a star. So basically at this point, the argument is "if only we knew of something that was very massive, so massive light doesn't escape it... oh yeah, a black hole!"
Now the trick is some black holes do emit at low levels, thanks to accreting dust onto them- this happens in closer star- black hole pairs, called X-ray binaries. This emission is basically created as particles get close to the event horizon of the black hole, "feeding" it, and how we can spot them usually in radio and X-rays. And, well, we know this star pretty well because we can see it, and every star will have some amount of particles coming off of it in a stellar wind (like the sun does, and how we get the aurora), which is pretty well understood for stars of this type. So then the question is- is Gaia BH2 emitting at any wavelength?
Now this is where I come in, in my role of someone who knows a thing or two about how to get radio observations of weird black holes. :) Kareem is in my institute and came in to tell me about this object a few months ago, and that he'd discovered the closest period in its ~3.5 year orbit was happening this month! (Yes, that's a bit of luck- in science it's good to be lucky sometimes!) So if you want to detect particles interacting with the black hole, your best chance of seeing it is basically now. Also, it was a very southern hemisphere object, so not just any telescope can look at it.
So, what I did was file for emergency time to use the MeerKAT telescope in South Africa, the best telescope on Earth to do this observation, asking for a several-hour observation of Gaia BH2. Luckily, they agreed and granted the time, so we took a look a few weeks ago! (And I have now officially hung up my shingle as a "black hole consultant" btw- my rates are very reasonable! :) )
Now, the bad news is, we did not detect any radio emission from Gaia BH2 (nor did the Chandra X-ray telescope.) You can see the details in Figure 10 of the paper linked at top. But the good news is this is actually massively helpful, because there is so much we don't understand about black holes! For example, how does this accretion process work for emission from black holes? Our data is good enough that we can say most of those stellar wind particles never reach the event horizon- maybe there are strong winds blowing them away, or similar. Not as exciting as a detection, but still really useful!
Anyway, moving on from that, Gaia BH2 is exciting because as the name implies, it's the second such Gaia black hole- the first being Gaia BH1. This discovery happened a few months ago (press release if you missed it then), and that one happens to be the closest black hole to Earth that we know of (and why Gaia BH2 is second- this one has the largest orbit known for a black hole though). This is super exciting because it now implies that these black holes in orbits are actually rather common in space- more common than ones where the black hole and star are closer at this rate!- and the trouble is detecting them. (It's also not clear how they form, so some nice work for theorists to do.) Well, for now- the good news is Gaia is still taking data, and its next data release (in ~2026) will have a lot more of these stars with mystery black hole companions in it! So, guess there will be a lot more to do!
R.I.P. Gerald Fried, Emmy-winning composer of Star Trek's famous "fight music" [Published articles]
Gerald Fried has died. A veteran musician and long-time Hollywood composer, Fried contributed work to dozens of films, as well as some of the defining TV shows of the 1960s and ’70s and beyond, including scores for Roots, Star Trek, and Gilligan’s Island. Although his work was sometimes overshadowed by the frequently …
500-Year-Old Leonardo Da Vinci Sketches Show Him Grappling With Gravity [Published articles]
An anonymous reader quotes a report from Gizmodo: A team of engineers studying the 500-year-old, backward writings of Leonardo da Vinci have found evidence that the Italian polymath was working out gravity a century before its foundations were established by Galileo Galilei. The team's findings come from a revisit of the Codex Arundel, a compilation of documents written by da Vinci that detail various experiments and personal notes taken down in the latter 40 years of his life. The codex is freely accessible online courtesy of the British Museum. The team's research is published in the MIT Press journal Leonardo. Mory Gharib, an engineer at Caltech, said he stumbled across the writings in 2017 when looking for some of da Vinci's work on flow in hearts. Though the codex was written over a long span of da Vinci's later years, Gharib suspects the gravitational musings were written sometime in the last 15-or-so years of his life. Gharib recruited co-author Flavio Noca, a researcher at the University of Applied Sciences and Arts Western Switzerland, to translate the Italian's backward writing on the subject. Da Vinci understood some fundamentals of objects in motion. He wanted to make an experiment testing how the motion of a cloud would correspond to the hail it produced, if the cloud's velocity and any changes to it corresponded with the falling hail's velocity. In lieu of control of the weather, da Vinci substituted a pitcher for the cloud and sand or water for the hail. Reliable clocks weren't available until about 140 years after da Vinci's death in 1519, the researchers note, so the inventor was forced to substitute the constant of time with space: by assuming that the time it took each water/sand particle to fall from the pitcher was constant, he just kept the pitcher at the same height throughout the tests. Da Vinci's sketch shows the positions of the falling material over the course of its trajectory toward the ground. By drawing a line through the position of the material at each instance in time, da Vinci realized that a triangle could be formed, with the drawn line being the hypotenuse. By changing the acceleration of the pitcher over the course of the experiment, one would change the shape of the triangle. Leonardo knew that the falling material would accelerate and that the acceleration is downward. What he wasn't wholly certain on -- hence the experiment -- was the relationship between the falling material's acceleration and the pitcher's acceleration. In one particular case, when the pitcher's motion was accelerated to the same rate as the falling material being affected by gravity, an equilateral triangle was formed. Literally, as Da Vinci noted, an "Equatione di Moti" or an "equalization of motions." The researchers modeled da Vinci's experiment and found that the polymath was wrong in his understanding of the relationship between the falling object and time. "What we saw is that Leonardo wrestled with this, but he modeled it as the falling object's distance was proportional to 2 to the t power [with t representing time] instead proportional to t squared," said Chris Roh, a researcher at Cornell University and a co-author of the researcher, in a Caltech release. "It's wrong, but we later found out that he used this sort of wrong equation in the correct way." The team interpreted tick marks on da Vinci's sketches as data points the polymath made based on his eyeballing of the experiment in action. In lieu of a timepiece, da Vinci found the gravitational constant to nearly 98% accuracy.
Read more of this story at Slashdot.
Latest Attack on PyPI Users Shows Crooks Are Only Getting Better [Published articles]
Read more of this story at Slashdot.
Python projects with best practices on Github? [Published articles]
I'm always finding ways to upskill my coding skills with Python and write refactored code the right way. After just diving deep into Python, I've been looking around Github for Python projects that are examples of best practices and ones with a strong and well built architecture. I want ones that are organized and have proper documentation.
Finding the ones that I really want might take some time, and I might end up getting something I don't want in the end. However, I did notice a few mentions of certain well-known ones in the sub. So with that being said, the ones that I believe match what I'm searching for are these repos:
If you've got any more links to that you think would be worth checking out, that would be nice. Thanks in advance to anyone who can suggest.
Coffee won’t give you extra energy, just borrow a bit that you’ll pay for later [Published articles]
Why is New York City Removing Free Broadband In Favor of Charter? [Published articles]
In January 2020, former New York City Mayor Bill de Blasio announced New York City’s Internet Master Plan, setting a path to deliver broadband for low-income New Yorkers by investing in public fiber infrastructure. The plan was a clear response to the gap created from systemic digital redlining (an industry practice EFF has called for governments to ban) that every American city deals with today. Shortly after the announcement by de Blasio, the COVID-19 pandemic hit and the made the need for public fiber into low-income areas greater than ever before.
In response, former union workers at Spectrum opted to build their own broadband cooperative called People’s Choice Communications, to deliver free high-speed access. This was unequivocally a good thing. The de Blasio Administration itself was even in the process of contracting with the worker-owned cooperative to build new networks. But since the election of Mayor Eric Adams, this critical progress has not only come to a halt, it is also now actively being undermined, to the benefit of large cable corporations. Instead of pursuing long term solutions to low-income access, as outlined by the Internet Master Plan, Mayor Adams has abandoned that plan. Now, the Adams administration is pushing an extraordinarily wasteful proposal dubbed “Big Apple Connect,” that literally just hands money over to cable companies.
Let's be crystal clear: Going from a plan to invest millions into building public infrastructure to a plan to subsidize cable companies is a gigantic waste. Building multi-generational public infrastructure that can eventually deliver free access is the only means of achieving long-term sustainable support. Giving money to cable companies to pay their inflated bills will build nothing, and it won't deliver 21st century infrastructure to those most denied it. It simply pads the profits of companies that have long-neglected these communities and failed to improve access—even when granted money to do so.
The original NYC proposal captures exactly what needs to be done to deliver permanent solutions. It would have created infrastructure that can lead to the creation of more local solutions like the People’s Choice Communications. NYC’s population density makes it attractive to small, local providers because there is such high demand for broadband that even small networks can find customers. Accessible fiber that is provisioned on an open and affordable basis dramatically lowers the barrier to entering the broadband market. This would both create competition and drive down prices for everyone, not just low-income people, as new entrants enter the market delivering gigabit-level connectivity.
As if all of that wasn't aggravating enough, now we know that the Adams administration is actively dismantling equipment that People's Choice Cooperative installed in public housing. This equipment offers free unsubsidized broadband access—sometimes at speeds greater than legacy cable connections. Why? To make space for expensive, subsidized cable. No government entity should be taking access away from people. But the existence of a free, unsubsidized connection would not only embarrassingly raise questions about the Big Apple Connect program’s entire premise, but also threaten the cable monopoly of high prices for inferior speeds across the country.
Fiber infrastructure requires a one-time installation cost so that a network can be useful for broadband purposes for decades to come. It is more efficient than legacy infrastructure and is set on a trajectory to deliver faster speeds at lower prices. This is why the Biden Administration made clear in its own infrastructure program that “only end to end fiber” can deliver future proof access. Every dollar spent on building out end-to end fiber will not need to be spent again to enable connectivity in the future. By contrast, every dollar spent on subsidizing legacy, obsolete infrastructure is a waste. This is why slower networks will cost more in the long term, and why public investments like Big Apple Connect are entirely the wrong idea. Mayor Adams should stop removing local choices for broadband access such as People’s Choice Communications, abandon the Big Apple Connect boondoggle, and re-embrace the long-term vision set out with the Internet Master Plan.
When Ted Turner asked Carl Sagan if he is a socialist [Published articles]
|submitted by /u/doterobcn to
pilot accidentally gives passenger announcement to air traffic control [Published articles]
|submitted by /u/ColugoLT to
The Enchiridion by Epictetus [Published articles]
How fingerprints form was a mystery — until now [Published articles]
Scientists have finally figured out how those arches, loops and whorls formed on your fingertips.
While in the womb, fingerprint-defining ridges expand outward in waves starting from three different points on each fingertip. The raised skin arises in a striped pattern thanks to interactions between three molecules that follow what’s known as a Turing pattern, researchers report February 9 in Cell. How those ridges spread from their starting sites — and merge — determines the overarching fingerprint shape.
Fingerprints are unique and last for a lifetime. They’ve been used to identify individuals since the 1800s. Several theories have been put forth to explain how fingerprints form, including spontaneous skin folding, molecular signaling and the idea that ridge pattern may follow blood vessel arrangements.
Scientists knew that the ridges that characterize fingerprints begin to form as downward growths into the skin, like trenches. Over the few weeks that follow, the quickly multiplying cells in the trenches start growing upward, resulting in thickened bands of skin.
Since budding fingerprint ridges and developing hair follicles have similar downward structures, researchers in the new study compared cells from the two locations. The team found that both sites share some types of signaling molecules — messengers that transfer information between cells — including three known as WNT, EDAR and BMP. Further experiments revealed that WNT tells cells to multiply, forming ridges in the skin, and to produce EDAR, which in turn further boosts WNT activity. BMP thwarts these actions.
To examine how these signaling molecules might interact to form patterns, the team adjusted the molecules’ levels in mice. Mice don’t have fingerprints, but their toes have striped ridges in the skin comparable to human prints. “We turn a dial — or molecule — up and down, and we see the way the pattern changes,” says developmental biologist Denis Headon of the University of Edinburgh.
Increasing EDAR resulted in thicker, more spaced-out ridges, while decreasing it led to spots rather than stripes. The opposite occurred with BMP, since it hinders EDAR production.
That switch between stripes and spots is a signature change seen in systems governed by Turing reaction-diffusion, Headon says. This mathematical theory, proposed in the 1950s by British mathematician Alan Turing, describes how chemicals interact and spread to create patterns seen in nature (SN: 7/2/10). Though, when tested, it explains only some patterns (SN: 1/21/14).
Mouse digits, however, are too tiny to give rise to the elaborate shapes seen in human fingerprints. So, the researchers used computer models to simulate a Turing pattern spreading from the three previously known ridge initiation sites on the fingertip: the center of the finger pad, under the nail and at the joint’s crease nearest the fingertip.
By altering the relative timing, location and angle of these starting points, the team could create each of the three most common fingerprint patterns — arches, loops and whorls — and even rarer ones. Arches, for instance, can form when finger pad ridges get a slow start, allowing ridges originating from the crease and under the nail to occupy more space.
“It’s a very well-done study,” says developmental and stem cell biologist Sarah Millar, director of the Black Family Stem Cell Institute at the Icahn School of Medicine at Mount Sinai in New York City.
Controlled competition between molecules also determines hair follicle distribution, says Millar, who was not involved in the work. The new study, she says, “shows that the formation of fingerprints follows along some basic themes that have already been worked out for other types of patterns that we see in the skin.”
Millar notes that people with gene mutations that affect WNT and EDAR have skin abnormalities. “The idea that those molecules might be involved in fingerprint formation was floating around,” she says.
Overall, Headon says, the team aims to aid formation of skin structures, like sweat glands, when they’re not developing properly in the womb, and maybe even after birth.
“What we want to do, in broader terms, is understand how the skin matures.”
u/PoopMobile9000 explains the history of the US government's "debt ceiling" and how an outdated procedural formality is exploited for political gain with potentially catastrophic effects on the economy [Published articles]
submitted by /u/Super_Jay to
So... how do you go about building a playgroup? [Published articles]
Long story short, after coming to the realisation that myself and my IRL friends wanted different things at the gaming table I've bounced around from discord to discord server running a handful of games here and there but moving on after a month or so.
Thing is is that this getting a bit knackering, and I'd like start building a stable group. I tried reaching out to some of the people I'd campaigned with previously to see if they'd be interested and got noncommittal nonanswers.
So reddit, if you were in my position how would you go about building a steady playgroup?
Social media the last two days [Published articles]
|submitted by /u/Drewshbag14 to
The City-ship of the Ancients: Atlantis 3D printed and illuminated [Published articles]
|submitted by /u/BlackMage13 to
What makes a sandwich go from boring to amazing? [Published articles]
submitted by /u/arisal3 to r/AskReddit
The Legacy of Pixar's Wall-E, Now a Member of the Criterion Collection [Published articles]
It’s hard to even imagine. A world ravaged by climate change. People totally consumed by technology. Mega corporations in control of everything. Robots performing menial tasks. Wait, did we say “hard” to imagine? We meant we’re literally living it. The “it” being Wall-E, Pixar’s 2008 Oscar-winning masterpiece co-written and directed by Andrew Stanton.
The tale of a lone robot left to clean up the Earth who finds himself on an intergalactic adventure to protect the future of the planet wowed audiences when it was released and is considered to be one of Pixar’s best films to date. Since then, Wall-E has only gotten more poignant and been more revered, so it’s only fitting that, on November 22, it becomes Pixar’s first film ever released by the Criterion Collection, a company specializing in the best, most comprehensive, obsessive Blu-ray releases around.
To mark the occasion, io9 sat for a video chat with Wall-E’s director to find out how the film made it to Criterion, what his favorite special features are, what he thinks about our world being so close to Wall-E’s, and whether there was ever talk of a sequel or theme park ride—as well as his work on Obi-Wan Kenobi and For All Mankind. Check it out.
Germain Lussier, io9: So how did you find out that the
movie is going to be a part of the Criterion Collection? Because
that was a big deal and it’s Pixar’s first
Andrew Stanton: I approached them. I pressed them as a filmmaker. This was not a studio thing. It was me asking as a favor to Alan Bergman, who is the president of Walt Disney Studios, and saying, “Look, I’ve been out in the world making TV shows for about seven years. I’ve met so many people in the industry now, filmmakers that I revere, filmmakers that are budding, that really sort of get the cinema DNA and inspirations that were in the molecules of Wall-E.” I made it with such a love of cinema. It was great to see that it had that effect on on a lot of peers and I felt like there’s something there that qualifies it to possibly be in their library. And so I said, “Can I ask them?” Because I know it breaks precedent with what Disney does so, it was a bit of a favor, and he said, “Well, if Criterion bites, yeah, we’ll see if we can make it work.” And so that was in 2019. Then the pandemic hit and everything just paused. They said yes. But it was very frustrating because then the world just stopped. Then it was really last year that we got very serious about it.
And the real question was “What’s the Criterion angle on this?” We’ve done such a good job and a thorough job of showing the behind-the-scenes on other DVDs. So I really left it up to them. And [Criterion producer] Kim Hendrickson and the rest of their wonderful team really just drilled down. I let them lead about “What is interesting to you guys? What is it you guys want to know?” So that really led the angle on all the doc materials and the booklet and everything. I’ve been a consumer of Criterion since they existed in the late ‘80s. So I was pleased as punch to see everything from the cover to what their perspective was and what was interesting to them, because I feel a lot of the times [Pixar] DVDs get used as a babysitter, and they’re not necessarily going to the crowd that we want to talk to, because I would love to talk to other people that love film as much. And so I feel like, “Oh, this is finally for an audience that I would be in.”
io9: Yeah, it’s my favorite Pixar movie, so going through the disc and exploring a little bit was excellent. Now, you buy a Criterion for the transfer, the sound, but mostly the features. And here there’s just so much. So, if someone buys this disc, they watch the movie—but after that, what’s the first thing they should go watch and why?
Stanton: Well, it’s hard for me to know if there’s a better order, but I find it fascinating to be able to finally do a Master Class and just talk about the actual under-the-hood work that we do [at Pixar], and how much we really control and work on and nuance the story down to a beat-by-beat level. I mean, I could have done that with any scene and any other filmmaker in the studio could have done that with their movies and their scenes. And so it was a chance to slow down and actually have a literal Master Class. And then I also loved being able to talk about all the cinematic influences because again, we’re such filmgoers first and filmmakers second that—any one of these films, but I think particularly Wall-E—had such deep, deep, deep influences from some of the earliest cinematic movies. There really was such a major Keaton and Chaplin influence. So I think it’s great to sort of see how cinema from any era keeps inspiring the latest films. That it just keeps passing it on.
io9: One of the things I did watch was the Master Class and it was really, really good. And I also watched the “Wall-E A to Z” feature that you guys did for this.
Stanton: To me, we could have gone A to... If that alphabet was twice as long we could have kept finding things.
io9: Oh for sure. I bring it up though because I found it interesting how it really spoke to the way Wall-E was ahead of its time, or at least forward-looking with a lot of things—technology, the terrible world we’re in. So out of all those things that our world has in common with Wall-E, does anything, in particular, stand out, for better or worse?
Stanton: Well, I certainly didn’t expect to be seeing the dire state of the world climate-wise in such a short amount of time. Didn’t want to be right about that! And I’ve noted back at the time of press for the movie when it originally released, that wasn’t something that I was preaching. I just sort of leveraged off of the truth of what I’d always thought. I was raised in the ‘60s and ‘70s not to pollute and that the environment is fragile. That was always in my world and culture. So I just went with that logic to get this robot alone. I wasn’t hoping to be right. I wasn’t being a Lorax, but I was not anti. So I’m horrified that I was right in that regard.
The other thing that happened, equal or greater, than I expected, was the siloing of everybody and their technology. I knew I was right about that. I was one of the first adopters of an iPhone and I was like “This is like smoking cigarettes.” I can’t stop it, you know? I just knew. And I was just sitting getting coffee this morning in New York and watching everybody pass by on their way to work and counting. And it was like one in every six people was looking forward and everybody else was just looking at their iPhones and not navigating. I’m like, “Oh my God, I’m sitting on the Axiom right now.”
io9: That’s hilarious-slash-terrifying. But okay.
The movie has this distinguished legacy but it’s also one of
the few Pixar movies from that time that didn’t get a sequel.
I get that the credits are kind of the sequel but did Disney ever
pressure you guys to say like, “Hey, any thoughts on a
Stanton: I’ve been getting this question since 1995.
Stanton: And everybody wants Disney to be the big bad guy. And I’m sure that they sometimes are on things. But for [Pixar], they’ve always said is “Whatever you guys want to do, we just would love a sequel whenever it comes to you naturally.” Economically, [Pixar] wouldn’t exist if we didn’t have our third feature be Toy Story 2, and if we didn’t continue to try and find other ways. So we try to find them organically and we try to find them honestly. And we certainly don’t want to spend four years working on a lesser-than product because that’s just too much of your life. And, frankly, having been behind several sequels, after about six months, it’s an original. Anything you think you gain from it, it’s sometimes even harder to crack.
So there’s never pressure from them like, “We need exactly this at this time.” We’ve never had that. But we’ve had our own private pressure of like, “How do we keep a balance so that we can keep the lights on?” or else you guys don’t get to watch anything ever again. It’s always been that. So there’s not any “If you left us alone, we’d never make a sequel.” But Wall-E just never felt right to me. I mean, I’m not anti and I’m very sober to the fact that I don’t own this movie. They can do whatever they want with it. And if I get hit by a bus tomorrow, who knows? But it doesn’t feel like it’s asking for that. And on the success chart of our movies, it wasn’t one of the bigger moneymakers. So I don’t even feel like the crass business guy is going “We need another one.” So it kind of protects it a bit.
io9: One of the other things Disney likes to do,
obviously, is theme park rides and pretty much—not all of
them, but a lot—of the Pixar movies from that era are in
theme parks as an attraction. Wall-E obviously has a
million things merch-wise and it’s honestly a little bit more
pessimistic towards the world, at least at the start, but were
there ever talks about bringing the movie into the parks as a
Stanton: Again, that’s a direct reflection of it wasn’t that big of a box office movie. It did that sweet spot of it did just well enough that nobody’s embarrassed by it, but it did just low enough that everybody is like, we’re just going to let that move on. And it’s kind of stayed pure in that sense.
io9: Gotcha. Now, you said you approached Criterion for this. You were a fan. So were there any features or materials you had been sitting on in case something like this ever came to pass?
Stanton: No. I think the thing that was frustrating though, still even on making this disc, is we shot so much behind the scenes [footage]. We just let the cameras run in so many meetings and I think our Criterion producer Kim Hendrickson saw more than I’ve ever seen. And she said that they could have made a whole box set of just watching how we make the donuts, you know? Which is frustrating because there’s so much stuff that we do that uses other materials. Like sometimes other music and we’ll never be able to show it because we don’t have the rights to the music. It’ll always be a bit of frustration that we can never really, really, really, really show a Get Back, behind-the-scenes talk.
io9: Yeah that would be incredible. Okay, I’ll
come back to Wall-E but as you can see [from the art]
behind me, I’m obviously a huge Star Wars fan. And
you helped write the final two episodes of Obi-Wan Kenobi
which have some really crucial moments in Star Wars
history. So I want to know, what’s the process for that?
Obviously, you want to tell a great story. But also with Star
Wars, it’s got to fit in with all the canon and
everything else. So how did that work out?
Stanton: That was the blessing and the curse of it. It’s like one, you’re geeking out that you get to type “Vader says” this and “Kenobi says” that. You pause and say “I can’t believe I’m actually getting paid to type this. I can’t believe these words may be said.” But then another part of you, it has to go through such a rigorous like “Does that fit the canon?” And I feel like it’s bittersweet. [The reason that happens is] because people care, but it also kind of doesn’t allow, sometimes, things to venture beyond where maybe they should to tell a better story. So it can sometimes really handicap what I think are better narrative options.
And so I was frustrated sometimes—not a lot—but I just felt it wasn’t as conducive to [the story]. So I love it when something like Andor is in a safe spot. And it can just do whatever the heck it wants. But I felt, you know, Joby [Harold, Obi-Wan Kenobi co-writer and executive producer], to his credit, kept the torch alive and kept trying to thread the needle so that the story wouldn’t suffer but it would please all the people that were trying to keep it in the canon. But I got some moments in there that I’m very happy with.
io9: Yeah, that sounds like a tough balance. Thank you. The other show you’ve been working on in the last couple of years is For All Mankind, which I recently caught up on and loved. What is it like working on something that is obviously so good, but it’s a little under the radar—and then also, did any of your space knowledge from Wall-E translate over?
Stanton: Well, we had a lot of NASA consultants at the time for Wall-E. And so it felt like I’d done a little bit of research. I mean, [For All Mankind’s] stuff is so thoroughly vetted by the writers’ room and the showrunners, and we have a consultant on set and an astronaut that’s actually there. And so you know that usually by the time you’re shooting and you’re reading what the scene is, that it’s already been vetted. But I just geek out and want to do it correctly. I love as a storyteller, working within those limitations. Like this is what would really happen, there wouldn’t be a window here, they float at this moment, they wouldn’t float here. I just love that challenge of just going, “Okay, then how do you tell that moment?” It’s such a great crew and show and I was pleased as punch to be able to come back and work with the same family again.
io9: Oh it’s the best. Finally, to wrap up back on Wall-E, it was a critical hit. Now we have this Criterion disc. Over a decade later, when you look back at it, what are you most proud of with the film?
Stanton: That you get just as caught up now. That’s all I care about. That’s the drug for me, still. I just want the lights to go down, and I want to be fully engaged and forget where I am, forget who I am, and then the lights come up, and I was just 100% in. And that’s really all it is that I’m trying to find again with every movie I buy a ticket for, and trying to do with every scene if I’m behind the telling of something. And I could tell I was hitting a real pure vein for so much of Wall-E. And it’s nice to come back to it now and feel that that hasn’t faded. You can just get just as caught up in it as you did on day one. It’s like having a song where everything’s just harmonized so well and you pick the arrangements just right. You don’t see a way to improve it. Plus, it’s a hummable tune. It’s something that your foot taps against your will. I feel like you don’t get to say when you’ve found songs that are that strong, and the same with movies, and I thought I did at the time. And it’s nice to look back and go, “Oh yeah, I did.”
Yes. Yes, he did. The Wall-E Criterion Collection disc
is out November 22.
Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about James Cameron’s Avatar: The Way of Water.
They forged enduring friendships from massive, same-name group chats [Published articles]
Last month, journalist Matt Cohen tweeted about his years-long Instagram group chat comprised of fellow Matt Cohens, which he calls “the most wholesome thing I’m a part of.”
In the chat, one Matt Cohen shared that he “had [his] first day of college classes today,” to which a Matt Cohen responded “Nice. Just started my first job. Real world is brutal enjoy college man.”
“Got married!” and “Just started my dream job!” chimed in fellow Matt Cohens. Another Matt Cohen announced he had launched a weed brand. The Matt Cohens, who have turned a shared name into an informal online club, planned a Zoom Happy Hour to catch up.
Your name clones usually lurk around you like a shadow. You get their junk mail, their emails, their Google results; glimpses of their intimate moments via their digital ephemera. They are strangers — but they don’t have to be.
Around the world, people are maintaining multigenerational, global friendships with their same-named counterparts — Jake Wright, William Hodgson, Jordan DaSilva, and Josh Brown, to name a few. Sometimes, name twins commiserate about shared experiences: a sixteen-member Council of Aaron Johnson chat laments about the viral Key and Peele sketch that introduced the now-inescapable A-A-Ron nickname. Perhaps the best, or at least the most publicized, example of same-name camaraderie is the Josh Fight, when a group chat of Josh Swains organized an April 2021 meeting in Lincoln, Nebraska to fight for the “right” to the name. More than 900 Joshes showed up.
The Paul O’Sullivan Band has four members with one thing in common: the name Paul O’Sullivan. The quartet materialized after Baltimore Paul started “indiscriminately adding other Paul O’Sullivans on Facebook” and realized that a few different Paul O’Sullivans were musicians. These days, a quartet of Paul O’Sullivans, who hail form from Baltimore, Rotterdam, Manchester, and Pennsylvania, have come together to form a bona fide musical group.
Since its early days, the social internet has been lauded as a way for niche interest groups to connect, and name twins are no exception. A chat titled “Council of Bens” hosts 2500 Benjamins and Bens, and when one Ben caught wind of a similar group chat of Sydneys, he created a chat just for people named Sydney or Ben, which has been going strong for months. Chris Lenaghan added 7 other Chris Lenaghans to a chat, and soon he had same-name friends from Ohio to Belfast to Birmingham. In a Josh Kaplan group chat on Twitter, fellow Josh Kaplans use the chat to congratulate each other on achievements and awards: “A win for one JK is a win for all.”
Samuel Stewart, a 19-year-old Exeter student living in London, formed an Instagram chat of fellow Samuel Stewarts after reading about the Josh Fight. For a few weeks, they chatted about their days; older Sam Stewarts gave advice to younger Sam Stewerts. “They seemed to take me under their wing as if I were a younger version of them,” said a 19-year-old Sam Stewert when we talked on the phone. But the chat went awry when one Samuel Stewert started asking for money. “I felt a bond with the fellow Samuel Stewarts, but the name connection wasn’t quite strong enough for me to start giving away my college fund,” Sam told me.
“They seemed to take me under their wing as if I were a younger version of them.”
The chats aren’t strictly social — sometimes, they’re the most practical way to sort through same-name mixups. Will Packer, a strategist in New York, recently used the Will Packer chat to see if any of his name brothers had been contributing to his inbox clutter. “Any of you from Queensland?” he asked. “Someone tried to create a PlayStation account with my email.”
College student Nolen Young says, “I once created a Facebook Messenger group chat with everyone I could find on Facebook with my exact same name, spelling and all. There were only two other people. One of them considered giving me a job, and the other was an old man who started commenting on all my photos. I've messaged the former a few times because he owns every domain name and email I've ever wanted, and he keeps telling me I can only have them when he dies.”
It’s easier than ever to connect with same-name pals today, but the uncanny allure of name clones predates social media. Tahnee Gehm, an artist and animator based in L.A., organized a Web 1.0 catalog of Tahnees when she was a teenager.
“My dad was into computers and he got me a URL with my name,” she says. “I built an atrocious ‘90s website in 2001 as an eighth grader, and I started getting messages from girls all over the world named Tahnee.”
To catalog her new pen pals, she created a “Hall of Tahnees” webpage with a photo, bio, and hometown for every Tahnee she could find. The site’s “Tahnee-only area” was a “weird, unique club.” Once, she says, a singer from the band Hanson used the website to track down a girl named Tahnee he’d met at a concert. And the Tahnee bond has lasted decades: Tahnee Gehm has maintained a long-distance friendship with Tahneé Engelen since they were in high school. A few years ago, Gehm spent two weeks visiting Engelen in Paris, where she works as a neurobiologist.
“It’s nice to know that my name buddy is living my alternate life and absolutely killing it,” she told Input over the phone.
Sometimes, all it takes to spark a friendship is a similar email address. Seth Capron met an older Seth Capron after noticing their similar interests based on the emails he mistakenly received — soon, they realized their physical resemblance, too. These days, the older Seth jokes that he could pass on his career. “I was actually considering that as I move into retirement, the Younger could just carry on in my former role of Seth Capron, affordable housing consultant,” said “Seth the Older.”
Name buddies sometimes have a parasocial relationship with each other’s digital footprint. As a kid, Chris Lenaghan found online videos of a different Chris Lenaghan doing wheelies and “cool BMX shit” and immediately told all his friends that it was him in the videos. Years later, thanks to a big group chat, Chris Lenaghan met the BMX trickster, who he now calls “Ohio Chris,” and they ended up becoming close friends.
The chats don’t always advance beyond acquaintanceship, though. Evan Quigley, a University of Florida student, says that the Evan Quigley group chat is “more like a running joke than true friendship.” (The Evan Quigleys, bonded by name alone, proclaim unconditional public support for one another by commenting “way to go, Evan Quigley” on each other’s posts).
People with uncommon first names can bond over shared experiences — mispronunciations, playground taunts, and misspellings. More than a dozen Zaviens have come together via Snapchat. “None of us had ever talked to another Zavien,” one Zavien told Input. And a 14-member-strong “Council of Ethyns” chat, which started on Instagram in 2019, is mostly dedicated to tongue-in-cheek malice toward Ethans (with an “a”). They also just pop in the chat to say “love you Ethyn” a lot.
Still, the unlikely connections evoke nostalgia for a simpler internet, less cluttered with surveillance and corporate interests, where people went to meet new friends. Occasionally, wholesome chance online encounters remain. “Text door neighbors,” for example, or people with phone numbers one digit apart, show how easy it is to stumble upon an unlikely friend. Most notably in the wrong-number-gone-right stories, the duo Wanda and Jamal, whose viral wrong number ordeal has led to a six-year-long-and-counting Thanksgiving tradition, is now set to be featured in an upcoming Netflix movie.
It’s a big world out there — lots of Matt Cohens, more Alex Stewarts, and even more James Smiths — and your name buddies have never been easier to befriend. And I think that’s beautiful.
But just because we’re psychologically inclined to like our own name, doesn’t mean you’ll have a guaranteed connection with your name clones. Just ask Kelly Hildebrandt and Kelly Hildebrandt, the couple that tied the knot a year after they’d met when name-searching on Facebook and then, four years later, called off the marriage due to irreconcilable differences. It’s not all in a name.
Missouri’s school funding system undermines its own goals for equity, experts say [Published articles]
Missouri’s school funding strategy recognizes that some children and communities need more financial support to meet education standards.
It directs extra funds to districts that have a harder time raising local property taxes, and to children who have special needs, are learning English or are living in poverty.
But experts say while the system has good intentions, the devil is in the details.
Parts of Missouri’s funding strategies undermine its equity goals. School finance researchers named issues that include:
Overall, this leads to a situation where some districts receive state aid they don’t need while others are stretched thin as they attempt to serve children who need more resources.
“We have a large portion of students across the state who experience some form of economic disadvantage,” said Cameron Anglum, an assistant professor of education, policy and equity at the Saint Louis University School of Education.
“It’s really important that the state funding formula serve those kids effectively, particularly those kids that live in districts that don’t have the local property wealth … to provide an adequate education.”
Bruce Baker, a professor and chair of the department of teaching and learning at the University of Miami School of Education and Human Development, said the main goals of a school finance formula should be adequacy and equity.
Adequacy means there’s enough funding for the school to meet certain goals. Equity acknowledges that some students or schools may require greater funding to meet those standards.
To illustrate, Baker referred to the School Finance Indicators Database run by the Albert Shanker Institute and Rutgers Graduate School of Education.
The database calculates that in 2019, the latest data available, the 20% of districts with the highest poverty rates in Missouri needed nearly $12,000 more per student to reach national average test scores than the 20% of districts with the lowest poverty rates.
Instead, the database shows students in highest-poverty districts were receiving only about $1,000 more than students in the most affluent schools.
Baker, one of the main researchers for the database, said the numbers are based on a statistical model that uses data on student characteristics, hiring costs and district size to calculate necessary funding levels, which are different in each state.
James Shuls, an associate professor of educational leadership and policy studies at the University of Missouri-St. Louis, said school funding “reflects the values of people” and appropriate levels should be determined through the political process.
Shuls said he personally values choice, equity and efficiency in education funding. He previously worked for the Show-Me Institute, where he authored a Missouri school finance formula primer. The institute is a think tank “dedicated to promoting free markets and individual liberty” and supportive of policies that increase “school choice.”
An ideal funding formula should be “dynamic,” said Shuls, reflecting changing local resources and specific student needs to better promote equity.
Instead, Missouri’s formula reflects outdated property values and school funding levels, Shuls said.
Missouri’s school funding formula starts with an “adequacy target,” the amount of money needed to educate a single student. It multiplies that number based on student attendance, area cost of living and, in some cases, student characteristics that might require extra funds such as disability or learning English.
The formula then factors in how much funding districts can raise from local property taxes.
Anglum, the SLU professor, said one equity challenge is that the state doesn’t manage a majority of the funding that goes to schools.
Some state funding also goes through programs that don’t have the same equity focus as the main formula.
Missouri ranks 47th out of 50 states when it comes to the percentage of school funding that comes from the state. When all sources — local, state and federal — are combined, 2018-19 data from the National Center for Education Statistics shows that K-12 per-student spending in Missouri ranks 32nd in the nation.
Additionally, 2021 data from the National Education Association, a prominent teachers union, shows that not counting the District of Columbia, Missouri has the second-highest percentage of funding coming from local sources and the smallest percentage coming from state sources.
“When we are relying predominantly on local resources in order to fund education, higher-wealth districts are going to win out and lower-wealth districts are going to lose out,” Anglum said.
Traci Gleason, vice president for external affairs at the Missouri Budget Project, said lower state funding can cause localities with fewer resources to choose between underfunding services — including education — or imposing burdensome levels of property and sales taxes.
When legislators reformed school finance in 2005, they also included “hold harmless” provisions to ensure no district would receive less state money under the new formula.
Shuls said that was a sensible way to prevent abrupt funding dips for some districts under the new system. But the “hold harmless” provisions didn’t phase out, meaning many districts are still being funded at outdated levels instead of updated, equitable ones.
When the formula calculates districts’ ability to raise local property taxes, it’s using property values that are now more than 16 years old. That means the state is giving districts with growing property values more funds than they need to meet targets, instead of distributing that money in other ways.
Baker, the University of Miami professor with experience in Missouri and Kansas, said Missouri’s “hold harmless” provisions aren’t even the biggest factor that prevents the state from having a “progressive” funding system. (In this case, “progressive” means districts with greater need spend more money per student.)
Hold harmless provisions tend to partially undo reforms, he agreed. “But I’m not convinced that any of the changes they were making would have very aggressively moved it in the right direction anyway.”
Instead, he said the state’s method of calculating attendance financially penalizes districts that most need support.
Missouri calculates the number of students in each district by using the average daily attendance instead of the total number of students.
That means a school with an 80% attendance rate on the average day could see its funding cut by 20% compared to an otherwise identical school with perfect attendance.
Baker said that’s especially problematic when it comes to equity because schools with lower attendance rates tend to have higher rates of students living in poverty.
“It’s been explained to policymakers in every damn state that it is discriminatory and erases any need adjustment to fund on average daily attendance, and only a few states are bold enough to still do it,” he said. “There’s no excuse for doing it. There’s no legitimate incentive that funding on average daily attendance will, you know, cause attendance to improve.”
Anglum and Shuls agreed that using average daily attendance penalizes poorer schools, although Shuls said there are pros and cons in all methods of calculating attendance.
Another quirk of Missouri’s system is that while it “weights” students who are typically more costly to educate, it only does so when the percentage of students in a specific category exceeds a specific threshold.
For example, the threshold for students receiving free and reduced-price lunch — a common way to estimate numbers of low-income students — is a bit more than 30%. Schools that serve a higher percentage than that get extra funding. Meanwhile, a district serving 25% students in that category doesn’t receive more funding than a district serving 5%.
Shuls, Baker and Anglum all criticized the use of thresholds. Shuls suggested the state could even differentiate the amounts granted for special-needs students — who can have very different funding needs — to better create a system where money “follows the student.”
Baker said that over the past decades, Kansas has strengthened its school finance system while Missouri’s has weakened.
Baker formerly taught at the University of Kansas and was involved in discussions surrounding school finance reform in both Missouri and Kansas. He recently published “School Finance and Education Equity: Lessons from Kansas,” which he said includes many comparisons with Missouri.
“The costs to get to the same outcomes are a little lower in Kansas, but Kansas also much more robustly funds their system,” Baker said.
In Kansas, 58% of districts are spending above the adequate level and achieving results above the national average, Baker said. In Missouri, only 43% of districts are doing the same.
Meanwhile, about 13% of Kansas districts are spending below the targets and achieving below the national average. Nearly 30% of Missouri districts are in the same boat.
“There’s much more inequality in Missouri; there’s far more kids in inadequately funded districts that then have inadequate outcomes to go along with that,” Baker said. “Kansas has just done much better in that regard, over time.”
The Beacon is working on a larger story about Kansas’ school finance formula.
A report from the Missouri Budget Project shows Missouri’s overall K-12 funding target for each student, adjusted for inflation, is less than the 2007 amount — by about $1,000.
Gleason, the project spokesperson, said that while Kansans
reacted to abrupt funding cuts several years ago and restored
funding, Missourians haven’t been as aware of gradual funding
“Missouri has been more like the frog in the frying pan, or boiling water … We just haven’t noticed because it’s happened so slowly over time.”
The post Missouri’s school funding system undermines its own goals for equity, experts say appeared first on The Beacon.
A (a few) ops lessons we all learn the hard way [Published articles]
ops is hard. what have we learned so far?
Internet of Things [Published articles]
|submitted by /u/elpollolepard to
Managed expectations manage suffering [Published articles]
|submitted by /u/Taliesintroll to
Rituals for Engineering Teams [Published articles]
Last weekend I happened to pick up a book called “Rituals For Work: 50 Ways To Create Engagement, Shared Purpose, And A Culture That Can Adapt To Change.” It’s a super quick read, more comic book than textbook, but I liked it.
It got me thinking about the many rituals I have initiated and/or participated in over the course of my career. Of course, I never thought of them as such — I thought of them as “having fun at work” — but now I realize these rituals disproportionately contribute to my favorite moments and the most precious memories of my career.
Rituals (a definition): Actions that a person or group does repeatedly, following a similar pattern or script, in which they’ve imbued symbolism and meaning.
I think it is extremely worth reading the first 27 pages of the book — the Introduction and Part One. To briefly sum up the first couple chapters: the power of creative rituals comes from their ability to link the physical with the psychological and emotional, all with the benefit of “regulation” and intentionality. Physically going through the process of a ritual helps people feel satisfied and in control, with better emotional regulation and the ability to act in a steadier and more focused way. Rituals also powerfully increase people’s sense of belonging, giving them a stable feeling of social connection. (p. 5-6)
The thing that grabbed me here is that rituals create a sense of belonging. You show that you belong to the group by participating in the ritual. You feel like you belong to the group by participating in the ritual. This is powerful shit!
It seems especially relevant these days when so many of us are atomized and physically separated from our teammates. That ineffable sense of belonging can make all the difference between a job that you do and a role that feeds your soul. Rituals are a way to create that sense of belonging. Hot damn.
So I thought I’d write up some of the rituals for engineering teams I remember from jobs past. I would love to hear about your favorite rituals, or your experience with them (good or bad). Tell me your stories at @mipsytipsy.
At Linden Lab, in the ancient era of SVN, we had something called the “Feature Fish”. It was a rubber fish that we kept in the freezer, frozen in a block of ice. We would periodically cut a branch for testing and deployment and call a feature freeze. Merging code into the branch was painful and time consuming, so If you wanted to get a feature in after the code freeze, you had to first take the fish out of the freezer and unfreeze it.
This took a while, so you would have to sit there and consider your sins as it slowly thawed. Subtext: Do you really need to break code freeze?
You were supposed to pair with another engineer for code review. In your commit message, you had to include the name of your reviewer or your merge would be rejected. But the template would also accept the name “Stuffy”, to confess that your only reviewer had been…Stuffy, the stuffed animal.
However if your review partner was Stuffy, you would have to narrate the full explanation of Stuffy’s code review (i.e., what questions Stuffy asked, what changes he suggested and what he thought of your code) at the next engineering meeting. Out loud.
We had a matted green felt headband with ogre ears on it, called the Shrek Ears. The first time an engineer broke production, they would put on the Ears for a day. This might sound unpleasant, like a dunce cap, but no — it was a rite of passage. It was a badge of honor! Everyone breaks production eventually, if they’re working on something meaningful.
If you were wearing the Shrek Ears, people would stop you throughout the day and excitedly ask what happened, and reminisce about the first time they broke production. It became a way for 1) new engineers to meet lots of their teammates, 2) to socialize lots of production wisdom and risk factors, and 3) to normalize the fact that yes, things break sometimes, and it’s okay — nobody is going to yell at you.
This is probably the number one ritual that everybody remembers about Linden Lab. “Congratulations on breaking production — you’re really one of us now!”
We had a stuffed Vorpal Bunny, duct taped to a 3″ high speaker stand, and the operations engineer on call would put the bunny on their desk so people knew who it was safe to interrupt with questions or problems.
At some point we lost the bunny (and added more offices), but it lingered on in company lore since the engineers kept on changing their IRC nick to “$name-bunny” when they went on call.
There was also a monstrous, 4-foot-long stuffed rainbow trout that was the source of endless IRC bot humor… I am just now noticing what a large number of Linden memories involve stuffed animals. Perhaps not surprising, given how many furries were on our platform
Whenever an engineer really took one for the team and dove headfirst into a spaghetti mess of tech debt, we would award them the “Tiara of Technical Debt” at the weekly all hands. (It was a very sparkly rhinestone wedding tiara, and every engineer looked simply gorgeous in it.)
Examples included refactoring our golang rewrite code to support injection, converting our entire jenkins fleet from AWS instances to containers, and writing a new log parser for the gnarliest logs anyone had ever seen (for the MongoDB pluggable storage engine update).
We spent nearly 2.5 years rewriting our entire ruby/rails API codebase to golang. Then there was an extremely long tail of getting rid of everything that used the ruby unicorn HTTP server, endpoint by endpoint, site by site, service by service.
When we finally spun down the last unicorn workers, I brought in a bunch of rainbow unicorn paper sculptures and a jug of lighter fluid, and we ceremonially set fire to them in the Facebook courtyard, while many of the engineers in attendance gave their own (short but profane) eulogies.
This one requires a bit of backstory.
Finally we caved and got on board. We were excited! I announced the migration and started providing biweekly updates to the infra leadership groups. Four months later, when the migration was half done, I get a ping from the same exact members of Facebook leadership:
“What are you doing?!?”
“You can’t do that, there are security issues!”
“No it’s fine, we have a fix for it.”
“There are hardware issues!”
“No it’s cool, we got it.”
“You can’t do this!!!”
ANYWAY. To make an EXTREMELY long and infuriating story short, they pulled the plug and canned the whole project. So I printed up a ten foot long “Mission Accomplished” banner (courtesy of George W Bush on the aircraft carrier), used Zuck’s credit card to buy $800 of top-shelf whiskey delivered straight to my desk (and cupcakes), and we threw an angry, ranty party until we all got it out of our systems.
I honestly don’t remember what this one was about, but I have extensive photographic evidence to prove that I shaved the heads of and/or dyed the hair blue of at least seven members of engineering. I wish I could remember why! but all I remember is that it was fucking hilarious.
Coincidentally (or not), I have no memories of participating in any rituals at the jobs I didn’t like, only the jobs I loved. Huh.
One thing that stands out in my mind is that all the fun rituals tend to come bottoms-up. A ritual that comes from your VP can run the risk of feeling like forced fun, in a way it doesn’t if it’s coming from your peer or even your manager. I actually had the MOST fun with this shit as a line manager, because 1) I had budget and 2) it was my job to care about teaminess.
There are other rituals that it does make sense for executives to create, but they are less about hilarious fun and more about reinforcing values. Like Amazon’s infamous door desks are basically just a ritual to remind people to be frugal.
Rituals tend to accrue mutations and layers of meaning as time goes on. Great rituals often make no sense to anybody who isn’t in the know — that’s part of the magic of belonging.
Claydream Is an Inspiring Yet Cautionary Show-Biz Tale [Published articles]
Claydream, Marq Evans’ new documentary about animator Will Vinton, addresses the elephant in the room immediately: yes, this is the guy who lost his company to his most deep-pocketed investor, Nike founder Phil Knight. It’s something that looms over the film, but it’s not the only melancholy element that colors this portrait of Vinton’s life and career.
Made with the cooperation of Vinton himself, who died of cancer in 2018 but is interviewed extensively here, Claydream offers a visual history of his remarkable accomplishments. Not only do we get a look at the progression of Vinton’s work over the years (from Closed Mondays, the Oscar-winning 1974 short he created with Bob Gardiner, to his company’s instantly recognizable commercial work from the ‘80s and ‘90s, including the California Raisins), we also get access to home movies, as well as firsthand accounts from friends, family members, and former coworkers. After sparking to filmmaking while at UC Berkeley in the 1960s, Vinton (who prized experimentation and creative fulfillment above all else, and was definitely a bit of a hippie) set up a small workshop with his collaborators in Portland, Oregon, a location that kept their productions deliberately removed from the Hollywood machine—the same machine he’d end up pursuing years later, when Will Vinton Studios was at its peak.
Most of Claydream keeps the focus on Vinton’s work—again, this movie is a visual feast, jam-packed with clips and other ephemera (including answering-machine messages from a California Raisins-obsessed Michael Jackson) that illustrate the narrative of Vinton’s career every step of the way. But for all his success, and for the admirable way he bounced back from his periodic failures and missteps, he never achieved the heights of his idol, Walt Disney, whose life trajectory he emulated, down to plans for a never-realized “Claymation Station” amusement park. Though he was well-liked as a person, not everyone he worked with is full of praise; there were issues over the years of sharing credit with the other animators who toiled on his projects, as well as some bad business decisions that meant, for instance, that Will Vinton Studios didn’t share in the licensing for the insanely marketable California Raisins—and also that Vinton passed on selling his company to Pixar during its pre-Disney era. A contentious split with the troubled Gardiner soon after their shared Oscar win haunted Vinton until Gardiner’s death in 2005. But as Claydream amply illustrates, the Phil Knight debacle ended up being the biggest tragedy of Vinton’s creative life.
Neither Knight nor his son Travis Knight are interviewed in Claydream; we see them in deposition and archival footage only. Travis Knight, now a film director known for the stop-motion feature Kubo and the Two Strings as well as the live-action Transformers spin-off Bumblebee, comes off particularly badly just on the basis of the facts presented: a failed rapper, he was hired at Will Vinton Studios after his father invested in it, where he developed his (by all accounts) true talent and passion for animation. But there’s no escaping the “nepotism baby” aroma that envelops him in this context, especially when the documentary points out that he became head of Will Vinton Studios—renamed Laika—after Vinton, who was unable to rescue his financially struggling company, was pushed out.
It’s juicy show-biz stuff, for sure, but Vinton makes a point of turning what was obviously an incredibly devastating blow into something positive. Looking back several years after he lost his studio, he sounds genuinely proud of its continued success, specifically in the way that Laika—which has since become a Hollywood powerhouse with acclaimed titles like Coraline, ParaNorman, The Boxtrolls, Missing Link, and Knight’s Kubo—brought stop-motion to an ever-wider audience while innovating on the art form. It couldn’t have been easy for Vinton to make peace with the situation, but Claydream sure makes it seem like he was able to. Perhaps, as in his earliest days as a counterculture animator, it all came down to what really mattered: making an end product that was cool as it could possibly be. Even if Vinton wasn’t directly involved in any of Laika’s titles, his legacy lives on.
Claydream hits select theaters today, August 5.
Want more io9 news? Check out when to expect the latest Marvel and Star Wars releases, what’s next for the DC Universe on film and TV, and everything you need to know about House of the Dragon and Lord of the Rings: The Rings of Power.
Lego Kansas City Skyline Instructions [Published articles]
|submitted by /u/Edacity1 to
I’m Brandon Sanderson, a bestselling fantasy author who somehow produced the highest-funded Kickstarter campaign of all time. AMA! [Published articles]
Iâ€™m Brandon Sanderson, a bestselling fantasy author. Best known for The Stormlight Archive, Mistborn, and for finishing Robert Jordanâ€™s The Wheel of Time, Iâ€™m now also known for having the highest-funded campaign in Kickstarterâ€™s history for four books I wrote during the quarantine. If you want to stay up to date with me, you should check out my YouTube channel (where you can watch me give my answers to the questions below) and my Facebook, Twitter, and Instagram. Ask me any questions you like, but Iâ€™m less likely to answer questions with massive spoilers for the books. Iâ€™ll be taking questions today only.
EDIT: I'm off the livestream and have had some dinner. The transcription of some questions is still coming, as...well, I talk a lot. Those answers will be posted soon, or you can see them on the VOD of my answers on the YouTube channel.
Apologies for the stream-of-consciousness wall-of-text answers. This was a new thing for us, finding a way for me to be able to give answers for people while also getting piles of pages signed. I hope you can make sense of the sometimes rambling answers I give. They might flow better if you watch them be spoken.
Thanks, all, for the wonderful AMA. And as I said, some answers are still coming (and I might pop in and write out a few others that I didn't get to.)
We’re heading for a messy, and expensive, breakup with natural gas [Published articles]
Russia’s invasion of Ukraine has exacerbated a number of fault lines already present within the global energy supply chain. This is especially true in Europe, where many countries were reliant on the superstate's natural resources, and are now hastily looking to cut ties before the supply is shut off. This has revealed the fragility of Europe’s energy market, and caused it to drive up demand and prices for consumers all over the globe.
In the UK, things are becoming increasingly dire and energy prices are skyrocketing. Bad planning on the infrastructure side and the cancellation of several major domestic energy efficiency programs are exacerbating the problem. It’s clear that real, useful action on the national level isn’t coming any time soon. So, I wondered, what would happen if I, personally, simply tried to break up with natural gas on my own? It’s relatively straightforward but, as it turns out, it comes at a cost that only one percenters will be able to bear.
I live in a four-bedroom, end-terraced house that’s around 150 years old and I’ve tried, as best as I can, to renovate it in an eco-friendly way. Since we bought it almost a decade ago, my wife and I have insulated most of the rooms, installed a new gas central heating system and hot water cylinder. We are, like nearly 20 million other households in the UK, reliant on natural gas to supply our home heating, hot water and cooking. And in the period between January 8th and April 7th, 2022, I was billed on the following usage:
Cost Per Unit (GBP)
Electricity (incl. standing charge)
Gas (incl. standing charge)
Total (incl. tax and other charges)
Essentially, I paid around $1,300 for my natural gas and electricity in the first quarter of 2022. That figure is likely to rise significantly, as the UK’s mandatory price cap on energy rose by more than 50 percent in April. A further price rise is scheduled for October, with the figure set at £2,800 per year, even though wholesale energy prices are no longer increasing. It’s likely that my energy bill for the first quarter of 2023 will be nearly twice what I’ve just paid. In 2020, the UK reported that 3.16 million households were unable to pay for their energy costs; that figure is likely to leap by 2023.
In the US, the EIA says that monthly utility bills rose to a national average of $122 in 2021, with Hawaii ($178 per month) and Utah ($82 per month) the most expensive and cheapest state to buy energy in. The average price per kWh is around 13.7 cents, which is less than half the comparable price in the UK as it currently stands. For natural gas, the average natural gas price for residential customers was $10.84 per thousand cubic feet in 2020.
Much of Europe is reliant on natural gas, a significant proportion of which was supplied by Russia. Despite a rapid decline in domestic production, Europe sought to make natural gas the bedrock of its energy policy in the medium term. A 2013 policy paper written by Sami Andoura and Clémentine d’Oultremont outlined the reasons why officials were banking on it. “An economically attractive option for investors, a potential backup source for renewables and the cleanest fossil fuel, natural gas is expected to play an important role in the European transition towards a low-carbon economy by 2050.” This is despite the fact that “European energy resources are being depleted, and energy demand is growing.”
In 2007, then EU Energy Commissioner Andris Piebalgs said that the bloc is “dependent on imports for over one half of our energy use.” He added that energy security is a “European security issue,” and that the bloc was vulnerable to disruption. “In 10 years, from 1995 to 2005, natural gas consumption in the EU countries has increased from 369 billion to 510 billion m3 [of gas] year,” he said. He added that the EU’s own production capacity and reserves peaked in the year 2000.
The EU’s plan was to pivot toward Liquified Natural Gas (LNG), methane which has been filtered and cooled to a liquid for easier transportation. It enables energy supplies from further afield to be brought over to Europe to satisfy the continent’s need for natural gas. But the invasion of Ukraine by Russia has meant that this transition has now needed to be accelerated as leaders swear off Russian-sourced gas and oil. And while the plan is to push more investment into renewables, LNG imports are expected to fill much of the gap for now.
Except, and this is crucial, many of the policy decisions made during this period seem to be in the belief that nothing bad would, or could, disrupt supply. Here in the UK, wholesale gas prices have risen five times since the start of 2021 but there’s very little infrastructure available to mitigate price fluctuations.
The Rough Field is a region in the North Sea situated 18 miles off the coast of Yorkshire, and was previously a source of natural gas for the UK. In 1985, however, it was converted into a natural gas storage facility with a capacity of 3.31 billion cubic meters. This one facility was able to fulfill the country’s energy needs for a little more than a week at a time and was considered a key asset to maintaining the UK’s energy security.
However, Centrica, the private company spun out of the former state-owned British Gas, opted to close the field in 2017. It cited safety fears and the high cost of repair as justification for the move, saying that alternative sources of gas – in the form of LNG – were available. At the time, one gas trader told Bloomberg that the closure would “boost winter prices” and “create seasonal swings in wholesale energy costs.” He added that the UK would now be “competing with Asia for winter gas cargoes,” raising prices and increasing reliance on these shipments.
And, unsurprisingly, the ramifications of this decision were felt in the summer of 2017 when a pair of LNG tankers from Qatar changed course. The vessels were going to the UK, and when they shifted direction, Bloomberg reported that prices started to shift upward almost instantly.
Analysis from TransitionZero, reported by The Guardian, says that the costs associated with natural gas are now so high that it’s no longer worth investing in as a “transition fuel.” It says that the cost to switch from coal to gas is around $235 per ton of CO2, compared to just $62 for renewables as well as the necessary battery storage.
In order to break up with gas in my own home, I’ll need to swap out my stovetop (not so hard) and my whole central heating system (pretty hard). The former I can likely achieve for a few hundred dollars, plus or minus the cost of installation. (Some units just plug in to a standard wall socket, so I may be able to do much of the work myself if I’m feeling up to the task.) Of course, getting a professional to unpick the gas pipeline that connects to my stovetop is going to be harder.
Unfortunately, replacing a 35kW condensing gas boiler (I have the Worcester Bosch Greenstar 35CDi) is going to be a lot harder. The obvious choice is an Air Source Heat Pump (ASHP), or even a geothermal Ground Source Heat Pump (GSHP), both of which are more environmentally-friendly. After all, both are more energy-efficient than a gas boiler, and both run on electricity which is theoretically cleaner.
More generally, the UK’s Energy Saving Trust, a Government-backed body with a mission to advocate for energy efficiency, says that the average Briton should expect to pay between £7,000 and £13,000 to install an ASHP. Much of that figure is dependent on how much of your home’s existing hardware you’ll need to replace. A GSHP is even more expensive, with the price starting at £14,000 and rising to closer to £20,000 depending on both your home’s existing plumbing and the need to dig a bore hole outside.
In my case, heat pump specialists told me that, give or take whatever nasties were found during installation, I could expect to pay up to £27,000 ($33,493). This included a new ASHP, radiators, hot water and buffer cylinders, pumps, piping, controllers, parts and labor. Mercifully, the UK is launching a scheme to offer a £5,000 ($6,200) discount on any new heat pump installations. But that still means that I’m paying north of £20,000 (and ripping out a lot of existing materials with plenty of life left in them) to make the switch.
In the US, there’s plenty of difference on a state level, but at the federal level, you can get a tax credit on the purchase of a qualifying GSHP. A system installed before January 1st, 2023, will earn a 26 percent credit, while a unit running before January 1st, 2024, will be eligible for a 22 percent credit. Purchasers of a qualifying ASHP, meanwhile, were entitled to a $300 tax credit until the end of 2021.
The contractors also provided me with a calculation of my potential energy savings over the following seven years. It turns out that I’d actually be spending £76 more on fuel per month, and £532 over the whole period. On one hand, if I had the cash to spare, it’s a small price to pay to dramatically reduce my personal carbon emissions. On the other, I was hoping that the initial investment would help me reduce costs overall, but that's not the case while the cost of gas is (ostensibly) cheaper than electricity. (This will, of course, change as energy prices surge in 2023, however, but I can only look at the data as it presently stands.)
An aside: To be honest with you all, I was fully aware that the economic case for installing a heat pump was always going to be a shaky one. When speaking to industry figures last year, they said that the conversation around “payback” isn’t shared when installing standard gas boilers. It doesn’t help that, at present, levies on energy mean that natural gas is subsidized more than energy, disincentivizing people making the switch. The rise of electric cars, too, has meant that demand for power is going to increase sharply as more people switch, forcing greater investment in generation. What’s required just as urgent is a series of measures to promote energy efficiency to reduce overall demand for both gas and electricity.
The UK has had an on-again, off-again relationship with climate change mitigation measures, which has helped sow the seeds of this latest crisis. The country, with low winter temperatures, relies almost exclusively on natural gas to heat its homes, its largest energy-consuming sector. As I reported last year, around 85 percent of UK homes are heated by burning natural gas in domestic boilers.
Work to reduce the UK’s extraordinary demand for natural gas was sabotaged by government in 2013. In 2009, under the previous Labour government, a series of levies on energy companies were introduced under the Community Energy Saving Programme. These levies were added to domestic energy bills, with the proceeds funding works to install wall or roof insulation, as well as energy-efficient heating systems and heating controllers for people on low incomes. The idea was to reduce demand for gas by making homes, and the systems that heated them, far more efficient since most of the UK’s housing stock was insufficiently insulated when built.
But in 2013, then-Conservative-Prime Minister David Cameron was reportedly quoted as saying that he wanted to reduce the cost of domestic energy bills by getting “rid of all the green crap.” At the time, The Guardian reported that while the wording was not corroborated by government officials, the sentiment was. Essentially, that meant scrapping the levies, which at the time GreenBusinessWatch said was around eight percent of the total cost of domestic energy. Cameron’s administration also scrapped a plan to build zero-carbon homes, and effectively banned the construction of onshore windfarms which would have helped reduce the cost of domestic electricity generation.
In 2021, the UK’s Committee on Climate Change examined the fallout from this decision, saying that Cameron’s decision kneecapped efforts to reduce demand for natural gas. As Carbon Brief highlighted at the start of 2022, in 2012, there were nearly 2.5 million energy efficiency improvements installed. By 2013, that figure had fallen to just 292,593. The drop off, the Committee on Climate Change believes, has caused insulation installations to fall to “only a third of the rate needed by 2021” to meet the national targets for curbing climate emissions.
Carbon Brief’s report suggests that the financial savings missed by the elimination of these small levies – the “green crap,” – has cost UK households around £2.5 billion. In recent years, a pressure group – Insulate Britain – has undertaken protests at major traffic intersections to help highlight the need for a new retrofit program to be launched. The current government’s response to their pleas has been to call for tougher criminal penalties for protesters including a jail term of up to six months.
Looking back through my energy bills over the last few years, my household’s annual electricity consumption is around 4,500kWh per year. A heat pump would likely add a further 6,000kWh to my energy bill, not to mention any additional cost for switching to all-electric cooking. It would be sensible to see if I could generate some, or all, of my own energy at home using solar panels to help reduce the potential bill costs.
The Energy Saving Trust says that the average homeowner can expect to pay £6,500 for a 4.2kWp system on the roof of their home. Environmental factors such as the country you live in and orientation of your property mean you can’t be certain how much power you’ll get out of a specific solar panel, but we can make educated guesses. For instance, the UK’s Renewable Energy Hub says you can expect to get around 850kW per year out of a 1kW system. For a theoretical 5kWp system in my location, the Energy Saving Trust thinks I’ll be able to generate around 4,581kWh per year.
Sadly, I live in an area where, even though my roof is brand new and strong enough to take panels, they aren’t allowed. This is because it is an area of “architectural or historic interest where the character and appearance [of the area] needs to be protected or improved.” Consequently, I needed to explore work to ground-mount solar panels in my back garden, which gets plenty of sunlight.
While I expected grounded panel installations to be much cheaper, they apparently aren’t. Two contractors I spoke to said that while their average roof-based installation is between £5,000 and £7,000, a 6kWp system on the ground would cost closer to £20,000. It would be, in fact, cheaper to build a sturdy shed in the bit of back yard I had my eye on and install a solar system on top of there, compared to just getting the mounting set up on the ground. That’s likely to spool out the cost even further, and that’s before we get to the point of talking about battery storage.
For this rather nifty thought experiment, the cost for me to be able to walk away from natural gas entirely would be north of £30,000 ($37,000). Given that the average UK salary is roughly £38,000, it’s a sum that is beyond the reach of most people without taking out a hefty loan. This is, fundamentally, why the need for government action is so urgent, since it is certainly beyond the ability of most people to achieve this change on their own.
In fact, it’s going to require significant movement from central government not just in the UK but elsewhere to really shake our love-hate relationship with natural gas. Unfortunately, given that it’s cheap, cleaner than coal and the energy lobby has plenty of muscle behind it, that’s not likely to happen soon. And so we’re stuck in a trap – it’s too expensive to do it ourselves (although that’ll certainly be an interesting experiment to undertake) and there’s no help coming, despite the energy crisis that’s unfurling around us.
Climate scientists reconsider the meaning and implications of drought in light of a changing world [Published articles]
Maps of the American West have featured ever darker shades of red over the past two decades. The colors illustrate the unprecedented drought blighting the region. In some areas, conditions have blown past severe and extreme drought into exceptional drought. But rather than add more superlatives to our descriptions, one group of scientists believes it's time to reconsider the very definition of drought.
QDB - 7634 [Published articles]
<SimonSapin> nox: the history of packaging in python is
<nox> SimonSapin: All I need to know is, is setuptools old stuff or new stuff?
<SimonSapin> nox: its been both
<SimonSapin> in that order
The History and Politics of Wuxia [Published articles]
I first fell in love with wuxia when I was around eight or so. I remember running around swinging the bright yellow handle of my toy broom as a sword, calling a sprawling tiger stuffed toy my master and pretending the shower was a waterfall I could learn the secrets of the universe under. I ran on tiptoe because that was somehow more like flying—or “hing gung” 輕功, the art of lightness, as I would eventually become fond of translating it .
But even before then I was deeply familiar with the genre; its many conventions have become baked into the everyday language of the Hong Kong I grew up in. My relatives all played Mahjong and much like with sports, discussions around these games borrowed heavily from the language of sparring martial artists. I’d ask at the end of every Sunday, what are the results of the battles. When asking for a family recipe, someone would joke that they’d have to become the apprentice of this or that auntie. Later, there was the world of study guides and crib sheets, all calling themselves secret martial arts manuals. The conventions around martial artists going into seclusion to perfect their craft and going mad in the pursuit of it take on new meaning as slang around cramming for exams.
Which is all to say, I really love wuxia.
“Wuxia”, literally meaning “martial hero”, is a genre about martially powerful heroes existing in a world parallel to and in the shadows of the Chinese imperial history.
The archetypal wuxia hero is someone carving out his own path in the world of rivers and lakes, cleaving only to their own personal code of honour. These heroes are inevitably embroiled in personal vengeance and familial intrigue, even as they yearn for freedom and seek to better their own skills within the martial arts. What we remember of these stories are the tournaments, the bamboo grove duels and the forbidden love.
Parallels are often drawn to knights errant of medieval romances, with many older translations favouring a chivalric vocabulary. There are also obvious comparisons to be made with the American western, especially with the desperados stumbling into adventures in isolated towns in search for that ever-elusive freedom.
It is easy to think of wuxia in these universal terms with broad themes of freedom, loyalty and justice, but largely divorced from contemporary politics. These are stories, after all, that are about outlaws and outcasts, existing outside of the conventional hierarchies of power. And they certainly do have plenty to say about these big universal themes of freedom, loyalty and justice.
But this is also a genre that has been banned by multiple governments within living memory. Its development continues to happen in the shadows of fickle Chinese censorship and at the heart of it remains a certain defiant cultural and national pride intermingled with nostalgia and diasporic yearning. The vast majority of the most iconic wuxia texts are not written by Chinese authors living comfortably in China, but by a dreaming diaspora amid or in the aftermath of vast political turmoil.
Which is all to say that the world of wuxia is fundamentally bound up with those hierarchies of power it seeks to reject. Much like there is more to superheroes than dorky names, love triangles, and broad universal ideals of justice, wuxia is grounded in the specific time and place of its creation.
Biography of Old Dragon-beard (虯髯客傳) by Du Guangting (杜光庭, 850-933) is commonly cited as the first wuxia novel. It chronicles the adventures of the titular Old Dragon-beard, who along with the lovers, Hongfu 紅拂 and Li Jing 李靖, make up the Three Heroes of the Wind and Dust. But the story isn’t just supernatural adventures; they also help Li Shimin 李世民 found the Tang Dynasty (618–906). The martial prowess and the seemingly eccentric titles of the characters aside, the act of dynastic creation is unavoidably political. 虯髯客傳 pivots around Hongfu’s ability to discern the true worth a man, which leads her to abandon her prior loyalties and cleave her love to Li Jing and his vision for a better empire. Not to mention Du wrote this and many of his other works whilst in exile with the Tang imperial court in the south, after rebels sacked the capital and burnt his books. Knowing this, it is difficult not to see Du as mythologising the past into a parable of personal resonance, that perhaps he too was making decisions about loyalties and legacies, which court or emperor he should stay with, asking himself if the Tang would indeed rise again (as he himself, as a taoist has prophecised).
Other commonly cited antecedents to the modern wuxia genre are the 14th Century classics like Romance of the Three Kingdoms (三國演義) and Outlaws of the Marsh (水滸傳), the former of which is all about the founding of dynasties and gives to Chinese the now ubiquitously cited The empire, long divided, must unite; long united, must divide. Thus it has ever been (话说天下大势．分久必合，合久必分).
Revolutionaries, Rebels and Race in the Qing Dynasty
No era of imperial China was in possession of a “free press”, but the literary inquisitions under the Qing Dynasty (1644–1911) were particularly bloody and thorough. The Manchu elite suppressed any openly revolutionary sentiment in fiction, however metaphorical, and what is written instead is a literature that sublimates much of that discontent into historical fiction nostalgic for the eras of Han dominance. Wandering heroes of the past were refashioned into a pariah elite, both marginalised from mainstream society but also superior to it with their taoist-cultivated powers.
Whilst earlier quasi-historical epics and supernatural tales are replete with gods and ghosts, late Qing wuxia begins to shed these entities and instead grounds itself in a world where taoist self-cultivation grants immense personal powers but not divinity itself. In each of the successive reprintings of Three Heroes and Five Gallants (三俠五義), editors pruned the text of anachronisms and supernatural flourishes.
The parallel world of secret societies, foreign cults, bickering merchants and righteous martial clans came to be known as jianghu, literally “rivers and lakes”. As a metaphor, it was first coined by taoist philosopher, Zhuangzi 莊子, to describe a utopian space outside of cutthroat court politics, career ambitions and even human attachments. This inspires subsequent generations of literati in their pursuits of aesthetic hermitism, but the jianghu we know today comes also from the waterways that form the key trade routes during the Ming Dynasty (1368–1644). To the growing mercantile classes, jianghu referred to the actual rivers and canals traversed by barges heavy with goods and tribute, a byname for the prosperous Yangtze delta.
These potent lineages of thought intermingle into what jianghu is within martial arts fiction today, that quasi historical dream time of adventure. But there is also another edge to it. In Stateless Subjects: Chinese Martial Arts History and Postcolonial History, Petrus Liu translates jianghu as “stateless”, which further emphasizes that the hero’s rejection of and by the machineries of government. Jianghu is thus a world that rejects the dictates of the state in favor of divine virtue and reason, but also of a sense of self created through clan and community.
The name of the genre, wuxia (“武俠“) comes from Japanese, where a genre of martially-focused bushido-inspired fiction called bukyō (“武侠”) was flourishing. It was brought into Chinese by Liang Qichao 梁启超, a pamphleteer writing in political exile in Japan, seeking to reawaken what he saw as Han China’s slumbering and forgotten martial spirit. In his political work, he holds up the industrialisation and militarisation of Meiji Japan (and its subsequent victory against Russia) as inspiration and seeks a similar restoration of racial and cultural pride for the Han people to be the “master of the Continent” above the hundred of different races who have settled in Asia.
Wuxia is fundamentally rooted in these fantasies of racial and cultural pride. Liang Qichao’s visions of Han exceptionalism were a response to subjugation under Manchu rule and Western colonialism, a martial rebuttal to the racist rhetoric of China being the “Sick Man of Asia”. But it is still undeniably ethno-nationalism built around the descendants of the Yellow Emperor conquering again the continent that is their birthright. Just as modern western fantasy has as its bones the nostalgia for a pastoral, premodern Europe, wuxia can be seen as a dramatisation of Sinocentric hegemony, where taoist cultivation grants power and stalwart heroes fight against an ever-barbaric, ever-invading Other.
Dreams of the Diaspora
Jin Yong 金庸 remains synonymous with the genre of wuxia in Chinese and his foundational mark on it cannot be overstated. His Condor Trilogy (射鵰三部曲) was serialised between 1957-63 and concerns three generations of heroes during the turbulent 12th-13th centuries. The first concerns a pair of sworn brothers, one loyal and righteous, the other clever and treacherous. Their friendship deteriorates as the latter falls into villainy, scheming with the Jin Empire (1115–1234) to conquer his native land. The second in the trilogy follows their respective children repeating and atoning for the mistakes of their parents whilst the Mongols conquer the south. The last charts the internal rivalries within the martial artists fighting over two peerless weapons whilst its hero leads his secret society to overthrow the Yuan Dynasty (1271–1368).
It’s around here that English articles about him start comparing him to Tolkien, and it’s not wholly unjustified, given how both created immensely popular and influential legendaria that draw heavily upon ancient literary forms. Entire genres of work have sprung up around them and even subversions of their work have become themselves iconic. Jin Yong laid down what would become the modern conventions of the genre, from the way fights are imagined with discrete moves, to the secret martial arts manuals and trap-filled tombs.
Unlike Tolkien, however, Jin Yong’s work is still regularly (even aggressively) adapted. There are in existence nine tv adaptations of each instalment of the Condor Trilogy, for example, as well as a video game and a mobile game. And at time of writing, eight feature films and nine tv series based on his work are in production.
But Jin Yong’s work was not always so beloved by mainland Chinese audiences. For a long time he, along with the rest of wuxia, were banned and the epicentre of the genre was in colonial Hong Kong. It is a detail often overlooked in the grand history of wuxia, so thoroughly has the genre been folded into contemporary Chinese identity. It is hard at times to remember how much of the genre was created by these artists in exile. Or perhaps that is the point, as Hong Kong’s own unique political and cultural identity is being subsumed into that of the People’s Republic, so too is its literary legacy. Literalist readings of his work as being primarily about historical martial artists defang the political metaphors and pointed allegories.
Jin Yong’s work is deeply political. Even in the most superficial sense, his heroes intersect with the politics of their time, joining revolutionary secret societies, negotiating treaties with Russia and fighting against barbarian invaders. They are bound up in the temporal world of hierarchy and power. Legend of the Condor Hero (射鵰英雄傳)’s Guo Jing 郭靖 becomes the sworn brother to Genghis Khan’s son, Tolui, and joins the Mongol campaign against the Khwarezmid Empire. Book and Sword (書劍恩仇錄)’s Chen Jialuo 陳家洛 is secretly the Qianlong Emperor’s half brother. The Deer and the Cauldron (鹿鼎記)’s Wei Xiaobao 韋小寶 is both best friends with the Kangxi Emperor and also heavily involved in a secret society dedicated to overthrowing the aforementioned emperor. Even Return of the Condor Hero (神鵰俠侶)‘s Yang Guo 楊過 ends up fighting to defend the remains of the Song Empire against the Mongols.
But it goes deeper than that. Jin Yong was a vocal critic of the Cultural Revolution, penning polemics against Mao Zedong and the Gang of Four during the late 60s. Beyond the immediate newspaper coverage, Jin Yong edited and published many more works both documenting and dissecting the Cultural Revolution.
Jin Yong described himself as writing every day one novel instalment and one editorial against the Gang of Four. Thus did they bleed together, the villains of Laughing in the Wind (笑傲江湖) becoming recognisable caricatures as it too rejected senseless personality cults.
In this light, his novels seem almost an encyclopaedia of traditional Chinese culture, its values and virtues, a record of it to stand bulwark against the many forces that would consign it all to oblivion. It is a resounding rebuttal to principles of the May Fourth Movement, that modernisation and westernisation are equivalents. To Jin Yong the old and the traditional were valuable, and it is from this we must build our new literature .
Taken together, Jin Yong’s corpus offers an alternate history of the Han people spanning over two thousand years from the Eastern Zhou (771–256 B.C.) to the Qing Dynasty (1644–1911). He fills in the intriguing gaps left in official records with folk heroes, court gossip and conspiracy theories. His text is dense with literary allusions and quotations from old Chinese poems.
His stories are almost all set during times of turmoil when what can be termed “China”, or at least, the Han people are threatened by barbarian invasion and internal corruption; pivotal moments in history that makes heroes and patriots out of ordinary men and women. All this Jin Yong immortalises with a deep yearning for a place and past that never quite was; nostalgia in the oldest sense of the word, with all the pain and pining and illusion that it implies.
It is arguably this very yearning, this conjuring of a real and relevant past from dry history books that makes Jin Yong’s work so endlessly appealing to the Chinese diaspora, as well as the mainland Chinese emerging from the Cultural Revolution. This alternate history dramatises the complexities of Han identity, all the times it has been threatened, disrupted and diluted in history, but at the same time it gave hope and heroics. These were stories as simple or as complex as the reader wanted it to be.
Chinese Imperialism and Han Hegemony
It is sometimes hard to remember that Jin Yong and all the rest of wuxia was once banned in the People’s Republic of China, so thoroughly have they now embraced his work. As late as the 1990s was Jin Yong decried as one of the “Four Great Vulgarities of Our Time” (alongside the four heavenly kings of cantopop, Jackie Chan and sappy Qiong Yao romances).
In recent decades, the CCP has rather dramatically changed its relationship with the past. The censorship machine is still very active, but it does not have in its crosshairs the decadent and feudal genre of wuxia (though there have been exceptions, especially during the run up to the Republic’s 70th anniversary when all frivolous dramas were put on pause; it is important to remember that the censors are not always singular or consistent in their opinions). But more importantly, the Party no longer draws power from a radical rejection of the past, instead it is embraces utterly, celebrated at every turn. Traditionalism now forms a core pillar of their legitimacy, with all five thousand years of that history validating their rule. The State now actively promotes all those superstitions and feudal philosophies it once held in contempt.
Along with the shifting use of history to inspire nationalism has Jin Yong been rehabilitated and canonised. It’s arguably that revolutionary traditionalism —that he was preserving history in a time of its destruction—that makes him so easy to rehabilitate. Jin Yong’s work appeals both to the conservative mind with its love of tradition and patriotic themes, but also to rebels in its love of outlaw heroes.
It isn’t that these stories have nothing to say on themes of a more abstract or universal sense of freedom or justice, but that they are also very much about the specifics of Han identity and nationalism. Jin Yong’s heroes often find themselves called to patriotism, even as they navigate their complex or divided loyalties, they must defend “China” in whatever form it exists in at the time against barbaric, alien invaders. Even as they function as straightforward stories of nationalistic defence, they are also dramatising disruptions of a simplistic or pure Chinese identity, foregrounding characters from marginalised (if also often exoticised) ethnicities and religions.
Jin Yong’s hero Guo Jing is Han by birth and Mongol by adoption. He ultimately renounces his loyalty to Genghis Khan and returns to his Han homeland to defend it from Mongol conquest. Whilst one can read Jin Yong’s sympathy and admiration for the Mongols as an attempt to construct an inclusive nationalism for modern China, Guo Jing’s participation as a Han hero in the conquest of Central Asia also functions as a justification of modern Han China’s political claim on that imperial and colonial legacy.
Book and Sword has this even more starkly as it feeds the popular Han fantasy that the Kangxi Emperor is not ethnically Manchu but instead, a Han changeling. He is forced by the hero of the novel Chen Jialuo to swear an oath to acknowledge his Han identity and overthrow the Manchus, but of course, he then betrays them and subjugates not only the Han but also the “Land of Wei” (now known as Xin Jiang, where the genocide is happening). Still there is something to be said about how this secret parentage plot attributes the martial victories of the Qing to Han superiority and justifies the Han inheritance of former Qing colonies.
The Uyghur tribes are portrayed with sympathy in Book and Sword. They are noble and defiant and devout. Instead of savages who need to be brought to heel, they are fellow resistance fighters. It alludes to an inclusive national identity, one in which Han and Uyghur are united by their shared suffering under Manchu rule. It can also be argued that their prominence disrupts the ideal of a pure Han-centric Chineseness. But what good is inclusion and unity to those who do not want to be part of that nation? Uyghurs, being a people suffering occupation, actively reject the label of “Chinese Muslims”.
Furthermore, the character of Kasili in Book and Sword, based on the legend of the Fragrant Concubine, is drenched in orientalist stereotype. Chen first stumbles upon her bathing naked in a river, her erotic and romantic availability uncomfortably paralleling that of her homeland. When the land of Wei falls to the emperor’s sword and Kasili is taken as a concubine, she remains loyal to the Han hero she fell in love with, ultimately killing herself to warn Chen of the emperor’s duplicity. Conquest and imperial legacy is thus dramatised as a love triangle between a Uyghur princess, a Han rebel and a Manchu emperor.
Chen, it should be noted, falls in love and marries a different Uyghur princess for his happy ending.
Amid other far more brutal policies meant to forcibly assimilate and eradicate Uyghur identity, the PRC government encouraged Han men to take Uyghur women as wives. Deeply unpleasant adverts still available online extolled the beauty and availability of Uyghur women, as something and somewhere to be conquered. It is impossible not to be reminded of this when reading about the beautiful and besotted Kasili.
There is no small amount of political allegory to be read between the lines of Jin Yong, something he became increasingly frank about towards the end of his life. Condor Trilogy with its successive waves of northern invaders can be seen as echoing at the Communist takeover of China. The success of Wei Xiaobao’s affable cunning can be a satire on the hollowness materialistic 70s modernity. But Jin Yong himself proved to be far less radical than his books as he sided with the conservative anti-democracy factions within Hong Kong during the Handover.
In an 1994 interview, Jin Yong argues against the idea that China was ever under “foreign rule”, instead proposing that the many ethnic groups within China are simply taking turns on who happens to be in ascendance. All wars are thus civil wars and he neatly aligns his novels with the current Chinese policies that oppress in the name of unity, harmony and assimilation, of “inclusive” nationalism.
The legacy of Jin Yong is a complex one. His work, like all art, contains multitudes and can sustain any number of seemingly contradictory interpretations. It is what is beautiful about art. But I cannot but feel that his rapid canonisation over the last decades in mainland China is a stark demonstration of how easily those yearning dreams of the diaspora can become nationalistic fodder.
I did not come to bury wuxia, but to praise it. I wanted to show you a little bit of its complexities and history, as well as the ideals and ideologies that simmer under its surface.
For me, I just think it is too easy to see wuxia as a form of salvation. Something to sustain and inspire me in a media landscape hostile to people who look like me. To give me the piece of me that I have felt missing, to heal a deep cultural wound. After all, Hollywood or broader Anglophone media might be reluctant to make stories with Asian protagonists, but I can turn to literally all of wuxia. American TV series won’t make me a fifty episode epic about two pretty men eyefucking each other that also has a happy ending, but I will always have The Untamed.
It’s this insidious feeling of hope. That this genre is somehow wholly “unproblematic” because I am reconnecting with my cultural roots, that it can nourish me. That it can be safe that way. It is, after all, untouched by all the problematic elements in Anglophone mainstream that I have analysed to death and back. That it is some sort of oasis, untouched by colonialism and western imperialism. That it therefore won’t or can’t have that taint of white supremacy; it’s not even made by white people.
Perhaps it is just naive of me to have ever thought these things, however subconsciously. Articulating it now, it’s ridiculous. Han supremacy is a poisonous ideology that is destroying culture, hollowing out communities and actively killing people. In the face of its all-consuming genocide-perpetuating ubiquity, the least I can do is recognise its presence in a silly little genre I love. It just doesn’t seem too much to ask.
Jeannette Ng is originally from Hong Kong but now lives in Durham, UK. Her MA in Medieval and Renaissance Studies fed into an interest in medieval and missionary theology, which in turn spawned her love for writing gothic fantasy with a theological twist. She runs live roleplay games and is active within the costuming community, running a popular blog. Jeannette has been a finalist for the John W. Campbell Award for Best New Writer and the Sydney J Bounds Award (Best Newcomer) in the British Fantasy Awards 2018.
Report Finds Phone Network Encryption Was Deliberately Weakened [Published articles]
A weakness in the algorithm used to encrypt cellphone data in the 1990s and 2000s allowed hackers to spy on some internet traffic, according to a new research paper. Motherboard: The paper has sent shockwaves through the encryption community because of what it implies: The researchers believe that the mathematical probability of the weakness being introduced on accident is extremely low. Thus, they speculate that a weakness was intentionally put into the algorithm. After the paper was published, the group that designed the algorithm confirmed this was the case. Researchers from several universities in Europe found that the encryption algorithm GEA-1, which was used in cellphones when the industry adopted GPRS standards in 2G networks, was intentionally designed to include a weakness that at least one cryptography expert sees as a backdoor. The researchers said they obtained two encryption algorithms, GEA-1 and GEA-2, which are proprietary and thus not public, "from a source." They then analyzed them and realized they were vulnerable to attacks that allowed for decryption of all traffic. When trying to reverse-engineer the algorithm, the researchers wrote that (to simplify), they tried to design a similar encryption algorithm using a random number generator often used in cryptography and never came close to creating an encryption scheme as weak as the one actually used: "In a million tries we never even got close to such a weak instance," they wrote. "This implies that the weakness in GEA-1 is unlikely to occur by chance, indicating that the security level of 40 bits is due to export regulations." Researchers dubbed the attack "divide-and-conquer," and said it was "rather straightforward." In short, the attack allows someone who can intercept cellphone data traffic to recover the key used to encrypt the data and then decrypt all traffic. The weakness in GEA-1, the oldest algorithm developed in 1998, is that it provides only 40-bit security. That's what allows an attacker to get the key and decrypt all traffic, according to the researchers.
Read more of this story at Slashdot.
An Indian art form called Rangoli [Published articles]
|submitted by /u/Hades0504 to