u/PoopMobile9000 explains the history of the US government's "debt ceiling" and how an outdated procedural formality is exploited for political gain with potentially catastrophic effects on the economy [Published articles]
Long story short, after coming to the realisation that myself and my IRL friends wanted different things at the gaming table I've bounced around from discord to discord server running a handful of games here and there but moving on after a month or so.
Thing is is that this getting a bit knackering, and I'd like start building a stable group. I tried reaching out to some of the people I'd campaigned with previously to see if they'd be interested and got noncommittal nonanswers.
So reddit, if you were in my position how would you go about building a steady playgroup?
To hear the first results from the James Webb Telescope, 200 astronomers descended on the Space Telescope Science Institute for three days in December, reports the New York Times, with an update on what may be 2022's biggest science story. The $10 billion telescope "is working even better than astronomers had dared to hope" -- and astronomers are ecstatic: At a reception after the first day of the meeting, John Mather of NASA's Goddard Space Flight Center and Webb's senior project scientist from the start, raised a glass to the 20,000 people who built the telescope, the 600 astronomers who had tested it in space and the new generation of scientists who would use it. "Some of you weren't even born when we started planning for it," he said. "Have at it!" Launched on Christmas one year ago, the Webb telescope "is seven times as powerful as its predecessor, the Hubble Space Telescope," the Times reports -- sharing what was revealed in that auditorium in December: One by one, astronomers marched to the podium and, speaking rapidly to obey the 12-minute limit, blitzed through a cosmos of discoveries. Galaxies that, even in their relative youth, had already spawned supermassive black holes. Atmospheric studies of some of the seven rocky exoplanets orbiting Trappist 1, a red dwarf star that might harbor habitable planets. (Data suggest that at least two of the exoplanets lack the bulky primordial hydrogen atmospheres that would choke off life as we know it, but they may have skimpy atmospheres of denser molecules like water or carbon dioxide.) "We're in business," declared Bjorn Benneke of the University of Montreal, as he presented data of one of the exoplanets. Megan Reiter of Rice University took her colleagues on a "deep dive" through the Cosmic Cliffs, a cloudy hotbed of star formation in the Carina constellation, which was a favorite early piece of sky candy. She is tracing how jets from new stars, shock waves and ionizing radiation from more massive nearby stars that were born boiling hot are constantly reshaping the cosmic geography and triggering the formation of new stars. "This could be a template for what our own sun went through when it was formed," Dr. Reiter said in an interview. Between presentations, on the sidelines and in the hallways, senior astronomers who were on hand in 1989 when the idea of the Webb telescope was first broached congratulated one another and traded war stories about the telescope's development. They gasped audibly as the youngsters showed off data that blew past their own achievements with the Hubble. The telescope is a joint project of NASA, the European Space Agency and the Canadian Space Agency. And appropriately for New Year's Eve, the article concludes with a look to the future: Thus far the telescope, bristling with cameras, spectroscopes and other instruments, is exceeding expectations. (Its resolving power is twice as good as advertised.) The telescope's flawless launch, Dr. Rigby reported, has left it with enough maneuvering fuel to keep it working for 26 years or more. "These are happy numbers...." The closing talk fell to Dr. Mather. He limned the telescope's history, and gave a shout-out to Barbara Mikulski, the former senator of Maryland, who supported the project in 2011 when it was in danger of being canceled. He also previewed NASA's next big act: a 12-meter space telescope called the Habitable Worlds Observatory that would seek out planets and study them.
Read more of this story at Slashdot.
There’s a lot of Star
Trek, and a lot of it is very watchable—especially
in the iconic sophomore series The Next
Generation, which helped truly catapult the franchise into
the pop culture stratosphere. Asking a Trek neophyte
to dive into over
five days of TV is a daunting task, however. So why not just give
them an appetizer of everything?
This incredible mash-up of all 178 episodes of Star Trek: The Next Generation by Sentinel of Something condenses the seven seasons of boldly going done by Captain Jean-Luc Picard and the crew of the U.S.S. Enterprise into a little over nine minutes... by giving you three random seconds of every episode. It’s unhinged and it’s perfect.
There’s an artistry to the consideration here. Do you pick an iconic visual, a perfect, but short enough line of dialogue, a joke, a sad moment, or a shot of action? Just how do you distill an entire episode of TNG, from the very best to the very worst, in just three seconds? The answer is that you not take Star Trek seriously, so what you get is three manic seconds of out-of-context weirdness, 178 times in a row.
Okay, it’s probably not helpful to a Star Trek newbie looking to shave some time off of a marathon. But for TNG fans, it’s a delightfully zany whirlwind trip through one of the best sci-fi TV shows of all time.
Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.
LastPass revealed today that attackers stole customer vault data after breaching its cloud storage earlier this year using information stolen during an August 2022 incident. BleepingComputer reports: This follows a previous update issued last month when the company's CEO, Karim Toubba, only said that the threat actor gained access to "certain elements" of customer information. Today, Toubba added that the cloud storage service is used by LastPass to store archived backups of production data. The attacker gained access to Lastpass' cloud storage using "cloud storage access key and dual storage container decryption keys" stolen from its developer environment. "The threat actor copied information from backup that contained basic customer account information and related metadata including company names, end-user names, billing addresses, email addresses, telephone numbers, and the IP addresses from which customers were accessing the LastPass service," Toubba said today. "The threat actor was also able to copy a backup of customer vault data from the encrypted storage container which is stored in a proprietary binary format that contains both unencrypted data, such as website URLs, as well as fully-encrypted sensitive fields such as website usernames and passwords, secure notes, and form-filled data." Fortunately, the encrypted data is secured with 256-bit AES encryption and can only be decrypted with a unique encryption key derived from each user's master password. According to Toubba, the master password is never known to LastPass, it is not stored on Lastpass' systems, and LastPass does not maintain it. Customers were also warned that the attackers might try to brute force their master passwords to gain access to the stolen encrypted vault data. However, this would be very difficult and time-consuming if you've been following password best practices recommended by LastPass. If you do, "it would take millions of years to guess your master password using generally-available password-cracking technology," Toubba added. "Your sensitive vault data, such as usernames and passwords, secure notes, attachments, and form-fill fields, remain safely encrypted based on LastPass' Zero Knowledge architecture."
Read more of this story at Slashdot.
New scientific records are set every year, and 2022 was no exception. A bacterial behemoth, a shockingly speedy supercomputer and a close-by black hole are among the most notable superlatives of the year.
The first known surgical operation was a leg amputation (SN: 10/8/22 & 10/22/22, p. 5). That’s the conclusion researchers came to after investigating the skeleton of a person who lived on the Indonesian island of Borneo about 31,000 years ago. Healed bone where the lower left leg had been removed suggests the individual survived for several years after the procedure. The discovery pushes surgery’s origin back by some 20,000 years.
Bacteria normally dwell in the microscopic world. Not Thiomargarita magnifica. Averaging about a centimeter long, this newfound bacterium is visible to the naked eye (SN: 7/16/22 & 7/30/22, p. 17). T. magnifica, which lives in the mangrove forests of the Caribbean’s Lesser Antilles, is about 50 times larger than other species of big bacteria and about 5,000 times larger than typical bacteria. Why this species evolved into such a giant is unknown.
A supercomputer named Frontier crunched numbers with mind-blowing speed this year: 1.1 quintillion operations per second (SN Online: 6/1/22). That makes the machine, run by Oak Ridge National Laboratory in Tennessee, the first exascale computer — a computer that can perform at least 1018 operations per second. The next fastest computer tops out at 442 quadrillion (that’s 1015) operations per second. Exascale computing is expected to lead to breakthroughs in everything from climate science to health to particle physics.
Deep off the coast of Antarctica, icefish congregate in a breeding colony as big as Orlando, Fla. Some 60 million nests of Jonah’s icefish (Neopagetopsis ionah) stretch across at least 240 square kilometers of seafloor (SN: 2/12/22, p. 12). Previously, nest-building species of fish were known to gather in only the hundreds. An abundant food supply and access to a zone of unusually warm water may explain the exceptionally large group.
By sifting through data released by the Gaia spacecraft, astrophysicists discovered a black hole that’s just over 1,560 light-years from Earth (SN Online: 11/4/22). Dubbed Gaia BH1, it’s about twice as close as the previously nearest known black hole. But that record may not stand. About 100 million black holes are predicted to exist in the Milky Way. Since most are invisible, they’re hard to find. But when Gaia, which is precisely mapping a billion stars, releases its next batch of data in a few years, even closer black holes may turn up.
A research team has linked the mutation that causes Huntington's disease to developmental deficits in the brain's oligodendrocyte cells that are caused by changes in metabolism. They found that high doses of thiamine and biotin can restore normal processes.
What keeps bones able to remodel themselves and stay healthy? A team has discovered clues to the key function of non-collagen protein compounds and how they help bone cells react to external load. The scientists used fish models to examine bone samples with and without bone cells to elucidate differences in microstructures and the incorporation of water. Using 3D neutron tomography they succeeded for the first time in precisely measuring the water diffusion across bone material -- with a surprising result.
While the stunning images from the James Webb Space Telescope captured space fans’ attention this year, other telescopes and spacecraft were busy on Earth and around the solar system (SN Online: 12/7/22). Here are some of the coolest space highlights that had nothing to do with JWST.
After several aborted attempts, NASA launched the Artemis I mission on November 16. That was a big step toward the goal of landing people on the moon as early as 2025 (SN: 12/3/22, p. 14). No human has set foot there since 1972. Artemis I included a new rocket, the Space Launch System, which had previously suffered a series of hydrogen fuel leaks, and the new Orion spacecraft. No astronauts were aboard the test flight, but Orion carried a manikin in the commander’s seat and two manikin torsos to test radiation protection and life-support systems, plus a cargo hold full of small satellites that went off on their own missions. On December 11, the Orion capsule successfully returned to Earth, splashing down in the Pacific Ocean near Mexico (SN Online: 12/12/22).
NASA’s DART spacecraft successfully nudged an asteroid into a new orbit this year. On September 26, the Double Asteroid Redirection Test slammed into asteroid Dimorphos, about 11 million kilometers from Earth at the time of impact. In October, NASA announced that the impact shortened Dimorphos’ roughly 12-hour orbit around its sibling asteroid, Didymos, by 32 minutes (SN: 11/5/22, p. 14). Dimorphos posed no threat to Earth, but the test will help inform future missions to divert any asteroids on a potentially dangerous collision course with our home planet, researchers say.
The InSight Mars lander is going out on a high note. After scientists reported in May that InSight had recorded the largest known Marsquake, roughly a magnitude 5, news came in October that the lander’s seismometer had also detected the rumblings of the two biggest meteorite impacts ever observed on Mars. Those impacts created gaping craters and sent seismic waves rippling along the top of the planet’s crust.
The details of how those waves and others moved through the Red Planet gave researchers new intel on the structure of Mars’ crust, which is hard to study any other way. The data also suggest that some Marsquakes are caused by magma moving beneath the surface (SN: 12/3/22, p. 12). The solar panels that power the lander are now covered in dust after four years on Mars, a death knell for the mission.
All five bases in DNA and RNA have been found in rocks that fell to Earth. Three of the nucleobases, which combine with sugars and phosphates to make up the genetic material of all known life, had previously been found in meteorites. But the last two — cytosine and thymine — were reported from space rocks only this year (SN: 6/4/22, p. 7). The find supports the idea that life’s precursors could have come to Earth from space, researchers say.
The supermassive black hole at the center of the Milky Way became the second black hole to get its close-up. After releasing a picture of the behemoth at the heart of galaxy M87 in 2019, astronomers used data from the Event Horizon Telescope, a network of radio telescopes around the world, to assemble an image of Sagittarius A* (SN: 6/4/22, p. 6). The image, released in May, shows a faint fuzzy shadow nestled in the glowing ring of the accretion disk. That may not sound impressive on its own, but the result provides new details about the turbulence roiling near our black hole’s edge.
Scientists have finally managed to bottle the sun.
At 1:03 a.m. PST on December 5, researchers with the National Ignition Facility in Livermore, Calif., ignited controlled nuclear fusion that, for the first time, resulted in the net production of energy. A 3-million-joule burst emerged from a peppercorn-sized capsule of fuel when it was heated with a 2-million-joule laser pulse. Details of the long-awaited achievement, which mimics how the sun makes energy, were revealed in a news conference December 13 by U.S. Department of Energy officials.
“This is a monumental breakthrough,” says physicist Gilbert Collins of the University of Rochester in New York, who is a former NIF collaborator but was not involved with the research leading to the latest advance. “Since I started in this field, fusion was always 50 years away…. With this achievement, the landscape has changed.”
Fusion potentially provides a clean energy source. The fission reactors now used to generate nuclear energy rely on heavy atoms, like uranium, to release energy when they break down into lighter atoms, including some that are radioactive. While it’s comparatively easy to generate energy with fission, it’s an environmental nightmare to deal with the leftover radioactive debris that can remain hazardous for hundreds of millenia.
Controlled nuclear fusion, on the other hand, doesn’t produce such long-lived radioactive waste, but it’s technically much harder to achieve in the first place. In nuclear fusion, light atoms fuse together to create heavier ones. In the sun, that typically occurs when a proton, the nucleus of a hydrogen atom, combines with other protons to form helium.
Getting atoms to fuse requires a combination of high pressure and temperature to squeeze the atoms tightly together. Intense gravity does much of the work in the sun.
At the National Ignition Facility, 192 lasers directed at a small capsule filled with deuterium and tritium, heavy types of hydrogen, provided a blast of energy that did the trick instead. About 4 percent of that fuel was fused in the process. The new result far surpassed the 1.3 million joules of energy produced by an earlier NIF experiment that marked the first time the team managed to ignite nuclear fusion.
“These recent results [at] NIF are the first time in a laboratory anywhere on Earth [that] we were able to demonstrate more energy coming out of a fusion reaction than was put in,” NIF physicist Tammy Ma said at the news conference. She predicted that pilot projects for power plants based on the fusion approach will be built in the “coming decades.”
But this latest fusion burst still didn’t produce enough energy to run the laser power supplies and other systems of the NIF experiment. It took about 300 million joules of energy from the electrical grid to get a hundredth of the energy back in fusion.
“The net energy gain is with respect to the energy in the light that was shined on the target, not with respect to the energy that went into making that light,” says University of Rochester physicist Riccardo Betti, who was also not involved with the research. “Now it’s up to the scientists and engineers to see if we can turn these physics principles into useful energy.”
Despite that, it’s a potential turning point in the technology comparable to the invention of the transistor or the Wright brothers’ first flight, says Collins. “We now have a laboratory system that we can use as a compass for how to make progress very rapidly,” he says.
Over 100 million years ago, the chirps of insects known as katydids dominated the sounds of Earth’s nights. Now, fossils reveal what the katydid ears that heard those sounds looked like.
Twenty-four fossils of roughly 160 million-year-old katydids unearthed in China represent the earliest known insect ears, researchers report December 12 in Proceedings of the National Academy of Sciences.
These ancient sensors of sound — identical to the ones found on today’s katydids — may have picked up the first short-range, high-frequency calls of any kind, helping the insects hide from predators.
Insects were the first land dwellers to send sound waves through the air, allowing the creatures to communicate over longer distances than sight often allows (SN: 7/15/21). While some insects use their antennae to detect vibrations in the air, katydids have mammal-like ears that use an eardrum to hear (SN: 11/15/12). Yet because well-preserved insect eardrums are rare in the fossil record, it’s unclear how katydid ears evolved, say paleontologist Chunpeng Xu of the Nanjing Institute of Geology and Paleontology in China and colleagues.
Analyses of the Chinese fossils push the known record of male and female katydid ears’ ability to listen for potential mates or male competitors to the mid-Jurassic, between 157 million and 166 million years ago. The previous record holders for oldest insect ears, katydids and crickets found in Colorado, are around 50 million years old.
What’s more, sound-producing structures on 87 fossilized male katydid wings from China, South Africa and Kyrgyzstan — which date from about 157 million to 242 million years ago — may have generated a variety of chirps, including high-frequency calls up to 16 kilohertz. (Humans, by comparison, can hear frequencies from roughly 20 hertz to 20 kilohertz.)
High-frequency chirps don’t travel far, which would have allowed katydids to communicate over short distances. Such a trait may have been useful because mammal hearing was improving around the same time, Xu says. Limiting the range of some calls could have helped katydids hide from predatory eavesdroppers on the hunt for an insect feast.
It’s hard to even imagine. A world ravaged by climate change. People totally consumed by technology. Mega corporations in control of everything. Robots performing menial tasks. Wait, did we say “hard” to imagine? We meant we’re literally living it. The “it” being Wall-E, Pixar’s 2008 Oscar-winning masterpiece co-written and directed by Andrew Stanton.
The tale of a lone robot left to clean up the Earth who finds himself on an intergalactic adventure to protect the future of the planet wowed audiences when it was released and is considered to be one of Pixar’s best films to date. Since then, Wall-E has only gotten more poignant and been more revered, so it’s only fitting that, on November 22, it becomes Pixar’s first film ever released by the Criterion Collection, a company specializing in the best, most comprehensive, obsessive Blu-ray releases around.
To mark the occasion, io9 sat for a video chat with Wall-E’s director to find out how the film made it to Criterion, what his favorite special features are, what he thinks about our world being so close to Wall-E’s, and whether there was ever talk of a sequel or theme park ride—as well as his work on Obi-Wan Kenobi and For All Mankind. Check it out.
Germain Lussier, io9: So how did you find out that the
movie is going to be a part of the Criterion Collection? Because
that was a big deal and it’s Pixar’s first
Andrew Stanton: I approached them. I pressed them as a filmmaker. This was not a studio thing. It was me asking as a favor to Alan Bergman, who is the president of Walt Disney Studios, and saying, “Look, I’ve been out in the world making TV shows for about seven years. I’ve met so many people in the industry now, filmmakers that I revere, filmmakers that are budding, that really sort of get the cinema DNA and inspirations that were in the molecules of Wall-E.” I made it with such a love of cinema. It was great to see that it had that effect on on a lot of peers and I felt like there’s something there that qualifies it to possibly be in their library. And so I said, “Can I ask them?” Because I know it breaks precedent with what Disney does so, it was a bit of a favor, and he said, “Well, if Criterion bites, yeah, we’ll see if we can make it work.” And so that was in 2019. Then the pandemic hit and everything just paused. They said yes. But it was very frustrating because then the world just stopped. Then it was really last year that we got very serious about it.
And the real question was “What’s the Criterion angle on this?” We’ve done such a good job and a thorough job of showing the behind-the-scenes on other DVDs. So I really left it up to them. And [Criterion producer] Kim Hendrickson and the rest of their wonderful team really just drilled down. I let them lead about “What is interesting to you guys? What is it you guys want to know?” So that really led the angle on all the doc materials and the booklet and everything. I’ve been a consumer of Criterion since they existed in the late ‘80s. So I was pleased as punch to see everything from the cover to what their perspective was and what was interesting to them, because I feel a lot of the times [Pixar] DVDs get used as a babysitter, and they’re not necessarily going to the crowd that we want to talk to, because I would love to talk to other people that love film as much. And so I feel like, “Oh, this is finally for an audience that I would be in.”
io9: Yeah, it’s my favorite Pixar movie, so going through the disc and exploring a little bit was excellent. Now, you buy a Criterion for the transfer, the sound, but mostly the features. And here there’s just so much. So, if someone buys this disc, they watch the movie—but after that, what’s the first thing they should go watch and why?
Stanton: Well, it’s hard for me to know if there’s a better order, but I find it fascinating to be able to finally do a Master Class and just talk about the actual under-the-hood work that we do [at Pixar], and how much we really control and work on and nuance the story down to a beat-by-beat level. I mean, I could have done that with any scene and any other filmmaker in the studio could have done that with their movies and their scenes. And so it was a chance to slow down and actually have a literal Master Class. And then I also loved being able to talk about all the cinematic influences because again, we’re such filmgoers first and filmmakers second that—any one of these films, but I think particularly Wall-E—had such deep, deep, deep influences from some of the earliest cinematic movies. There really was such a major Keaton and Chaplin influence. So I think it’s great to sort of see how cinema from any era keeps inspiring the latest films. That it just keeps passing it on.
io9: One of the things I did watch was the Master Class and it was really, really good. And I also watched the “Wall-E A to Z” feature that you guys did for this.
Stanton: To me, we could have gone A to... If that alphabet was twice as long we could have kept finding things.
io9: Oh for sure. I bring it up though because I found it interesting how it really spoke to the way Wall-E was ahead of its time, or at least forward-looking with a lot of things—technology, the terrible world we’re in. So out of all those things that our world has in common with Wall-E, does anything, in particular, stand out, for better or worse?
Stanton: Well, I certainly didn’t expect to be seeing the dire state of the world climate-wise in such a short amount of time. Didn’t want to be right about that! And I’ve noted back at the time of press for the movie when it originally released, that wasn’t something that I was preaching. I just sort of leveraged off of the truth of what I’d always thought. I was raised in the ‘60s and ‘70s not to pollute and that the environment is fragile. That was always in my world and culture. So I just went with that logic to get this robot alone. I wasn’t hoping to be right. I wasn’t being a Lorax, but I was not anti. So I’m horrified that I was right in that regard.
The other thing that happened, equal or greater, than I expected, was the siloing of everybody and their technology. I knew I was right about that. I was one of the first adopters of an iPhone and I was like “This is like smoking cigarettes.” I can’t stop it, you know? I just knew. And I was just sitting getting coffee this morning in New York and watching everybody pass by on their way to work and counting. And it was like one in every six people was looking forward and everybody else was just looking at their iPhones and not navigating. I’m like, “Oh my God, I’m sitting on the Axiom right now.”
io9: That’s hilarious-slash-terrifying. But okay.
The movie has this distinguished legacy but it’s also one of
the few Pixar movies from that time that didn’t get a sequel.
I get that the credits are kind of the sequel but did Disney ever
pressure you guys to say like, “Hey, any thoughts on a
Stanton: I’ve been getting this question since 1995.
Stanton: And everybody wants Disney to be the big bad guy. And I’m sure that they sometimes are on things. But for [Pixar], they’ve always said is “Whatever you guys want to do, we just would love a sequel whenever it comes to you naturally.” Economically, [Pixar] wouldn’t exist if we didn’t have our third feature be Toy Story 2, and if we didn’t continue to try and find other ways. So we try to find them organically and we try to find them honestly. And we certainly don’t want to spend four years working on a lesser-than product because that’s just too much of your life. And, frankly, having been behind several sequels, after about six months, it’s an original. Anything you think you gain from it, it’s sometimes even harder to crack.
So there’s never pressure from them like, “We need exactly this at this time.” We’ve never had that. But we’ve had our own private pressure of like, “How do we keep a balance so that we can keep the lights on?” or else you guys don’t get to watch anything ever again. It’s always been that. So there’s not any “If you left us alone, we’d never make a sequel.” But Wall-E just never felt right to me. I mean, I’m not anti and I’m very sober to the fact that I don’t own this movie. They can do whatever they want with it. And if I get hit by a bus tomorrow, who knows? But it doesn’t feel like it’s asking for that. And on the success chart of our movies, it wasn’t one of the bigger moneymakers. So I don’t even feel like the crass business guy is going “We need another one.” So it kind of protects it a bit.
io9: One of the other things Disney likes to do,
obviously, is theme park rides and pretty much—not all of
them, but a lot—of the Pixar movies from that era are in
theme parks as an attraction. Wall-E obviously has a
million things merch-wise and it’s honestly a little bit more
pessimistic towards the world, at least at the start, but were
there ever talks about bringing the movie into the parks as a
Stanton: Again, that’s a direct reflection of it wasn’t that big of a box office movie. It did that sweet spot of it did just well enough that nobody’s embarrassed by it, but it did just low enough that everybody is like, we’re just going to let that move on. And it’s kind of stayed pure in that sense.
io9: Gotcha. Now, you said you approached Criterion for this. You were a fan. So were there any features or materials you had been sitting on in case something like this ever came to pass?
Stanton: No. I think the thing that was frustrating though, still even on making this disc, is we shot so much behind the scenes [footage]. We just let the cameras run in so many meetings and I think our Criterion producer Kim Hendrickson saw more than I’ve ever seen. And she said that they could have made a whole box set of just watching how we make the donuts, you know? Which is frustrating because there’s so much stuff that we do that uses other materials. Like sometimes other music and we’ll never be able to show it because we don’t have the rights to the music. It’ll always be a bit of frustration that we can never really, really, really, really show a Get Back, behind-the-scenes talk.
io9: Yeah that would be incredible. Okay, I’ll
come back to Wall-E but as you can see [from the art]
behind me, I’m obviously a huge Star Wars fan. And
you helped write the final two episodes of Obi-Wan Kenobi
which have some really crucial moments in Star Wars
history. So I want to know, what’s the process for that?
Obviously, you want to tell a great story. But also with Star
Wars, it’s got to fit in with all the canon and
everything else. So how did that work out?
Stanton: That was the blessing and the curse of it. It’s like one, you’re geeking out that you get to type “Vader says” this and “Kenobi says” that. You pause and say “I can’t believe I’m actually getting paid to type this. I can’t believe these words may be said.” But then another part of you, it has to go through such a rigorous like “Does that fit the canon?” And I feel like it’s bittersweet. [The reason that happens is] because people care, but it also kind of doesn’t allow, sometimes, things to venture beyond where maybe they should to tell a better story. So it can sometimes really handicap what I think are better narrative options.
And so I was frustrated sometimes—not a lot—but I just felt it wasn’t as conducive to [the story]. So I love it when something like Andor is in a safe spot. And it can just do whatever the heck it wants. But I felt, you know, Joby [Harold, Obi-Wan Kenobi co-writer and executive producer], to his credit, kept the torch alive and kept trying to thread the needle so that the story wouldn’t suffer but it would please all the people that were trying to keep it in the canon. But I got some moments in there that I’m very happy with.
io9: Yeah, that sounds like a tough balance. Thank you. The other show you’ve been working on in the last couple of years is For All Mankind, which I recently caught up on and loved. What is it like working on something that is obviously so good, but it’s a little under the radar—and then also, did any of your space knowledge from Wall-E translate over?
Stanton: Well, we had a lot of NASA consultants at the time for Wall-E. And so it felt like I’d done a little bit of research. I mean, [For All Mankind’s] stuff is so thoroughly vetted by the writers’ room and the showrunners, and we have a consultant on set and an astronaut that’s actually there. And so you know that usually by the time you’re shooting and you’re reading what the scene is, that it’s already been vetted. But I just geek out and want to do it correctly. I love as a storyteller, working within those limitations. Like this is what would really happen, there wouldn’t be a window here, they float at this moment, they wouldn’t float here. I just love that challenge of just going, “Okay, then how do you tell that moment?” It’s such a great crew and show and I was pleased as punch to be able to come back and work with the same family again.
io9: Oh it’s the best. Finally, to wrap up back on Wall-E, it was a critical hit. Now we have this Criterion disc. Over a decade later, when you look back at it, what are you most proud of with the film?
Stanton: That you get just as caught up now. That’s all I care about. That’s the drug for me, still. I just want the lights to go down, and I want to be fully engaged and forget where I am, forget who I am, and then the lights come up, and I was just 100% in. And that’s really all it is that I’m trying to find again with every movie I buy a ticket for, and trying to do with every scene if I’m behind the telling of something. And I could tell I was hitting a real pure vein for so much of Wall-E. And it’s nice to come back to it now and feel that that hasn’t faded. You can just get just as caught up in it as you did on day one. It’s like having a song where everything’s just harmonized so well and you pick the arrangements just right. You don’t see a way to improve it. Plus, it’s a hummable tune. It’s something that your foot taps against your will. I feel like you don’t get to say when you’ve found songs that are that strong, and the same with movies, and I thought I did at the time. And it’s nice to look back and go, “Oh yeah, I did.”
Yes. Yes, he did. The Wall-E Criterion Collection disc
is out November 22.
Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about James Cameron’s Avatar: The Way of Water.
<Roosevelt> !choose do work, play games
<RoBoBo> Choice: do work
<Roosevelt> !choose listen to a stupid bot, don't listen to a stupid bot
<RoBoBo> Choice: listen to a stupid bot
Computer engineer George Hilliard says he has built an electronic business card running Linux. From his blog post: It is a complete, minimal ARM computer running my customized Linux firmware built with Buildroot. It has a USB port in the corner. If you plug it into a computer, it boots in about 6 seconds and shows up over USB as a flash drive and a virtual serial port that you can use to log into the card's shell. The flash drive has a README file, a copy of my resume, and some of my photography. The shell has several games and Unix classics such as fortune and rogue, a small 2048, and a small MicroPython interpreter. All this is accomplished on a very small 8MB flash chip. The bootloader fits in 256KB, the kernel is 1.6MB, and the whole root filesystem is 2.4MB. So, there's plenty of space for the virtual flash drive. It also includes a writable home directory, on the off chance that anyone creates something they want to keep. This is also saved on the flash chip, which is properly wear leveled with UBI. The whole thing costs under $3. It's cheap enough to give away. If you get one from me, I'm probably trying to impress you. In a detailed write-up, Hilliard goes on to explain how he came up with the design and assembled all the components. Naturally, there were some problems that arose during the construction that he had to troubleshoot: "first, the USB port wasn't long enough to reliably make contact in many USB ports. Less critically, the flash footprint was wrong, which I worked around by bending the leads under the part by hand..." Impressively, the total cost of the card (not including his time) was $2.88 -- "cheap enough that I don't feel bad giving it away, as designed!"
Read more of this story at Slashdot.
As part of our Blackhat Europe talk “Reverse Engineering and Exploiting Builds in the Cloud” we publicly released a new tool called Terrier.
In this blog post, I am going to show you how Terrier can help you identify and verify container and image components for a wide variety of use-cases, be it from a supply-chain perspective or forensics perspective. Terrier can be found on Github https://github.com/heroku/terrier.
In this blog post, I am not going to go into too much detail about containers and images (you can learn more here) however it is important to highlight a few characteristics of containers and images that make them interesting in terms of Terrier. Containers are run from images and currently the Open Containers Initiative (OCI) is the most popular format for images. The remainder of this blog post refers to OCI images as images.
Essentially images are tar archives that container multiple tar archives and meta-information that represent the “layers” of an image. The OCI format of images makes images relatively simple to work with which makes analysis relatively simple. If you only had access to a terminal and the tar command, you could pretty much get what you need from the image’s tar archive.
When images are utilised at runtime for a container, their
contents become the contents of the running container and the
layers are essentially extracted to a location on the
container’s runtime host. The container runtime host is the
host that is running and maintaining the containers. This location
location contains a few folders of interest, particularly the
"merged" folder. The "merged" folder contains the contents of the
image and any changes that have occurred in the container since its
creation. For example, if the image contained a location such as
/usr/chris/stuff and after creating a container from
this image I created a file called
/usr/chris/stuff. This would result in
the following valid path on the container runtime host
Now that we have a brief understanding of images and containers, we can look at what Terrier does. Often it is the case that you would like to determine if an image or container contains a specific file. This requirement may be due to a forensic analysis need or to identify and prevent a certain supply-chain attack vector. Regardless of the requirement, having the ability to determine the presence of a specific file in an image or container is useful.
Terrier can be used to determine if a specific image contains a specific file. In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above uses a Docker image ID which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will then need to establish the SHA256 hash of the particular file you would like to identify in the image. There are multiple ways to achieve this but in this example, we are going to use the hash of the Golang binary go1.13.4 linux/amd64 which can be achieved with following command on a Linux host:
$ cat /usr/local/go/bin/go | sha256sum
The command above should result in the following SHA256 hash:
Now that we have a hash, we can use this hash to determine if
the Golang binary is in the image
achieve this, we need to populate a configuration file for Terrier.
Terrier makes use of YAML configuration files and below is our
config file that we save as
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
The config file above has multiple entries which allow us to
mode that Terrier will operate in and in
this case, we are working with an image file (tar archive) so the
image. The image file we are working with is
myImage.tar and the hash we are looking to identify is
We are now ready to run Terrier and this can be done with the following command:
The command above should result in output similar to the following:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
We have identified a file
located at layer
that has the same SHA256 hash as the one we provided. We now have
verification that the image “myImage.tar” contains a
file with the SHA256 hash we provided.
This example can be extended upon and you can instruct Terrier to search for multiple hashes. In this case, we are going to search for a malicious file. Recently a malicious Python library was identified in the wild and went by the name “Jeilyfish”. Terrier could be used to check if a Docker image of yours contained this malicious package. To do this, we can determine the SHA256 of one of the malicious Python files that contains the backdoor:
$ cat jeIlyfish-0.7.1/jeIlyfish/_jellyfish.py | sha256sum cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c
We then update our Terrier config to include the hash calculated above.
mode: image image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
We then run Terrier against and analyse the results:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [!] Found file '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759/usr/local/go/bin/go' with hash: 82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c
The results above indicate that our image did not contain the malicious Python package.
There is no limit as to how many hashes you can search for however it should be noted that Terrier performs all its actions in-memory for performance reasons so you might hit certain limits if you do not have enough accessible memory.
Terrier can also be used to determine if a specific image contains a specific file at a specific location. This can be useful to ensure that an image is using a specific component i.e binary, shared object or dependency. This can also be seen as “pinning” components by ensuring that you are images are using specific components i.e a specific version of cURL.
In order to do this, you need the following:
The first point can be easily achieved with Docker by using the following command:
$ docker save imageid -o myImage.tar
The command above utilises a Docker image id which can be obtained using the following command:
$ docker images
Once you have your image exported as a tar archive, you will need to determine the path of the file you would like to identify and verify in the image. For example, if we would like to ensure that our images are making use of a specific version of cURL, we can run the following commands in a container or some other environment that resembles the image.
$ which curl /usr/bin/curl
We now have the path to cURL and can now generate the SHA256 of this instance of cURL because in this case, we trust this instance of cURL. We could determine the hash by other means for example many binaries are released with a corresponding hash from the developer which can be acquired from the developer’s website.
$ cat /usr/bin/curl | sha256sum 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96
With this information, we can now populate our config file for Terrier:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96'
We’ve saved the above config as
when we run Terrier with this config, we get the following
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1) $ echo $? 0
The output above indicates that the file
/usr/bin/curl was successfully identified and
verified, meaning that the image contained a file at the location
/usr/bin/curl and that the SHA256 of that file matched
the hash we provided in the config. Terrier also makes use of
return codes and if we analyse the return code from the output
above, we can see that the value is
0 which indicates
a success. If Terrier cannot identify or verify all the provided
files, a return code of
1 is returned which indicates
a failure. The setting of return codes is particularly useful in
testing environments or CI/CD environments.
We can also run Terrier with verbose mode enable to get more information:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [!] Identified instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl [!] Verified matching instance of '/usr/bin/curl' at: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560/usr/bin/curl with hash: 9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (1/1) [!] All components were identified and verified: (1/1)
The output above provides some more detailed information such as which layer the cURL files was located at. If you wanted more information, you could enable the veryveryverbose option in the config file but beware, this is a lot of output and grep will be your friend.
There is no limit for how many hashes you can specify for a file. This can be useful for when you want to allow more than one version of a specific file i.e multiple versions of cURL. An example config of multiple hashes for a file might look like:
mode: image image: myImage.tar files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - hash: 'aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545' - hash: '6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759' - hash: 'd4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c'
The config above allows Terrier to verify if the identified cURL instance is one of the provided hashes. There is also no limit for the amount of files Terrier can attempt to identify and verify.
Terrier’s Github repo also contains a useful script called
convertSHA.sh which can be used to convert a list of
SHA256 hashes and filenames into a Terrier config file. This is
useful when converting the output from other tools into a Terrier
friendly format. For example, we could have the following contents
of a file:
8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d ./bin/uname 6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92 ./bin/gzexe 74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04 ./bin/wdctl 61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877 ./bin/dmesg 7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b ./bin/which 3ed46bd8b4d137cad2830974a78df8d6b1d28de491d7a23d305ad58742a07120 ./bin/mknod e8ca998df296413624b2bcf92a31ee3b9852f7590f759cc4a8814d3e9046f1eb ./bin/mv a91d40b349e2bccd3c5fe79664e70649ef0354b9f8bd4658f8c164f194b53d0f ./bin/chown 091abe52520c96a75cf7d4ff38796fc878cd62c3a75a3fd8161aa3df1e26bebd ./bin/uncompress c5ebd611260a9057144fd1d7de48dbefc14e16240895cb896034ae05a94b5750 ./bin/echo d4ba9ffb5f396a2584fec1ca878930b677196be21aee16ee6093eb9f0a93bf8f ./bin/df 5fb515ff832650b2a25aeb9c21f881ca2fa486900e736dfa727a5442a6de83e5 ./bin/tar 6936c9aa8e17781410f286bb1cbc35b5548ea4e7604c1379dc8e159d91a0193d ./bin/zforce 8d641329ea7f93b1caf031b70e2a0a3288c49a55c18d8ba86cc534eaa166ec2e ./bin/gzip 0c1a1f53763ab668fb085327cdd298b4a0c1bf2f0b51b912aa7bc15392cd09e7 ./bin/su 20c358f7ee877a3fd2138ecce98fada08354810b3e9a0e849631851f92d09cc4 ./bin/bzexe 01764d96697b060b2a449769073b7cf2df61b5cb604937e39dd7a47017e92ee0 ./bin/znew 0d1a106dc28c3c41b181d3ba2fc52086ede4e706153e22879e60e7663d2f6aad ./bin/login fb130bda68f6a56e2c2edc3f7d5b805fd9dcfbcc26fb123a693b516a83cfb141 ./bin/dir 0e7ca63849eebc9ea476ea1fefab05e60b0ac8066f73c7d58e8ff607c941f212 ./bin/bzmore 14dc8106ec64c9e2a7c9430e1d0bef170aaad0f5f7f683c1c1810b466cdf5079 ./bin/zless 9cf4cda0f73875032436f7d5c457271f235e59c968c1c101d19fc7bf137e6e37 ./bin/chmod c5f12f157b605b1141e6f97796732247a26150a0a019328d69095e9760b42e38 ./bin/sleep b9711301d3ab42575597d8a1c015f49fddba9a7ea9934e11d38b9ff5248503a8 ./bin/zfgrep 0b2840eaf05bb6802400cc5fa793e8c7e58d6198334171c694a67417c687ffc7 ./bin/stty d9393d0eca1de788628ad0961b74ec7a648709b24423371b208ae525f60bbdad ./bin/bunzip2 d2a56d64199e674454d2132679c0883779d43568cd4c04c14d0ea0e1307334cf ./bin/mkdir 1c48ade64b96409e6773d2c5c771f3b3c5acec65a15980d8dca6b1efd3f95969 ./bin/cat 09198e56abd1037352418279eb51898ab71cc733642b50bcf69d8a723602841e ./bin/true 97f3993ead63a1ce0f6a48cda92d6655ffe210242fe057b8803506b57c99b7bc ./bin/zdiff 0d06f9724af41b13cdacea133530b9129a48450230feef9632d53d5bbb837c8c ./bin/ls da2da96324108bbe297a75e8ebfcb2400959bffcdaa4c88b797c4d0ce0c94c50 ./bin/zegrep
The file contents above are trusted SHA256 hashes for specific files. If we would like to use this list for ensuring that a particular image is making use of the files listed above, we can do the following:
$ ./convertSHA.sh trustedhashes.txt terrier.yml
The script above takes the input file
trustedhashes.txt which contains our trusted hashes
listed above and converts them into a Terrier friendly config file
terrier.yml which looks like the following:
mode: image image: myImage.tar files: - name: '/bin/uname' hashes: - hash: '8946690bfe12308e253054ea658b1552c02b67445763439d1165c512c4bc240d' - name: '/bin/gzexe' hashes: - hash: '6de8254cfd49543097ae946c303602ffd5899b2c88ec27cfcd86d786f95a1e92' - name: '/bin/wdctl' hashes: - hash: '74ff9700d623415bc866c013a1d8e898c2096ec4750adcb7cd0c853b4ce11c04' - name: '/bin/dmesg' hashes: - hash: '61c779de6f1b9220cdedd7dfee1fa4fb44a4777fff7bd48d12c21efb87009877' - name: '/bin/which' hashes: - hash: '7bdde142dc5cb004ab82f55adba0c56fc78430a6f6b23afd33be491d4c7c238b' - name: '/bin/mknod'
The config file
terrier.yml is ready to be
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] Not all components were identifed: (4/31) [!] Component not identified: /bin/uncompress [!] Component not identified: /bin/bzexe [!] Component not identified: /bin/bzmore [!] Component not identified: /bin/bunzip2 $ echo $? 1
As we can see from the output above, Terrier was unable to identify 4/31 of the components provided in the config. The return code is also 1 which indicates a failure. If we were to remove the components that are not in the provided image, the output from the previous command would look like the following:
$ ./terrier -cfg=terrier.yml [+] Loading config: terrier.yml [+] Analysing Image [+] Docker Image Source: myImage.tar [*] Inspecting Layer: 34a9e0f17132202a82565578a3c2dae1486bb198cde76928c8c2c5c461e11ccf [*] Inspecting Layer: 6539a80dd09da08132a525494ff97e92f4148d413e7c48b3583883fda8a40560 [*] Inspecting Layer: 6d2d61c78a65b6e6c82b751a38727da355d59194167b28b3f8def198cd116759 [*] Inspecting Layer: a6e646c34d2d2c2f4ab7db95e4c9f128721f63c905f107887839d3256f1288e1 [*] Inspecting Layer: aefc8f0c87a14230e30e510915cbbe13ebcabd611e68db02b050b6ceccf9c545 [*] Inspecting Layer: d4468fff8d0f28d87d48f51fc0a6afd4b38946bbbe91480919ebfdd55e43ce8c [*] Inspecting Layer: dbf9da5e4e5e1ecf9c71452f6b67b2b0225cec310a20891cc5dedbfd4ead667c [!] All components were identified: (27/27) [!] Not all components were verified: (26/27) [!] Component not verified: /bin/cat [!] Component not verified: /bin/chmod [!] Component not verified: /bin/chown [!] Component not verified: /bin/df [!] Component not verified: /bin/dir [!] Component not verified: /bin/dmesg [!] Component not verified: /bin/echo [!] Component not verified: /bin/gzexe [!] Component not verified: /bin/gzip [!] Component not verified: /bin/login [!] Component not verified: /bin/ls [!] Component not verified: /bin/mkdir [!] Component not verified: /bin/mknod [!] Component not verified: /bin/mv [!] Component not verified: /bin/sleep [!] Component not verified: /bin/stty [!] Component not verified: /bin/su [!] Component not verified: /bin/tar [!] Component not verified: /bin/true [!] Component not verified: /bin/uname [!] Component not verified: /bin/wdctl [!] Component not verified: /bin/zdiff [!] Component not verified: /bin/zfgrep [!] Component not verified: /bin/zforce [!] Component not verified: /bin/zless [!] Component not verified: /bin/znew $ echo $? 1
The output above indicates that Terrier was able to identify all
the components provided but many were not verifiable, the hashes
did not match and once again, the return code is
indicate this failure.
The previous sections focused on identifying files in images, which can be referred to as a form of “static analysis,” however it is also possible to perform this analysis to running containers. In order to do this, you need the following:
merged folder is Docker specific, in this case,
we are using it because this is where the contents of the Docker
container reside, this might be another location if it were
The location of the container’s
can be determined by running the following commands. First obtain
the container’s ID:
$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES b9e676fd7b09 golang "bash" 20 hours ago Up 20 hours cocky_robinson
Once you have the container’s ID, you can run the
following command which will help you identify the location of the
merged folder on the underlying
$ docker exec b9e676fd7b09 mount | grep diff overlay on / type overlay (rw,relatime,lowerdir=/var/lib/docker/overlay2/l/7ZDEFE6PX4C3I3LGIGGI5MWQD4: /var/lib/docker/overlay2/l/EZNIFFIXOVO2GIT5PTBI754HC4:/var/lib/docker/overlay2/l/UWKXP76FVZULHGRKZMVYJHY5IK: /var/lib/docker/overlay2/l/DTQQUTRXU4ZLLQTMACWMJYNRTH:/var/lib/docker/overlay2/l/R6DE2RY63EJABTON6HVSFRFICC: /var/lib/docker/overlay2/l/U4JNTFLQEKMFHVEQJ5BQDLL7NO:/var/lib/docker/overlay2/l/FEBURQY25XGHJNPSFY5EEPCFKA: /var/lib/docker/overlay2/l/ICNMAZ44JY5WZQTFMYY4VV6OOZ, upperdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/diff, workdir=/var/lib/docker/overlay2/04f84ddd30a7df7cd3f8b1edeb4fb89d476ed84cf3f76d367e4ebf22cd1978a4/work)
From the results above, we are interested in two entries,
workdir because these two
entries will provide us with the path to the container’s
merged folder. From the results above, we can
determine that the container’s
is located at
on the underlying host.
Now that we have the location, we need some files to identify and in this case, we are going to reuse the SHA256 hashes from the previous section. Let’s now go ahead and populate our Terrier configuration with this new information.
mode: container path: merged #image: myImage.tar hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' - hash: 'cf734865dd344cd9b0b349cdcecd83f79a751150b5fd4926f976adddb93d902c'
The configuration above shows that we have changed the
and we have added the
path to our
folder. We have kept the two hashes from the previous section.
If we run Terrier with this configuration from the location
we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd
From the output above, we know that the container
b9e676fd7b09) does not contain the malicious Python
package but it does contain an instance of the Golang binary which
is located at
And as you might have guessed, Terrier can also be used to verify and identify files at specific paths in containers. To do this, we need the following:
The points above can be determined using the same procedures described in the previous sections. Below is an example Terrier config file that we could use to identify and verify components in a running container:
mode: container path: merged verbose: true files: - name: '/usr/bin/curl' hashes: - hash: '9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96' - name: '/usr/local/go/bin/go' hashes: - hash: '82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91dd3ff92dd'
If we run Terrier with the above config, we get the following output:
$ ./terrier [+] Loading config: cfg.yml [+] Analysing Container [!] Found matching instance of '/usr/bin/curl' at: merged/usr/bin/curl with hash:9a43cb726fef31f272333b236ff1fde4beab363af54d0bc99c304450065d9c96 [!] Found matching instance of '/usr/local/go/bin/go' at: merged/usr/local/go/bin/go with hash:82bce4b98d7aaeb4f841a36f7141d540bb049f89219f9e377245a91 dd3ff92dd [!] All components were identified: (2/2) [!] All components were identified and verified: (2/2) $ echo $? 0
From the output above, we can see that Terrier was able to
successfully identify and verify all the files in the running
container. The return code is also
0 which indicates a
successful execution of Terrier.
In addition to Terrier being used as a standalone CLI tool, Terrier can also be integrated easily with existing CI/CD technologies such as GitHub Actions and CircleCI. Below are two example configurations that show how Terrier can be used to identify and verify certain components of Docker files in a pipeline and prevent the pipeline from continuing if all verifications do not pass. This can be seen as an extra mitigation for supply-chain attacks.
Below is a CircleCI example configuration using Terrier to verify the contents of an image.
version: 2 jobs: build: machine: true steps: - checkout - run: name: Build Docker Image command: | docker build -t builditall . - run: name: Save Docker Image Locally command: | docker save builditall -o builditall.tar - run: name: Verify Docker Image Binaries command: | ./terrier
Below is a Github Actions example configuration using Terrier to verify the contents of an image.
name: Go on: [push] jobs: build: name: Build runs-on: ubuntu-latest steps: - name: Get Code uses: actions/checkout@master - name: Build Docker Image run: | docker build -t builditall . - name: Save Docker Image Locally run: | docker save builditall -o builditall.tar - name: Verify Docker Image Binaries run: | ./terrier
In this blog post, we have looked at how to perform multiple actions on Docker (and OCI) containers and images via Terrier. The actions performed allowed us to identify specific files according to their hashes in images and containers. The actions performed have also allowed us to identify and verify multiple components in images and containers. These actions performed by Terrier are useful when attempting to prevent certain supply-chain attacks.
We have also seen how Terrier can be used in a DevOps pipeline via GitHub Actions and CircleCI.
Learn more about Terrier on GitHub at https://github.com/heroku/terrier.
Last month, journalist Matt Cohen tweeted about his years-long Instagram group chat comprised of fellow Matt Cohens, which he calls “the most wholesome thing I’m a part of.”
In the chat, one Matt Cohen shared that he “had [his] first day of college classes today,” to which a Matt Cohen responded “Nice. Just started my first job. Real world is brutal enjoy college man.”
“Got married!” and “Just started my dream job!” chimed in fellow Matt Cohens. Another Matt Cohen announced he had launched a weed brand. The Matt Cohens, who have turned a shared name into an informal online club, planned a Zoom Happy Hour to catch up.
Your name clones usually lurk around you like a shadow. You get their junk mail, their emails, their Google results; glimpses of their intimate moments via their digital ephemera. They are strangers — but they don’t have to be.
Around the world, people are maintaining multigenerational, global friendships with their same-named counterparts — Jake Wright, William Hodgson, Jordan DaSilva, and Josh Brown, to name a few. Sometimes, name twins commiserate about shared experiences: a sixteen-member Council of Aaron Johnson chat laments about the viral Key and Peele sketch that introduced the now-inescapable A-A-Ron nickname. Perhaps the best, or at least the most publicized, example of same-name camaraderie is the Josh Fight, when a group chat of Josh Swains organized an April 2021 meeting in Lincoln, Nebraska to fight for the “right” to the name. More than 900 Joshes showed up.
The Paul O’Sullivan Band has four members with one thing in common: the name Paul O’Sullivan. The quartet materialized after Baltimore Paul started “indiscriminately adding other Paul O’Sullivans on Facebook” and realized that a few different Paul O’Sullivans were musicians. These days, a quartet of Paul O’Sullivans, who hail form from Baltimore, Rotterdam, Manchester, and Pennsylvania, have come together to form a bona fide musical group.
Since its early days, the social internet has been lauded as a way for niche interest groups to connect, and name twins are no exception. A chat titled “Council of Bens” hosts 2500 Benjamins and Bens, and when one Ben caught wind of a similar group chat of Sydneys, he created a chat just for people named Sydney or Ben, which has been going strong for months. Chris Lenaghan added 7 other Chris Lenaghans to a chat, and soon he had same-name friends from Ohio to Belfast to Birmingham. In a Josh Kaplan group chat on Twitter, fellow Josh Kaplans use the chat to congratulate each other on achievements and awards: “A win for one JK is a win for all.”
Samuel Stewart, a 19-year-old Exeter student living in London, formed an Instagram chat of fellow Samuel Stewarts after reading about the Josh Fight. For a few weeks, they chatted about their days; older Sam Stewarts gave advice to younger Sam Stewerts. “They seemed to take me under their wing as if I were a younger version of them,” said a 19-year-old Sam Stewert when we talked on the phone. But the chat went awry when one Samuel Stewert started asking for money. “I felt a bond with the fellow Samuel Stewarts, but the name connection wasn’t quite strong enough for me to start giving away my college fund,” Sam told me.
“They seemed to take me under their wing as if I were a younger version of them.”
The chats aren’t strictly social — sometimes, they’re the most practical way to sort through same-name mixups. Will Packer, a strategist in New York, recently used the Will Packer chat to see if any of his name brothers had been contributing to his inbox clutter. “Any of you from Queensland?” he asked. “Someone tried to create a PlayStation account with my email.”
College student Nolen Young says, “I once created a Facebook Messenger group chat with everyone I could find on Facebook with my exact same name, spelling and all. There were only two other people. One of them considered giving me a job, and the other was an old man who started commenting on all my photos. I've messaged the former a few times because he owns every domain name and email I've ever wanted, and he keeps telling me I can only have them when he dies.”
It’s easier than ever to connect with same-name pals today, but the uncanny allure of name clones predates social media. Tahnee Gehm, an artist and animator based in L.A., organized a Web 1.0 catalog of Tahnees when she was a teenager.
“My dad was into computers and he got me a URL with my name,” she says. “I built an atrocious ‘90s website in 2001 as an eighth grader, and I started getting messages from girls all over the world named Tahnee.”
To catalog her new pen pals, she created a “Hall of Tahnees” webpage with a photo, bio, and hometown for every Tahnee she could find. The site’s “Tahnee-only area” was a “weird, unique club.” Once, she says, a singer from the band Hanson used the website to track down a girl named Tahnee he’d met at a concert. And the Tahnee bond has lasted decades: Tahnee Gehm has maintained a long-distance friendship with Tahneé Engelen since they were in high school. A few years ago, Gehm spent two weeks visiting Engelen in Paris, where she works as a neurobiologist.
“It’s nice to know that my name buddy is living my alternate life and absolutely killing it,” she told Input over the phone.
Sometimes, all it takes to spark a friendship is a similar email address. Seth Capron met an older Seth Capron after noticing their similar interests based on the emails he mistakenly received — soon, they realized their physical resemblance, too. These days, the older Seth jokes that he could pass on his career. “I was actually considering that as I move into retirement, the Younger could just carry on in my former role of Seth Capron, affordable housing consultant,” said “Seth the Older.”
Name buddies sometimes have a parasocial relationship with each other’s digital footprint. As a kid, Chris Lenaghan found online videos of a different Chris Lenaghan doing wheelies and “cool BMX shit” and immediately told all his friends that it was him in the videos. Years later, thanks to a big group chat, Chris Lenaghan met the BMX trickster, who he now calls “Ohio Chris,” and they ended up becoming close friends.
The chats don’t always advance beyond acquaintanceship, though. Evan Quigley, a University of Florida student, says that the Evan Quigley group chat is “more like a running joke than true friendship.” (The Evan Quigleys, bonded by name alone, proclaim unconditional public support for one another by commenting “way to go, Evan Quigley” on each other’s posts).
People with uncommon first names can bond over shared experiences — mispronunciations, playground taunts, and misspellings. More than a dozen Zaviens have come together via Snapchat. “None of us had ever talked to another Zavien,” one Zavien told Input. And a 14-member-strong “Council of Ethyns” chat, which started on Instagram in 2019, is mostly dedicated to tongue-in-cheek malice toward Ethans (with an “a”). They also just pop in the chat to say “love you Ethyn” a lot.
Still, the unlikely connections evoke nostalgia for a simpler internet, less cluttered with surveillance and corporate interests, where people went to meet new friends. Occasionally, wholesome chance online encounters remain. “Text door neighbors,” for example, or people with phone numbers one digit apart, show how easy it is to stumble upon an unlikely friend. Most notably in the wrong-number-gone-right stories, the duo Wanda and Jamal, whose viral wrong number ordeal has led to a six-year-long-and-counting Thanksgiving tradition, is now set to be featured in an upcoming Netflix movie.
It’s a big world out there — lots of Matt Cohens, more Alex Stewarts, and even more James Smiths — and your name buddies have never been easier to befriend. And I think that’s beautiful.
But just because we’re psychologically inclined to like our own name, doesn’t mean you’ll have a guaranteed connection with your name clones. Just ask Kelly Hildebrandt and Kelly Hildebrandt, the couple that tied the knot a year after they’d met when name-searching on Facebook and then, four years later, called off the marriage due to irreconcilable differences. It’s not all in a name.
Missouri’s school funding strategy recognizes that some children and communities need more financial support to meet education standards.
It directs extra funds to districts that have a harder time raising local property taxes, and to children who have special needs, are learning English or are living in poverty.
But experts say while the system has good intentions, the devil is in the details.
Parts of Missouri’s funding strategies undermine its equity goals. School finance researchers named issues that include:
Overall, this leads to a situation where some districts receive state aid they don’t need while others are stretched thin as they attempt to serve children who need more resources.
“We have a large portion of students across the state who experience some form of economic disadvantage,” said Cameron Anglum, an assistant professor of education, policy and equity at the Saint Louis University School of Education.
“It’s really important that the state funding formula serve those kids effectively, particularly those kids that live in districts that don’t have the local property wealth … to provide an adequate education.”
Bruce Baker, a professor and chair of the department of teaching and learning at the University of Miami School of Education and Human Development, said the main goals of a school finance formula should be adequacy and equity.
Adequacy means there’s enough funding for the school to meet certain goals. Equity acknowledges that some students or schools may require greater funding to meet those standards.
To illustrate, Baker referred to the School Finance Indicators Database run by the Albert Shanker Institute and Rutgers Graduate School of Education.
The database calculates that in 2019, the latest data available, the 20% of districts with the highest poverty rates in Missouri needed nearly $12,000 more per student to reach national average test scores than the 20% of districts with the lowest poverty rates.
Instead, the database shows students in highest-poverty districts were receiving only about $1,000 more than students in the most affluent schools.
Baker, one of the main researchers for the database, said the numbers are based on a statistical model that uses data on student characteristics, hiring costs and district size to calculate necessary funding levels, which are different in each state.
James Shuls, an associate professor of educational leadership and policy studies at the University of Missouri-St. Louis, said school funding “reflects the values of people” and appropriate levels should be determined through the political process.
Shuls said he personally values choice, equity and efficiency in education funding. He previously worked for the Show-Me Institute, where he authored a Missouri school finance formula primer. The institute is a think tank “dedicated to promoting free markets and individual liberty” and supportive of policies that increase “school choice.”
An ideal funding formula should be “dynamic,” said Shuls, reflecting changing local resources and specific student needs to better promote equity.
Instead, Missouri’s formula reflects outdated property values and school funding levels, Shuls said.
Missouri’s school funding formula starts with an “adequacy target,” the amount of money needed to educate a single student. It multiplies that number based on student attendance, area cost of living and, in some cases, student characteristics that might require extra funds such as disability or learning English.
The formula then factors in how much funding districts can raise from local property taxes.
Anglum, the SLU professor, said one equity challenge is that the state doesn’t manage a majority of the funding that goes to schools.
Some state funding also goes through programs that don’t have the same equity focus as the main formula.
Missouri ranks 47th out of 50 states when it comes to the percentage of school funding that comes from the state. When all sources — local, state and federal — are combined, 2018-19 data from the National Center for Education Statistics shows that K-12 per-student spending in Missouri ranks 32nd in the nation.
Additionally, 2021 data from the National Education Association, a prominent teachers union, shows that not counting the District of Columbia, Missouri has the second-highest percentage of funding coming from local sources and the smallest percentage coming from state sources.
“When we are relying predominantly on local resources in order to fund education, higher-wealth districts are going to win out and lower-wealth districts are going to lose out,” Anglum said.
Traci Gleason, vice president for external affairs at the Missouri Budget Project, said lower state funding can cause localities with fewer resources to choose between underfunding services — including education — or imposing burdensome levels of property and sales taxes.
When legislators reformed school finance in 2005, they also included “hold harmless” provisions to ensure no district would receive less state money under the new formula.
Shuls said that was a sensible way to prevent abrupt funding dips for some districts under the new system. But the “hold harmless” provisions didn’t phase out, meaning many districts are still being funded at outdated levels instead of updated, equitable ones.
When the formula calculates districts’ ability to raise local property taxes, it’s using property values that are now more than 16 years old. That means the state is giving districts with growing property values more funds than they need to meet targets, instead of distributing that money in other ways.
Baker, the University of Miami professor with experience in Missouri and Kansas, said Missouri’s “hold harmless” provisions aren’t even the biggest factor that prevents the state from having a “progressive” funding system. (In this case, “progressive” means districts with greater need spend more money per student.)
Hold harmless provisions tend to partially undo reforms, he agreed. “But I’m not convinced that any of the changes they were making would have very aggressively moved it in the right direction anyway.”
Instead, he said the state’s method of calculating attendance financially penalizes districts that most need support.
Missouri calculates the number of students in each district by using the average daily attendance instead of the total number of students.
That means a school with an 80% attendance rate on the average day could see its funding cut by 20% compared to an otherwise identical school with perfect attendance.
Baker said that’s especially problematic when it comes to equity because schools with lower attendance rates tend to have higher rates of students living in poverty.
“It’s been explained to policymakers in every damn state that it is discriminatory and erases any need adjustment to fund on average daily attendance, and only a few states are bold enough to still do it,” he said. “There’s no excuse for doing it. There’s no legitimate incentive that funding on average daily attendance will, you know, cause attendance to improve.”
Anglum and Shuls agreed that using average daily attendance penalizes poorer schools, although Shuls said there are pros and cons in all methods of calculating attendance.
Another quirk of Missouri’s system is that while it “weights” students who are typically more costly to educate, it only does so when the percentage of students in a specific category exceeds a specific threshold.
For example, the threshold for students receiving free and reduced-price lunch — a common way to estimate numbers of low-income students — is a bit more than 30%. Schools that serve a higher percentage than that get extra funding. Meanwhile, a district serving 25% students in that category doesn’t receive more funding than a district serving 5%.
Shuls, Baker and Anglum all criticized the use of thresholds. Shuls suggested the state could even differentiate the amounts granted for special-needs students — who can have very different funding needs — to better create a system where money “follows the student.”
Baker said that over the past decades, Kansas has strengthened its school finance system while Missouri’s has weakened.
Baker formerly taught at the University of Kansas and was involved in discussions surrounding school finance reform in both Missouri and Kansas. He recently published “School Finance and Education Equity: Lessons from Kansas,” which he said includes many comparisons with Missouri.
“The costs to get to the same outcomes are a little lower in Kansas, but Kansas also much more robustly funds their system,” Baker said.
In Kansas, 58% of districts are spending above the adequate level and achieving results above the national average, Baker said. In Missouri, only 43% of districts are doing the same.
Meanwhile, about 13% of Kansas districts are spending below the targets and achieving below the national average. Nearly 30% of Missouri districts are in the same boat.
“There’s much more inequality in Missouri; there’s far more kids in inadequately funded districts that then have inadequate outcomes to go along with that,” Baker said. “Kansas has just done much better in that regard, over time.”
The Beacon is working on a larger story about Kansas’ school finance formula.
A report from the Missouri Budget Project shows Missouri’s overall K-12 funding target for each student, adjusted for inflation, is less than the 2007 amount — by about $1,000.
Gleason, the project spokesperson, said that while Kansans
reacted to abrupt funding cuts several years ago and restored
funding, Missourians haven’t been as aware of gradual funding
“Missouri has been more like the frog in the frying pan, or boiling water … We just haven’t noticed because it’s happened so slowly over time.”
The post Missouri’s school funding system undermines its own goals for equity, experts say appeared first on The Beacon.
ops is hard. what have we learned so far?
Last weekend I happened to pick up a book called “Rituals For Work: 50 Ways To Create Engagement, Shared Purpose, And A Culture That Can Adapt To Change.” It’s a super quick read, more comic book than textbook, but I liked it.
It got me thinking about the many rituals I have initiated and/or participated in over the course of my career. Of course, I never thought of them as such — I thought of them as “having fun at work” — but now I realize these rituals disproportionately contribute to my favorite moments and the most precious memories of my career.
Rituals (a definition): Actions that a person or group does repeatedly, following a similar pattern or script, in which they’ve imbued symbolism and meaning.
I think it is extremely worth reading the first 27 pages of the book — the Introduction and Part One. To briefly sum up the first couple chapters: the power of creative rituals comes from their ability to link the physical with the psychological and emotional, all with the benefit of “regulation” and intentionality. Physically going through the process of a ritual helps people feel satisfied and in control, with better emotional regulation and the ability to act in a steadier and more focused way. Rituals also powerfully increase people’s sense of belonging, giving them a stable feeling of social connection. (p. 5-6)
The thing that grabbed me here is that rituals create a sense of belonging. You show that you belong to the group by participating in the ritual. You feel like you belong to the group by participating in the ritual. This is powerful shit!
It seems especially relevant these days when so many of us are atomized and physically separated from our teammates. That ineffable sense of belonging can make all the difference between a job that you do and a role that feeds your soul. Rituals are a way to create that sense of belonging. Hot damn.
So I thought I’d write up some of the rituals for engineering teams I remember from jobs past. I would love to hear about your favorite rituals, or your experience with them (good or bad). Tell me your stories at @mipsytipsy.
At Linden Lab, in the ancient era of SVN, we had something called the “Feature Fish”. It was a rubber fish that we kept in the freezer, frozen in a block of ice. We would periodically cut a branch for testing and deployment and call a feature freeze. Merging code into the branch was painful and time consuming, so If you wanted to get a feature in after the code freeze, you had to first take the fish out of the freezer and unfreeze it.
This took a while, so you would have to sit there and consider your sins as it slowly thawed. Subtext: Do you really need to break code freeze?
You were supposed to pair with another engineer for code review. In your commit message, you had to include the name of your reviewer or your merge would be rejected. But the template would also accept the name “Stuffy”, to confess that your only reviewer had been…Stuffy, the stuffed animal.
However if your review partner was Stuffy, you would have to narrate the full explanation of Stuffy’s code review (i.e., what questions Stuffy asked, what changes he suggested and what he thought of your code) at the next engineering meeting. Out loud.
We had a matted green felt headband with ogre ears on it, called the Shrek Ears. The first time an engineer broke production, they would put on the Ears for a day. This might sound unpleasant, like a dunce cap, but no — it was a rite of passage. It was a badge of honor! Everyone breaks production eventually, if they’re working on something meaningful.
If you were wearing the Shrek Ears, people would stop you throughout the day and excitedly ask what happened, and reminisce about the first time they broke production. It became a way for 1) new engineers to meet lots of their teammates, 2) to socialize lots of production wisdom and risk factors, and 3) to normalize the fact that yes, things break sometimes, and it’s okay — nobody is going to yell at you.
This is probably the number one ritual that everybody remembers about Linden Lab. “Congratulations on breaking production — you’re really one of us now!”
We had a stuffed Vorpal Bunny, duct taped to a 3″ high speaker stand, and the operations engineer on call would put the bunny on their desk so people knew who it was safe to interrupt with questions or problems.
At some point we lost the bunny (and added more offices), but it lingered on in company lore since the engineers kept on changing their IRC nick to “$name-bunny” when they went on call.
There was also a monstrous, 4-foot-long stuffed rainbow trout that was the source of endless IRC bot humor… I am just now noticing what a large number of Linden memories involve stuffed animals. Perhaps not surprising, given how many furries were on our platform
Whenever an engineer really took one for the team and dove headfirst into a spaghetti mess of tech debt, we would award them the “Tiara of Technical Debt” at the weekly all hands. (It was a very sparkly rhinestone wedding tiara, and every engineer looked simply gorgeous in it.)
Examples included refactoring our golang rewrite code to support injection, converting our entire jenkins fleet from AWS instances to containers, and writing a new log parser for the gnarliest logs anyone had ever seen (for the MongoDB pluggable storage engine update).
We spent nearly 2.5 years rewriting our entire ruby/rails API codebase to golang. Then there was an extremely long tail of getting rid of everything that used the ruby unicorn HTTP server, endpoint by endpoint, site by site, service by service.
When we finally spun down the last unicorn workers, I brought in a bunch of rainbow unicorn paper sculptures and a jug of lighter fluid, and we ceremonially set fire to them in the Facebook courtyard, while many of the engineers in attendance gave their own (short but profane) eulogies.
This one requires a bit of backstory.
Finally we caved and got on board. We were excited! I announced the migration and started providing biweekly updates to the infra leadership groups. Four months later, when the migration was half done, I get a ping from the same exact members of Facebook leadership:
“What are you doing?!?”
“You can’t do that, there are security issues!”
“No it’s fine, we have a fix for it.”
“There are hardware issues!”
“No it’s cool, we got it.”
“You can’t do this!!!”
ANYWAY. To make an EXTREMELY long and infuriating story short, they pulled the plug and canned the whole project. So I printed up a ten foot long “Mission Accomplished” banner (courtesy of George W Bush on the aircraft carrier), used Zuck’s credit card to buy $800 of top-shelf whiskey delivered straight to my desk (and cupcakes), and we threw an angry, ranty party until we all got it out of our systems.
I honestly don’t remember what this one was about, but I have extensive photographic evidence to prove that I shaved the heads of and/or dyed the hair blue of at least seven members of engineering. I wish I could remember why! but all I remember is that it was fucking hilarious.
Coincidentally (or not), I have no memories of participating in any rituals at the jobs I didn’t like, only the jobs I loved. Huh.
One thing that stands out in my mind is that all the fun rituals tend to come bottoms-up. A ritual that comes from your VP can run the risk of feeling like forced fun, in a way it doesn’t if it’s coming from your peer or even your manager. I actually had the MOST fun with this shit as a line manager, because 1) I had budget and 2) it was my job to care about teaminess.
There are other rituals that it does make sense for executives to create, but they are less about hilarious fun and more about reinforcing values. Like Amazon’s infamous door desks are basically just a ritual to remind people to be frugal.
Rituals tend to accrue mutations and layers of meaning as time goes on. Great rituals often make no sense to anybody who isn’t in the know — that’s part of the magic of belonging.
Claydream, Marq Evans’ new documentary about animator Will Vinton, addresses the elephant in the room immediately: yes, this is the guy who lost his company to his most deep-pocketed investor, Nike founder Phil Knight. It’s something that looms over the film, but it’s not the only melancholy element that colors this portrait of Vinton’s life and career.
Made with the cooperation of Vinton himself, who died of cancer in 2018 but is interviewed extensively here, Claydream offers a visual history of his remarkable accomplishments. Not only do we get a look at the progression of Vinton’s work over the years (from Closed Mondays, the Oscar-winning 1974 short he created with Bob Gardiner, to his company’s instantly recognizable commercial work from the ‘80s and ‘90s, including the California Raisins), we also get access to home movies, as well as firsthand accounts from friends, family members, and former coworkers. After sparking to filmmaking while at UC Berkeley in the 1960s, Vinton (who prized experimentation and creative fulfillment above all else, and was definitely a bit of a hippie) set up a small workshop with his collaborators in Portland, Oregon, a location that kept their productions deliberately removed from the Hollywood machine—the same machine he’d end up pursuing years later, when Will Vinton Studios was at its peak.
Most of Claydream keeps the focus on Vinton’s work—again, this movie is a visual feast, jam-packed with clips and other ephemera (including answering-machine messages from a California Raisins-obsessed Michael Jackson) that illustrate the narrative of Vinton’s career every step of the way. But for all his success, and for the admirable way he bounced back from his periodic failures and missteps, he never achieved the heights of his idol, Walt Disney, whose life trajectory he emulated, down to plans for a never-realized “Claymation Station” amusement park. Though he was well-liked as a person, not everyone he worked with is full of praise; there were issues over the years of sharing credit with the other animators who toiled on his projects, as well as some bad business decisions that meant, for instance, that Will Vinton Studios didn’t share in the licensing for the insanely marketable California Raisins—and also that Vinton passed on selling his company to Pixar during its pre-Disney era. A contentious split with the troubled Gardiner soon after their shared Oscar win haunted Vinton until Gardiner’s death in 2005. But as Claydream amply illustrates, the Phil Knight debacle ended up being the biggest tragedy of Vinton’s creative life.
Neither Knight nor his son Travis Knight are interviewed in Claydream; we see them in deposition and archival footage only. Travis Knight, now a film director known for the stop-motion feature Kubo and the Two Strings as well as the live-action Transformers spin-off Bumblebee, comes off particularly badly just on the basis of the facts presented: a failed rapper, he was hired at Will Vinton Studios after his father invested in it, where he developed his (by all accounts) true talent and passion for animation. But there’s no escaping the “nepotism baby” aroma that envelops him in this context, especially when the documentary points out that he became head of Will Vinton Studios—renamed Laika—after Vinton, who was unable to rescue his financially struggling company, was pushed out.
It’s juicy show-biz stuff, for sure, but Vinton makes a point of turning what was obviously an incredibly devastating blow into something positive. Looking back several years after he lost his studio, he sounds genuinely proud of its continued success, specifically in the way that Laika—which has since become a Hollywood powerhouse with acclaimed titles like Coraline, ParaNorman, The Boxtrolls, Missing Link, and Knight’s Kubo—brought stop-motion to an ever-wider audience while innovating on the art form. It couldn’t have been easy for Vinton to make peace with the situation, but Claydream sure makes it seem like he was able to. Perhaps, as in his earliest days as a counterculture animator, it all came down to what really mattered: making an end product that was cool as it could possibly be. Even if Vinton wasn’t directly involved in any of Laika’s titles, his legacy lives on.
Claydream hits select theaters today, August 5.
Want more io9 news? Check out when to expect the latest Marvel and Star Wars releases, what’s next for the DC Universe on film and TV, and everything you need to know about House of the Dragon and Lord of the Rings: The Rings of Power.
Iâ€™m Brandon Sanderson, a bestselling fantasy author. Best known for The Stormlight Archive, Mistborn, and for finishing Robert Jordanâ€™s The Wheel of Time, Iâ€™m now also known for having the highest-funded campaign in Kickstarterâ€™s history for four books I wrote during the quarantine. If you want to stay up to date with me, you should check out my YouTube channel (where you can watch me give my answers to the questions below) and my Facebook, Twitter, and Instagram. Ask me any questions you like, but Iâ€™m less likely to answer questions with massive spoilers for the books. Iâ€™ll be taking questions today only.
EDIT: I'm off the livestream and have had some dinner. The transcription of some questions is still coming, as...well, I talk a lot. Those answers will be posted soon, or you can see them on the VOD of my answers on the YouTube channel.
Apologies for the stream-of-consciousness wall-of-text answers. This was a new thing for us, finding a way for me to be able to give answers for people while also getting piles of pages signed. I hope you can make sense of the sometimes rambling answers I give. They might flow better if you watch them be spoken.
Thanks, all, for the wonderful AMA. And as I said, some answers are still coming (and I might pop in and write out a few others that I didn't get to.)
Russia’s invasion of Ukraine has exacerbated a number of fault lines already present within the global energy supply chain. This is especially true in Europe, where many countries were reliant on the superstate's natural resources, and are now hastily looking to cut ties before the supply is shut off. This has revealed the fragility of Europe’s energy market, and caused it to drive up demand and prices for consumers all over the globe.
In the UK, things are becoming increasingly dire and energy prices are skyrocketing. Bad planning on the infrastructure side and the cancellation of several major domestic energy efficiency programs are exacerbating the problem. It’s clear that real, useful action on the national level isn’t coming any time soon. So, I wondered, what would happen if I, personally, simply tried to break up with natural gas on my own? It’s relatively straightforward but, as it turns out, it comes at a cost that only one percenters will be able to bear.
I live in a four-bedroom, end-terraced house that’s around 150 years old and I’ve tried, as best as I can, to renovate it in an eco-friendly way. Since we bought it almost a decade ago, my wife and I have insulated most of the rooms, installed a new gas central heating system and hot water cylinder. We are, like nearly 20 million other households in the UK, reliant on natural gas to supply our home heating, hot water and cooking. And in the period between January 8th and April 7th, 2022, I was billed on the following usage:
Cost Per Unit (GBP)
Electricity (incl. standing charge)
Gas (incl. standing charge)
Total (incl. tax and other charges)
Essentially, I paid around $1,300 for my natural gas and electricity in the first quarter of 2022. That figure is likely to rise significantly, as the UK’s mandatory price cap on energy rose by more than 50 percent in April. A further price rise is scheduled for October, with the figure set at £2,800 per year, even though wholesale energy prices are no longer increasing. It’s likely that my energy bill for the first quarter of 2023 will be nearly twice what I’ve just paid. In 2020, the UK reported that 3.16 million households were unable to pay for their energy costs; that figure is likely to leap by 2023.
In the US, the EIA says that monthly utility bills rose to a national average of $122 in 2021, with Hawaii ($178 per month) and Utah ($82 per month) the most expensive and cheapest state to buy energy in. The average price per kWh is around 13.7 cents, which is less than half the comparable price in the UK as it currently stands. For natural gas, the average natural gas price for residential customers was $10.84 per thousand cubic feet in 2020.
Much of Europe is reliant on natural gas, a significant proportion of which was supplied by Russia. Despite a rapid decline in domestic production, Europe sought to make natural gas the bedrock of its energy policy in the medium term. A 2013 policy paper written by Sami Andoura and Clémentine d’Oultremont outlined the reasons why officials were banking on it. “An economically attractive option for investors, a potential backup source for renewables and the cleanest fossil fuel, natural gas is expected to play an important role in the European transition towards a low-carbon economy by 2050.” This is despite the fact that “European energy resources are being depleted, and energy demand is growing.”
In 2007, then EU Energy Commissioner Andris Piebalgs said that the bloc is “dependent on imports for over one half of our energy use.” He added that energy security is a “European security issue,” and that the bloc was vulnerable to disruption. “In 10 years, from 1995 to 2005, natural gas consumption in the EU countries has increased from 369 billion to 510 billion m3 [of gas] year,” he said. He added that the EU’s own production capacity and reserves peaked in the year 2000.
The EU’s plan was to pivot toward Liquified Natural Gas (LNG), methane which has been filtered and cooled to a liquid for easier transportation. It enables energy supplies from further afield to be brought over to Europe to satisfy the continent’s need for natural gas. But the invasion of Ukraine by Russia has meant that this transition has now needed to be accelerated as leaders swear off Russian-sourced gas and oil. And while the plan is to push more investment into renewables, LNG imports are expected to fill much of the gap for now.
Except, and this is crucial, many of the policy decisions made during this period seem to be in the belief that nothing bad would, or could, disrupt supply. Here in the UK, wholesale gas prices have risen five times since the start of 2021 but there’s very little infrastructure available to mitigate price fluctuations.
The Rough Field is a region in the North Sea situated 18 miles off the coast of Yorkshire, and was previously a source of natural gas for the UK. In 1985, however, it was converted into a natural gas storage facility with a capacity of 3.31 billion cubic meters. This one facility was able to fulfill the country’s energy needs for a little more than a week at a time and was considered a key asset to maintaining the UK’s energy security.
However, Centrica, the private company spun out of the former state-owned British Gas, opted to close the field in 2017. It cited safety fears and the high cost of repair as justification for the move, saying that alternative sources of gas – in the form of LNG – were available. At the time, one gas trader told Bloomberg that the closure would “boost winter prices” and “create seasonal swings in wholesale energy costs.” He added that the UK would now be “competing with Asia for winter gas cargoes,” raising prices and increasing reliance on these shipments.
And, unsurprisingly, the ramifications of this decision were felt in the summer of 2017 when a pair of LNG tankers from Qatar changed course. The vessels were going to the UK, and when they shifted direction, Bloomberg reported that prices started to shift upward almost instantly.
Analysis from TransitionZero, reported by The Guardian, says that the costs associated with natural gas are now so high that it’s no longer worth investing in as a “transition fuel.” It says that the cost to switch from coal to gas is around $235 per ton of CO2, compared to just $62 for renewables as well as the necessary battery storage.
In order to break up with gas in my own home, I’ll need to swap out my stovetop (not so hard) and my whole central heating system (pretty hard). The former I can likely achieve for a few hundred dollars, plus or minus the cost of installation. (Some units just plug in to a standard wall socket, so I may be able to do much of the work myself if I’m feeling up to the task.) Of course, getting a professional to unpick the gas pipeline that connects to my stovetop is going to be harder.
Unfortunately, replacing a 35kW condensing gas boiler (I have the Worcester Bosch Greenstar 35CDi) is going to be a lot harder. The obvious choice is an Air Source Heat Pump (ASHP), or even a geothermal Ground Source Heat Pump (GSHP), both of which are more environmentally-friendly. After all, both are more energy-efficient than a gas boiler, and both run on electricity which is theoretically cleaner.
More generally, the UK’s Energy Saving Trust, a Government-backed body with a mission to advocate for energy efficiency, says that the average Briton should expect to pay between £7,000 and £13,000 to install an ASHP. Much of that figure is dependent on how much of your home’s existing hardware you’ll need to replace. A GSHP is even more expensive, with the price starting at £14,000 and rising to closer to £20,000 depending on both your home’s existing plumbing and the need to dig a bore hole outside.
In my case, heat pump specialists told me that, give or take whatever nasties were found during installation, I could expect to pay up to £27,000 ($33,493). This included a new ASHP, radiators, hot water and buffer cylinders, pumps, piping, controllers, parts and labor. Mercifully, the UK is launching a scheme to offer a £5,000 ($6,200) discount on any new heat pump installations. But that still means that I’m paying north of £20,000 (and ripping out a lot of existing materials with plenty of life left in them) to make the switch.
In the US, there’s plenty of difference on a state level, but at the federal level, you can get a tax credit on the purchase of a qualifying GSHP. A system installed before January 1st, 2023, will earn a 26 percent credit, while a unit running before January 1st, 2024, will be eligible for a 22 percent credit. Purchasers of a qualifying ASHP, meanwhile, were entitled to a $300 tax credit until the end of 2021.
The contractors also provided me with a calculation of my potential energy savings over the following seven years. It turns out that I’d actually be spending £76 more on fuel per month, and £532 over the whole period. On one hand, if I had the cash to spare, it’s a small price to pay to dramatically reduce my personal carbon emissions. On the other, I was hoping that the initial investment would help me reduce costs overall, but that's not the case while the cost of gas is (ostensibly) cheaper than electricity. (This will, of course, change as energy prices surge in 2023, however, but I can only look at the data as it presently stands.)
An aside: To be honest with you all, I was fully aware that the economic case for installing a heat pump was always going to be a shaky one. When speaking to industry figures last year, they said that the conversation around “payback” isn’t shared when installing standard gas boilers. It doesn’t help that, at present, levies on energy mean that natural gas is subsidized more than energy, disincentivizing people making the switch. The rise of electric cars, too, has meant that demand for power is going to increase sharply as more people switch, forcing greater investment in generation. What’s required just as urgent is a series of measures to promote energy efficiency to reduce overall demand for both gas and electricity.
The UK has had an on-again, off-again relationship with climate change mitigation measures, which has helped sow the seeds of this latest crisis. The country, with low winter temperatures, relies almost exclusively on natural gas to heat its homes, its largest energy-consuming sector. As I reported last year, around 85 percent of UK homes are heated by burning natural gas in domestic boilers.
Work to reduce the UK’s extraordinary demand for natural gas was sabotaged by government in 2013. In 2009, under the previous Labour government, a series of levies on energy companies were introduced under the Community Energy Saving Programme. These levies were added to domestic energy bills, with the proceeds funding works to install wall or roof insulation, as well as energy-efficient heating systems and heating controllers for people on low incomes. The idea was to reduce demand for gas by making homes, and the systems that heated them, far more efficient since most of the UK’s housing stock was insufficiently insulated when built.
But in 2013, then-Conservative-Prime Minister David Cameron was reportedly quoted as saying that he wanted to reduce the cost of domestic energy bills by getting “rid of all the green crap.” At the time, The Guardian reported that while the wording was not corroborated by government officials, the sentiment was. Essentially, that meant scrapping the levies, which at the time GreenBusinessWatch said was around eight percent of the total cost of domestic energy. Cameron’s administration also scrapped a plan to build zero-carbon homes, and effectively banned the construction of onshore windfarms which would have helped reduce the cost of domestic electricity generation.
In 2021, the UK’s Committee on Climate Change examined the fallout from this decision, saying that Cameron’s decision kneecapped efforts to reduce demand for natural gas. As Carbon Brief highlighted at the start of 2022, in 2012, there were nearly 2.5 million energy efficiency improvements installed. By 2013, that figure had fallen to just 292,593. The drop off, the Committee on Climate Change believes, has caused insulation installations to fall to “only a third of the rate needed by 2021” to meet the national targets for curbing climate emissions.
Carbon Brief’s report suggests that the financial savings missed by the elimination of these small levies – the “green crap,” – has cost UK households around £2.5 billion. In recent years, a pressure group – Insulate Britain – has undertaken protests at major traffic intersections to help highlight the need for a new retrofit program to be launched. The current government’s response to their pleas has been to call for tougher criminal penalties for protesters including a jail term of up to six months.
Looking back through my energy bills over the last few years, my household’s annual electricity consumption is around 4,500kWh per year. A heat pump would likely add a further 6,000kWh to my energy bill, not to mention any additional cost for switching to all-electric cooking. It would be sensible to see if I could generate some, or all, of my own energy at home using solar panels to help reduce the potential bill costs.
The Energy Saving Trust says that the average homeowner can expect to pay £6,500 for a 4.2kWp system on the roof of their home. Environmental factors such as the country you live in and orientation of your property mean you can’t be certain how much power you’ll get out of a specific solar panel, but we can make educated guesses. For instance, the UK’s Renewable Energy Hub says you can expect to get around 850kW per year out of a 1kW system. For a theoretical 5kWp system in my location, the Energy Saving Trust thinks I’ll be able to generate around 4,581kWh per year.
Sadly, I live in an area where, even though my roof is brand new and strong enough to take panels, they aren’t allowed. This is because it is an area of “architectural or historic interest where the character and appearance [of the area] needs to be protected or improved.” Consequently, I needed to explore work to ground-mount solar panels in my back garden, which gets plenty of sunlight.
While I expected grounded panel installations to be much cheaper, they apparently aren’t. Two contractors I spoke to said that while their average roof-based installation is between £5,000 and £7,000, a 6kWp system on the ground would cost closer to £20,000. It would be, in fact, cheaper to build a sturdy shed in the bit of back yard I had my eye on and install a solar system on top of there, compared to just getting the mounting set up on the ground. That’s likely to spool out the cost even further, and that’s before we get to the point of talking about battery storage.
For this rather nifty thought experiment, the cost for me to be able to walk away from natural gas entirely would be north of £30,000 ($37,000). Given that the average UK salary is roughly £38,000, it’s a sum that is beyond the reach of most people without taking out a hefty loan. This is, fundamentally, why the need for government action is so urgent, since it is certainly beyond the ability of most people to achieve this change on their own.
In fact, it’s going to require significant movement from central government not just in the UK but elsewhere to really shake our love-hate relationship with natural gas. Unfortunately, given that it’s cheap, cleaner than coal and the energy lobby has plenty of muscle behind it, that’s not likely to happen soon. And so we’re stuck in a trap – it’s too expensive to do it ourselves (although that’ll certainly be an interesting experiment to undertake) and there’s no help coming, despite the energy crisis that’s unfurling around us.
Maps of the American West have featured ever darker shades of red over the past two decades. The colors illustrate the unprecedented drought blighting the region. In some areas, conditions have blown past severe and extreme drought into exceptional drought. But rather than add more superlatives to our descriptions, one group of scientists believes it's time to reconsider the very definition of drought.
<SimonSapin> nox: the history of packaging in python is
<nox> SimonSapin: All I need to know is, is setuptools old stuff or new stuff?
<SimonSapin> nox: its been both
<SimonSapin> in that order
I first fell in love with wuxia when I was around eight or so. I remember running around swinging the bright yellow handle of my toy broom as a sword, calling a sprawling tiger stuffed toy my master and pretending the shower was a waterfall I could learn the secrets of the universe under. I ran on tiptoe because that was somehow more like flying—or “hing gung” 輕功, the art of lightness, as I would eventually become fond of translating it .
But even before then I was deeply familiar with the genre; its many conventions have become baked into the everyday language of the Hong Kong I grew up in. My relatives all played Mahjong and much like with sports, discussions around these games borrowed heavily from the language of sparring martial artists. I’d ask at the end of every Sunday, what are the results of the battles. When asking for a family recipe, someone would joke that they’d have to become the apprentice of this or that auntie. Later, there was the world of study guides and crib sheets, all calling themselves secret martial arts manuals. The conventions around martial artists going into seclusion to perfect their craft and going mad in the pursuit of it take on new meaning as slang around cramming for exams.
Which is all to say, I really love wuxia.
“Wuxia”, literally meaning “martial hero”, is a genre about martially powerful heroes existing in a world parallel to and in the shadows of the Chinese imperial history.
The archetypal wuxia hero is someone carving out his own path in the world of rivers and lakes, cleaving only to their own personal code of honour. These heroes are inevitably embroiled in personal vengeance and familial intrigue, even as they yearn for freedom and seek to better their own skills within the martial arts. What we remember of these stories are the tournaments, the bamboo grove duels and the forbidden love.
Parallels are often drawn to knights errant of medieval romances, with many older translations favouring a chivalric vocabulary. There are also obvious comparisons to be made with the American western, especially with the desperados stumbling into adventures in isolated towns in search for that ever-elusive freedom.
It is easy to think of wuxia in these universal terms with broad themes of freedom, loyalty and justice, but largely divorced from contemporary politics. These are stories, after all, that are about outlaws and outcasts, existing outside of the conventional hierarchies of power. And they certainly do have plenty to say about these big universal themes of freedom, loyalty and justice.
But this is also a genre that has been banned by multiple governments within living memory. Its development continues to happen in the shadows of fickle Chinese censorship and at the heart of it remains a certain defiant cultural and national pride intermingled with nostalgia and diasporic yearning. The vast majority of the most iconic wuxia texts are not written by Chinese authors living comfortably in China, but by a dreaming diaspora amid or in the aftermath of vast political turmoil.
Which is all to say that the world of wuxia is fundamentally bound up with those hierarchies of power it seeks to reject. Much like there is more to superheroes than dorky names, love triangles, and broad universal ideals of justice, wuxia is grounded in the specific time and place of its creation.
Biography of Old Dragon-beard (虯髯客傳) by Du Guangting (杜光庭, 850-933) is commonly cited as the first wuxia novel. It chronicles the adventures of the titular Old Dragon-beard, who along with the lovers, Hongfu 紅拂 and Li Jing 李靖, make up the Three Heroes of the Wind and Dust. But the story isn’t just supernatural adventures; they also help Li Shimin 李世民 found the Tang Dynasty (618–906). The martial prowess and the seemingly eccentric titles of the characters aside, the act of dynastic creation is unavoidably political. 虯髯客傳 pivots around Hongfu’s ability to discern the true worth a man, which leads her to abandon her prior loyalties and cleave her love to Li Jing and his vision for a better empire. Not to mention Du wrote this and many of his other works whilst in exile with the Tang imperial court in the south, after rebels sacked the capital and burnt his books. Knowing this, it is difficult not to see Du as mythologising the past into a parable of personal resonance, that perhaps he too was making decisions about loyalties and legacies, which court or emperor he should stay with, asking himself if the Tang would indeed rise again (as he himself, as a taoist has prophecised).
Other commonly cited antecedents to the modern wuxia genre are the 14th Century classics like Romance of the Three Kingdoms (三國演義) and Outlaws of the Marsh (水滸傳), the former of which is all about the founding of dynasties and gives to Chinese the now ubiquitously cited The empire, long divided, must unite; long united, must divide. Thus it has ever been (话说天下大势．分久必合，合久必分).
Revolutionaries, Rebels and Race in the Qing Dynasty
No era of imperial China was in possession of a “free press”, but the literary inquisitions under the Qing Dynasty (1644–1911) were particularly bloody and thorough. The Manchu elite suppressed any openly revolutionary sentiment in fiction, however metaphorical, and what is written instead is a literature that sublimates much of that discontent into historical fiction nostalgic for the eras of Han dominance. Wandering heroes of the past were refashioned into a pariah elite, both marginalised from mainstream society but also superior to it with their taoist-cultivated powers.
Whilst earlier quasi-historical epics and supernatural tales are replete with gods and ghosts, late Qing wuxia begins to shed these entities and instead grounds itself in a world where taoist self-cultivation grants immense personal powers but not divinity itself. In each of the successive reprintings of Three Heroes and Five Gallants (三俠五義), editors pruned the text of anachronisms and supernatural flourishes.
The parallel world of secret societies, foreign cults, bickering merchants and righteous martial clans came to be known as jianghu, literally “rivers and lakes”. As a metaphor, it was first coined by taoist philosopher, Zhuangzi 莊子, to describe a utopian space outside of cutthroat court politics, career ambitions and even human attachments. This inspires subsequent generations of literati in their pursuits of aesthetic hermitism, but the jianghu we know today comes also from the waterways that form the key trade routes during the Ming Dynasty (1368–1644). To the growing mercantile classes, jianghu referred to the actual rivers and canals traversed by barges heavy with goods and tribute, a byname for the prosperous Yangtze delta.
These potent lineages of thought intermingle into what jianghu is within martial arts fiction today, that quasi historical dream time of adventure. But there is also another edge to it. In Stateless Subjects: Chinese Martial Arts History and Postcolonial History, Petrus Liu translates jianghu as “stateless”, which further emphasizes that the hero’s rejection of and by the machineries of government. Jianghu is thus a world that rejects the dictates of the state in favor of divine virtue and reason, but also of a sense of self created through clan and community.
The name of the genre, wuxia (“武俠“) comes from Japanese, where a genre of martially-focused bushido-inspired fiction called bukyō (“武侠”) was flourishing. It was brought into Chinese by Liang Qichao 梁启超, a pamphleteer writing in political exile in Japan, seeking to reawaken what he saw as Han China’s slumbering and forgotten martial spirit. In his political work, he holds up the industrialisation and militarisation of Meiji Japan (and its subsequent victory against Russia) as inspiration and seeks a similar restoration of racial and cultural pride for the Han people to be the “master of the Continent” above the hundred of different races who have settled in Asia.
Wuxia is fundamentally rooted in these fantasies of racial and cultural pride. Liang Qichao’s visions of Han exceptionalism were a response to subjugation under Manchu rule and Western colonialism, a martial rebuttal to the racist rhetoric of China being the “Sick Man of Asia”. But it is still undeniably ethno-nationalism built around the descendants of the Yellow Emperor conquering again the continent that is their birthright. Just as modern western fantasy has as its bones the nostalgia for a pastoral, premodern Europe, wuxia can be seen as a dramatisation of Sinocentric hegemony, where taoist cultivation grants power and stalwart heroes fight against an ever-barbaric, ever-invading Other.
Dreams of the Diaspora
Jin Yong 金庸 remains synonymous with the genre of wuxia in Chinese and his foundational mark on it cannot be overstated. His Condor Trilogy (射鵰三部曲) was serialised between 1957-63 and concerns three generations of heroes during the turbulent 12th-13th centuries. The first concerns a pair of sworn brothers, one loyal and righteous, the other clever and treacherous. Their friendship deteriorates as the latter falls into villainy, scheming with the Jin Empire (1115–1234) to conquer his native land. The second in the trilogy follows their respective children repeating and atoning for the mistakes of their parents whilst the Mongols conquer the south. The last charts the internal rivalries within the martial artists fighting over two peerless weapons whilst its hero leads his secret society to overthrow the Yuan Dynasty (1271–1368).
It’s around here that English articles about him start comparing him to Tolkien, and it’s not wholly unjustified, given how both created immensely popular and influential legendaria that draw heavily upon ancient literary forms. Entire genres of work have sprung up around them and even subversions of their work have become themselves iconic. Jin Yong laid down what would become the modern conventions of the genre, from the way fights are imagined with discrete moves, to the secret martial arts manuals and trap-filled tombs.
Unlike Tolkien, however, Jin Yong’s work is still regularly (even aggressively) adapted. There are in existence nine tv adaptations of each instalment of the Condor Trilogy, for example, as well as a video game and a mobile game. And at time of writing, eight feature films and nine tv series based on his work are in production.
But Jin Yong’s work was not always so beloved by mainland Chinese audiences. For a long time he, along with the rest of wuxia, were banned and the epicentre of the genre was in colonial Hong Kong. It is a detail often overlooked in the grand history of wuxia, so thoroughly has the genre been folded into contemporary Chinese identity. It is hard at times to remember how much of the genre was created by these artists in exile. Or perhaps that is the point, as Hong Kong’s own unique political and cultural identity is being subsumed into that of the People’s Republic, so too is its literary legacy. Literalist readings of his work as being primarily about historical martial artists defang the political metaphors and pointed allegories.
Jin Yong’s work is deeply political. Even in the most superficial sense, his heroes intersect with the politics of their time, joining revolutionary secret societies, negotiating treaties with Russia and fighting against barbarian invaders. They are bound up in the temporal world of hierarchy and power. Legend of the Condor Hero (射鵰英雄傳)’s Guo Jing 郭靖 becomes the sworn brother to Genghis Khan’s son, Tolui, and joins the Mongol campaign against the Khwarezmid Empire. Book and Sword (書劍恩仇錄)’s Chen Jialuo 陳家洛 is secretly the Qianlong Emperor’s half brother. The Deer and the Cauldron (鹿鼎記)’s Wei Xiaobao 韋小寶 is both best friends with the Kangxi Emperor and also heavily involved in a secret society dedicated to overthrowing the aforementioned emperor. Even Return of the Condor Hero (神鵰俠侶)‘s Yang Guo 楊過 ends up fighting to defend the remains of the Song Empire against the Mongols.
But it goes deeper than that. Jin Yong was a vocal critic of the Cultural Revolution, penning polemics against Mao Zedong and the Gang of Four during the late 60s. Beyond the immediate newspaper coverage, Jin Yong edited and published many more works both documenting and dissecting the Cultural Revolution.
Jin Yong described himself as writing every day one novel instalment and one editorial against the Gang of Four. Thus did they bleed together, the villains of Laughing in the Wind (笑傲江湖) becoming recognisable caricatures as it too rejected senseless personality cults.
In this light, his novels seem almost an encyclopaedia of traditional Chinese culture, its values and virtues, a record of it to stand bulwark against the many forces that would consign it all to oblivion. It is a resounding rebuttal to principles of the May Fourth Movement, that modernisation and westernisation are equivalents. To Jin Yong the old and the traditional were valuable, and it is from this we must build our new literature .
Taken together, Jin Yong’s corpus offers an alternate history of the Han people spanning over two thousand years from the Eastern Zhou (771–256 B.C.) to the Qing Dynasty (1644–1911). He fills in the intriguing gaps left in official records with folk heroes, court gossip and conspiracy theories. His text is dense with literary allusions and quotations from old Chinese poems.
His stories are almost all set during times of turmoil when what can be termed “China”, or at least, the Han people are threatened by barbarian invasion and internal corruption; pivotal moments in history that makes heroes and patriots out of ordinary men and women. All this Jin Yong immortalises with a deep yearning for a place and past that never quite was; nostalgia in the oldest sense of the word, with all the pain and pining and illusion that it implies.
It is arguably this very yearning, this conjuring of a real and relevant past from dry history books that makes Jin Yong’s work so endlessly appealing to the Chinese diaspora, as well as the mainland Chinese emerging from the Cultural Revolution. This alternate history dramatises the complexities of Han identity, all the times it has been threatened, disrupted and diluted in history, but at the same time it gave hope and heroics. These were stories as simple or as complex as the reader wanted it to be.
Chinese Imperialism and Han Hegemony
It is sometimes hard to remember that Jin Yong and all the rest of wuxia was once banned in the People’s Republic of China, so thoroughly have they now embraced his work. As late as the 1990s was Jin Yong decried as one of the “Four Great Vulgarities of Our Time” (alongside the four heavenly kings of cantopop, Jackie Chan and sappy Qiong Yao romances).
In recent decades, the CCP has rather dramatically changed its relationship with the past. The censorship machine is still very active, but it does not have in its crosshairs the decadent and feudal genre of wuxia (though there have been exceptions, especially during the run up to the Republic’s 70th anniversary when all frivolous dramas were put on pause; it is important to remember that the censors are not always singular or consistent in their opinions). But more importantly, the Party no longer draws power from a radical rejection of the past, instead it is embraces utterly, celebrated at every turn. Traditionalism now forms a core pillar of their legitimacy, with all five thousand years of that history validating their rule. The State now actively promotes all those superstitions and feudal philosophies it once held in contempt.
Along with the shifting use of history to inspire nationalism has Jin Yong been rehabilitated and canonised. It’s arguably that revolutionary traditionalism —that he was preserving history in a time of its destruction—that makes him so easy to rehabilitate. Jin Yong’s work appeals both to the conservative mind with its love of tradition and patriotic themes, but also to rebels in its love of outlaw heroes.
It isn’t that these stories have nothing to say on themes of a more abstract or universal sense of freedom or justice, but that they are also very much about the specifics of Han identity and nationalism. Jin Yong’s heroes often find themselves called to patriotism, even as they navigate their complex or divided loyalties, they must defend “China” in whatever form it exists in at the time against barbaric, alien invaders. Even as they function as straightforward stories of nationalistic defence, they are also dramatising disruptions of a simplistic or pure Chinese identity, foregrounding characters from marginalised (if also often exoticised) ethnicities and religions.
Jin Yong’s hero Guo Jing is Han by birth and Mongol by adoption. He ultimately renounces his loyalty to Genghis Khan and returns to his Han homeland to defend it from Mongol conquest. Whilst one can read Jin Yong’s sympathy and admiration for the Mongols as an attempt to construct an inclusive nationalism for modern China, Guo Jing’s participation as a Han hero in the conquest of Central Asia also functions as a justification of modern Han China’s political claim on that imperial and colonial legacy.
Book and Sword has this even more starkly as it feeds the popular Han fantasy that the Kangxi Emperor is not ethnically Manchu but instead, a Han changeling. He is forced by the hero of the novel Chen Jialuo to swear an oath to acknowledge his Han identity and overthrow the Manchus, but of course, he then betrays them and subjugates not only the Han but also the “Land of Wei” (now known as Xin Jiang, where the genocide is happening). Still there is something to be said about how this secret parentage plot attributes the martial victories of the Qing to Han superiority and justifies the Han inheritance of former Qing colonies.
The Uyghur tribes are portrayed with sympathy in Book and Sword. They are noble and defiant and devout. Instead of savages who need to be brought to heel, they are fellow resistance fighters. It alludes to an inclusive national identity, one in which Han and Uyghur are united by their shared suffering under Manchu rule. It can also be argued that their prominence disrupts the ideal of a pure Han-centric Chineseness. But what good is inclusion and unity to those who do not want to be part of that nation? Uyghurs, being a people suffering occupation, actively reject the label of “Chinese Muslims”.
Furthermore, the character of Kasili in Book and Sword, based on the legend of the Fragrant Concubine, is drenched in orientalist stereotype. Chen first stumbles upon her bathing naked in a river, her erotic and romantic availability uncomfortably paralleling that of her homeland. When the land of Wei falls to the emperor’s sword and Kasili is taken as a concubine, she remains loyal to the Han hero she fell in love with, ultimately killing herself to warn Chen of the emperor’s duplicity. Conquest and imperial legacy is thus dramatised as a love triangle between a Uyghur princess, a Han rebel and a Manchu emperor.
Chen, it should be noted, falls in love and marries a different Uyghur princess for his happy ending.
Amid other far more brutal policies meant to forcibly assimilate and eradicate Uyghur identity, the PRC government encouraged Han men to take Uyghur women as wives. Deeply unpleasant adverts still available online extolled the beauty and availability of Uyghur women, as something and somewhere to be conquered. It is impossible not to be reminded of this when reading about the beautiful and besotted Kasili.
There is no small amount of political allegory to be read between the lines of Jin Yong, something he became increasingly frank about towards the end of his life. Condor Trilogy with its successive waves of northern invaders can be seen as echoing at the Communist takeover of China. The success of Wei Xiaobao’s affable cunning can be a satire on the hollowness materialistic 70s modernity. But Jin Yong himself proved to be far less radical than his books as he sided with the conservative anti-democracy factions within Hong Kong during the Handover.
In an 1994 interview, Jin Yong argues against the idea that China was ever under “foreign rule”, instead proposing that the many ethnic groups within China are simply taking turns on who happens to be in ascendance. All wars are thus civil wars and he neatly aligns his novels with the current Chinese policies that oppress in the name of unity, harmony and assimilation, of “inclusive” nationalism.
The legacy of Jin Yong is a complex one. His work, like all art, contains multitudes and can sustain any number of seemingly contradictory interpretations. It is what is beautiful about art. But I cannot but feel that his rapid canonisation over the last decades in mainland China is a stark demonstration of how easily those yearning dreams of the diaspora can become nationalistic fodder.
I did not come to bury wuxia, but to praise it. I wanted to show you a little bit of its complexities and history, as well as the ideals and ideologies that simmer under its surface.
For me, I just think it is too easy to see wuxia as a form of salvation. Something to sustain and inspire me in a media landscape hostile to people who look like me. To give me the piece of me that I have felt missing, to heal a deep cultural wound. After all, Hollywood or broader Anglophone media might be reluctant to make stories with Asian protagonists, but I can turn to literally all of wuxia. American TV series won’t make me a fifty episode epic about two pretty men eyefucking each other that also has a happy ending, but I will always have The Untamed.
It’s this insidious feeling of hope. That this genre is somehow wholly “unproblematic” because I am reconnecting with my cultural roots, that it can nourish me. That it can be safe that way. It is, after all, untouched by all the problematic elements in Anglophone mainstream that I have analysed to death and back. That it is some sort of oasis, untouched by colonialism and western imperialism. That it therefore won’t or can’t have that taint of white supremacy; it’s not even made by white people.
Perhaps it is just naive of me to have ever thought these things, however subconsciously. Articulating it now, it’s ridiculous. Han supremacy is a poisonous ideology that is destroying culture, hollowing out communities and actively killing people. In the face of its all-consuming genocide-perpetuating ubiquity, the least I can do is recognise its presence in a silly little genre I love. It just doesn’t seem too much to ask.
Jeannette Ng is originally from Hong Kong but now lives in Durham, UK. Her MA in Medieval and Renaissance Studies fed into an interest in medieval and missionary theology, which in turn spawned her love for writing gothic fantasy with a theological twist. She runs live roleplay games and is active within the costuming community, running a popular blog. Jeannette has been a finalist for the John W. Campbell Award for Best New Writer and the Sydney J Bounds Award (Best Newcomer) in the British Fantasy Awards 2018.
A weakness in the algorithm used to encrypt cellphone data in the 1990s and 2000s allowed hackers to spy on some internet traffic, according to a new research paper. Motherboard: The paper has sent shockwaves through the encryption community because of what it implies: The researchers believe that the mathematical probability of the weakness being introduced on accident is extremely low. Thus, they speculate that a weakness was intentionally put into the algorithm. After the paper was published, the group that designed the algorithm confirmed this was the case. Researchers from several universities in Europe found that the encryption algorithm GEA-1, which was used in cellphones when the industry adopted GPRS standards in 2G networks, was intentionally designed to include a weakness that at least one cryptography expert sees as a backdoor. The researchers said they obtained two encryption algorithms, GEA-1 and GEA-2, which are proprietary and thus not public, "from a source." They then analyzed them and realized they were vulnerable to attacks that allowed for decryption of all traffic. When trying to reverse-engineer the algorithm, the researchers wrote that (to simplify), they tried to design a similar encryption algorithm using a random number generator often used in cryptography and never came close to creating an encryption scheme as weak as the one actually used: "In a million tries we never even got close to such a weak instance," they wrote. "This implies that the weakness in GEA-1 is unlikely to occur by chance, indicating that the security level of 40 bits is due to export regulations." Researchers dubbed the attack "divide-and-conquer," and said it was "rather straightforward." In short, the attack allows someone who can intercept cellphone data traffic to recover the key used to encrypt the data and then decrypt all traffic. The weakness in GEA-1, the oldest algorithm developed in 1998, is that it provides only 40-bit security. That's what allows an attacker to get the key and decrypt all traffic, according to the researchers.
Read more of this story at Slashdot.
For my work on Debian, i want to use my debian.org email address, while for my personal projects i want to use my gmail.com address.
One way to change the user.email git config value is to git config --local in every repo, but that's tedious, error-prone and doesn't scale very well with many repositories (and the chances to forget to set the right one on a new repo are ~100%).
The solution is to use the git-config ability to include extra configuration files, based on the repo path, by using includeIf:
Content of ~/.gitconfig:
name = Sandro Tosi
email = <personal.address>@gmail.com
path = ~/.gitconfig-deb
Every time the git path is in ~/deb/ (which is where i have all Debian repos) the file ~/.gitconfig-deb will be included; its content:
[user]That results in my personal address being used on all repos not part of Debian, where i use my Debian email address. This approach can be extended to every other git configuration values.
email = firstname.lastname@example.org
Studies from around the world suggest that success depends on class size, distancing, the age of the students, and how prevalent the virus is locally.
or at this YouTube link:
Been putting this together for a while... more to come.
In no particular order, though grouped by composer.
To be clear, I'm in no way saying these are unknown themes or not loved. In my limited experience, they just don't get the same acclaim as some more well-known scores, and I feel they deserve recognition! These are just pieces of music uncannily suited to their films, and work perfectly in the movie while also standing alone as wonderful pieces of music.
And while I haven't completely steered away from the John Williams' and Jerry Goldsmiths of the world, I have tried to include slightly more off-kilter selections that are truly fantastic.
Klendathu Drop - Starship Troopers
Robocop Theme - Robocop
Riddle of Steel & Riders of Doom - Conan the Barbarian
Love Theme - Cinema Paradiso
Complete Score - The Thing
Ecstasy of Gold - The Good, The Bad, and The Ugly
Going The Distance & The Final Bell - Rocky
Main Theme - The Right Stuff
Main Theme - Capricorn One
Main Theme - Gremlins II (and Gremlins... just a great
performance of it)
Main Title - Planet of the Apes
The Enterprise - Star Trek: The Motion Picture
Erich Wolfgang Korngold
Main Title - Kings Row (also... the inspiration for Star
Main Title - Reunion - The Sea Hawk
Main Theme - Seven Years in Tibet (one of his best)
Main Theme - Born on the Fourth of July
With Malice Towards None - Lincoln
Main Theme - Predator
Main Theme - Contact (Maybe my fav on the list... I'm a sucker
for sentimentality... Sue me)
Captain America March - Captain America: The First Avenger
Junkie Xl - Mad Max: Fury Road
Daft Punk - Tron Legacy
James Horner - Commando
Wow, man. Some of us take on more extreme projects during the
Great Coronavirus Quarantine than others.
This ambitious fellow shows you how to build a Nintendo Switch, with a beautiful and wholesome purpose: “to Starve Online Price Gougers” who are jacking up the prices because demand is high for Nintendo Switch, and availability is nil.
Here's their introduction to the HOWTO gallery, which is amazing and stupendous.
After playing New Horizons and hyping it up to my friends, they decided they wanted a Switch. They called around to different retailers every day for a week with no luck finding anyone who had one in stock. No one knew when the next shipment would be. This led to an online search like Craigslist, OfferUp, and Ebay.
Unfortunately everyone knows the rest. Upwards of $450 to $600 in the Seattle area for a used Switch. Some with and without all the accessories. This enraged me to the point of telling them I could build one cheaper out of spare parts. So they hired me to do just that. If anyone is interested in doing the same here is my step by step buying guide along with assembly instructions and a pricing guide.
1. Game Cartridge Card Slot Socket Board w/Headphones Port - $15
2. NS Console Micro SD TF Memory Card Slot Port Socket Reader - $5
3. Nintendo Switch HAC-001 CPU Cooling Heatsink - $7
4. Game Cartridge Card Plastic Cover - $1
5. Console Speaker Replacement Parts For Nintendo Switch Built in speaker - $8
6. Wifi Antenna Connecting Cable (Short) $2
7. Wifi Antenna Connecting Cable (Long) $2
8. Internal Cooling Fan - $3
9. Power & Volume Button control flex cable (w/ buttons and rubber conductor) - $4
10. Side Slider Sliding Rail Flex Cable (Left) - $3
11. Side Slider Sliding Rail Flex Cable (Right) - $3
12. Replacement Top Housing Shell Case Face plate -$6
13. Nintendo Switch Console Replacement Battery (New) - $15
14. Replacement Bottom Housing Shell Transparent Case Face plate -$5
15. Touch Screen Digitizer Adhesive - $0.50
16. Touch Screen Digitizer - $9
17. LCD Display Screen Replacement - $12
18. Shield Plate - $2
19. Iron Middle Frame - $6
20. (Not Pictured Here) - 100% WORKING OEM NINTENDO SWITCH REPLACEMENT LOGIC BOARD MOTHERBOARD - $95
21. (Not Pictured Here) - Full Screw Replacement Set - $2
22. (Not Pictured Here) - (Removal of Copper Sicker on CPU)
Grand Total For Used Parts Build: = $199
Ebay Average Price Jan 2020: = (between $175 and $225)
Ebay Average Price April 2020: = (between $300 and $400)
I am sure I made made mistakes in this post so feel free to correct me if I am wrong about anything.
And screw you if you are one of the bad guys making a buck off of a crisis.
Here you go...
It used to be that being a couch potato was almost universally deemed a negative—but it’s funny how it only takes a contagious epidemic to turn the normal state of things on its head. Fortunately, nobody with a computer need be without ways to occupy their time.
Publishers, studios, and other media agencies are providing free offerings to give people plenty to do to ride out the corona lockdowns—as well as tools to assist self-education or learning at home. Here are a few of them I’ve noticed.
Educational/children’s book publisher Scholastic is offering a free 20-day learn-at-home program for grades K-9 via its web site—very handy for those in areas whose schools have closed down.
Would your children like to learn more about whales? Seattle-based research institute Oceans Initiative has launched a free Virtual Marine Biology Camp to teach school-closed children more about aquatic life. They’re holding live sessions every Monday and Thursday at 11 a.m. Pacific (2 p.m. Eastern) to help give those out-of-school children something educational to do.
Audiobook publisher and Amazon subsidiary Audible.com is making hundreds of audiobook titles available for free for the duration of school closures, via stories.audible.com.
NPR, the Sarasota Herald-Tribune, and CNET, among others, have articles collecting a lot of other free entertainment and education sources that weren’t free before the Corona quarantines. (Indeed, all you need do is google “coronavirus free entertainment” to find all the others who had the same idea.) But there are also still plenty of things that were already free and still are.
Baen’s Free Library is, of course, still just as free as it ever was. If you’re a member of a compatible public library, Hoopla Digital will let you borrow a limited number of ebooks, audiobooks, albums, movies, or TV episodes per month for free. And you still have access to Project Gutenberg, Librivox for audiobooks, Archive.org for all sorts of content, and all the other public-domain sites out there.
If you’re looking for something interesting to watch, Open Culture has links to over 200 free documentary films online, on subjects as diverse as Hayao Miyazaki and M.C. Escher. The site also includes links to free ebooks, audiobooks, online courses, and textbooks.
If you’re into anime, most of Crunchyroll‘s anime titles are available to watch for free (save for the very newest episode). Resolution may be limited, and you may have to put up with advertisements—but free is free, right? Pluto TV has over 250 channels of free video content, too, with mobile apps for iOS and Android available. And YouTube has its usual countless hundreds of thousands of hours of enjoyable ways to entertain or improve yourself, including its “Learning” category.
If you’re more into computer games, you could check out the Homecoming City of Heroes servers. Coming up on a full year since the game originally returned, it has thousands of players once again enjoying life in the early-2000s superhero MMO. (I play primarily on the Torchbearer shard, myself, and am always happy to help out new or returning players.)
There are many more free education or entertainment resources than I could even list, and there will doubtless be more the longer this lockdown goes on. How about adding your favorites in the comments?
Photo by Eric Antunes on Pexels.com
If you found this post worth reading and want to kick in a buck or two to the author, click here.
I have been late to adopt an on-premise cloud solution as the security of Owncloud a few years ago wasn't so stellar (cf. my comment from 2013 in Encryption files ... for synchronization across the Internet). But the follow-up product Nextcloud has matured quite nicely and we use it for collaboration both in the company and in FLOSS related work at multiple nonprofit organizations.
There is a very annoying "feature" in Nextcloud though that the designers think menu items for apps at the top need to be limited to eight or less to prevent information overload in the header. The whole item discussion is worth reading as it it an archetypical example of design prevalence vs. user choice.
And of course designers think they are right. That's a feature
of the trade.
And because they know better there is no user configurable option to extend that 8 items to may be 12 or so which would prevent the annoying overflow menu we are seeing with 10 applications in use:
Luckily code can be changed and there are many comments floating
around the Internet to change
minAppsDesktop = 8. In this case it is slightly
compressed form (aka "minified") as
core/js/dist/main.js and you probably don't want to
build the whole beast locally to change one constant.
gets compressed during build time to become part of one 15,000+ character line. The relevant portion reads:
Well, we can still patch that, can we?Continue reading "Fixing the Nextcloud menu to show more than eight application icons"