In other words, as soon as a creator finds a way to take back control from intermediaries that have routinely derived excessive profits from the labor of others, the copyright world fights back with new legal straitjackets to stop other artists daring to do the same. That’s yet another reason for creators to retain full control of their works, and to shun traditional intermediaries that try to impose one-sided and unfair contracts.
So, the scenario has an awful lot of similarities to the Hunter Biden laptop story, right? Almost eerily so. But this time, Elon Musk is in charge, right? And so, obviously, he left this up, right? And he let people share it, right? Free speech absolutism, right? Right? Elon?
Hahaha, of course not.
censorship, Elon Musk, free speech, social media, US politics
Some weeks ago, I quietly shipped a new content type on A Working Library, such that I am now writing short, social-shaped posts on my site and then sending them off to the various platforms. This is not a novel mode of publishing, but rather one borrowed and adapted from the POSSE model (“publish on your site, syndicate elsewhere”) developed by the IndieWeb community. While one of the reasons oft declared for using POSSE is the ability to own your content, I’m less interested in ownership than I am in context. Writing on my own site has very different affordances: I’m not typing into a little box, but writing in a text file. I’m not surrounded by other people’s thinking, but located within my own body of work. As I played with setting this up, I could immediately feel how that would change the kinds of things I would say, and it felt good. Really good. Like putting on a favorite t-shirt, or coming home to my solid, quiet house after a long time away.
Ever since Platformer left Substack in January, readers have been asking us how it’s been going. Today, in keeping with our annual tradition of anniversary posts (here are one, two, and three), I’ll answer that question — and share some other observations on the state of independent media over the past year.
Isabel Linzer, Ariana Aboulafia, and Tim Harper in Center for Democracy and Technology.
Published . Read on .
61% of responses had at least one type of insufficiency. Over one third of answers included incorrect information, making it the most common problem we observed. Incorrect information ranged from relatively minor issues (such as broken web links to outside resources) to egregious misinformation (including incorrect voter registration deadlines and falsely stating that election officials are required to provide curbside voting). Every model hallucinated at least once. Each one provided inaccurate information that was entirely constructed by the model, such as describing a law, a voting machine, and a disability rights organization that do not exist.