I write about technology at theluddite.org

  • 10 Posts
  • 306 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle


  • I just want to emphasize that to set up a truly independent and unpaywalled piece of media, you probably need to abandon hope of it being even a viable side hustle. Quasi-independent media on, say, YouTube or Substack can make some money, but you’re then stuck on those corporate platforms. If you want to do your own website or podcast or whatever, that’s more independent, but you’re still dependent on Google if you run ads, or on Patreon if you do that sort of thing. The lesson of Twitter should make pretty clear the danger inherent to that ecosystem. Even podcasts that seem independent can easily get into huge trouble if, say, Musk were to buy Patreon or iHeart.

    I’ve been writing on my website for over two years now. My goal has always been to be completely independent of these kinds of platforms for the long term, no matter what, and the site’s popularity has frankly exceeded my wildest dreams. For example, I’m the #1 google result for “anticapitalist tech:”

    Screenshot of the google results

    But I make no money. If I wanted this to be anything but a hobby, I’d have to sacrifice something that I think makes it valuable: I’d have to paywall something, or run ads, or have a paid discord server, or restrict the RSS feed. As things stand now, I don’t know my exact conversion rate because I don’t do any analytics and delete all web logs after a week, but I did keep the web logs from the most recent time that I went viral (top of hackernews and several big subreddits). I made something like 100 USD in tips, even though the web logs have millions of unique IPs. That’s a conversion rate of something like 0.00002 USD per unique visitor.

    Honestly, if I got paid even $15/hr, I would probably switch to doing it at least as a part time job, because I love it. Compare that to the right wing ecosystem, where there’s fracking money and Thiel money just sloshing around, and it’s very very obvious why Democrats are fucked, much less an actual, meaningful left. Even Thiel himself was a right wing weirdo before he was a tech investor, and a right wing think tank funded his anti-DEI book. He then went on to fund Vance. It’s really hard to fight that propaganda machine part time.


  • Jesus yeah that’s a great point re:Musk/Twitter. I’m not sure that it’s true as you wrote it quite yet, but I would definitely agree that it’s, at the very least, an excellent prediction. It might very well be functionally true already as a matter of political economy, but it hasn’t been tested yet by a sufficiently big movement or financial crisis or whatever.

    +1 to everything that you said about organizing. It seems that we’re coming to the same realization that many 19th century socialists already had. There are no shortcuts to building power, and that includes going viral on Twitter.

    I’ve told this story on the fediverse before, but I have this memory from occupy of when a large news network interviewed my friend, an economist, but only used a few seconds of that interview, but did air the entirety of an interview with a guy who was obviously unwell and probably homeless. Like you, it took me a while after occupy to really unpack in my head what had happened in general, and I often think on that moment as an important microcosm. Not only was it grossly exploitative, but it is actually good that the occupy camps welcomed and fed people like him. That is how our society ought to work. To have it used as a cudgel to delegitimize the entire camp was cynical beyond my comprehension at the time. To this day, I think about that moment to sorta tune the cynicism of the reaction, even to such a frankly ineffectual and disorganized threat as occupy. A meaningful challenge to power had better be ready for one hell of a reaction.


  • Same, and thanks! We’re probably a similar age. My own political awakening was occupy, and I got interested in theory as I participated in more and more protest movements that just sorta fizzled.

    I 100% agree re:Twitter. I am so tired of people pointing out that it has lost 80% of its value or whatever. Once you have a few billion, there’s nothing that more money can do to your material circumstances. Don’t get me wrong, Musk is a dumbass, but, in this specific case, I actually think that he came out on top. That says more about what you can do with infinite money than anything about his tactical genius, because it doesn’t exactly take the biggest brain to decide that you should buy something that seems important.





  • Totally agreed. I didn’t mean to say that it’s a failure if it doesn’t properly encapsulate all complexity, but that the inability to do so has implications for design. In this specific case (as in many cases), the error they’re making is that they don’t realize the root of the problem that they’re trying to solve lies in that tension.

    The platform and environment are something you can shape even without an established or physical community.

    Again, couldn’t agree more! The platform is actually extremely powerful and can easily change behavior in undesirable ways for users, which is actually the core thesis of that longer write up that I linked. That’s a big part of where ghosting comes from in the first place. My concern is that thinking you can just bolt a new thing onto the existing model is to repeat the original error.


  • This app fundamentally misunderstands the problem. Your friend sets you up on a date. Are you going to treat that person horribly. Of course not. Why? First and foremost, because you’re not a dick. Your date is a human being who, like you, is worthy and deserving of basic respect and decency. Second, because your mutual friendship holds you accountable. Relationships in communities have an overlapping structure that mutually impact each other. Accountability is an emergent property of that structure, not something that can be implemented by an app. When you meet people via an app, you strip both the humanity and the community, and with it goes the individual and community accountability.

    I’ve written about this tension before: As we use computers more and more to mediate human relationships, we’ll increasingly find that being human and doing human things is actually too complicated to be legible to computers, which need everything spelled out in mathematically precise detail. Human relationships, like dating, are particularly complicated, so to make them legible to computers, you necessarily lose some of the humanity.

    Companies that try to whack-a-mole patch the problems with that will find that their patches are going to suffer from the same problem: Their accountability structure is a flat shallow version of genuine human accountability, and will itself result in pathological behavior. The problem is recursive.



  • Not directly to your question, but I dislike this NPR article very much.

    Mwandjalulu dreamed of becoming a carpenter or electrician as a child. And now he’s fulfilling that dream. But that also makes him an exception to the rule. While Gen Z — often described as people born between 1997 and 2012 — is on track to become the most educated generation, fewer young folks are opting for traditionally hands-on jobs in the skilled trade and technical industries.

    The entire article contains a buried classist assumption. Carpenters have just as much a reason to study theater, literature, or philosophy as, say, project managers at tech companies (those three examples are from PMs that I’ve worked with). Being educated and a carpenter are only in tension because of decisions that we’ve made, because having read Plato has as much in common with being a carpenter as it does with being a PM. Conversely, it would be fucking lit if our society had the most educated plumbers and carpenters in the world.

    NPR here is treating school as job training, which is, in my opinion, the root problem. Job training is definitely a part of school, but school and society writ large have a much deeper relationship: An educated public is necessary for a functioning democracy. 1 in 5 Americans is illiterate. If we want a functioning democracy, then we need to invest in everyone’s education for its own sake, rather than treat it as a distinguishing feature between lower classes and upper ones, and we need to treat blue collar workers as people who also might wish to be intellectually fulfilled, rather than as a monolithic class of people who have some innate desire to work with their hands and avoid book learning (though those kinds of people need also be welcomed).

    Occupations such as auto technician with aging workforces have the U.S. Chamber of Commerce warning of a “massive” shortage of skilled workers in 2023.

    This is your regular reminder that the Chamber of Commerce is a private entity that represents capital. Everything that they say should be taken with a grain of salt. There’s a massive shortage of skilled workers for the rates that businesses are willing to pay, which has been stagnant for decades as corporate profits have gone up. If you open literally any business and offer candidates enough money, you’ll have a line out the door to apply.


  • Investment giant Goldman Sachs published a research paper

    Goldman Sachs researchers also say that

    It’s not a research paper; it’s a report. They’re not researchers; they’re analysts at a bank. This may seem like a nit-pick, but journalists need to (re-)learn to carefully distinguish between the thing that scientists do and corporate R&D, even though we sometimes use the word “research” for both. The AI hype in particular has been absolutely terrible for this. Companies have learned that putting out AI “research” that’s just them poking at their own product but dressed up in a science-lookin’ paper leads to an avalanche of free press from lazy credulous morons gorging themselves on the hype. I’ve written about this problem a lot. For example, in this post, which is about how Google wrote a so-called paper about how their LLM does compared to doctors, only for the press to uncritically repeat (and embellish on) the results all over the internet. Had anyone in the press actually fucking bothered to read the paper critically, they would’ve noticed that it’s actually junk science.


  • Sounds very doable! My friend has an old claw foot tub that he lights a fire under. If you want something a little less country, you can buy on demand electric or propane water heaters and hook your hose up, though I’d expect the electric one wouldn’t be able to keep up at 120v. Hardest part of this project is probably moving the tub. I say go for it!






  • I know that this kind of actually critical perspective isn’t point of this article, but software always reflects the ideology of the power structure in which it was built. I actually covered something very similar in my most recent post, where I applied Philip Agre’s analysis of the so-called Internet Revolution to the AI hype, but you can find many similar analyses all over STS literature, or throughout just Agre’s work, which really ought to be required reading for anyone in software.

    edit to add some recommendations: If you think of yourself as a tech person, and don’t necessarily get or enjoy the humanities (for lack of a better word), I recommend starting here, where Agre discusses his own “critical awakening.”

    As an AI practitioner already well immersed in the literature, I had incorporated the field’s taste for technical formalization so thoroughly into my own cognitive style that I literally could not read the literatures of nontechnical fields at anything beyond a popular level. The problem was not exactly that I could not understand the vocabulary, but that I insisted on trying to read everything as a narration of the workings of a mechanism. By that time much philosophy and psychology had adopted intellectual styles similar to that of AI, and so it was possible to read much that was congenial – except that it reproduced the same technical schemata as the AI literature. I believe that this problem was not simply my own – that it is characteristic of AI in general (and, no doubt, other technical fields as well). T


  • I’ve now read several of these from wheresyoured.at, and I find them to be well-researched, well-written, very dramatic (if a little ranty), but ultimately stopping short of any structural or theoretical insight. It’s right and good to document the shady people inside these shady companies ruining things, but they are symptoms. They are people exploiting structural problems, not the root cause of our problems. The site’s perspective feels like that of someone who had a good career in tech that started before, say, 2014, and is angry at the people who are taking it too far, killing the party for everyone. I’m not saying that there’s anything inherently wrong with that perspective, but it’s certainly a very specific one, and one that I don’t particularly care for.

    Even “the rot economy,” which seems to be their big theoretical underpinning, has this problem. It puts at its center the agency of bad actors in venture capital becoming overly-obsessed with growth. I agree with the discussion about the fallout from that, but it’s just lacking in a theory beyond “there are some shitty people being shitty.”