• 0 Posts
  • 127 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle




  • Let me tell you something, folks. Scott Walker, he’s like a giant, stinking pile of shit. It’s unbelievable. Nobody’s seen anything like it before. You walk past it, and you know it’s bad right away. Everyone says it. People are talking about it. It’s huge, just sitting there, doing nothing, and it stinks—worse than anyone thought. And guess what? He thinks it’s good! Can you believe it? He’s out there, pretending like everything’s fine, while people can’t even stand to be around him. Total disaster, folks. Total mess. We’re going to clean it up, because that’s what we do. We clean up the mess left by people like Scott Walker, the human pile of shit. Believe me.



  • They’re supposed to be good a transformation tasks. Language translation, create x in the style of y, replicate a pattern, etc. LLMs are outstandingly good at language transformer tasks.

    Using an llm as a fact generating chatbot is actually a misuse. But they were trained on such a large dataset and have such a large number of parameters (175 billion!?) that they passably perform in that role… which is, at its core, to fill in a call+response pattern in a conversation.

    At a fundamental level it will never ever generate factually correct answers 100% of the time. That it generates correct answers > 50% of the time is actually quite a marvel.









  • Max moved under braking multiple times, passed off track, ran into Norris not once but twice in the same corner, left the track maintaining his position, pushed Norris off track on the straight after the collision, then weaved and blocked Norris on the racing line on the following corner after it was clear he had a puncture.

    Norris was overly optimistic going into a corner once and gave the position back.

    But yeah, let’s go ahead and treat both of their behaviors as the same… /sarcasm