• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • Yeah, the goddamn wooden spoon. I remember being noisy in a crib and my mom storming into the room screaming and busting the spoon in half on the side of the crib. She’d already hit me with it so I knew exactly what it meant. I got spoons, open hand, and hairbrushes for most of my childhood. Hair pulling, pinching, and ear-twisting too if we were in a situation where she couldn’t just haul off and hit me.

    The funny thing is, she called me up about a decade ago and asked if I could remember anything about my childhood that was bad. And rather than list everything off, I told her about the time she broke the spoon on the crib. That’s when I found out that it hadn’t happened at all, and in fact if it had happened it was because the spoon was old and brittle and if she’d done anything at all it would have been a light tap on the side of the crib to get my attention, and now that she remembers it yeah that’s exactly what happened. It just fell apart in her hands. We didn’t talk for a few years because of that and other things.

    After my daughter was born, she sent us a package that included two beautiful olivewood spoons from Israel. I use the fuckers when I’m making pasta. She calls or texts every once in a while warning me about protecting my daughter dark, evil things in the world. This usually happens when she sees a picture of my kid playing with a toy spider or a halloween skull. And I just chuckle and agree that there are dark, evil things in the world and I’m doing my damndest to protect her from them.









  • As someone whose employer is strongly pushing them to use AI assistants in coding: no. At best, it’s like being tied to a shitty intern that copies code off stack overflow and then blows me up on slack when it magically doesn’t work. I still don’t understand why everyone is so excited about them. The only tasks they can handle competently are tasks I can easily do on my own (and with a lot less re-typing.)

    Sure, they’ll grow over the years, but Altman et al are complaining that they’re running out of training data. And even with an unlimited body of training data for future models, we’ll still end up with something about as intelligent as a kid that’s been locked in a windowless room with books their whole life and can either parrot opinions they’ve read or make shit up and hope you believe it. I’ll think we’ll get a series of incompetent products with increasing ability to make wrong shit up on the fly until C-suite moves on to the next shiny bullshit.

    That’s not to say we’re not capable of creating a generally-intelligent system on par with or exceeding human intelligence, but I really don’t think LLMs will allow for that.

    tl;dr: a lot of woo in the tech community that the linux community isn’t as on board with