• 0 Posts
  • 191 Comments
Joined 17 天前
cake
Cake day: 2025年5月3日

help-circle

  • Yup, or any hex editor that could target memory addresses (some of them were limited to run on a certain file or whatever.) But yeah I used to do similar when I was a kid, I would go into my game files (all DOS games back then of course) and change text strings you could find in there with a hex editor. I’d just change goofy stuff like ‘Copyright’ to ‘Copyleft’, ‘The bandit strikes the princess!’ to ‘The dude slaps a ho’, etc. It was endlessly amusing when I was that age. :)


  • I see a fair amount of Christian-related posts in your post history so I’m gonna go ahead and suggest that this is probably a conversation you don’t want to have. I’m trying not to be an asshole here, but I am very well read on the subject of Christianity, so suffice to say that contradictions exist, they are widely known, and I find Christian apologia on the subject wholly unconvincing.

    That said, if I’m really the person you would like to go on this journey of discovery about your religion with then I will take you, but I can’t say that you are very likely to enjoy the results.



  • Yup, same. I would get this sense of deja vu except instead of feeling like I’ve been somewhere before it was feeling like I had previously dreamed the events that were about to happen. And yeah it was always minor stuff, a conversation, mom coming home angry about having dropped something expensive at work, the solution to some coding problem a friend was about to tell me, etc. I tried playing with it, and if I changed anything (‘Oh, I know what you’re about to say’, etc) it would disrupt it and not happen, but otherwise it happened the way I dreamed it every time. Sadly it got more and more uncommon as I got older, and now it’s been probably 10-15 years since the last time I remember.




  • And the people who don’t know that you should check LLMs for hallucinations/errors (despite the fact that the press has been screaming that for a year) are definitely self-hosting their own, right? I’ve done it, it’s not hard, but it’s certainly not trivial either, and most of these folks would just go ‘lol what’s a docker?’ and stop there. So we’re advocating guard-rails for people in a use-case they would never find themselves in.

    You’re saying this like they’re equal.

    Not as if they’re equal, but as if they’re both unreliable and should be checked against multiple sources, which is what I’ve been advocating for since the beginning of this conversation.

    The problem is consistency. A con man will always be a con man. With an LLM you have no way to know if it’s bullshitting this time or not

    But you don’t know a con man is a con man until you’ve read his book and put some of his ideas in practice and discovered that they’re bullshit, same as with an LLM. See also: check against multiple sources.


  • No, this was via debug, a command that’s been included in MS-DOS since like version 2.0 (before there even was a Windows, much less full-OS windows like Win95/NT/etc rather than 3.0/3.1 that were just fancy launchers that sat on top of DOS.) It can let you view and alter the contents of memory at a particular address, etc. We also used it to wipe hard drives by forcibly writing 0s to every block on the drive.


  • Huh, I haven’t treated my ceramic skillets special at all, just rinse 'em out when I’m done and throw 'em in the dishwasher, or if I have to hand-wash I can just scrub them real quick since they’re not nasty with food gunk all over them. To the best of my knowledge they don’t require special treatment, I only suggest not letting them sit with food on them because that’ll make anything harder to clean up.










  • I’m also not expecting people to be able to understand complex technical troubleshooting or anything either.

    No, you’re just calling them stupid for not having spent the time to learn things you with your technical expertise and high comfort level with technical subjects think ought to be pretty simple. I agree that everyone could benefit from increasing their computer literacy, but I also understand that people prioritize the things they care about and that they’re not stupid for not caring to learn the stuff you think they ought to.


  • You should check your sources when you’re googling or using chatGPT too (most models I’ve seen now cite sources you can check when they’re reporting factual stuff), that’s not unique to those those things. Yeah LLMs might be more likely to give bad info, but people are unreliable too, they’re biased and flawed and often have an agenda, and they are frequently, confidently wrong. Guess who writes books? Mostly people. So until we’re ready to apply that standard to all sources of information it seems unreasonable to arbitrarily hold LLMs to some higher standard just because they’re new.