• 0 Posts
  • 48 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle
  • MudMan@fedia.iotoGames@lemmy.worldDoom (2016) now DRM free on GOG
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    14 hours ago

    Sure! I mean, why not? Hell, release the game DRM free in the first place on all platforms, huh? Why did we have to wait a decade and buy it twice before we could get the DRM version of any part of it, after all?

    But you weren’t complaining about it yesterday and you’re way closer to the right outcome today. I would much rather have a DRM free version of some part of that game than not.


  • Wait, does it? Oh, man, it does! I actively remember the praise, where did I get so much Mandela effect from this? I didn’t even think to look it up, I was so certain.

    In any case, here’s to being actively wrong and still having made your point. Eternal is the lesser game in general, and I have played it much less, but it’s still telling I straight up forgot and invented an alternate scenario about it.



  • MudMan@fedia.iotoGames@lemmy.worldDoom (2016) now DRM free on GOG
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    22 hours ago

    Nobody did. It was one of this weird wave of interesting multiplayer setups that just didn’t have the competitive cleanness of the established stuff and nobody ended up caring about.

    It was midly interesting to try out once, but let’s say there’s a reason they didn’t do a MP mode in the sequel and every reviewer praised that choice.












  • Hah. Yeah, I’ll do that as soon as you invent a way to freeze time.

    For what it’s worth, I’m pretty sure it’s less energy efficient to run a local open source LLM than to offload the task to a data center, but the flexibility and privacy are too big of a deal to ignore.

    In any case chatbots suck at finding accurate information reliably, but they are actually pretty good at reaching things you already know or can verify at a glance with suprisingly little information. The fact that a piece of tech is being misused often doesn’t mean it’s useless. This simplistic black-and-white stuff is so dumb and social media is so full of it. Speaking of often misused technology, I suppose.


  • Not to be that guy, but if there’s a fictional character that made a career out of prompt engineering a surprisingly flaky AI it’s Geordi La Forge. The guy hasn’t given the hand to a “Computer!” interaction in his life.

    He literally fed his notes to a chatbot to make a custom assistant and then dated the custom assistant.



  • I don’t get it. There is a test that takes ten seconds blowing into a tube. Why is “the right to refuse the breathalyzer” a thing? What’s the point if you’re still going to get tested in a less accurate way that takes longer? What right or freedom is being preserved there other than the right to waste everybody’s time and risk a worse outcome? Why does it matter if it’s “on you”? There are other people involved, from the cop performing the test to whoever else needs to get stopped or tested after you to potentially the public interest of not having drunk drivers zooming around. Why is it “being on you” relevant?

    It’s mostly trivial, but man, it is such a microcosm of weird-ass American/anarchocapitalist thinking about public/private interactions.


  • The mystique Americans have built around checking whether someone is drunk is so weird to me.

    Over here you take a breath test. It’s not optional. You breathe into the tube and either carry on or get fined and sleep it off before moving on.

    I understand that there is some weird hangup about compulsory checks in the US for some reason, as part of the weirdo libertarian nonsense they huff over there, but I’ve never understood the logic of how spending fifteen minutes having a cop decide whether they want to shoot you is the better alternative.


  • Honestly that’s because speeding up localizations by having the first pass be machine-made is not something that waited for GenAI to happen. It’s been going on for a while using good old machine translators.

    Now, Google Translate and similar tools have been reliant on machine learning for ages, people just weren’t freaking out about it because “AI” hadn’t gone viral. It’s been weird to watch this sort of thing play out.

    FWIW, if they are using the same loc workflow and genAI works better than good old machine translations for a first pass go ahead and do GenAI. From what I’ve seen casually it’s not necessarily faster or more reliable, but I’m not working on loc professionally. Maybe that’s what he means when he talks about using it in “backend processes”?