404media.co

guyrocket, do gaming w Meet the Guy Preserving the New History of PC Games, One Linux Port at a Time
@guyrocket@kbin.social avatar

I appreciate that many older games are still available on Steam either "maintained" as in the article or "remastered". Someday soon I will buy Total Annihilation...again...on Steam this time.

But I do not understand why games are seen as disposable, temporary media. Sure the latest titles are flashy but there are plenty of fucking awesome older games that are still fun to play. And as physical media disappears it becomes much more important for the gaming industry to stop pulling the ladder up behind themselves. History matters. Old <> bad.

There should be an equivalent to the classic rock stations for video games. I greatly appreciate the efforts of the MAME, archive.org and Mr. Lee to keep the classics alive.

TwilightVulpine, (edited ) do gaming w Meet the Guy Preserving the New History of PC Games, One Linux Port at a Time

This is such important work, but large gaming companies now seem to want games to stop working so people will move to the next thing. That's one of the hidden business interests on tying everything to online services.

I do hope we can still manage to maintain compatibility using emulators, virtual machines and compatibility layers. Digital media is so trivial to copy and store that letting it be lost can only happen due to complete neglect.

pdqcp,

Don't forget your remaster/remakes releases

velox_vulnus, do gaming w Microsoft QA Contractors Say They Were Laid Off for Attempting to Unionize
@velox_vulnus@lemmy.ml avatar

On the bright side, Microsoft loses money trying to find new employees.

haui_lemmy, do gaming w Microsoft QA Contractors Say They Were Laid Off for Attempting to Unionize

Fuck microsoft

dan1101, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)

I think realtime LLM in games will happen, and I like this approach of having it run locally. Not sure why you got the port warning though.

webghost0101, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)

I wonder if were gonna start seeing modular specialized game drivers to save space and work.

We already have shared libraries for gamepad controlles and such. Why not one that handles a large language model , one for raytraced light. Maybe even an image generator for patterns in creative building games.

These would need to be standardized and able to be further molded, processed , restricted by the actual games.

Obvious the Triple Ass studios will want you to pay for online services but I legitimately believe there is a future for open source gaming and this could potentially save allot of hair pulling for some nonprofit indie devs.

AlmightyTritan, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)

I don’t think we’ll see this any time soon, because corpos probably won’t listen to any creative that presents this, but I want something where the LLM runs locally and is just used to interpret what you are asking for but the dialogue responses are all still written by a writer. Then you can make the user interaction feel more intuitive, but the design of the story and mechanics can just respond to the implied tone, questions, prompts, keywords from the user.

Then you could have a dialogue tree that responds with a nice well constructed narrative, but a user who asked something casually vs accusatory might end up with slightly different information.

Fauxreigner,

Unless you’re willing to put in some kind of response that basically says “I’m not going to respond to that” (and that’s a sure way to break immersion) this is effectively impossible to do well, because the writer has to anticipate every possible thing a player could say and craft a response to it. If you don’t, you’ll end up finding a “nearest fit” that is not at all what the player was trying to say, and the reaction is going to be nonsensical from the player’s perspective

LA Noire is a great example of this, although from the side of the player character: the dialogue was written with the “Doubt” option as “Press” (as in, put pressure on the other party). As a result, a suspect can say something, the player selects “Doubt”, and Phelps goes nuts making wild accusations instead of pointing out an inconsistency.

Except worse, because in this case, the player says something like “Why didn’t you say something to your boss about feeling sick?” and the game interpreted it as “Accuse them of trying to sabotage the business.”

memfree,

Ooooh, I’d like that! Well, there’s 3 parts to the (random user input / scripted game output) conundrum:

  1. I think it is fair that if you ask, ‘Why didn’t you say something?’ the NPC might either respond as if it is being accused of sabotage, answer the damn question, lie, or prefer not to talk about it (it’s personal).
  2. I’d keep a short list of standard options – probably in a collapsed scroller kinda thing so you could either verbally say or type whatever you want, OR you could click an arrow to pick from a list. That way lazy or stuck players wou;dn’t have to think of all the options, and players interested in roleplaying could do as they please.
  3. I’m OK with, “I’m not going to respond to that”. I’d hope each character had several variations of that, but I think it is legitimate for NPCs to dislike being pestered. Shopkeepers might have replies like, “Are you gonna buy something or are you just here to bend my ear?” or “I don’t see how that relates to my inventory.” Random townies might reply, “Do I even know you?” or “Would you PLEASE stop bothering me.” or “You’re harshing my mellow, man. Shhhh… Just chill.”
savvywolf, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)
@savvywolf@pawb.social avatar

I’ve played Ace Attorney and the writers put a lot of love and personality into the characters. I’d be sceptical if an AI could get close enough to any kind of writing style to “kill” writing in games like that.

Honestly getting fed up of AI doing a mediocre job of creating art and then people claiming it kills whole industries because it’s the “in” technology.

raccoona_nongrata,
@raccoona_nongrata@beehaw.org avatar

deleted_by_author

  • Loading...
  • spriteblood,

    The amount of time to build something like this seems like it would offset the amount of effort it would take just to write good character dialogue. AI tools are basically word calculators, which means you have to provide data for the LLM, which means time to produce this data, time to build guardrails, etc. Even in this implementation, they say they had to build guardrails so that they don't say anything "harmful."

    There are also a number of lawsuits going on that will set a precedent for how training data can be utilized in commercial products. While I expect them to take the side of large corporations with vast resources at the expense of ethics, there's the possibility that they will do the right thing. This will affect how AI tools wil be used in such contexts.

    SoupBrick,

    It will kill industries temporarily, until the corpos realize their success came from the artists.

    AceFuzzLord,

    I know the situation is different for everyone diagnosed with autism, but I like to compare AI writing as being something similar to someone with autism writing (as someone with autism). It can look kinda emotionless and robotic at times but other times it looks passable as something slightly less robotic.

    Fisch, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)
    @Fisch@lemmy.ml avatar

    Since I learned about LLMs when ChatGPT became popular, the one thing I wanted to see was games where you can actually talk to NPCs (using a locally running LLM like here, not using ChatGPT) and it’s cool to see that we’re getting closer and closer to that

    peter,
    @peter@feddit.uk avatar

    This is the only actually good use of LLMs I can really think of. As long as there is a good way to keep them within the bounds of the actual story it would be great for that

    blindsight,

    I think they also have potential for creating lots of variations in dialogue pre-run in a database, and manually checked by a writer for QC.

    The problem with locally-run LLMs is that the good ones require massive amounts of video memory, so it’s just not feasible anytime soon. And the small ones are, well, crappy. And slow. And still huge (8GB+).

    That of course means you can’t get truly dynamic branching dialogue, but it can enable things like getting thousands of NPC lines instead of “I took an arrow to the knee” from every guard in every city.

    It can also be used to generate dialogue, too, so not just one-liners, but “real” NPC conversations (or rich branching dialogue options for players to select.)

    I’m very skeptical that we’ll get “good” dynamic LLM content in games, running locally, this decade.

    Fisch,
    @Fisch@lemmy.ml avatar

    Big breakthroughs are still made when it comes to efficiency (so same or better quality for less processing power) and game devs will probably figure out how to best instruct the LLM to do what they want over time. I think there’s still a lot that will happen in that regard in the next few years until it starts to slow down.

    derbis,

    There’s a Stardew valley mod that does this but it seems development has stalled

    Fisch,
    @Fisch@lemmy.ml avatar

    That’s really cool. I also heard about a Skyrim mod that does this too and since I’m playing Skyrim in VR (modded ofc) that would make it even cooler.

    memfree, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)

    NOTE: I just downloaded the game and on my first attempted launch, it complained that the port it wanted was not open. My only option was to close the game. I ran netstat and did not see the port listed, so I tried again. THAT time, it complained about my older video card :-/ The warning is clunky and there’s a typo, too (within -> withing). It says (if I transcribed accurately):

    You are using an: NVIDIA GEOFORCE GTX 1080. This video card is currently not recognized withing the recommended specs. We only support a limited amount of NVIDIA GTX graphics cards, all NVIDIA RTX graphics cards or all AMD RX graphics cards since the local AI requires a lot of performance.

    So please note that the game might not work properly. Refer to the Steam guide for more information.

    When I closed that warning, the game loaded.

    memfree, do gaming w A Small Steam Game Shows How LLMs Could Kill the Dialogue Tree (re: Verbal Verdict demo)

    NOTE: I just downloaded the game and on my first attempted launch, it complained that the port it wanted was not open. My only option was to close the game. I ran netstat and did not see the port listed, so I tried again. THAT time, it complained about my older video card :-/ The warning is clunky and there’s a typo, too (within -> withing). It says (if I transcribed accurately):

    You are using an: NVIDIA GEOFORCE GTX 1080. This video card is currently not recognized withing the recommended specs. We only support a limited amount of NVIDIA GTX graphics cards, all NVIDIA RTX graphics cards or all AMD RX graphics cards since the local AI requires a lot of performance.

    So please note that the game might not work properly. Refer to the Steam guide for more information.

    When I closed that warning, the game loaded.

    zaphod,
    @zaphod@lemmy.ca avatar

    Wait… why the heck does it need to open a network port?

    Mixel,

    Probably or the ai if I should have guessed in the backend it’s using something like local ai, koboldcpp, llamacpp probably

    ninjan,

    It likely starts the LLM it uses as a service, and it requires running on a port. They could of course have rewritten it to not use a port and instead use other mechanisms possible when you’re in control of the code but then that requires modification of the LLM project they use and would make updating its version harder so such a thing would be reserved for the full release or skipped all together because it’s not really a big deal. All this assuming that they do use one of the hundreds of open source local LLM projects floating around Github.

    peter,
    @peter@feddit.uk avatar

    Why do you need to open a port if its listening locally?

    memfree,

    It is probably easier. I used to run a program that ran its own mini server-like process to send input to other open programs. It used local ports. It didn’t need internet, but it did need ports. My first guess is that programmers already know a bunch of dev libraries that deal with ports so it is easier to use that than write something else from scratch.

    Midnitte,

    If it runs the model as a service locally, it’s probably communicating to the game via a network port.

    Similar to how some single player campaigns are still technically running a “server” for the game, despite being single player.

    ReversalHatchery,

    Actually it’s not rare that a part of interprocess communication between a software’s processes is done through localhost networking

    peter,
    @peter@feddit.uk avatar

    But what firewall blocks that by default?

    ReversalHatchery,

    Probably none, but I think OP did not say this

    peter,
    @peter@feddit.uk avatar

    OP said that the port wasn’t open, unless they meant the port was already in use rather than the port was closed?

    memfree,

    The warning message said the port was not open, but my guess is that the message was inexact. I doubt the port was ever restricted at all. In fact – and with no evidence one way or the other – it wouldn’t surprise me if the only issue was my old video card and the ‘port’ error was simply the first error message the game found on initial launch. For my theory to make sense, though, some initial setup piece must have completed on 1st launch such that the 2nd launch had a newly made config file or something and that extra piece let me proceed to a more accurate error.

    zaphod,
    @zaphod@lemmy.ca avatar

    So laziness. Got it.

    (They could easily move to an ipc mechanism that doesn’t require binding a port on a network interface but that’d require time and effort and why bother when the goal is to ship something fast and cheap while the AI hype is strong)

    Sounds like a fun way to directly mess with their model though.

    ninjan,

    We’re talking about a demo here…

    CrayonRosary, do gaming w Watch This Guy Play ‘Doom’ on a Toothbrush
    ZeroHora,
    @ZeroHora@lemmy.ml avatar

    wtf

    cuchilloc, do gaming w Watch This Guy Play ‘Doom’ on a Toothbrush

    Great now run it in a disposable vape!

    ZeroHora, do gaming w Watch This Guy Play ‘Doom’ on a Toothbrush
    @ZeroHora@lemmy.ml avatar

    Nice paywall

    CrayonRosary,

    Sign up for free access to this post

    conciselyverbose,

    It’s not a paywall.

    But account walls are still gross.

    fushuan,

    That would be paying with your personal information, so it’s still a paywall.

    catloaf, do gaming w Watch This Guy Play ‘Doom’ on a Toothbrush

    And unlike the pregnancy test, this one appears to be actually running on it, not just using the case as a shell around your own hardware.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • Spoleczenstwo
  • rowery
  • esport
  • Pozytywnie
  • krakow
  • giereczkowo
  • Blogi
  • tech
  • niusy
  • sport
  • lieratura
  • Cyfryzacja
  • kino
  • muzyka
  • LGBTQIAP
  • opowiadania
  • slask
  • Psychologia
  • motoryzacja
  • turystyka
  • MiddleEast
  • fediversum
  • zebynieucieklo
  • test1
  • Archiwum
  • FromSilesiaToPolesia
  • NomadOffgrid
  • m0biTech
  • Wszystkie magazyny