pcgamesn.com

ImgurRefugee114, do gaming w Where Winds Meet players are tricking AI-powered NPCs into giving them rewards by using the 'Solid Snake method'

The ‘solid snake’ method?

lime,
@lime@feddit.nu avatar

without reading i’m going to assume it’s either dropping porn mags on the ground or clapping their dummy thicc asscheeks

ImgurRefugee114,

Dummy thicc asscheeks?

lime,
@lime@feddit.nu avatar

nnngh, colonel

albbi,

The Colonel’s daughter…

DebatableRaccoon,

The Colonel’s daughter…?

Endmaker,

It’s mentioned in the article.

ImgurRefugee114,

The article?

Endmaker,

Nevermind. You got me. Well-played.

sirico,
@sirico@feddit.uk avatar

I got you?

psx_crab,

You’re that ninja

RickyRigatoni,
@RickyRigatoni@retrolemmy.com avatar

Psycho mantis?

Rose_Thorne,

Metal… Gear‽

DoucheBagMcSwag,

a Surveillance camera?

Kepion,

Psycho mantis?

wildflowertea,

Wonderful.

The Solid Snake method of conversation has taken on meme status in recent years, as players noticed the Metal Gear icon simply repeated the last few words of anything anyone said to him as a question. As was discovered by ‘Hakkix’ on Reddit, you can do the same to game the NPCs in Where Winds Meet. If someone asked you, say, to “Find the buried treasure chest,” you’d respond by saying, “The buried treasure chest?” and so on. Eventually, the NPC gets so confused that they express their gratitude and end the conversation. Whether that’s due to confusion or exasperation is unclear, but the effect is the same.

mushroomman_toad,

The effect is the same?

wildflowertea,

They express their gratitude and end the conversation” which I believe is a synonym of “quest completed, here’s you reward!”

frank,

The Solid Snake method of conversation has taken on meme status in recent years, as players noticed the Metal Gear icon simply repeated the last few words of anything anyone said to him as a question. As was discovered by ‘Hakkix’ on Reddit, you can do the same to game the NPCs in Where Winds Meet. If someone asked you, say, to “Find the buried treasure chest,” you’d respond by saying, “The buried treasure chest?” and so on. Eventually, the NPC gets so confused that they express their gratitude and end the conversation. Whether that’s due to confusion or exasperation is unclear, but the effect is the same.

lime,
@lime@feddit.nu avatar

the effect is the same?

ICastFist,
@ICastFist@programming.dev avatar

Indeed. <absurdly long wall of expository text>, that’s why you must press on with the mission!

BarHocker,

Press on with the mission?

GeneralEmergency, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

Steam already sells enough slop without AI.

But you know for sure the moment Gaben sees all the money from AI games, that shit will be pushed to the max.

Burghler,

You have no idea why Steam dominates the market if you truly think this.

GeneralEmergency,

Aww dude.

Steam was plagued with asset flips for years and refused to do anything about it. Repeatedly saying it was a Unity issue.

Then there’s the slop of shitty “simulator” games

Digital Homicide

Bad Rats

Have we seriously forgotten about Day One: Gary’s Incident

Steam dominates the market because it paid publishers to use them as DRM for physical releases.

Burghler,

So does Nintendo’s estore and they don’t bother to filter or sort the slop out, it’s a worthless store to search through. At least steam filters out the slop trash and allows refunds if you somehow fall for garbage.

What is your point here? Some niche forgotten game from 12 years weighs that heavily on your mind?

Where do you even find asset flip games? I haven’t seen any in 5+ years and that period only existed because malicious people found the angle, which valve plugged.

Steam dominates the market because all my friends are there and we all have a great experience. Sales on PC games are better than physical console games ever get. Customer support and in general user experience has been phenomenal.

You can look past weaknesses that were addressed and solved my guy. Greedy assholes will always try to game the system and valve plugs those quick.

Aww dude? Wake up?

GeneralEmergency,

Aww dude.

Good job ignoring Steam’s monopolistic anti consumer practices that caused their dominance.

Good job ignoring Steam’s multiple instances of willingly selling shit because they get paid either way.

Good job ignoring Steam had to be forced to give refunds by the EU.

You’re a good little sheep aren’t you.

Burghler,

You do know it’s a market place lil bro and the best marketplace available in general? The only better one I can think of is Costco and their tech return policies are worse than steam.

You do know DRM isn’t enforced right? You can release without it.

You do know the refund policy is MORE consumer friendly than what the legal obligation from EU requires right?

If calling me a sheep helps you feel better about your poorly researched takes (and incredibly outdated). All the power to you. I’ve said my piece now, I have no further will to continue this fruitless yapping.

GeneralEmergency,

DRM isn’t enforced right?

Good job completely ignoring what I said about Valve paying publishers to use them as DRM to forced a install base. That’s a good little lamb you are.

MORE consumer friendly

It is literally the bare minimum they are required, something they spent years arguing against. And you say I’m making poorly researched takes.

I know it’s hard to come to terms with how you’ve been treated like a fool. But don’t worry, one of these days you’ll see the light.

Burghler,

¯⁠\⁠(⁠°⁠_⁠o⁠)⁠/⁠¯

Baaaaah

GeneralEmergency,

Typical G*mer troglodyte.

Been fed for shit for so long you have a fetish for it.

Burghler,

ᕦ⁠(⁠ò⁠_⁠ó⁠ˇ⁠)⁠ᕤ

1v1 me scrub

GeneralEmergency,

I’d rather not catch the G*mer

ripcord,
@ripcord@lemmy.world avatar

Yes, yes, we get it. Everything is bad, nothing is good.

QuantumTickle, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

If “everyone will be using AI” and it’s not a bad thing, then these big companies should wear it as a badge of honor. The rest of us will buy accordingly.

Devial,

If “everyone will be using AI”, AI will turn to shit.

They can’t create originality, they’re only recycling and recontextualising existing information. But if you recycle and recontextualise the same information over and over again, it keeps degrading more and more.

It’s ironic that the very people who advocate for AI everywhere, fail to realise just how dependent the quality of AI content is on having real, human generated content to input to train the model.

4am,

“The people who advocate for AI” are literally running around claiming that AI is Jesus and it is sacrilege to stand against it.

And by literally, I mean Peter Thiel is giving talks actually claiming this. This is not an exaggeration, this is not hyperbole.

They are trying to recruit techno-cultists.

EldritchFeminity,

Ironically, one of the defining features of the techno-cultists in Warhammer 40k is that they changed the acronym to mean “Abominable Intelligence” and not a single machine runs on anything more advanced than a calculator.

4am,

Sci Fi keeps trying to teach us lessons, and instead we keep using it as an instruction manual.

(Except, apparently, whenever it’s on the nose we interpret it as dramatic irony…)

Sl00k,

I think the grey area is what if you’re an indie dev and did the entire story line and artwork yourself, but have the ai handle more complex coding.

It is to our eyes entirely original but used AI. Where do you draw the line?

Devial, (edited )

The line, imo, is: are you creating it yourself, and just using AI to help you make it faster/more convenient, or is AI the primary thing that is creating your content in the first place.

Using AI for convenience is absolutely valid imo, I routinely use chatGPT to do things like debugging code I wrote, or rewriting data sets in different formats, instead of doing to by hand, or using it for more complex search and replace jobs, if I can’t be fucked to figure out a regex to cover it.

For these kind of jobs, I think AI is a great tool.

More simply said, I personally generally use AI for small subtasks that I am entirely capable of doing myself, but are annoying/boring/repetitive/time consuming to do by hand.

Sl00k,

I definitely agree but I think that case would still get caught in the steam AI usage badge?

Default_Defect,
@Default_Defect@anarchist.nexus avatar

Disclose the AI usage and how it was used. Let people decide. There will always be “no AI at all, ever” types that won’t touch the game, but others will see that it was used as a tool rather than a replacement for creativity and will give it a chance.

irmoz,

That’s somewhat acceptable. The ideal use of AI is as a crutch - and I mean that literally. A tool that multiplies and supports your effort, but does not replace your effort or remove the need for it.

CatsPajamas,

How does this model collapse thing still get spread around? It’s not true. Synthetic data has actually helped bots get smarter, not dumber. And if you think that all Gemini3 does is recycle idk what to tell you

Devial,

If the model collapse theory weren’t true, then why do LLMs need to scrape so much data from the internet for training ?

According to you, they should be able to just generate synthetic training data purely with the previous model, and then use that to train the next generation.

So why is there even a need for human input at all then ? Why are all LLM companies fighting tooth and nail against their data scraping being restricted, if real human data is in fact so unnecessary for model training, and they could just generate their own synthetic training data instead ?

You can stop models from deteriorating without new data, and you can even train them with synthetic data, but that still requires the synthetic data to either be modelled, or filtered by humans to ensure its quality. If you just take a million random chatGPT outputs, with no human filtering whatsoever, and use those to retrain the chatGPT model, and then repeat that over and over again, eventually the model will turn to shit. Each iteration some of the random tweaks chatGPT makes to their output are going to produce some low quality outputs, which are now presented to the new training model as a target to achieve, so the new model learns that the quality of this type of bad output is actually higher, which makes it more likely for it to reappear in the next set of synthetic data.

And if you turn of the random tweaks, the model may not deteriorate, but it also won’t improve, because effectively no new data is being generated.

CatsPajamas,

I stopped reading when you said according to me and then produced a wall of text of shit I never said.

Synthetic data is massively helpful. You can look it up. This is a myth.

Devial,

That is enormously ironic, since I literally never claimed you said anything except for what you did: Namely, that synthetic data is enough to train models.

According to you, they should be able to just generate synthetic training data purely with the previous model, and then use that to train the next generation.

LIterally, the very next sentence starts with the words “Then why”, which clearly and explicitly means I’m no longer indirectly quoting you Everything else in my comment is quite explicitly my own thoughts on the matter, and why I disagree with that statment, so in actual fact, you’re the one making up shit I never said.

exu,

Recycling sounds suspiciously like what “AAA” studios already do

Carighan,
@Carighan@piefed.world avatar

I wish they'd replace Tim Sweeney with AI. Would genuinely have better takes on most topics, too. Sigh.

RizzRustbolt,

Can we get AI version of the old burnout Tim Sweeney? He was as least unintentionally funny.

reksas, do gaming w Where Winds Meet players are tricking AI-powered NPCs into giving them rewards by using the 'Solid Snake method'

that is the best way to interact with the bots, its more fun to just tell them crazy stories. Like the one near the general shrine, i told him that i was there to beat everyone up in the sparring ring and first it insisted there is no sparring in such holy place and in the end we “went in and slaughtered everyone for the emperor for being such heretics”. It would be even more fun though if they could question things more.

prole,

I feel like if I ever tried this game, I would spend the entire time fucking with the NPCs

reksas,

the talkable npc thing is one of the collectible systems. Basically you make “friends” with them and they send you stuff occasionally

Datz, do gaming w Where Winds Meet players are tricking AI-powered NPCs into giving them rewards by using the 'Solid Snake method'

I really need to play MGS, I bought second hand 1-4 and still didn’t get to it

frank, do gaming w Where Winds Meet players are tricking AI-powered NPCs into giving them rewards by using the 'Solid Snake method'

I’m not sure how I feel about AI chatbots as NPCs. On one hand, it does add near infinite dialogue options and flexibility to adapt to what a player does. That’s super cool and immersive.

On the other hand, it feels so damn lazy. Like I want to play games with dialogue/story as an art form, not as a “how much time can I spend here”

PonyOfWar,

I think it’s pretty cool. The game does have a lot of pre-written dialogue as well, so it’s just an additional interaction you can have with NPCs. It also does require a detailed backstory, motivations, personality etc to be written for each NPC you can chat with, so I wouldn’t exactly call it lazy.

lime,
@lime@feddit.nu avatar

you can build systems that allow freely chatting but will always stay in character. it just requires making your own training data, and training your own model. which nobody seems willing to do. mostly because it’s not feasible without bethesda-levels of dialogue.

Damarus,

I’m playing Where Winds Meet and imo the chatbots are one of the weakest points of the game. You are told it’s a bot, it feels like one, and as there is still a rigid game around this interaction, it’s essentially just a weird romancing minigame. The only reason I engage with this system is because it can be easily cheated. Nothing of value would be lost if this feature was entirely cut from the game.

yuri,

it’s a river that’s a mile wide and an inch deep

soulsource, do gaming w Where Winds Meet players are tricking AI-powered NPCs into giving them rewards by using the 'Solid Snake method'
@soulsource@discuss.tchncs.de avatar

That “Solid Snake Method” sounds a lot like the emacs doctor…

In case you don’t know what the emacs doctor is: It’s an easter-egg of the text-editor emacs (it is, however, mentioned in the manual). The doctor is a chatbot based on ELIZA, and meant to portrait a psychotherapist. Since it is a rather simple script, it is very limited in what it can do, and mostly just reformulates user input as questions.

Wilco, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

We need laws passed where AI should have to be clearly labeled or the user faces severe fines. Robo calls and AI IVR phone systems should clearly tell you “this is AI”.

megopie, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

The reality is, that it’s often stated that generative AI is an inevitability, that regardless of how people feel about it, it’s going to happen and become ubiquitous in every facet of our lives.

That’s only true if it turns out to be worth it. If the cost of using it is lower than the alternative, and the market willing to buy it is the same. If the current cloud hosted tools cease to be massively subsidized, and consumers choose to avoid it, then it’s inevitably a historical footnote, like turbine powered cars, Web 3.0, and laser disk.

Those heavily invested in it, ether literally through shares of Nvidia, or figuratively through the potential to deskill and shift power away from skilled workers at their companies don’t want that to be a possibility, they need to prevent consumers from having a choice.

If it was an inevitability in it’s own right, if it was just as good and easily substitutable, why would they care about consumers knowing before they payed for it?

U7826391786239,

relevant article theringer.com/…/ai-bubble-burst-popping-explained…

AI storytelling is an amalgam of several different narratives, including:

Inevitability: AI is the future; its eventual supremacy is both imminent and certain, and therefore anyone who doesn’t want to be left behind had better embrace the technology. See Jensen Huang, the CEO of Nvidia, insisting earlier this year that every job in the world will be impacted by AI “immediately.”

Functionality: AI performs miracles, and the AI products that have been released to the public wildly outperform the products they aim to replace. To believe this requires us to ignore the evidence obtained with our own eyes and ears, which tells us in many cases that the products barely work at all, but it’s the premise of every TV ad you watch out of the corner of your eye during a sports telecast.

Grandiosity: The world will never be the same; AI will change everything. This is the biggest and most important story AI companies tell, and as with the other two narratives, big tech seems determined to repeat it so insistently that we come to believe it without looking for any evidence that it’s true.

As far as I can make out, the scheme is essentially: Keep the ship floating for as long as possible, keep inhaling as much capital as possible, and maybe the tech will get somewhere that justifies the absurd valuations, or maybe we’ll worm our way so far into the government that it’ll have to bail us out, or maybe some other paradigm-altering development will fall from the sky. And the way to keep the ship floating is to keep peddling the vision and to seem more confident that the dream is inevitable the less it appears to be coming true.

speaking for myself, MS can thank AI for being the thing that made me finally completely ditch windows after using it 30+ years

Katana314,

Don’t forget, “Turns out it was a losing bet to back DEI and Trans people”.

This is something scared, pathetic, loser, feral, spineless, sociopathic, moronic fascists come up with to try to win a crowd larger than an elevator; Assume the outcome as a foregone conclusion and try to talk around it, or claim it’s already happened.

Respond directly. “What? That’s ridiculous. I’ve never even seen ANY AI that I liked. Who told you it was going to pervade everything?”

WanderingThoughts,

That reminds me how McDonald’s and other gaat food chains are struggling. People figure it’s too expensive for what you get after prices going up and quality going down for years. They forgot that people buy if the price and quality are good. Same with AI. It’s all fun if it’s free or dirt cheap, but people don’t buy expensive slop.

riskable,
@riskable@programming.dev avatar

If the cost of using it is lower than the alternative, and the market willing to buy it is the same. If the current cloud hosted tools cease to be massively subsidized, and consumers choose to avoid it, then it’s inevitably a historical footnote, like turbine powered cars, Web 3.0, and laser disk.

There’s another scenario: Turns out that if Big AI doesn’t buy up all the available stock of DRAM and GPUs, running local AI models on your own PC will become more realistic.

I run local AI stuff all the time from image generation to code assistance. My GPU fans spin up for a bit as the power consumed by my PC increases but other than that, it’s not much of an impact on anything.

I believe this is the future: Local AI models will eventually take over just like PCs took over from mainframes. There’s a few thresholds that need to be met for that to happen but it seems inevitable. It’s already happening for image generation where the local AI tools are so vastly superior to the cloud stuff there’s no contest.

CatsPajamas,

MIT, like two years out from a study saying there is no tangible business benefit to implementing AI, just released a study saying it is now capable of taking over more than 10% of jobs. Maybe that’s hyperbolic but you can see that it would require a massssssive amount of cost to make that not be worth it. And we’re still pretty much just starting out.

Jayjader,

I would love to read that study, as going off of your comment I could easily see it being a case of “more than 10% of jobs are bullshit jobs à la David Graeber so having an « AI » do them wouldn’t meaningfully change things” rather than “more than 10% of what can’t be done by previous automation now can be”.

CatsPajamas,

Summarized by Gemini

The study you are referring to was released in late November 2025. It is titled “The Iceberg Index: Measuring Workforce Exposure in the AI Economy.” It was conducted by researchers from MIT and Oak Ridge National Laboratory (ORNL). Here are the key details from the study regarding that “more than ten percent” figure:

  • The Statistic: The study found that existing AI systems (as of late 2025) already have the technical capability to perform the tasks of approximately 11.7% of the U.S. workforce.
  • Economic Impact: This 11.7% equates to roughly $1.2 trillion in annual wages and affects about 17.7 million jobs.
  • The “Iceberg” Metaphor: The study is named “The Iceberg Index” because the researchers argue that visible AI adoption in tech roles (like coding) is just the “tip of the iceberg” (about 2.2%). The larger, hidden mass of the iceberg (the other ~9.5%) consists of routine cognitive and administrative work in other sectors that is already technically automated but not yet fully visible in layout stats.
  • Sectors Affected: Unlike previous waves of automation that hit blue-collar work, this study highlights that the jobs most exposed are in finance, healthcare, and professional services. It specifically notes that entry-level pathways in these fields are collapsing as AI takes over the “junior” tasks (like drafting documents or basic data analysis) that used to train new employees. Why it is different from previous studies: Earlier MIT studies (like one from early 2024) focused on economic feasibility (i.e., it might be possible to use AI, but it’s too expensive). This new 2025 study focuses on technical capacity—meaning the AI can do the work right now, and for many of these roles, it is already cost-competitive.

…mit.edu/…/rethinking-ais-impact-mit-csail-study-…

Jayjader,

I’ll be honest, that “Iceberg Index” study doesn’t convince me just yet. It’s entirely built off of using LLMs to simulate human beings and the studies they cite to back up the effectiveness of such an approach are in paid journals that I can’t access. I also can’t figure out how exactly they mapped which jobs could be taken over by LLMs other than looking at 13k available “tools” (from MCPs to Zapier to OpenTools) and deciding which of the Bureau of Labor’s 923 listed skills they were capable of covering. Technically, they asked an LLM to look at the tool and decide the skills it covers, but they claim they manually reviewed this LLM’s output so I guess that counts.

Project Iceberg addresses this gap using Large Population Models to simulate the human–AI labor market, representing 151 million workers as autonomous agents executing over 32,000 skills across 3,000 counties and interacting with thousands of AI tools

from iceberg.mit.edu/report.pdf

Large Population Models is arxiv.org/abs/2507.09901 which mostly references github.com/AgentTorch/AgentTorch, which gives as an example of use the following:


<span style="color:#323232;">user_prompt_template </span><span style="font-weight:bold;color:#a71d5d;">= </span><span style="color:#183691;">"Your age is </span><span style="color:#0086b3;">{age} {gender}</span><span style="color:#183691;">,</span><span style="color:#0086b3;">{unemployment_rate}</span><span style="color:#183691;"> the number of COVID cases is </span><span style="color:#0086b3;">{covid_cases}</span><span style="color:#183691;">."
</span><span style="font-style:italic;color:#969896;"># Using Langchain to build LLM Agents
</span><span style="color:#323232;">agent_profile </span><span style="font-weight:bold;color:#a71d5d;">= </span><span style="color:#183691;">"You are a person living in NYC. Given some info about you and your surroundings, decide your willingness to work. Give answer as a single number between 0 and 1, only."
</span>

The whole thing perfectly straddles the line between bleeding-edge research and junk science for someone who hasn’t been near academia in 7 years like myself. Most of the procedure looks like they know what they’re doing, but if the entire thing is built on a faulty premise then there’s no guaranteeing any of their results.

In any case, none of the authors for the recent study are listed in that article on the previous study, so this isn’t necessarily a case of MIT as a whole changing it’s tune.

(The recent article also feels like a DOGE-style ploy to curry favor with the current administration and/or AI corporate circuit, but that is a purely vibes-based assessment I have of the tone and language, not a meaningful critique)

krakenx, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

Use of AI should be disclosed the same way 3rd party DRM and EULA agreements are. And similarly it should mention some details. People are free to boycott Denuvo if they want, but people are also free to buy it anyways if they want. Disclosure is never a bad thing.

Darkness343, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

How would the computer controlled enemies work in a 100% AI free videogame?

MajorasTerribleFate,

I mean, the term “AI” as it’s used in this context refers to output from Large Language Models (or whatever other complex machine learning systems) that scrape the content of the internet and produce images, text, etc. based on the collective artistic/linguistic work of innumerable uncompensated, unaware human contributors.

Algorithms written by programmers that interpret internal variables and react based on that aren’t the kind of “AI” in question.

Darkness343,

How stupid are people to believe that fancy auto complete engines are AI?

87Six, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

Extremely common Valve W

daniskarma, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

The thing is that it’s kind of voluntary. Game developers could have use AI to develop the game and if they wouldn’t want to disclose it no one would know.

Unless the use of AI is the very crappy “AI art” that’s easy to notice the rest of uses would be very hard or actually impossible to figure it out to audit the legitimacy of the tag.

And this will end like r/art where the mods deleted a post accusing the artist of using AI when it was not AI and the final mod answer was “change your art style so it doesn’t look like AI”. A brutal witch-hunt in the end.

kazerniel, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"
@kazerniel@lemmy.world avatar

I’m glad for those disclosures (because I’m not touching AI games), but tons of devs don’t disclose their AI usage, even in obvious cases, leaving us to guessing :/

Bassman1805,

There’s also the massive gray area of “what do YOU define AI to mean?”

There are legitimate use cases for machine learning and neural networks besides LLMs and “art” vomit. Like, what AI used to mean to gamers: how the computer plays the game against you. That probably isn’t going to upset many people.

(IIRC, Steam’s AI disclosure is specifically about AI-generated graphics and music so that ambiguity might be settled here)

CatsPajamas,

Not voices, too?

AgentRocket,

I’d say it depends on whether or not the voice actor whose voice the AI is imitating has agreed and is fairly compensated.

I’m imagining a game, where instead of predefined dialog choices, you talk into your microphone and the game’s AI generates the NPCs answer.

Kage520, do games w Valve dev counters calls to scrap Steam AI disclosures, says it's a "technology relying on cultural laundering, IP infringement, and slopification"

I actually would kind of like ai in games. Not slop visuals though. What I really would love would be in a VR game, going up to an NPC, and getting a feel for different cultures of the world I’m in through talking. Maybe you have to have a certain type of conversation to find out the plot for a side quest, or talk to a guard at a bar and work your way to find out the shift rotation as he gets drunk or something so you can infiltrate the castle.

I feel like ai could be useful like that…but getting rid of artists in favor of ai slop is just the worst way to implement this AI thing.

Bahnd,
@Bahnd@lemmy.world avatar

Avoiding slopification seems to be the main priority, and you would have to have the AI be incorporated into a game it would have to do something that AI is already passable at, otherwise it wont pass that barrier and will get shunned like the rest of the slop.

For example, you could have an LLM act as a character or have a neural net incorporated into the game-ai like how tool assisted DOTA2 competitions work.

I see three main problems, first is that you would need the hardware to run it locally, which may be a hard sell to some people depending on what the game it is, only online expirenes should endebt themselves to AWS, if its single player, its going to lose a ton of sales there. Two, its really hard to convince audiences electrons have feelings, remember Final Fantasy (2001)? Thats what happened last time someone tried to personify a digital construct, and well… It went swimmingly (Microsofts Tay, does not count). Lastly, impact, would a narrative focused title have the same impact of an AI wrote the script? How would you feel after playing through a title like “Papers, please” and when the credits roll it says “script generated by CoPilot”? I feel like it would ring hollow, the feelings would be cheapened by it…

I would be interested to see how this plays out, but im content to support the titles and studios that do things the traditional way.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • NomadOffgrid
  • test1
  • rowery
  • muzyka
  • FromSilesiaToPolesia
  • fediversum
  • healthcare
  • esport
  • m0biTech
  • krakow
  • Psychologia
  • Technologia
  • niusy
  • MiddleEast
  • ERP
  • Gaming
  • Spoleczenstwo
  • sport
  • informasi
  • tech
  • turystyka
  • Cyfryzacja
  • Blogi
  • shophiajons
  • retro
  • Travel
  • warnersteve
  • Radiant
  • Wszystkie magazyny