lemmy.world

HEXN3T, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.
@HEXN3T@lemmy.blahaj.zone avatar

Let’s compare two completely separate games to a game and a remaster.

Generational leaps then:

https://lemmy.blahaj.zone/pictrs/image/e87aac0f-e62f-4c07-a733-215139b87d43.webp

Good lord.

EDIT: That isn’t even the Zero Dawn remaster. That is literally two still-image screenshots of Forbidden West on both platforms.

https://lemmy.blahaj.zone/pictrs/image/8e8bd0ca-7a89-4a15-82b2-5634195a804f.webp

Good. Lord.

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

It is baffling to me that people hate cross gen games so much. Like, how awful for PS4 owners that don’t have to buy a new console to enjoy the game, and how awful for PS5 owners that the game runs at the same fidelity at over 60FPS, or significantly higher fidelity at the same frame rate.

They should have made the PS4 version the only one. Better yet, we should never make consoles again because they can’t make you comprehend four dimensions to be new enough.

Maggoty,

The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

It’s a pointless point. Complain about power draw. Push ARM.

Maggoty,

ARM isn’t going to magically make GPUs need less brute force energy in badly optimized games.

HEXN3T, (edited )
@HEXN3T@lemmy.blahaj.zone avatar

…So push ARM. By optimising games.

EDIT: This statement is like saying “Focusing on ARM won’t fix efficiency because we aren’t focusing on ARM”.

Maggoty,

Yeah no. You went from console to portable.

We’ve had absolutely huge leaps in graphical ability. Denying that we’re getting diminishing returns now is just ridiculous.

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

We’re still getting huge leaps. It simply doesn’t translate into massively improved graphics. What those leaps do result in, however, is major performance gains.

I have played Horizon Zero Dawn, its remaster, and Forbidden West. I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn. The differences are absolutely there, it’s just not as spectacular as the jump from 2D to 3D.

The post comes off like a criticism of hardware not getting better enough faster enough. Wait until we can create dirt, sand, water or snow simulations in real time, instead of having to fake the look of physics. Imagine real simulations of wind and heat.

And then there’s gaussian splatting, which absolutely is a huge leap. Forget trees practically being arrangements of PNGs–what if each and every leaf and branch had volume? What if leaves actually fell off?

Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

Combined with better and better storage and VR/AR, there is still plenty of room for tech to grow. Saying “diminishing returns” is like saying that fire burns you when you touch it.

Maggoty,

I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.

Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.

Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?

And BG3 has notoriously low minimums, it is the exception, not the standard.

If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

I’m assuming you’re playing on a bad TV. I have a 4k120 HDR OLED panel, and the difference is night and day.

I also prefer to enjoy new things, instead of not enjoying new things. It gives me a positive energy that disgruntled gamers seem to be missing.

Maggoty,

I’m playing on a normal TV because I’m not made of money.

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

So you’re claiming new hardware isn’t perceivably better, despite not using a display which is actually capable of displaying said improvements. I use such a display. I have good vision. The quality improvement is extremely obvious. Just because not everyone has a high end display doesn’t mean that new hardware is pointless, and that everyone else has to settle for the same quality as the lowest common denominator.

My best hardware used to be Intel on-board graphics. I still enjoyed games, instead of incessantly complaining how stagnant the gaming industry is because my hardware isn’t magically able to put out more pixels.

The PS5 is a good console. Modern GPUs are better than older ones. Games look better than they did five or ten years ago. Those are cold, hard, unobjectionable facts. Don’t like it? Don’t buy it.

I do like it.

metaldream,

HFW runs like butter on PC, and personally I noticed a big difference between HZD and HFW on PS5.

ICastFist,
@ICastFist@programming.dev avatar

What those leaps do result in, however, is major performance gains.

Which many devs will make sure you never feel them by “optimizing” the game for only the most bleeding edge hardware

Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

See, if the games were made with a performance first mindset, that’d be possible already. Not to dunk on performance gains, but there’s a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.

Saying “diminishing returns” is like saying that fire burns you when you touch it.

Unless chip fabrication can figure a way to make transistors “stack” on top of one another, effectively making 3D chips, they’ll continue to be “flat” sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it’s been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren’t that big. Now compare with the gains from a GTX 980 vs a GTX 1080

starman2112,
@starman2112@sh.itjust.works avatar

The fact that the Game Boy Advance looks that much better than the Super Nintendo despite being a handheld, battery powered device is insane

SilentStorms,

Is it that much better? The colours just look more saturated to me

starman2112,
@starman2112@sh.itjust.works avatar

There’s noticably more detail, especially along the coastline. Also, the more saturated colors improve contrast

HEXN3T,
@HEXN3T@lemmy.blahaj.zone avatar

The GBA just has reworked art. The SNES could easily do the same thing.

amon,

Because most GBA games were meant to be desaturated due to the terrible screen

DODOKING38,

What game is the first one

AngryCommieKender,

It appears to be a Final Fantasy game, so likely either 4 or 6 aka 2 or 3 in the US

ICastFist,
@ICastFist@programming.dev avatar

Final Fantasy 4 (2 on USA)

ParadoxSeahorse, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.

tbf I went from Wii to PS4 and shit a brick

pjwestin,
@pjwestin@lemmy.world avatar

Yeah, but the Wii was a very underpowered system, and it didn’t even have HDMI. That transition wouldn’t have been as stark going from PS3 to PS4.

A_Union_of_Kobolds,

Horizon Zero Dawn was a stunning game, I did pretty much the same

I’m kinda annoyed bc my 2 BFFs JUST got PlayStations like for Xmas. I’ve been on PS4+PS5 for a long while now and played both Horizons for free. I really wanted to tell them to give Zero Dawn a whirl just to show what the PS5 could do with it… but for full price? Eh… I’ll leave that up to them.

Cethin, do games w Emulating PS2 for my Steam Deck, would love any recommendations!

Dark Cloud 2 (aka Dark Chronicle). That and MGS3 are basically the only PS2 games I actively think about still.

Dark Cloud 1 is also worth playing, but I’d play 2 first. It smoothes out a lot of issues with the first and adds so much more to it.

renegadespork, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.
@renegadespork@lemmy.jelliefrontier.net avatar

This is true of literally any technology. There are so many things that can be improved in the early stages that progress seems very fast. Over time, the industry finds most of the optimal ways of doing things and starts hitting diminishing returns on research & development.

The only way to break out of this cycle is to discover a paradigm shift that changes the overall structure of the industry and forces a rethinking of existing solutions.

The automobile is a very mature technology and is thus a great example of these trends. Cars have achieved optimal design and slowed to incremental progress multiple times, only to have the cycle broken by paradigm shifts. The most recent one is electrification.

Maggoty,

Okay then why are they arbitrarily requiring new GPUs? It’s not just about the diminishing returns of “next gen graphics”.

renegadespork,
@renegadespork@lemmy.jelliefrontier.net avatar

That’s exactly why. Diminishing returns means exponentially more processing power for minimal visual improvement.

Maggoty,

I think my real question is what point do we stop trying until researchers make another breakthrough?

DasSkelett,

Researchers can’t make a breakthrough if they don’t try ^^

Maggoty,

AAA game designers don’t need to be the researchers.

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

That’s what game engines are for

Maggoty,

Great, let the game engine people go wild. We don’t need to try and build the next Far Cry with all of their beta tech though.

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

path tracing is a paradigm shift, a completely different way of showing a scene to that normally done, it’s just a slow and expensive one (that has existed for many years but only started to become possible in real time recently due to advancing gpu hardware)

Yes, usually the improvement is minimal. That is because games are designed around rasterization and have path tracing as an afterthought. The quality of path tracing still isn’t great because a bunch of tricks are currently needed to make it run faster.

You could say the same about EVs actually, they have existed since like the 1920s but only are becoming useful for actual driving because of advancing battery technology.

Maggoty,

Then let the tech mature more so it’s actually analogous with modern EVs and not EVs 30 years ago.

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

Yea, it’s doing that. RT is getting cheaper, and PT is not really used outside of things like cyberpunk “rt overdrive” which are basically just for show.

Maggoty,

Except it’s being forced on us and we have to buy more and more powerful GPUs just to handle the minimums. And the new stuff isn’t stable anyways. So we get the ability to see the peach fuzz on a character’s face if we have a water-cooled $5,000 spaceship. But the guy rocking solid GPU tech from 2 years ago has to deal with stuttering and crashes.

This is insane, and we shouldn’t be buying into this.

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

It’s not really about detail, it’s about basic lighting especially in dynamic situations

(Sometimes it is used to provide more detail in shadows I guess, but that is also usually a pretty big visual improvement)

I think there’s currently a single popular game where rt is required? And I honestly doubt a card old enough to not support ray tracing would be fast enough for any alternate minimum setting it would have had instead. Maybe the people with 1080 ti-s are missing out, but there’s not that many of them honestly. I haven’t played that game and don’t know all that much about it, it might be a pointless requirement for all I know.

Nowadays budget cards support rt, even integrated gpus do (at probably unusable levels of speed, but still)

I don’t think every game needs rt or that rt should be required, but it’s currently the only way to get the best graphics, and it has the potential to completely change what is possible with the visual style of games in the future.

Edit: also the vast majority of new solid gpus started supporting rt 6 years ago, with the 20 series from nvidia

Maggoty,

That’s my point though, the minimums are jacked up well beyond where they need to be in order to cram new tech in and get 1 percent better graphics even without RT. There’s not been any significant upgrade to graphics in the last 5 years, but try playing a 2025 AAA with a 2020 graphics card. It might work, but it’s certainly not supported and some games are actually locking out old GPUs.

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

Often the lighting systems used require some minimum amount of processing power, and to create a lower graphics setting you would need a whole separate lighting technique

Obelix,

If you think about it, the gaming GPUs have been in a state of crisis for over half a decade. First shortages because everybody used them to mine bitcoins, then the covid chip shortages happened and now AI is killing cheaper GPUs. Therefore many people are stuck with older hardware, SteamDecks, consoles and haven’t upgrades their systems and those highly flammable $1000+ GPUs will not lead to everyone upgrading their PCs. So games are using older GPUs as target

Andere, do games w Steam Deck Gaming News

Great summary. Thanks!

lazorne, do games w Steam Deck Gaming News

RetroDECK (a emulation suite known for its use on the Steam Deck system) has had a recent update and sub-sequent blog post

Now that’s a post!

We hope to have 0.9.1b out during next week 😅

PerfectDark,
@PerfectDark@lemmy.world avatar

Thank you!!!

I’m still cross (not really tho) that when I reached out to you all last year, you never wanted me to interview you!!!

RetroDECK has been my one true emulation love for so long now, and I’ve adored all your recent updates. You should all be so proud! I went into a little more detail for the users here on RetroDECK on my latest news post here on Lemmy too!

<3

Kolanaki, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.
@Kolanaki@pawb.social avatar

Has anyone ever really noticed how samey everything looks right now? It’s a bit hard to explain, because it’s not the aesthetics of any kind of art style used, but the tech employed and how it’s employed. Remember how a lot of early 3D in film just looked like it was plastic? It’s like that, but with a wider variety of materials than plastic. Yet every modern game kinda looks like it’s made using toys.

Like, 20 years from now I think it would be possible to look at any give game that is contemporary right now and be able to tell by how it looks when it was made. The way PS1 era games have a certain quality to them that marks when they were made, or how games of the early 2000’s are denoted by their use of browns and grays.

soloner,

My guess is a lot of convergence to a smaller set of known game engines. Godot, unreal, unity, plus a few others and some in-house like valves source.

I could be wrong but I presume in the past almost every game was made with its own custom engine. Now a lot of them have the “unreal engine” look.

But I’m not complaining. Looks great to me and leads to better performance and fewer bugs in the long run. Of course there are some caveats

Kolanaki,
@Kolanaki@pawb.social avatar

Oh yeah this isn’t a complaint, because I think it looks good. It’s just I notice it, and it probably is from almost everything being made on UE5 these days. However, I think MGSV was one of the first games to have this particular look to it, and that’s on its own in-house engine (FOX Engine). It could just be how the lighting and shadowing are done. Those two things are getting so close to photorealism that it’s the texturing and modeling work that puts things (usually human characters) into the uncanny valley. A scene of a forest can look so real… And then you put a person walking through it and the illusion is lost. lol

The_Picard_Maneuver,
@The_Picard_Maneuver@lemmy.world avatar

Yes, definitely. It has to be that they’re all using the exact same engines and methods or something.

ysjet,

It’s everyone using UE-based mocap tools that cause the hyperrealistic-yet-puffy faces, is what I suspect he’s talking about, along with the same photogrammetry tools/libraries.

MudMan,

What do you mean, "everything".

I wish this place was better for images, but... just pulling from my recently played list disproves this hard.

A_Union_of_Kobolds,

Horizon really shone in movement and how fluid the environment felt. It came out a long time ago now, though.

I thought it had a pretty good art direction for what it was

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

Honestly the biggest thing missing in general lighting is usually rough specular reflections and small scale global illumination, which are very hard to do consistently without raytracing (or huge light bakes)

Activision has a good technique for baking static light maps with rough specular reflections. It’s fairly efficient, however it’s still a lot of data. Their recent games have been in the 100-200 gb range apparently. I’m sure light bakes make up a good portion of that. It’s also not dynamic of course.

So, what I’m saying is, raytracing will help with this, hardware will advance, and everyone will get more realistic looking games hopefully.

gandalf_der_12te,

Games look samey because Game Studios don’t have ideas anymore. They just try to sell 20 h of playtime - that is essentially empty. It’s literally just a bunch of materials and “common techniques” squashed into a sellable product. In the early times of gaming, people had ideas before they had techniques to implement them. Nowadays, we have techniques and think the ideas are unimportant. It’s uninspired and uninspiring. That’s why.

RightHandOfIkaros, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.

Ironically, Zelda Link to the Past ran at 60fps, and Ocarina of Time ran at 20fps.

The same framerates are probably in the Horizon pictures below lol.

Now, Ocarina of Time had to run at 20fps because it had one of the biggest draw distances of any N64 game at the time. This was so the player could see to the other end of Hyrule Field, or other large spaces. They had to sacrifice framerate, but for the time it was totally worth the sacrifice.

Modern games sacrifice performance for an improvement so tiny that most people would not be able to tell unless they are sitting 2 feet from a large 4k screen.

JoYo,
@JoYo@lemmy.ml avatar

when i was a smol i thought i needed to buy the memory expansion pack whenever OoT fps tanked.

Maalus,

Had to, as in “they didn’t have enough experience to optimize the games”. Same for Super Mario 64. Some programmers decompiled the code and made it run like a dream on original hardware.

RightHandOfIkaros,

The programming knowledge did not exist at the time. Its not that they did not have the experience, it was impossible for them to have the knowledge because it did not exist at the time. You can’t really count that against them.

Kaze optimizing Mario 64 is amazing, but it would have been impossible for Nintendo to have programmed the game like that because Kaze is able to use programming technique and knowledge that literally did not exist at the time the N64 was new. Its like saying that the NASA engineers that designed the Atlas LV-3B spacecraft were bad engineers or incapable of making a good rocket design just because of what NASA engineers could design today with the knowledge that did not exist in the 50s.

CancerMancer,

One of the reasons I skipped the other consoles but got a GameCube was because all the first party stuff was buttery smooth. Meanwhile trying to play shit like MechAssault on Xbox was painful.

RightHandOfIkaros,

I never had trouble with MechAssault, because the fun far outweighed infrequent performance drops.

I am a big proponent of 60fps minimum, but I make an exception for consoles from the 5th and 6th generations. The amount of technical leap and improvement, both in graphics technology and in gameplay innovation, far outweighs any performance dips as a cost of such improvement. 7th generation is on a game by game basis, and personally 8th generation (Xbox One, Switch, and PS4) is where it became completely unacceptable to run even just a single frame below 60fps. There is no reason that target could not have been met by then, definitely now. Switch was especially disappointing with this, since Nintendo made basically a 2015 mid-range smartphone but then they tried to make games for a real game console, with performance massively suffering as a result. 11fps, docked, in Breath of the Wild’s Korok Forest or Age of Calamity (anyehwere in the game, take your pick,) is totally unacceptable, even if it only happened one time ever rather than consistently.

thisismyhaendel,

I’m usually tolerant of frame drops, especially when they make hard games easier (like on the N64), but I agree it has gotten much worse on recent consoles. Looking at you, Control on PS4 (seems like it should just have been a PS5 game with all the frame drops; even just unpausing freezes the game for multiple seconds).

Regrettable_incident, do games w Steam Deck Gaming News
@Regrettable_incident@lemmy.world avatar

Thanks for taking the time to put this together. I dunno if you are aware but there’s also a steam deck sub here and they’d probably be happy if this was posted there too.

scrubbles, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.
!deleted6348 avatar

And they’re shocked that no one bought the PS5 pro for 800 dollars

atomicbocks, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.

The improvement levels are the same amount they used to be. It’s just that adding 100mhz to a 100mhz processor doubles your performance, adding 100mhz to a modern processor adds little in comparison as a for instance.

bjoern_tantau,
@bjoern_tantau@swg-empire.de avatar

Well, that’s what Moore’s Law was for. The processing power does increase massively over each generation. It’s just that at this point better graphics are less noticeable. There is not much difference to the eye between 100.000 and a million or more polygons.

We’ve basically reached the top. Graphics fidelity is just down to what the artists do with it.

Takumidesh,

I disagree ( that we have reached the top).

Go watch a high budget animated movie (think Pixar or Disney) and come back when real time rendered graphics look like that.

Yea games look good, but real time rendering is still not as good as pre rendered (and likely will never be). Modern games are rife with clipping, and fakery.

If you watch the horizon forbidden West intro scene (as an example), and look at the details, how hair falls on characters shoulders, how clothing moves in relation to bodies, etc, and compare it to something like inside out 2, it’s a world of difference.

If we can pre render it, then in theory it’s only a matter of time before we can real time render it.

gandalf_der_12te,

If we can pre render it, then in theory it’s only a matter of time before we can real time render it.

Not really, because pre renders are often optimized to only look good from one side. If you try to make a 3D model out of it and render that in real time in the game world, it might look ugly or weird from another angle.

Takumidesh, (edited )

Any given frame is just looking at something from one side though, this is the case for video games as well and it’s part of the reason why real time rendering is so much slower. It’s an art and game direction challenge to make things look good however you want to not a technical limitation (in the sense of, you can make a video game look like a Pixar movie does today, it’s just going to render at days per frame instead of frames per second)

There isn’t really a conceptual difference between rendering a frame with the intent to save it and later play it back, and rendering a frame with the intent to display it as soon as it’s ready and dispose of it.

Toy story 1 took days to render a single frame, now it could be rendered on a single home GPU at 24 fps no problem, which would be real time rendering.

To clarify my first paragraph. The challenge is not that it is impossible to render a video game with movie like graphics it’s that the level of effort is higher because you don’t have the optimizations, and so art direction needs to account for that.

As far as considering unexpected behaviors, that is technically only a concern in psuedo-nondeterministic environments (e.g. dynamic physics rendering) where the complexity and amount of potential outcomes is very high and hard to account for. This is a related issue but not really the same one, and it is effectively solved with more horsepower, the same as rendering.

I think the point you were making is that potentially, artistic choices that are deliberately made can’t always be done in real time, which I could agree with. Something like ‘oh this characters hair looks weird the way it falls, let’s try it again and tweak this or that.’ That is awarded by the benefit of trial and error, and can only be replicated real time by more robust physics systems.

Ultimately the medium is different, and while they are both technically deterministic, something like a game has potential for unwanted side effects. However, psuedo-nondeterminism isn’t a prerequisite for a game. The example that comes to mind are real time rendered cutscenes. They aren’t fundamentally different from a movie in that regard, and most oddities in them are the result of bugs in the rendering engine rather than technical impossibilities. Similar bugs exist in 3d animation software, it’s just that Hollywood movies have the budget and attention to detail to fix them, or the luxury to try again.

I’ll end with, if we have the Pixar server farm sufficient hardware, there is nothing that says they couldn’t render Luca or whatever in real time or even faster than real time.

kitnaht, do gaming w Small, incremental improvements don't make shockwaves like the old massive tech leaps used to.

Kind of like smartphones. They all kind of blew up into this rectangular slab, and…

Nothing. It’s all the same shit. I’m using a OnePlus 6T from 2018, and I think I’ll have it easily for another 3 years. Things eventually just stagnate.

paraphrand,

What do you expect next? Folding phones? That would be silly!

user224,
@user224@lemmy.sdf.org avatar

I was hoping that eventually smartphones would evolve to do everything. Especially when things like Samsung Dex were intorduced, it looked to me like maybe in the future phones could replace desktops, running a full desktop OS when docked and some simplified mobile UI + power saving when in mobile mode.

But no, I only have a locked-down computer.

Trainguyrom,

Yeah whatever happened to that? That was such a good idea and could have been absolutely game changing if it was actually marketed to the people who would benefit the most from it

pufferfisherpowder,

I used it for a while when I worked two jobs. Is clock out of job 1 and had an agreement with them to be allowed to use the screen and input devices at my desk for job 2. Then I’d plug in my Tab S8 and get to work, instead of having to carry to chunky laptops.
So it still exists! What I noticed is that a Snapdragon 8 Gen 1 feels underpowered and that Android, and this is the bigger issue, does not have a single browser that works as a full fledged desktop version. All browser I tested has some shortcomings, especially with drag and drop or context menus or whatever. Like things work but you’re constantly reminded that you’re running a mobile os. Like weird behavior or oversized context menus or whatever.

I wish you could lunch into a Linux vm instead of Dex UI. Or for Samsung to double down on the concept. The Motorola Atrix was so ahead of it’s time. Like your phone transforming into your tablet, into your laptop, into your desktop. How fucking cool is that?
Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?

Trainguyrom,

It’s super easy to forget but Ubuntu tried to do it back in the day with Convergence as well, and amusingly this article also compares it to Microsoft’s solution on Windows Phone. It’s a brilliant idea but apparently no corporation with the ecosystem to make it actually happen has the will to risk actually changing the world despite every company talking about wanting an “iPhone moment”

Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?

Let’s be real, Apple’s biggest risk would be losing the entire student and young professional market by actually demonstrating that they don’t need a Mac Book Pro to use the same 5 webapps that would work just as well on a decent Chromebook (if such a thing existed)

user224,
@user224@lemmy.sdf.org avatar

Linux vm

Or just something like Termux, a terminal emulator for Android. Example screenshot (XFCE desktop over VNC server), I didn’t know what to fit in there:
https://files.catbox.moe/zr7kem.png

Full desktop apps, running natively under Android. For better compatibility Termux also has proot-distro (similar to chroot) where you can have… let me copy-paste


<span style="color:#323232;">Supported distributions (format: name < alias >):
</span><span style="color:#323232;">
</span><span style="color:#323232;">  * Alpine Linux < alpine >
</span><span style="color:#323232;">  * Arch Linux < archlinux >
</span><span style="color:#323232;">  * Artix Linux < artix >
</span><span style="color:#323232;">  * Chimera Linux < chimera >
</span><span style="color:#323232;">  * Debian (bookworm) < debian >
</span><span style="color:#323232;">  * deepin < deepin >
</span><span style="color:#323232;">  * Fedora < fedora >
</span><span style="color:#323232;">  * Manjaro < manjaro >
</span><span style="color:#323232;">  * openKylin < openkylin >
</span><span style="color:#323232;">  * OpenSUSE < opensuse >
</span><span style="color:#323232;">  * Pardus < pardus >
</span><span style="color:#323232;">  * Ubuntu (24.04) < ubuntu >
</span><span style="color:#323232;">  * Void Linux < void >
</span><span style="color:#323232;">
</span><span style="color:#323232;">Install selected one with: proot-distro install <alias>
</span>

Though there is apparently some performance hit. I just prefer Android, but maybe you could run even full LibreOffice under some distro this way.

If it can be done by Termux, then someone like Samsung could definitely make something like that too, but integrated with the system and with more software available in their repos.

What’s missing from the picture but is interesting too is NGINX server (reverse proxy, lazy file sharing, wget mirrored static website serving), kiwix-serve (serving ZIM files including the entire Wikipedia from SD card) and Navidrome (music server).
And brought to any internet-connected computer via Cloudflare QuickTunnel (because it doesn’t need account nor domain name). The mobile data upload speed will finally matter, a lot.

You get the idea, GNU+Linux. And Android already has the Linux kernel part.

pufferfisherpowder,

Yeah, I remember trying it and while it works the performance hit was too big for my use case. But it’s been a while!

Fortunately I’m in a position where I don’t have to juggle two jobs anymore so I barely use Dex these days.
Which in reverse is also why Samsung isn’t investing a lot into it I suppose - it’s a niche use case. I would guess that generally people with a desktop setup would want something with more performance than a mobile chip.

mrvictory1,

Linux on DeX was a thing but killed by Samsung

AdrianTheFrog,
@AdrianTheFrog@lemmy.world avatar

there is an official android desktop mode, I tried it and it isn’t great ofc but my phone manufacturer (oneplus) has clearly put no work into making it functional

MonkderVierte,

Maybe make the rectangular slab… smaller again?

CancerMancer,

I would love to have a smaller phone. Not thinner, smaller. I don’t care if it’s a bit thick, but I do care if the screen is so big I can’t reach across it with one hand.

secret300,

I miss physical keyboards on phones

starman2112,
@starman2112@sh.itjust.works avatar

One company put a stupid fucking notch in their screen and everyone bought that phone, so now every company has to put a stupid fucking notch in the screen

I just got my tax refund. If someone can show me a modern phone with a 9:16 aspect ratio and no notch, I will buy it right now

MonkderVierte,

You can easily keep a phone for 7 years.

mrvictory1,

OnePlus 6 line of phones are one of the very few with good Linux support, I mean, GNU/Linux support. If custom ROMs no longer cut it you can get even more years with Linux. I had an iPhone, was eventually fed up, got an Android aaand I realized I am done with smartphones lol. Gimme a laptop with phone stuff (push notifications w/o killing battery, VoLTE) and my money is yours, but no such product exists.

inlandempire, do games w Steam Deck Gaming News
@inlandempire@jlai.lu avatar

Nice, thanks for all the effort, keep them coming !

plant_based_monero, do games w Steam Deck Gaming News

People like you make lemmy great :)

nickwitha_k, do games w Steam Deck Gaming News

As a preface, I used to do this a lot on Reddit. My hobby (sounds odd) was to make a little old-school-blog-style post, detailing what I found interesting in gaming in the last week or so. I got a name for it, for a time, but having long-since abandoned reddit I thought I might try the same thing here, if you’ll indulge me!

Happy to have you posting!

VitoRobles,

Agreed! There’s apparently a huge ecosystem that I wasn’t aware of!

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • esport
  • informasi
  • test1
  • muzyka
  • krakow
  • NomadOffgrid
  • Technologia
  • FromSilesiaToPolesia
  • rowery
  • fediversum
  • retro
  • ERP
  • Travel
  • Spoleczenstwo
  • gurgaonproperty
  • shophiajons
  • Psychologia
  • Gaming
  • slask
  • nauka
  • sport
  • niusy
  • antywykop
  • Blogi
  • lieratura
  • motoryzacja
  • giereczkowo
  • warnersteve
  • Wszystkie magazyny