Any true 2D game, because the console was designed for 2D games. The SuperFX chip used for Star Fox was also used for Yoshi’s Island, which did maintain 60Hz.
It’s almost like having double frame buffers for 720p or larger, 16 bit PCM audio, memory safe(ish) languages, streaming video, security sandboxes, rendering fully textured 3d objects with a million polygons in real time, etc. are all things that take up cpu and ram.
I will run any game at 60fps if it was designed for this exact machine that does nothing but play games designed for it and is also 16-bit with pixel graphics and also has low quality audio and also fits in the memory of the cartridge
I didn’t realize web browsing in Chrome required fully textured 3D objects. Not to mention playing 720p video with PCM audio in a separate app doesn’t grind everything to a halt.
well the gpu doesn’t care if it’s 2d or 3d, but you are rendering a whole bunch of textured triangles… (separated into tiles for fast partial or multithreaded re-rendering), and also just-in-time rasterizing fonts, running a complex constraint solver to lay out the ui, parsing 3 completely separate languages, communicating using multiple complex network protocols, doing a whole bunch interprocess communication in order to sandbox stuff
WebGL means the browser has access to the GPU. Also, the whole desktop tends to be rendered as a 3D space these days. It makes things like scaling and blur effects easier, among other benefits.
The Super Nintendo’s interlaced video mode was basically never ever used. It could output 60Hz and more than often did.
Only some games had limited framerate for various reasons, such as Another World being limited by cartridge ram or Star Fox being limited by the power of the SuperFX. Yoshis Island also used the SuperFX and wasn’t limited like Star Fox was. Occasionally there was slowdown if a developer put too much on screen at once, but these were momentary and similar to today when a game hitches while trying to load a new area during gameplay.
I’m just being nitpicky because you are using CRT interchangeably with Television. CRT’s are used in TV’s but aren’t interlaced unless the circuitry around them sends interlaced. So no, interlacing is not native on CRT’s when receiving an interlaced signal. If I plugged a Nintendo into my old ViewSonic CRT, I wouldn’t get a signal because it didn’t support NTSC interlaced input.
It’s like saying interlacing is native on LCDs. LCD TVs are interlaced, not LCDs.
I’m just being nitpicky because you are using CRT interchangeably with Television.
That was intentional on my part because of the audience and good communication. You’re technically correct, but without a paragraph of tangential and irrelevant explanation your audience isn’t going to understand you. Modern parlance usage of “television” isn’t the CRT appliance, its any appliance that shows the moving pictures and sound content of television programming. If you walk into any store today and buy a TV, you’re going to get an LCD, AMOLED, or quantum dot display. None of those are CRTs, yet everyone born after about 2002 will associate a TV or Television with a flat panel non-CRT display.
So no, interlacing is not native on CRT’s when receiving an interlaced signal.
And in nobody’s mind was the vision of plugging a SNES into a computer monitor CRT. You introduced that idea only to show how its wrong. You win at pedantry, but lose at communication.
If someone says to you “I’m watching TV”, do you poke your head around the back of the unit to make sure it has a tuner in it and if it doesn’t you quip back to correct them “You’re not actually watching a TV, you’re watching a monitor. A TV requires a tuner, which this unit does not have, making it a monitor, not a TV”?
It is 59.94 fields per second, translating into 29.97 FPS. Interlaced video is fun. Reason why it’s not a round 60 or 30 FPS is due to maintaining compatibility with black and white sets.
240p uses each field as a frame, though, while still maintaining compatibility with NTSC. This is what most consoles pre-6th generation uses (same with PAL, but 288p at 50 FPS)
At 480i. SNES used 240p, which is technically not standard NTSC, but compatible. Nintendo called this “double strike”, since each field would display in the same location.
Sure they were technically 30 “fields” per second, but most games updated 60 times a second, even SMB on NES. You only saw one half of what the internal console rendered which is an output issue, not a rendering one.
Add on 480p and you get both 60 frames and 60 fields per second
Because the moon landing was rendered by the TV station and your TV only showed the end result. You can do the same with GeForce NOW or other streaming service.
Unless this mod adds the N word, t’s really not. There’s a lot of racism in the elders scrolls universes that is there for worldbuilding as opposed to thinly-veiled stereotypes of real races like in Harry Potter*. Pretty much every race has issues with the others just like they would in the real life equivalent of the time period.
It’s not at all the same as other mods which add real world bigotry into the game
*One could argue that Khajiits are analogous to the Romani people but it’s a stretch
lemmy.world
Gorące