I think it would be tough to nail down one thing. There are the clear comparisons to Victoria 2, which I haven’t played, but my understanding is that 2 is more “detailed” in it’s simulation of some things. There will always be people who don’t like changes from the last game. The military aspect is a lot less engaging than something like Hearts of Iron, but I think the intent there was to keep the focus on the economic and political sides of things. Warfare received a minor overhaul when I first tried the game that I’ve heard made things better, but it can still be a little frustrating at times.
Most of the complaints about the economic side that’s meant to take center stage is that your economy’s success boils down to how many construction points you can have going at once. That’s true, but I do like that you can’t pour everything into that without balancing the foundation needed to support the increase of construction, and just doing that could limit growth in other areas, like improving citizen lives, which could complicate your political affairs.
I feel like I’ve gotten a little lost in the weeds here. Overall, I think it has mixed reviews because Victoria 3 is still a work in progress. It’s a work in progress that I enjoy very much, but there is still room for improvement. I kind of fell off Stellaris between the Nemesis and Overlord expansions because it felt kind of bloated and repetitive, and I wasn’t wondering what kind of civilization I could play anymore. Victoria 3 has been successful at making me contemplate how I can manipulate the mechanics to achieve a specific outcome, even when I’m not playing.
With menu games like Paradox make, you gotta learn by playing the game. And by playing the game, I of course mean pausing the game every minute or two to spend way more minutes reading the tooltips, the tooltips within those tooltips, and then finding your way to a new menu you didn’t know existed referenced by those tooltips so you can read more tooltips!
It’s a beautiful cycle, and Victoria 3 has sucked me in as much as Stellaris did 7 years ago. If you have any questions or thoughts, I’d love to hear them!
With menu games like Paradox make, you gotta learn by playing the game. And by playing the game, I of course mean pausing the game every minute or two to spend way more minutes reading the tooltips, the tooltips within those tooltips, and then finding your way to a new menu you didn’t know existed referenced by those tooltips so you can read more tooltips!
It’s a beautiful cycle, and Victoria 3 has sucked me in as much as Stellaris did 7 years ago. If you have any questions or thoughts, I’d love to hear them!
Tatsu from Xenoblade Chronicles X is a really annoying little dude. I watched my buddy play through and every time he said anything he’d tell the tv “shut up Tatsu.” It’s arguably more aggravating because the game seems aware of his annoyance since one of the main characters is constantly suggesting she cook him into a dish to eat. I’d say that would be the best outcome.
Tatsu from Xenoblade Chronicles X is a really annoying little dude. I watched my buddy play through and every time he said anything he’d tell the tv “shut up Tatsu.” It’s arguably more aggravating because the game seems aware of his annoyance since one of the main characters is constantly suggesting she cook him into a dish to eat. I’d say that would be the best outcome.
Saying you were 13/14 when horse armor came out doesn’t help your case arguing against their comment. It just means you were prime gaming age when dlc, map packs, and smaller content were replacing larger expansions. The acceptance of those (which based on your demographic you probably did accept) made it easier to transition to more and more egregious micro transactions.
There used to be (maybe still are) complete games released on mobile. They usually cost $6.99 and didn’t need more. If they want Elden Ring on mobile without tarnishing its reputation, they could sell a complete experience for $10 or $15 since it’s been a decade since those $6.99 prices. That’s what Elden Ring was and it was widely praised. That’s what the rest of their games have done and that has turned out well for them.
There may be servers for the multiplayer, but based on the fact none of the other From Soft games charged for it the cost must be minimal.
It can be depending on what you like. You have a flying drone to help you that isn’t in multiplayer because there you all have different abilities to cover each others’ weaknesses.
Personally I think single-player gets stale and lonely quick, it’s just a lot more fun panicking and overcoming challenges with friends.
It’s one of my favorite games, and now is a good time to play it! It gives me a similar feeling to Halo where humanity itself is on the backfoot and nearly extinct the whole time, yet enduring as best it can. The difference being that you’re controlling a city fighting the snowpocalypse rather than a cyber-soldier fighting aliens.
I remember playing Assassins Creed II on pc with a 9500GT and getting sub 20fps constantly to the point I had to wait for character animations to catch up with the dialogue so the next person could talk. Halfway through the game I upgraded to a GTX 560 and was astounded that everything was in sync and oh so smooth. I always remember that when I start getting annoyed I can’t get over 90fps in a game. As long as it’s playable!
For me it depends on the game. A menu game from Paradox like Crusader Kings? 4k 60fps. A competitive shooter? Ideally the max resolution (for greater pinpoint accuracy) and 144fps, but between the two I’d want maximum fps for the reaction speed and responsiveness. A pretty game like Ori and the Will of the Wisps? Crank the graphics up and I’m happy with 60fps.
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
This is not a delay. They are updating the window from "Early 2023-24," which the article states is likely anytime this past year to the end of their fiscal year at the end of March to... "Q4 2023-24" which ends at the end of March. So there's no real change to when it could release by (yet).