So to summarize, you're saying the performance overhead no matter how small is intolerable?
I think that, if that is your argument, is reductive. For one, not all hardware is equally taxed, one can use the gpu for frame Gen rather well on gsgs as they tend not to use the gpu heavily.
Why on earth would you even want to use upscaling if the game isn't taxing your GPU heavily? The point of using upscaling afaik is to run the game at a lower resolution because it is less taxing for the GPU, and therefore allowing you to run at a higher FPS.
Also, the locked frame rate thing seems to be a thing of the past insofar as it being necessary is.
It is one of the go to solutions to improve game speed in Vic 3. In the Vic 3 benchmark thread people have reported a fairly big impact:
There's been a lot of discussing Victoria 3's performance. Some people say it runs perfectly smooth the whole way through, others say it's unplayable after X number of years. Presumably the truth is somewhere in the middle. I decided to try and...
forum.paradoxplaza.com
There's been a lot of discussing Victoria 3's performance. Some people say it runs perfectly smooth the whole way through, others say it's unplayable after X number of years. Presumably the truth is somewhere in the middle. I decided to try and...
forum.paradoxplaza.com
As for the vic 3 comment, I can't really give input there as I'm not a game dev and it isn't strictly relavent to my counter claim as it's an external application, but if this was designed for the game, surely that would be accounted for.
The upscaling being done by an extarnal application is completely irrelevant. The problem is that when using upscaling you are reducing the resolution the game runs at.
I think the whole conversation is probably moot though, as I give it less than a 5 percent chance for pdx to implement frame Gen natively in their game, and I'll just go back to using lossless scaling when the game runs not well, just like every single one of their games in recent memory in the mid to late game.
The real performance issue in Paradox games is the simulation speed, not fps. Them implementing native frame generation and/or upscaling won't do anything to improve that.
Frame generation's UI issues might arise when the UI element is surrounded by fast-moving scenery like a shooter's aim reticle.
No. The primary problem is predicting what the numbers should be. Separating the UI from the rest of the frame generation could potentially address that issue for Paradox games, but from what I understand Nvidia's frame generation does not allow to not generate UI elements as of today. Paradox could of course go for AMD and Intel frame generation support only, but that would probably cause a bigger uproar than just not supporting frame generation.
FPS is pretty much an issue in Paradox games and was further demonstrated for EU5 in preview videos, it ran like a slog.
The primary performance issue in all Paradox games is the game speed, not FPS.
You don't run the game magically slower because there's less demand on GPU computation. Game logic will tax whatever the CPU can provide first, after that whatever remains will be diverted to portraying the scene. That's why the frame drops the faster the game runs, game is drawing from CPU all it can to go fast at the expense of graphics.
That's not how things work, but there is nothing magical about it. I would suggest reading this old Vic 3 dev diary for some enlightenment:
Hello and welcome to this week's Victoria 3 dev diary. This time we will be talking a bit about performance and how the game works under the hood. It will get somewhat detailed along the way and if you are mostly interested in what has improved...
forum.paradoxplaza.com
Considering the fact that EUV basically runs on the same engine, and the fact that there is no good way around the cause of the issue, there is no way EUV will not have some of the same challanges when it comes to FPS and simulation speed.
Where would a gsg use floating points? Aren't all the values in such ranges that can be handled as integers (even if adding a decimal dot for UI), and using integers, at least in the CPU, is much faster than using floating points?
Fair point.