• We have updated our Community Code of Conduct. Please read through the new rules for the forum that are an integral part of Paradox Interactive’s User Agreement.
Why on earth would you even want to use upscaling if the game isn't taxing your GPU heavily? The point of using upscaling afaik is to run the game at a lower resolution because it is less taxing for the GPU, and therefore allowing you to run at a higher FPS.

It is one of the go to solutions to improve game speed in Vic 3. In the Vic 3 benchmark thread people have reported a fairly big impact:

The upscaling being done by an extarnal application is completely irrelevant. The problem is that when using upscaling you are reducing the resolution the game runs at.

The real performance issue in Paradox games is the simulation speed, not fps. Them implementing native frame generation and/or upscaling won't do anything to improve that.

No. The primary problem is predicting what the numbers should be. Separating the UI from the rest of the frame generation could potentially address that issue for Paradox games, but from what I understand Nvidia's frame generation does not allow to not generate UI elements as of today. Paradox could of course go for AMD and Intel frame generation support only, but that would probably cause a bigger uproar than just not supporting frame generation.

The primary performance issue in all Paradox games is the game speed, not FPS.

That's not how things work, but there is nothing magical about it. I would suggest reading this old Vic 3 dev diary for some enlightenment:
Considering the fact that EUV basically runs on the same engine, and the fact that there is no good way around the cause of the issue, there is no way EUV will not have some of the same challanges when it comes to FPS and simulation speed.

Fair point.
Ah, we were discussing different scopes. I was specifically talking about frame interpolation, no up scaling at all. Pdx games often will start to have frame drops when Sim speed is the bottleneck, play on speed 5 for example and your frame rate will tank usually.
I think graphics are tied to Sim speed, so if your game starts to stutter, interpolation can smooth those out.

As for the resources, I'll have to check those out another time, thanks for providing them.

Honestly the frame drops and stutter are one of the biggest reasons I stop playing games like this so interpolating frames has really helped me out in that respect, even if the game slows down.
 
Pdx games often will start to have frame drops when Sim speed is the bottleneck, play on speed 5 for example and your frame rate will tank usually.
I think graphics are tied to Sim speed, so if your game starts to stutter, interpolation can smooth those out.
But running frame generation when your frame rate is tanking will just introduce massive input delay. Which in a mouse-menuing game is very annoying.
 
Last edited:
  • 3Like
  • 1
Reactions:
It's not really a matter of "letting." Paradox GSGs are performance limited by things that aren't amenable to GPU compute.
Actually, GPUs excel at the kind of things that Paradox games has a lot of: Parallel floating-point arithmetic, but part of the problem is that a few tasks which takes a long time can't be parallelised (each tag needs to be serialised, and the biggest tags tends to take the longest), so the benefit would likely be very limited compared to what you can achive on a modern high-end CPU. From a development point of view it would be a nightmare, and it would likely cause all kinds of problems for people who are running low on VRAM, which would likely be a lot if such a feature was implemented.
I can definitely imagine there being ways to wrap up some of their calculations into a format suitable for GPUs--I'm not really familiar with GPU acceleration for programs myself, having just worked with CPU compute grids, but given that GPUs are good at parallelizing simple operations like linear operations on matrices and other simple functions, I bet there are ways to apply that to the kind of math they need for pop or trade calculations, AI decision making, pathfinding, etc. One thing I have definitely learned both via experience and research as a programmer is there are times when using hardware (or software in terms of language/library/etc. constraints) the right way to brute force a problem can be much faster than a "smarter" solution even though you're doing "more" work. So forming some big matrix product to multiply on the GPU might actually produce a lot of "waste" results that you throw away after the computation, but still be faster than a more complex algorithm running only on the CPU. Often this can come down to memory access patterns; modern machines crunch numbers very fast, but it doesn't matter if they have a hard time finding the data to input so to speak.

It's entirely possible PDX has already explored this and found it too difficult to implement in a way that won't frequently break or cause adverse side effects with different hardware, or just seems like too much work... or maybe they already have done it and we'll be surprised! (Although I heard the demos don't run that fast so...)

Where would a gsg use floating points? Aren't all the values in such ranges that can be handled as integers (even if adding a decimal dot for UI), and using integers, at least in the CPU, is much faster than using floating points?
This is really not a significant difference in modern hardware (also, depending on device, if they have more FPUs or other related hardware, floating point operations may be faster), especially compared to more significant bottlenecks like memory access or truly complex composite functions. To me it would make much more sense if they compute everything as floats under the hood and round the UI elements that need it. Way less headache in writing the math. And all sorts of things would start breaking or behaving weirdly if you forced yourself to use integers under the hood (especially considering paradox's love of modifiers, though at least EU historically hasn't used so many unintuitive functions like HOI which has all sorts of combat equations that have logarithmic and square root dependencies).
 
No. The primary problem is predicting what the numbers should be. Separating the UI from the rest of the frame generation could potentially address that issue for Paradox games, but from what I understand Nvidia's frame generation does not allow to not generate UI elements as of today. Paradox could of course go for AMD and Intel frame generation support only, but that would probably cause a bigger uproar than just not supporting frame generation.

The primary performance issue in all Paradox games is the game speed, not FPS.

That's not how things work, but there is nothing magical about it. I would suggest reading this old Vic 3 dev diary for some enlightenment:
Considering the fact that EUV basically runs on the same engine, and the fact that there is no good way around the cause of the issue, there is no way EUV will not have some of the same challanges when it comes to FPS and simulation speed.
You don't need to framegen the UI to begin with, it's not the UI that causes massive frame-rate drops. It updates just fine when the 3D parts of the screen isn't dragging it down.

No, that's your own opinion.

You have ignored the fact that the game can still run slowly even when paused, which will be even more of an issue in EU5 as demonstrated by previews.

Just add the option and I'll turn on and you'll turn off, we'll both be happy. No need to only make you happy.
 
  • 2
  • 1
Reactions:
This is really not a significant difference in modern hardware (also, depending on device, if they have more FPUs or other related hardware, floating point operations may be faster), especially compared to more significant bottlenecks like memory access or truly complex composite functions. To me it would make much more sense if they compute everything as floats under the hood and round the UI elements that need it. Way less headache in writing the math. And all sorts of things would start breaking or behaving weirdly if you forced yourself to use integers under the hood (especially considering paradox's love of modifiers, though at least EU historically hasn't used so many unintuitive functions like HOI which has all sorts of combat equations that have logarithmic and square root dependencies).
We know at least some things are using integers for decimal numbers because Johan’s said so.
 
  • 1
Reactions: