MULTI Unreal Engine 5 officially released; new tech test video from The Coalition

CerebralTiger

Expert
Apr 12, 2007
19,839
5,868
129
Islamabad
Agree about responsiveness. Frame times constantly hovering between 16.67ms and 33.33ms isn't great for playability. Remains to be seen for what percentage of the time it stays at 16.67ms. VG Tech, where you at? :LOL: 🤜

Still, fluctuations between 16.67/33.33ms are better than larger frame time spikes 🤷‍♂️:LOL:
 
  • Like
Reactions: Danish_karachi

Necrokiller

Expert
Apr 16, 2009
13,594
5,127
129
Agree about responsiveness. Frame times constantly hovering between 16.67ms and 33.33ms isn't great for playability. Remains to be seen for what percentage of the time it stays at 16.67ms. VG Tech, where you at? :LOL: 🤜

Still, fluctuations between 16.67/33.33ms are better than larger frame time spikes 🤷‍♂️:LOL:
Frame times hovering between 16.67ms and 33.33ms do not have the same effect when the frame rate is 30fps or 40fps or 60fps.

Lower frame rates are far worse because they last a longer period of time and can happen any time as well, even during combat and also fall out of the VRR window (and no, LFC isn't gonna help with poor responsiveness) 🤷‍♂️:LOL:

Example:

Lulz of the Kingdom : 1st percentile 20fps, 5th percentile 24fps = shitty performing game with shitty responsiveness despite the frame times not exceeding 66.6ms

Lulzonetta 3 : 60fps cap gameplay 1st and 5th percentile 34fps and 39fps respectively = shitty performing game with shitty responsiveness and also has stuttering issues.

Jedi Survivor : Performance mode 1st and 5th percentile on both consoles 32-36fps = shitty performing game with shitty responsiveness, while also having MASSIVE spikes

Therefore, "best performing" label is absooooolutely meaningless and anyone who can enjoy the aforementioned games has no right to complaint about PC performance as far as I'm concerned. 🤷‍♂️:LOL:
 
Last edited:

CerebralTiger

Expert
Apr 12, 2007
19,839
5,868
129
Islamabad
Frame times hovering between 16.67ms and 33.33ms do not have the same effect when the frame rate is 30fps or 40fps or 60fps.
Frame rates are derived from frame times, not the other way around. The "last a longer period of time" information is covered by the percentage of the time frames are delivered at a certain frame time.

TOTK delivers frames at higher than the intended frame time 3.9% of the time. Completely fine and performs well.

Bayonetta 3 delivers frames at higher than the intended frame time 17.17% of the time w/ an additional 0.4% of stutters. The stutters are a bummer, but what are my options for playing the game in a better state? None.

Jedi Survivor delivers frames at higher than the intended frame time 25.63% of the time w/ an additional 0.4% of stutters. The stutters are a bummer, but what are my options for playing the game in a better state? None. The PC version is a disaster, and the Series X version performs worse than PS5.

Ultimately, I'll play the game I want to play on the platform that delivers the best possible experience. That is why "best performing" is very meaningful :LOL:🤜
 

iampasha

Seasoned
Apr 4, 2013
2,754
1,475
129
28
Karachi
Frame rates are derived from frame times, not the other way around. The "last a longer period of time" information is covered by the percentage of the time frames are delivered at a certain frame time.

TOTK delivers frames at higher than the intended frame time 3.9% of the time. Completely fine and performs well.

Bayonetta 3 delivers frames at higher than the intended frame time 17.17% of the time w/ an additional 0.4% of stutters. The stutters are a bummer, but what are my options for playing the game in a better state? None.

Jedi Survivor delivers frames at higher than the intended frame time 25.63% of the time w/ an additional 0.4% of stutters. The stutters are a bummer, but what are my options for playing the game in a better state? None. The PC version is a disaster, and the Series X version performs worse than PS5.

Ultimately, I'll play the game I want to play on the best possible platform. That is why "best performing" is very meaningful :LOL:🤜
Longf thing me no
read
 
  • Haha
Reactions: Necrokiller

Necrokiller

Expert
Apr 16, 2009
13,594
5,127
129
Frame rates are derived from frame times, not the other way around.
When did I say otherwise?

The "last a longer period of time" information is covered by the percentage of the time frames are delivered at a certain frame time.
Like the 0.2% in Dead Space remake 🤜

I mean, a spike is literally measured in ms, while low frame rates are averaged over a longer period of time. 🤦‍♂️

The input response in all these console versions is awful due to low frame rate issues and directly impacts gameplay negatively 🤷‍♂️


Ultimately, I'll play the game I want to play on the platform that delivers the best possible experience. That is why "best performing" is very meaningful :LOL:🤜
You may not think you have a better option but that doesn't make the objectively poor versions playable :sick:
 

CerebralTiger

Expert
Apr 12, 2007
19,839
5,868
129
Islamabad
Like the 0.2% in Dead Space remake 🤜
Yes, just as I said 0.4% of stutters in Jedi Survivor on PS5 is a bummer, 0.2% of stutter isn't good either 🤜

I mean, a spike is literally measured in ms, while low frame rates are averaged over a longer period of time 🤦‍♂️
Frame time counts, which is what the percentage is based on is not measured in ms, however. It is literally telling you the number of frames out of the total rendered frames during the test that are delivered at a particular frame time. It does not involve the measurement unit of frame time itself.

The input response in all these console versions is awful due to low frame rate issues and directly impacts gameplay negatively 🤷‍♂️
Frame rate is irrelvant. TOTK delivers only 1227 out of 31474 (3.9%) frames at 50ms. The input response is fine based on the frame time count breakdown for the rendered frames.

You may not think you have a better option but that doesn't make the objectively poor versions playable :sick:
The only thing that can objectively be proven here is that the option I chose is best 🤜
 

Necrokiller

Expert
Apr 16, 2009
13,594
5,127
129
Yes, just as I said 0.4% of stutters in Jedi Survivor on PS5 is a bummer, 0.2% of stutter isn't good either 🤜
TOTK delivers only 1227 out of 31474 (3.9%) frames at 50ms.
"only"
"bummer"

Notice how CerebralTiger tries his best to sugar coat the console performance issues.

I mean, 3.9% of worse motion (VRR and LFC can't save 20-30fps) AND input response, compared to 0.2% in Dead Space Remake. Exactly what I said, lower frame rates last for longer period of time and input response suffers the whole time it happens 🤜:ROFLMAO:

The only thing that can objectively be proven here is that the option I chose is best 🤜
Math says otherwise 🤷‍♂️:ROFLMAO:
 

CerebralTiger

Expert
Apr 12, 2007
19,839
5,868
129
Islamabad
"only"
"bummer"
It's a statement relative to the target platform and its intended frame time. Is a drop from the intended frame time of 33.33 ms to 50 ms 3.9% of the time bad? The input response is affected for sure, but input still registers. A drop to 66.66 ms? Yeah, that's terrible for a game that targets 33.33 ms or 30 fps, and can cause input registry issues.

In the case of Jedi Survivor, 0.4% in Performance mode on PS5 is bad because it indicates pauses that potentially eat up input. Even if it's momentary, it's horrible 🤷‍♂️

Math says otherwise 🤷‍♂️:ROFLMAO:
Please elaborate, math teacher :ROFLMAO:
 

LeGenD123

The One and Only
Sep 5, 2007
3,752
22
44
Lahore
I genuinely believe this "PC stutters in games" is blown way out of proportions... PCs have so many configurations with alot of people running tons of shit.in the background, unstable overclocks, cracked games, I can see why complaints for bad performance poppes up so often
Agreed.

My system has excellent resource management, it automatically shuts down unnecessary applications and minimizes background load when I run a PC game. I also buy original.

However, some game developers have come up with bad PC ports that are notorious for stuttering or performance issues.

Bad PC Ports Weren't Developed With PCs in Mind

Consoles aren't as powerful as high-end or even mid-range PCs. However, the components inside a console are all predictable. With this consistency in parts, developers can code their game in a way that makes the most out of the hardware it's expected to play on.

Consoles usually have a shared pool of system and graphics memory. While on computers with dedicated graphics cards, the system and graphics memory aren't shared. Changing that very fundamental part of how the game works is quite tricky and could sometimes mean a massive overhaul, leading to performance issues.

Console games are developed to run as efficiently as possible on the hardware that they're given. Considering that console games are sold to console gamers first, the developers have all the reasons to make the game cater to that market the most, leaving the PC gamers as an afterthought.

Ports have tight deadlines; working on an old existing project isn't seen as very profitable, so they're often quite rushed. To keep the main developers working on new projects, porting the game to PC is often outsourced to external companies with tight deadlines.

Corporate Greed Rushes PC Ports

Some ports have the potential to be good but are limited by investors that don't understand what it takes to make a game. You've probably heard of rushed game releases for the sake of keeping investors happy; the same happens with ports.

Ports have tight deadlines; working on an old existing project isn't seen as very profitable, so they're often quite rushed. To keep the main developers working on new projects, porting the game to PC is often outsourced to external companies with tight deadlines.

The infamous The Last of Us Part II is an example of this outsourcing. The game was ported to PC by Iron Galaxy—the same studio outsourced to port The Last of Us Part I to PC. Part I was also a bad PC port, with both games suffering memory and CPU management issues.

There are many more things that corporate greed is ruining; check out how the rising costs of games are affecting the industry as a whole to learn more.



This is not a PC hardware problem.
 

iampasha

Seasoned
Apr 4, 2013
2,754
1,475
129
28
Karachi
Agreed.

My system has excellent resource management, it automatically shuts down unnecessary applications and minimizes background load when I run a PC game. I also buy original.

However, some game developers have come up with bad PC ports that are notorious for stuttering or performance issues.

Bad PC Ports Weren't Developed With PCs in Mind

Consoles aren't as powerful as high-end or even mid-range PCs. However, the components inside a console are all predictable. With this consistency in parts, developers can code their game in a way that makes the most out of the hardware it's expected to play on.

Consoles usually have a shared pool of system and graphics memory. While on computers with dedicated graphics cards, the system and graphics memory aren't shared. Changing that very fundamental part of how the game works is quite tricky and could sometimes mean a massive overhaul, leading to performance issues.

Console games are developed to run as efficiently as possible on the hardware that they're given. Considering that console games are sold to console gamers first, the developers have all the reasons to make the game cater to that market the most, leaving the PC gamers as an afterthought.

Ports have tight deadlines; working on an old existing project isn't seen as very profitable, so they're often quite rushed. To keep the main developers working on new projects, porting the game to PC is often outsourced to external companies with tight deadlines.

Corporate Greed Rushes PC Ports

Some ports have the potential to be good but are limited by investors that don't understand what it takes to make a game. You've probably heard of rushed game releases for the sake of keeping investors happy; the same happens with ports.

Ports have tight deadlines; working on an old existing project isn't seen as very profitable, so they're often quite rushed. To keep the main developers working on new projects, porting the game to PC is often outsourced to external companies with tight deadlines.

The infamous The Last of Us Part II is an example of this outsourcing. The game was ported to PC by Iron Galaxy—the same studio outsourced to port The Last of Us Part I to PC. Part I was also a bad PC port, with both games suffering memory and CPU management issues.

There are many more things that corporate greed is ruining; check out how the rising costs of games are affecting the industry as a whole to learn more.



This is not a PC hardware problem.
I'm glad your back. Do stick around. We aren't much left here ol regulars.
 

CerebralTiger

Expert
Apr 12, 2007
19,839
5,868
129
Islamabad
The "so many configurations" is exactly why shader compilation stutters happen in the first place. As games get more complex, so does their shader work. With popular game engines using visual scripting and graphs for the creation of shaders, the dependencies between them can be significant in large-scale games. It's impossible to pre-compile all shaders for a vast number of hardware configurations in such cases. Asyc shader compilation helps on systems with available CPU headroom, but doesn't entirely solve the problem. The creation of certain shaders is necessitated at runtime as a combination of two pre-compiled shaders due to various actions in-game.

Secondly, streaming related stutters happen due to not-so-great streaming technology used by a game engine or due to the effective throughput at which compressed data is transferred from storage to memory in a GPU readable form on a platform. DirectStorage API attempts to solve this problem on PC, and it has for the most part in a linear, corridor shooter with scripted events that require fast streaming of data. It'll be interesting to see how it handles it in an open world game that actively requires fast streaming of data.
 
  • Haha
Reactions: Necrokiller

Necrokiller

Expert
Apr 16, 2009
13,594
5,127
129
The "so many configurations" is exactly why shader compilation stutters happen in the first place. As games get more complex, so does their shader work. With popular game engines using visual scripting and graphs for the creation of shaders, the dependencies between them can be significant in large-scale games. It's impossible to pre-compile all shaders for a vast number of hardware configurations in such cases. Asyc shader compilation helps on systems with available CPU headroom, but doesn't entirely solve the problem. The creation of certain shaders is necessitated at runtime as a combination of two pre-compiled shaders due to various actions in-game.
And yet, there are tons of complex games on game engines that do not have this issue at all, inclooooding Unreal Engine. Huh, fancy that.


DirectStorage API attempts to solve this problem on PC, and it has for the most part in a linear, corridor shooter with scripted events that require fast streaming of data. It'll be interesting to see how it handles it in an open world game that actively requires fast streaming of data.
*

Spoiler: show
*actively coping :ROFLMAO:
 
Last edited:
General chit-chat
Help Users
We have disabled traderscore and are working on a fix. There was a bug with the plugin | Click for Discord
  • No one is chatting at the moment.
    Chandoo Chandoo: I have heard that it is now unplayable in countries which do not support handful of third world...