Metro Exodus Enhanced 12900K and Rtx 4090 bottleneck?scene

Wenn Du diese Anzeige nicht sehen willst, registriere Dich und/oder logge Dich ein.
1. there is always some sort of bottleneck
2. you've been told how to find out what bottleneck you have
3. % CPU usage says absolutely nothing
4. if you GPU usage is below ~95% you are most likely CPU limited
5. DDR4 and DDR5 perform identical up to some point
6. DLSS will lower your GPU usage but not increase your CPU usage
7. there is always some sort of bottleneck
 
1. there is always some sort of bottleneck
2. you've been told how to find out what bottleneck you have
3. % CPU usage says absolutely nothing
4. if you GPU usage is below ~95% you are most likely CPU limited
5. DDR4 and DDR5 perform identical up to some point
6. DLSS will lower your GPU usage but not increase your CPU usage
7. there is always some sort of bottleneck
Ok but why i have the same fps on all dlss modes in CB2077?
 
what do u mean exactly?
That's the main issue. He doesn't have a clue what he is talkin' about.
And he refuses to read to get some knowledge and basics straight.
So he is just arguing in the blue, comparing apples and bananas and consequently talking trash.
Ok but why i have the same fps on all dlss modes in CB2077?
Because you are CPU limited. :wall:
Or you don't know how DLSS2.3 and DLSS3.0 work. And you don't know what you are comparing.
Get your basics sorted before you compare apples and bananas.

 
see points 4+6
DLSS is only useful in GPU limited scenarios since it lowers the actual rendered resolution of the image hence taking load off the GPU. CPU utilisation stays the same. You can see the same behaviour by lowering your game's resolution from 1400p to 720p.
 
Ah ok so what you guys recommend for me? Change to 13900K and DDR5 ? Or just stay with current pc:)?( I wanna better much better fps and better usage gpu ) :)
 
( I wanna better much better fps and better usage gpu )
Read benchmarks. Cut down your wishes to what is possible with current hardware.
Whatever your exceptation is - you will likely be disappointed.
 
Ok but why i have the same fps on all dlss modes in CB2077?
DLSS is a GPU feature. It mainly reduces the resolution the game is rendered. DLSS will only give u more fps if u are in gpu limited scene. in an already cpu limited scene, the FPS will mostly stay the same :)

i mean DLSS 2.0 without Frame Generation!
 
your fps will not increase by some magic amount by throwing enough money at it.

1668272645476.png
 
DLSS is a GPU feature. It mainly reduces the resolution the game is rendered. DLSS will only give u more fps if u are in gpu limited scene. in an already cpu limited scene, the FPS will mostly stay the same :)
On benchmark i have 98% usage on 1440P with rt ultra,dlss quality and swapping to DLSS Performance,ultra performane or balanced,dont impact fps,and fps is the same. Weird.
Beitrag automatisch zusammengeführt:
 
On benchmark i have 98% usage on 1440P with rt ultra,dlss quality and swapping to DLSS Performance,ultra performane or balanced,dont impact fps,and fps is the same. Weird.
Beitrag automatisch zusammengeführt:


So use 2.25xDL right? and then try to change dlss in game and maybe it will work?
Why are you making fruit salad in your statements? That statement isn't even apples and bananas...!
You are mixing up everything!
Get your basics straight!
 
On benchmark i have 98% usage on 1440P with rt ultra,dlss quality and swapping to DLSS Performance,ultra performane or balanced,dont impact fps,and fps is the same. Weird.
Beitrag automatisch zusammengeführt:
never take multiple referenes.

switching from DLSS quality to a "lower" dlss option should increase your fps if your gpu was on 98% on the option before.

can u make some screenshots ?
 
I revealed that the only way to experience full GPU usage (99%) is to max out RT to ULTRA and turn off DLSS, and run the game in native resolution 1440P.
 
I compared fps in gameplay in Cyberpunk 2077 ( 1440P dlss quality,rt ultra ) with my Rtx 4090 on Ryzen 9 7950X DDR5 6000mhz. I have 12900K stock,ddr4 3600mhz.

And i have the same fps . So its ok ,that cpus are comparable?
 
if you want to compare CPUs, you don't do that in 1440p with RT Ultra but in 720p high no RT. you want to take away load from the GPU to see true CPU performance. If your GPU is already struggling to keep up your fps say absolutely nothing about CPU performance. the fact that you're getting the same fps is very much indicative that you are GPU limited.
 
Hi. Is this real? I compared

13600K DDR4 , RTX 4090 -1440p max​

99fps at start​

link:

with

12700 DDR5 , RTX 4090 -1440p max​


118fps at start

link:

So why its 20 fps difference between 12700K and 13600K?13600K is better
 
Turn DLSS off, and there would be no difference.
CP seems to scale with numbers of cores.
So this time its not Metro, its Cyberpunk. Maybe you should observe 10 more different Games, to get 10 more different conclusions^^
Cyberpunk2077_2022_11_21_18_32_49_449.jpg


Use the Settings, do not use the Settings.
 
Zuletzt bearbeitet:
Turn DLSS off, and there would be no difference.
in den videos ist der 12700 schneller. das verwundert ihn.
in the videos, the 12700 is faster. thats the thing hes wondering.

vll skaliert cb auf kerne?
maybe cp77´s core scaling is better than in other games?
 
Naja, die Intel-Dinger unterscheiden sich ja fast gar nicht. 6 P-Cores neue Generation v 8 P-Cores alte Generation. Ist ja jetzt kein so unglaublich großer Unterschied in der Anzahl, schließlich wird der Pippi-Kram von den e-Cores aufgefangen, die deutlich an Leistung zugelegt haben.

Cyberpunk2077_2022_11_21_18_46_08_558.jpg

Cyberpunk2077_2022_11_21_18_48_56_989.jpg
 
Zuletzt bearbeitet:
dann müsste der 13600k in den videos die er verlinkt hat ja gleichauf oder besser sein. aber es sind wohl 20% unterschied zugunsten des 12700.

aber wer weis was der mit "max settings" wieder meint und ob er wirklich in beiden vergleichen die identischen settings genutzt hat :d
 
Da hast Du recht. Ich hatte das mal auf dem 24-Ender laufen lassen und ich glaub da waren 12 Kerne gut dabei. Aber ich bin zu faul gewesen das jetzt neu auszutesten.
Was ich fix machen kann, ist die e-Cores abschalten^^

Cyberpunk2077_2022_11_21_18_52_53_536.jpg
 
In the 13600K vid it says "ecores off" and you use DDR4 vs. DDR5, which should also scale a bit in 1440P, so this comparison is useless to begin with imho.

GPU clock speed is also higher in the 12700 vid... cba looking for more differences :d
 
Your PC is totally fine and behaves like it should. Just play and enjoy the games instead of searching for problems that aren't problems would be my advice in this case.
 
I am using 1440P,DLSS QUALITY,RT PSYCHO. During gameplay usage is 80-90% . When swapping to 1080P ( DLSS QUALITY ) i dont have fps increase. The same fps.

Here someone with Rtx 4090 ( 1440P,DLSS QUALITY,RT PSYCHO ). During gameplay hes usage is 85-93%. When swapping to 1080P ( DLSS QUALITY ) he gets 20 fps boost and lower gpu usage. When i dont have fps increase.
Why i dont have fps boost then? Link to hes video skip to 3:06



Hes usage is 85-93% on 1440P ( DLSS QUALITY ) and why going to 1080P ( DLSS QUALITY ) he have more fps + lower usage?
 
Zuletzt bearbeitet:
Hardwareluxx setzt keine externen Werbe- und Tracking-Cookies ein. Auf unserer Webseite finden Sie nur noch Cookies nach berechtigtem Interesse (Art. 6 Abs. 1 Satz 1 lit. f DSGVO) oder eigene funktionelle Cookies. Durch die Nutzung unserer Webseite erklären Sie sich damit einverstanden, dass wir diese Cookies setzen. Mehr Informationen und Möglichkeiten zur Einstellung unserer Cookies finden Sie in unserer Datenschutzerklärung.


Zurück
Oben Unten refresh