moin
jo hab mal ein wenig geforscht und die antwort gefunden
schau mal hier was die von ATI dazu meinen
also alles im lot
-----------------------------------
Eben hatte ich noch vermutet, dass wir in einigen Tagen eine Antwort auf die 20° Diskrepanz zwischen Control Panel und RivaTuner erhalten werden, schon sind die Antworten da. Um es gleich vorwegzunehmen, RivaTuner liest richtig aus. ATI addiert bei der 9800XT eine sogenannte 20° Kompensation, ausgehend von der 9800Pro, dazu, weil ATI bei der 9600XT eine On-die Diode, bei der 9800XT dagegen einen in Die-Nähe befindlichen Sensor verwendet.
Zunächst das Statement von Unwinder, dem Macher von RivaTuner, und anschließend die Antwort von ATI:
Well, well, well… It looks like I’ve found the reason. Instead of waiting for reply from ATI I tried to RE the driver to understand control panel’s temperature monitoring implementation. I’m afraid that ATI will not address my questions stated in this thread, because the reasons causing differences in RivaTuner’s and control panel’s temperature monitoring are definitively not XT-family-friendly. So I’ll do it myself.
I hate to say it, but it seems like XT boards don’t have on-die thermal diode. Now I’m almost sure that R(V)360 VPUs have no temperature monitoring related improvements comparing to the previous VPUs and R(V)360’s thermal diode is just a myth from PR guys. Control panel’s temperature monitoring implementation just proved it. LM63’s remote temperature (i.e. core temperature monitored by RivaTuner) seems to be read from the external diode-connected transistor so the sensor’s D+/D- pins are not directly connected to ASIC’s thermal diode (probably because ASIC simply don’t have it). The control panel also reads _the_same_ temperature from hardware sensor then simply adds 20°C temperature compensation to it. The code responsible for temperature compensation is located in AtiEDUGetAdapterTemperatureOffset function exported by the driver’s ati2edxx.dll binary and it simply returns 14h (20 decimal) if compensation is needed, otherwise it returns 0. Anyone with basic RE skills can examine the code to verify this information.
So the final verdict is:
RivaTuner monitors original temperature reported by LM63 hardware sensor.
Catalyst 4.1 control panel reads original temperature reported by LM63 hardware sensor then simply adds to it constant (20°C) compensation and displays corrected value. I can see only two cases for such temperature correction:
1) Control panel incorrectly detects the boards with on-die thermal diode and AtiEDUGetAdapterTemperatureOffset erroneously adds compensation to the temperature read from on-die thermal diode.
2) 9800XTs don’t have on-die thermal diode and the compensation (20°C) is added to approximate the real temperature of VPU considering the fact that the temperature is retrieved from external source. And it is _really_ bad, because constant compensation will never be accurate. Especially considering the fact that the compensation value is rather high.
Anyway, even in the second case, RivaTuner displayed, display and will display original (non-compensated) temperatures retrieved directly from the sensor. Probably I’ll add an ability to set custom compensation for displayed temperature in RivaTuner, but it will not use any compensation by default. The driver uses original (non-compensated) temperatures for core clock frequency adjustment when Overdrive is activated (even publicly available temperature / clock frequency ranges contain original (non-compensated) temperatures). LM63’s temperature controlled PWM fan speed lookup is programmed for original (non-compensated) temperatures too. So a lot of people (including me) prefer to see original value instead of “t = t + 20” approximation. IMHO it’s _very_ rough way to measure temperature because the compensation depends on a lot of factors and definitively it’s not a constant value for all boards. From my point of view Catalyst control panel developers should either make the compensation adjustable, or remove it at all because constant temperature compensation is too inaccurate.
And ‘thank you’ very much for saying that RT reads wrong temperatures, ATI It’s much easier to say this instead of confirming the fact that your control panel simply uses rough +20°C temperature compensation due to unknown reason. It’s time to tell us the truth about on-die thermal diode on XTs, ATI. Is it a myth? Do it please.
ATIs Antwort auf Unwinder:
The 9600XT's have an on-die thermal diode, and we are using that for thermal checks and as the reported temperature
The 9800XTs do not have an on-die thermal diode. They have an on package thermistor and an on-board thermal sensor chip. The thermistor is very close to the ASIC die and the thermal transfer is very close to being constant, so a constant delta is fine.
We've done HW qualification testing on 9800XT's in various configurations, and we've found that there's a very consistent delta differential between reported thermal data and measure ON DIE temperature.
We can use the reported thermal data from both sources for OC calibration
The offset of 20 degrees is an approximation of the temperature differential on 98's, and is only used in the control panel temperature report. This temperature does not need to be exact, it just needs to be roughly accurate, which it is within a degree or two. This was experimentally gathered information
Future changes might refine this process, as we deem fit. We are constantly developing this technology and from the 9600 onward, we plan to have (not guaranteeing we will though) on die thermal diodes in all of our products and in the future, other much more complex on die thermal logic.