1.7V is not a permissible load voltage.
Intel's specs clearly state that 1.72V is the absolute maximum permissible voltage with VDroop and the maximum permissible voltage without is 1.52V.
In other words, under real load, the permissible voltage is always below 1.52V and these are the upper limits of what is permissible.
The VDroop protects the CPU from voltage spikes, so a higher IDLE voltage with a lot of VDroop is better for the CPU,
as if you hardly set any/no VDroop and voltage peaks exceed the specified limits under hard load changes.
P.S
There are cucumbers that have 1.6V when idle, I have never seen one with 1.7V. The CPUs are usually between 1.3-1.4V under load.
It's a bit more complicated than this. But then again, no one cares what I say because no one believes me.
1.52v isn't even a safe IDLE voltage with LAPTOP SPEC LLC (1.1 mohms loadline calibration!). I tested this on a 13900K, 1.51v bios set, LLC Level 3, no load, idle for a week. Degraded from that (it was already degraded some from many stockfish chess abuse usage tests), so that CB R23 went from being able to do 1.375v set + LLC6 at 5.7 ghz, to crashing at 1.39v set + LLC6---just from the "1.51v + LLC3" 1 week idle test.
Intel doesn't even know what their chips can safely do anymore. I've known about this for years now.
The 1.72v thing--I remember how this first appeared.
It was only set by a VRM register "33h", it appeared **ONLY** on 9900K as a spec, allowing something called "offset capability" (NOT adaptive mode!!!). It DID NOT EXIST on 8700K --Z370 chipset !! Original Intel 8th generation spec sheet, would show max VID 1.52v only, for 6 core SKU, and 1.6 mohm AC/DC Loadline.
When 9th gen came out and spec sheet was updated, "1.520v + 200mv = 1.720v" was added in 8 core SKU ONLY (9900K), but was not listed for 6 core SKU (8700K-).
I do not know what this was for or why it was added. But Gigabyte boards DISABLED this by default.
PLZ let me explain.
This "offset mode" allowed MAX VID to exceed 1.520v cap (as allowed by AC Loadline being the INVERSE of vdroop LLC (aka "predicted" current")--the max ACLL was allowed to set native VID to (before vdroop) was 1,520v.
Offset mode was set by a VRM register, I believe the bit value in hexadecimal was "33 hex" It's listed in the 9th generation spec sheet. Also found this register on some old IR 35201 datasheet as well, just labeled as "SVID OFFSET"
Formula was:
Vcore= Native VID + ((ACLL mohms * Loadstep IOUT Current) - (VRM Loadline LLC * IOUT current)) + vOffset.
CPU VID= Native VID + ((ACLL mohms * loadstep IOUT) - (DC Loadline * OUT)) + voffset.
Loadstep IOUT was known as "dI", which was the difference between two load points
dI=d1-d0.
vOffset was offset voltage via adaptive mode (NOT related to "VRM offset capability!!")
This formula for "vcore" was only for adaptive or auto vcore. Not manual VRM set overrides.
NOW:
GIGABYTE Z390 boards had this register disabled by default!!!!!
it was called "SVID OFFSET" in those boards. So max VID was 1.520v.
Asus enabled this however (no way to disable it). So their VID's could exceed 1.520v. That was the beginning of the "Asus Z390 boards overvolting past 1.5v" rumors from so long ago.
SVID OFFSET was VERY BUGGY on gigabyte Z390 boards. It did not function as expected and could cause very erratic or failed operation. That is why it was disabled.
I talked to gigabyte BIOS engineer about this long ago. They sent me two Aorus Master test bioses, "T0d" and "T1d". This allowed SVID Offset to function correctly without errors, similar to Asus Z390 series and I tested it and CPU VID with AC Loadline boost indeed could go higher than 1.52v now (before vdroop applied). But there was still a hardwired bug--if CPU VCORE were set manually to 1.20v (This was gigabyte "white bios lettering default") and SVID OFFSET enabled, CPU would get 0 volts and POST CODE DISPLAY WOULD BE BLANK (nothing), with all fans at 100% and black screen.
Gigabyte then told me that SVID OFFSET was not doing something correct or they still didnt understand the function and they disabled this "test" feature in regular BIOS. However I did help them also fix a bug with DVID offsets at the same time.
On Z490, all boards from all ODM's had "Offset mode" (VRM) permanently enabled at all times, and max VID was now 1.72v (AC Loadline boost). But on Z490, ACLL function was also changed---it was no longer inverse of "predicted" vdroop but now was a strange "offset" to native CPU VID (then TVB boost from temps added on top after). That is why if you set ACLL manually to 1.1 mohm (= SVID=Intel Fail safe), you will get 1.65v IDLE in windows at max turbo multiplier.....(on Z390 and older, 1.6 mohm ACLL would be inverse of amps, so you would get native VID (1.3-1.4v)...
This was 14nm....14nm....1.52v VID has been this spec since ***SANDY BRIDGE*** and SB was not 14nm!!!! it was higher!
Bottom Line--Intel doesn't even know the reason for their own spec!! 1.52v idle is NOT SAFE on 10nm. They are using 18nm(?) max VID on 10nm.....no wonder this company is doing so bad!!!!!
Of course NO ONE CARES WHAT I SAY.