Forgive me if I misunderstand some basic EE principles -- I'm a CS guy. After some googling, nobody really explains how the chip runs faster. I understand voltage must increase per this related article.
Do we actually increase the frequency at which the chip operates?
A CPU runs at a certain frequency, which is programmed into it's registers. This frequency can be modified at any time to account for drifting, which is used to account for distance between the crystal and the CPU. (This is going from memory from years ago -- chances are every assumption here is inaccurate.)
So, the original frequency is defined by the crystal which, by the nature of a crystal, oscillates at a static frequency band. This in turn we program into the CPU, which will go on to do X
number of calculations per millisecond.
At this point, overclockers manipulate the drift that is programmed into the CPU? My gut tells me that the drift registers can't increase the frequency enough to be relevant. So where does the increase in frequency come from?
Something that has just occurred to me is that just by applying more voltage, the 'bits' actually move around a faster..but then that wouldn't mean an increase in frequency, right?
Answer
Do we actually increase the frequency at which the chip operates?
Yes, we do!
Modern CPUs have a unit called PLL - Phase Locked Loop - which make the multiple GHz out of relatively cheap crystals running at something like 33.3 or 100 MHz. These units are programmable in a wide range of output frequencies. That is used to slow down the core(s) when there is less work to do to save power - or to overclock them.
You can increase the clock frequency further when the voltage is higher - but at the price of massive additional generated heat. And the silicon will "wear out" faster, as bad things like Electromigration will increase too.
No comments:
Post a Comment