Tuesday 25 February 2014

frequency - Why does a faster clock require more power?


If you overclock a microcontroller, it gets hot.


If you overclock a microcontroller, it needs more voltage.


In some abstract way it makes sense: it is doing more computation, so it needs more energy (and being less than perfect, some of that energy dissipates as heat).


However, from a just plain old Ohm's law level electricity and magnetism, what is going on?



Why does the clock frequency have anything to do with power dissipation or voltage?




As far as I know, the frequency of AC has nothing to do with its voltage or power, and a clock is just a super-position of a DC and a (square) AC. Frequency doesn't affect the DC.



Is there some equation relating clock frequency and voltage or clock frequency and power?



I mean does a high speed oscillator need more voltage or power than a low speed one?



Answer



Voltage required is affect by significantly more than clock speed, but you are correct, for higher speeds you will need higher voltages in general.


Why does power consumption increase?


This is a lot messier than a simple circuit, but you can think about it being similar to an RC circuit.



RC circuit equivilent


At DC an RC circuit consumes no power. At a frequency of infinity, which is not attainable, but you can always solve this theoretically, the capacitor acts as a short and you are left with a resistor. This means you have a simple load. As frequency decreases the capacitor stores and discharges power causing a smaller amount of power dissipated overall.


What is a microcontroller?


Inside it is made up of many many MOSFETs in a configuration we call CMOS.


If you try to change the value of the gate of a MOSFET you are just charging or discharging a capacitor. This is a concept I have a hard time explaining to students. The transistor does a lot, but to us it just looks like a capacitor from the gate. This means in a model the CMOS will always have a load of a capacitance.


Wikipedia has an image of a CMOS inverter I will reference.



CMOS Inverter Schematic



The CMOS inverter has an output labeled Q. Inside a microcontroller your output will be driving other CMOS logic gates. When your input A changes from high to low the capacitance on Q must be discharged through the transistor on bottom. Every time you charge a capacitor you see power use. You can see this on wikipedia under power switching and leakage.



Why does voltage have to go up?


As you voltage increases it makes it easier to drive the capacitance to the threshold of your logic. I know this seems like a simplistic answer, but it is that simple.


When I say it is easier to drive the the capacitance I mean that it will be driven between the thresholds faster, as mazurnification put it:



With increased supply drive capability of the MOS transistor also increases (bigger Vgs). That means that actual R from RC decreases and that is why gate is faster.



In relation to power consumption, due to how small transistors are there is a large leakage through the gate capacitance, Mark had a bit to add about this:



higher voltage results in higher leakage current. In high transistor count devices like a modern desktop CPU leakage current can account for the majority of power dissipation. as process size gets smaller and transistor counts rise, leakage current becomes more and more the critical power usage statistic.




No comments:

Post a Comment

arduino - Can I use TI's cc2541 BLE as micro controller to perform operations/ processing instead of ATmega328P AU to save cost?

I am using arduino pro mini (which contains Atmega328p AU ) along with cc2541(HM-10) to process and transfer data over BLE to smartphone. I...