So I have been mulling over Ohm's law for hours today. It makes sense to me that the current gets bigger as voltage rises, since the voltage is supplying more electrons and current is the number of electrons flowing. However, I'm confused by the theory that as resistance drops, current increases, even when voltage stays constant. Such as:
$$V/R = I$$ $$\frac{1V}{0.0001 \Omega} = 10,000,000 mA!!$$
That's a big current from such a small voltage!
My mental picture of a current is more electrons flowing. Electrons are supplied by the voltage, right? If we start off with a small voltage, say 1V, but infinitely decrease the resistance, we'll get a bigger and bigger current. How can this be? Isn't current the number of electrons flowing? With fewer electrons (because of less voltage), how can less resistance infinitely increase the current?
No comments:
Post a Comment