Posted by: uniscom July 25, 2008
Login in to Rate this Post:
0
?
a well known formula,
P=V*I
so,if power for an applience is constant,
V*I=K(constant)
or, V ∞1/I
e.g. P=1000W
if V=200V,it will take 5A current
if V=100V,I will be 10 A
therefore,for a device having constant power consumption,if we decrease voltage,it will take more current.and ultimately power consumption will be same.