A $$220 \ V$$ and $$100 \ W$$ lamp is connected to $$220 \ V$$ power supply. What is the power consumed?
Correct option is A. $$100$$W
The power consumed would be obviously $$100 \ W$$.
It is because the lamp is operated at its rated voltage $$V =220 \ V$$. So the power consumed will be equal to its power rating, i.e. $$100 \ W.$$
Let us solve it to prove it.
We know, power $$P=\dfrac{V^2}{R}$$
Putting $$P=100 \ W$$ and $$V =220 \ V $$
$$\Rightarrow R=\dfrac{(220)^2}{100}=484 \ \Omega$$
By Ohm's law,
Current in the circuit $$I=\dfrac{\text{power supply voltage V}}{\text{resistance R}}$$
$$\Rightarrow I=\dfrac{220}{484} \ A$$
Power consumed, P can be calculated as $$I^2R.$$
$$P={\left( \dfrac{220}{484}\right)}^2 \times 484=\dfrac{(220)^2}{484}=100 \ W$$
Hence, option (A) is correct.