A beam of light is converging towards a point $$I$$ on a screen. A plane glass plate whose thickness in the direction of the beam $$=t$$ refractive index $$=\mu$$ is introduced in the path of the beam. The convergence point is shifted by
A
$$t\left(1+\dfrac {1}{\mu}\right)$$ away
B
$$t\left(1-\dfrac {1}{\mu}\right)$$ away
C
$$t\left(1-\dfrac {1}{\mu}\right)$$ nearer
D
$$t\left(1+\dfrac {1}{\mu}\right)$$ nearer
Correct option is A. $$t\left(1-\dfrac {1}{\mu}\right)$$ away
Normal
shift $$\Delta x=\left(1-\dfrac 1 \mu \right)t$$
and shift takes place in direction of ray.