1.

A satellite is observed from two points A and B at a distance 1.3 × 107 m apart on earth. If angle subtended at the satellite is 1o30’, find distance of the satellite from earth.

Answer»

We have,

θ = 1o30’ = (60’ + 30’) = 90’

= (90 × 60)’’

= 5400”

= (5400)(4.85 × 10-6 ) rad [ ∵ 1’’ = 4.85 × 10-6 rad]

and b = 1.3 × 107 m.

using, \(D= \frac{b}{\theta}\) , we get

\(D = \frac{1.3\times10^7}{(5400(4.85\times10^{-6}))}\)

\(= \frac{1.3\times10^{11}}{54\times4.85}\)

= 4.96 × 108 m



Discussion

No Comment Found