A satellite passes across the sky sending out radio signals at a constant but
inaccurately known frequency \(\omega_{s}\). Assume that the satellite altitude
is small compared to the earth's radius and hence that the trajectory can be
considered to be a straight line at constant altitude above a flat earth. By
beating the signals against a known standard frequency and measuring the
difference frequency, an observer on the earth can accurately measure the
received frequency, with its Doppler shift, as a function of time, with \(\cos
\theta_{s}\) in \((5.8 .3)\) [or \((5.8 .5)\), with \(\gamma=1\), since \(v \ll c]\)
ranging from 1 to \(-1\). Show how the observations can be made to yield a value
of \(v\), the velocity of the satellite, and a value of \(D\), the closest
distance of approach of the satellite. \(\dagger\)