A Community Devoted to the Preservation and Practice of Celestial Navigation and Other Methods of Traditional Wayfinding
From: Frank Reed
Date: 2018 Feb 26, 09:30 -0800
Antoine, I think I made my special case "too special".
If I express this in terms of amplitudes, it's easier. If a star rises 25° north of east, we say that its celestial "amplitude" is 25°. If the declination of the body doesn't change (as is the case for a true star), then its amplitude when setting will also be 25°. In other words, it sets 25° north of west. If you were to mark those directions on the ground, without knowing the specific value of the amplitude, you can determine north by splitting the difference between them. That's the standard procedure: you watch any bright star rise and note its direction. Later in the day or even weeks later, you observe the same star setting and note that direction. Split the difference between them and you have a north-south line.
Suppose I work this trick to find north using two stars with slightly different declinations, or suppose I use the Sun on some date well away from the solstices. The declination changes during the day, so the rise and set amplitudes won't be identical. At sunrise, the Sun's amplitude (north of east) might be 25°10'. At sunset the same day, it might be 25°30' (north of west). If I split the difference between those two directions, my north line would be offset by 10'. So how does this offset depend on declination and latitude?
The amplitude of any object when the true altitude is zero, can be calculated from
sin(Amp) = sin(Dec) / cos(Lat).
If I differentiate at constant latitude, that implies
cos(Amp)·dAmp = cos(Dec)·dDec / cos(Lat)
dAmp = dDec · cos(Dec) / cos(Lat) / cos(Amp).
There might be a more convenient form, but that's good enough for now.
Yes, I think I answered my own question by writing up a clearer statement of what the question was in the first place. Oh well.