|If I have a point on a map, defined by the latitude and the longitude, and I want to be able to determine the latitude and longitude at any point on a circle with a radius of 10 nautical miles from that point, what is the equation that gives me the point?
|Question Date: 2001-04-13|
I'm not sure if this is exactly what you are after, but since nobody appears to have given you an answer yet, I thought I'd try and take a stab at this one. This isn't a "pretty" analytical expression, but at least should be able to give you something. For this, I'll make the approximation that the earth is spherical. Now, if we adopt the spherical coordinate system, then latitude and longitude are described by our two polar angles (we'll call them j and k). Now, we can find the shortest distance between any two points on a sphere described by (R, j, k), where R is the radius of the sphere (in this case the earth) by finding their separation on a "great circle." The shortest distance between two points on a sphere (R, j1, k1) and (R, j2, k2) is thus given by
d = R*arccos[sin(j1)*sin(j2) + cos(j1)*cos(j2)*cos(k1-k2)]
By picking any distance, d (you had asked about 10 nautical miles), and any reference point given at (R, j1, k1), that leaves us with two variables, j2 and k2 that are left. At this point, one can (numerically) solve for values of j2 and k2 that satisfy this equation. Since for any given latitude there are two values of longitude (and vice versa) you need to make sure you find all solutions when solving. This could probably be done by selectively bounding the maximum and minimum values according to the maximum displacements in latitudinal or longitudinal position.
The short one is the amount by which the latitude shrink for a distance of 10 miles far away from the poles is so small that one does not have to worry about it.
In general this distance is going to be small unless one is flying north or south over long distances.
The long answer is that one should slice the earth perpendicular to a line that goes from the south to the north pole. Elementary trigonometry shows that the radii of these slices will shrink by the formula,
radius of slice = radius of earth times cosine of difference in latitude.
The length of the circumference of the slice is then 2 pi times the radius of the slice.
The radius of the earth is 3963 miles, and the difference in the angle of latitude for the distance of ten miles will be: 360 degrees time 10 miles divided by 2 times pi times 3963 miles. This is a very small angle so cosine of it will be close to one. This means that radii of the two slices at latitudes 10 miles apart will be very close. To show how close one computes the ratio of the cosines for the two latitudes.
Actually, because the earth is not a perfect sphere, the distance between lines of latitude does change over the earth (smallest at the equator and largest at the poles), but the difference between the shortest and longest degrees of latitude varies by less than a standard mile (~0.7 miles). The distance between lines of longitude can vary from 0 to 69.17 standard miles (or 0 to 60 nautical miles).
The equations you are looking for do exist, obviously, and is used all the time. For example, the authors of the "distance calculator" at http://www.indo.com/distance/ have used a set of equations to solve a similar problem: figuring out the distance between two geographic points, given any two locations on the globe. [Note that to use the "distance calculator", you must be careful to use the right lat/lon notation (e.g., for 40 degrees 20 minutes 26 seconds north latitude, enter 40:26:26N).]
For your problem, you would need to manipulate the equation used on the web site to figure out a geographic point, given a fixed distance (10 nautical miles) and an initial location. As you suggest, the hard part is figuring out the distance between any two lines of longitude at a given latitude. An estimate can be calculated using the following equation:
y = (-0.0071 * x2) - (0.1539 * x) + 70.295,
where "y" is the distance between a degree of longitude at latitude "x". Knowing that (roughly) a change of latitude is 69 standard miles anywhere on the globe, you could use Pythagoras' theorem and the above equation to figure out the geographic location that is 10 nautical miles (11.5 standard miles) from a given location.
One could easily write a MatLab script that would step though this equation repeatedly to generate a SERIES of points exactly 10 nautical miles from a specified starting point (you could even set up your script so that it would prompt the user to enter the starting point). Given enough random points, this series would circumscribe a circle. The MatLab script could even calculate the equation for the circle and plot it on a map, together with the starting location.
Computers can do this easily enough, but in the old days when sailors had to make such calculations rapidly, they depended on charts with this information, covering every area of the ocean.
All this is much more complicated than I could do, but a suggestion for someone with time and/or MatLab experience.
Click Here to return to the search form.
Copyright © 2020 The Regents of the University of California,
All Rights Reserved.