Skip to main content
Arduino

Return to Answer

Fixed azimuth: north is 0°.
Source Link
Edgar Bonet
  • 45.1k
  • 4
  • 42
  • 81

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dydx, dxdy) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Edit: timemage posted a very interesting comment, and I would want to further expand on the precision issue. The AVR floating point support is indeed limited to single precision. For latitudes and longitudes in the range shown in the question, the numerical resolution (formally, the unit in the last place) is (×ばつ10−6)° for the latitude and (×ばつ10−6)° for the longitude. This translates to about 42 cm and 66 cm respectively on the surface of the Earth. It may be just good enough to hit the waypoint to within 4 m.

You could increase the resolution by storing the latitude and longitude in microdegrees, as 32-bit integers. This should be easy if the GPS always gives 6 digits after the decimal point: just ignore that decimal point. Then, once the quantities waypoint.longitude - current.longitude and waypoint.latitude - current.latitude are computed, you can safely use floating point for the rest of the computations.

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Edit: timemage posted a very interesting comment, and I would want to further expand on the precision issue. The AVR floating point support is indeed limited to single precision. For latitudes and longitudes in the range shown in the question, the numerical resolution (formally, the unit in the last place) is (×ばつ10−6)° for the latitude and (×ばつ10−6)° for the longitude. This translates to about 42 cm and 66 cm respectively on the surface of the Earth. It may be just good enough to hit the waypoint to within 4 m.

You could increase the resolution by storing the latitude and longitude in microdegrees, as 32-bit integers. This should be easy if the GPS always gives 6 digits after the decimal point: just ignore that decimal point. Then, once the quantities waypoint.longitude - current.longitude and waypoint.latitude - current.latitude are computed, you can safely use floating point for the rest of the computations.

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dx, dy) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Edit: timemage posted a very interesting comment, and I would want to further expand on the precision issue. The AVR floating point support is indeed limited to single precision. For latitudes and longitudes in the range shown in the question, the numerical resolution (formally, the unit in the last place) is (×ばつ10−6)° for the latitude and (×ばつ10−6)° for the longitude. This translates to about 42 cm and 66 cm respectively on the surface of the Earth. It may be just good enough to hit the waypoint to within 4 m.

You could increase the resolution by storing the latitude and longitude in microdegrees, as 32-bit integers. This should be easy if the GPS always gives 6 digits after the decimal point: just ignore that decimal point. Then, once the quantities waypoint.longitude - current.longitude and waypoint.latitude - current.latitude are computed, you can safely use floating point for the rest of the computations.

Address accuracy issue.
Source Link
Edgar Bonet
  • 45.1k
  • 4
  • 42
  • 81

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Edit: timemage posted a very interesting comment, and I would want to further expand on the precision issue. The AVR floating point support is indeed limited to single precision. For latitudes and longitudes in the range shown in the question, the numerical resolution (formally, the unit in the last place ) is (×ばつ10−6)° for the latitude and (×ばつ10−6)° for the longitude. This translates to about 42 cm and 66 cm respectively on the surface of the Earth. It may be just good enough to hit the waypoint to within 4 m.

You could increase the resolution by storing the latitude and longitude in microdegrees, as 32-bit integers. This should be easy if the GPS always gives 6 digits after the decimal point: just ignore that decimal point. Then, once the quantities waypoint.longitude - current.longitude and waypoint.latitude - current.latitude are computed, you can safely use floating point for the rest of the computations.

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Edit: timemage posted a very interesting comment, and I would want to further expand on the precision issue. The AVR floating point support is indeed limited to single precision. For latitudes and longitudes in the range shown in the question, the numerical resolution (formally, the unit in the last place ) is (×ばつ10−6)° for the latitude and (×ばつ10−6)° for the longitude. This translates to about 42 cm and 66 cm respectively on the surface of the Earth. It may be just good enough to hit the waypoint to within 4 m.

You could increase the resolution by storing the latitude and longitude in microdegrees, as 32-bit integers. This should be easy if the GPS always gives 6 digits after the decimal point: just ignore that decimal point. Then, once the quantities waypoint.longitude - current.longitude and waypoint.latitude - current.latitude are computed, you can safely use floating point for the rest of the computations.

Break down the computation and discuss the approximation errors.
Source Link
Edgar Bonet
  • 45.1k
  • 4
  • 42
  • 81

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system:

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree)
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;
float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

The computation is not difficult. Since your drone is unlikely to travel for thousands of kilometers, you can assume the Earth is locally flat, and the latitude and longitude are an orthogonal coordinate system. First, compute the vector to the next waypoint in Cartesian coordinates (dx, dy, dz):

// Defined at global scope.
const float radians_per_degree = M_PI / 180;
const float degrees_per_radians = 180 / M_PI;
const float meters_per_degree = 1e7 / 90;
// For every reading of the GPS:
float dx = (waypoint.longitude - current.longitude) * meters_per_degree
 * cos(current.latitude * radians_per_degree);
float dy = (waypoint.latitude - current.latitude) * meters_per_degree;
float dz = waypoint.height - current.height;

Then, from this vector you can get the required angles:

float azimuth = atan2(dy, dx) * degrees_per_radians;
float pitch_angle = atan(dz/sqrt(dx*dx + dy*dy)) * degrees_per_radians;

Note that the approximation breaks down if the drone has to travel for a distance that is a significant fraction of the distance to the closest pole. Note also that the approximation errors are inconsequential if you periodically update your estimate of azimuth and pitch angle: the errors will be corrected along the way as the drone gets closer and closer to the waypoint.

Source Link
Edgar Bonet
  • 45.1k
  • 4
  • 42
  • 81
Loading

AltStyle によって変換されたページ (->オリジナル) /